Send me Jobs like this
Nationality
Any Nationality
Gender
Not Mentioned
Vacancy
1 Vacancy
Job Description
Roles & Responsibilities
- Design, build, and maintain scalable data ingestion pipelines and ETL/ELT workflows using tools such as Apache NiFi and Cribl to ensure timely and reliable data delivery.
- Develop and manage data transformation logic and semantic data models using platforms such as Dataiku and Dataflow to create consistent and trustworthy datasets.
- Architect and manage data lake and data warehouse solutions on GCP, including BigQuery and Cloud Storage, applying best practices for performance, security, and cost optimisation.
- Implement and maintain monitoring and log analytics solutions, such as Cloud Logging, to proactively manage data quality and pipeline health.
- Collaborate closely with data scientists, analysts, and engineering teams, acting as a technical authority on data architecture and contributing to cross functional decision making.
- An experienced data engineer with strong hands on expertise in building and managing data ingestion pipelines and ETL/ELT workflows.
- Skilled in data transformation and data modelling, with a solid understanding of dimensional and normalised modelling approaches.
- Proficient in GCP data services, including BigQuery and Cloud Storage, within large scale data lake and warehouse environments.
- Highly competent in advanced SQL, including query optimisation and working with complex data structures.
- A collaborative communicator who takes ownership of delivery and works effectively with both technical and non technical stakeholders.
- Holds a Bachelor s degree (or equivalent) in Computer Science, Computer Engineering, or a related field; relevant cloud data certifications are advantageous.
- Opportunities to work with modern cloud native data platforms at scale.
- Exposure to complex, enterprise level data challenges across multiple markets.
- A collaborative environment that values learning, innovation, and continuous improvement.
- The chance to influence data architecture decisions that underpin business critical insights.
- Advanced optimisation techniques for large scale cloud data platforms.
- Best practices in designing resilient, cost efficient data pipelines on GCP.
- Enhanced stakeholder engagement and technical leadership skills within cross functional teams.
- Deeper expertise in data quality, monitoring, and operational excellence.
Desired Candidate Profile
- An experienced data engineer with strong hands on expertise in building and managing data ingestion pipelines and ETL/ELT workflows.
- Skilled in data transformation and data modelling, with a solid understanding of dimensional and normalised modelling approaches.
- Proficient in GCP data services, including BigQuery and Cloud Storage, within large scale data lake and warehouse environments.
- Highly competent in advanced SQL, including query optimisation and working with complex data structures.
- A collaborative communicator who takes ownership of delivery and works effectively with both technical and non technical stakeholders.
- Holds a Bachelor s degree (or equivalent) in Computer Science, Computer Engineering, or a related field; relevant cloud data certifications are advantageous.
Company Industry
- Telecom
- ISP
Department / Functional Area
- IT Software
Keywords
- Senior Data Engineer (VOIS)
Disclaimer: Naukrigulf.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@naukrigulf.com
Vodafone