Data Engineer

ATOM Insurance

Posted 30+ days ago

Experience

3 - 5 Years

Education

Bachelors in Computer Application(Computers)

Nationality

Any Nationality

Gender

Not Mentioned

Vacancy

1 Vacancy

Job Description

Roles & Responsibilities

  • Job description

    You will be working on projects with teams from across our ATOM Platform and Business Intelligence/Data Science practice. Reporting to the Chief Product Officer, you will deliver digestible, contemporary and immediate data content to support and drive business decisions. The key focus of the role is to deliver reports, dashboards and custom solutions for various business critical requirements. You will be involved in all aspects of data engineering from delivery planning, estimating and analysis, all the way through to data architecture and pipeline design, delivery and production implementation. From day one, you will be involved in the design and implementation of complex data solutions ranging from batch to streaming and event-driven architectures, across cloud, on-premise and hybrid client technology landscapes.

    As a Data Engineer, you will:

    Build different types of data warehousing layers based on specific use cases

    Lead the design, implementation, and successful delivery of large-scale, critical, or difficult data solutions involving a significant amount of work

    Build scalable data infrastructure and understand distributed systems concepts from a data storage and compute perspective

    Utilize expertise in SQL and have a strong understanding of ETL and data modeling

    Ensure the accuracy and availability of data to customers and understand how technical decisions can impact their business s analytics and reporting

    Be proficient in at least one scripting/programming language to handle large volume data processing

    We are looking for 5+ years of experience in data engineering in a customer or business facing capacity.

  • Knowledge and Skills

    To be considered for this role you will have:

    Ability to understand and articulate requirements to technical and non-technical audiences

    Stakeholder management and communication skills, including prioritising, problem solving and interpersonal relationship building

    Strong experience in SDLC delivery, including waterfall, hybrid and Agile methodologies. Experience delivering in an agile environment

    Experience of implementing and delivering data solutions and pipelines on AWS Cloud Platform

    A strong understanding of data modelling, data structures, databases, and ETL processes

    An in-depth understanding of large-scale data sets, including both structured and unstructured data

    Strong SQL and Python knowledge

    Knowledge and experience of delivering CI/CD and DevOps capabilities in a data environment

    Basic qualifications:

    At least 3- 5 years of Consulting or client service delivery experience on Amazon AWS (AWS)

    At least 3- 5 years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL and data warehouse solutions

    Extensive experience providing practical direction within the AWS Native and Hadoop

    Experience with private and public cloud architectures, pros/cons, and migration considerations

    Minimum of 5 years of hands-on experience in AWS and Big Data technologies such as Java, Node.js, C##, Python, SQL, EC2, S3, Lambda, Spark/SparkSQL, Hive/MR, Pig, Oozie and streaming technologies such as Kafka, Kinesis, NiFI etc.

    3-5+ years of hands on experience in programming languages such as Java, c#, node.js, python, pyspark, spark, SQL, Unix shell/Perl scripting etc.

    Experience working with DevOps tools such as GitLabs, Jenkins, CodeBuild, CoePipeline CodeDeploy, etc.

    Bachelors or higher degree in Computer Science or a related discipline

    Must have at least one AWS certification:

    Certified AWS Developer Associate

    Certified AWS DevOps Professional

    Certified AWS Big Data Specialty

    Nice-to-have skills/qualifications:

    DevOps on an AWS platform. Multi-cloud experience a plus

    Experience developing and deploying ETL solutions on AWS using tools like Talend, Informatica, Matillion

    Strong in Java, C##, Spark, PySpark, Unix shell/Perl scripting

    IoT, event-driven, microservices, containers/Kubernetes in the cloud

Company Industry

Department / Functional Area

Disclaimer: Naukrigulf.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@naukrigulf.com

Similar Jobs

Data Analyst

Confidential Company

  • 2 - 8 Years
  • Sharjah , Dubai - United Arab Emirates (UAE)

Data Engineer

View All