Do you want to help build a world-class institution from the region, experience the thrill of being part of a high-growth technology company, and improve people lives?
There is only one way to build an awesome institution: to attract exceptionally talented people who are aligned with the mission of the organisation and make them partners in success. At Careem, our mission is to simplify & improve the lives of people, initially through solutions that make transportation in the region reliable, and over time, through disruptions in payments and logistics. In the process, we want to build an organisation that inspires and become a world-class institution from the region.
Founded in 2012 by former entrepreneurs and McKinsey alums, Careem is the MENA regions leading ride-hailing service and newest Tech Unicorn. With 30% monthly growth, we now operate in 80+ cities across 13+ countries and host over 12 million users. With our recent Series D funding success, we are positioned on the cusp of significant scale and well on target to deliver our goal of creating one million jobs in the region by 2018. Read more about us here and here .
ABOUT THE ROLE
Developing cross platforms ETL processes, maintaining systems for tracking data quality and consistency and using databases in a business environment with large-scale, complex data sets.
You will be building end-to-end, high-performance, robust and scalable data processing systems using varied forms of data infrastructure, including: RDBMS (PostgreSQL, MySQL, Redshift); NoSQL database; Elastic Search; Redis; Hadoop Eco-system; Logging/messaging systems (Kafka, Firehose).
You will have an opportunity to interact with a talented team of engineers at all levels of experience, particularly in the areas of large-scale distributed systems. Optimize and improve existing data pipelines to support our growth, initiatives around performance and scalability
• Develop data architecture, data modeling, and ETL mapping solutions within AWS cloud
• Provide support to by writing complex SQL queries against large amounts of data to answer business questions
• Monitor the DWH and BI systems performance and integrity, provide corrective and preventative maintenance as required
• Develop reporting applications using various BI tools
• Serve as (or grow into) a subject matter expert
• Develop new data pipelines and work with data scientists, data engineers and product managers to add new data sources or new views in our data
• Implement and maintain data models using business intelligence tools in order to allow our business users access the data directly and drive decisions.
• Excellent knowledge and strong track record working with relational database systems
• Very strong SQL skills
• Experience of data warehousing, and knowledge of ETL processes and tooling
• Hands-on experience with both relational and dimensional database modelling.
• Experience with building and operating data marts/warehouses
• Experience working with multiple Business Intelligence tools, building reports and dashboards
• Ability to: effectively articulate technical challenges and solutions; deal with loosely defined problems and fast changing requirements & think abstractly
• Good experience in programming or scripting languages
• Passionate about learning new technologies and working on a product of massive scale and impact.
• Good to have familiarity with Big Data technologies (eg. Hive, Spark, Presto, Athena, Big Query)
• Competitive remuneration and equity shares
• Premium medical insurance (including spouse and children)
• 25 working days annual leave
• Discounted Careem rides plus free credits inline with company growth
• Entrepreneurial working environment
• Flexible working arrangements
• Mentorship and career growth
Industry Type :
Logistics / Transportation / Warehousing / Courier
Functional Area :
Corporate Planning / Consulting / Strategy / M&A