Do you want to help build a world-class institution from the region, experience the thrill of being part of a high-growth technology company, and improve people's lives?
There is only one way to build an awesome institution: to attract exceptionally talented people who are aligned with the mission of the organisation and make them partners in success. At Careem, our mission is to simplify & improve the lives of people, initially through solutions that make transportation in the region reliable, and over time, through disruptions in payments and logistics. In the process, we want to build an organisation that inspires and become a world-class institution from the region.
Founded in 2012 by former entrepreneurs and McKinsey alums, Careem is the MENA regions leading ride-hailing service and newest Tech Unicorn. With 30% monthly growth, we now operate in 80+ cities across 12+ countries and host over 6million users. With our recent Series D funding success, we are positioned on the cusp of significant scale and well on target to deliver our goal of creating one million jobs in the region by 2018. Read more about us here and here .
ABOUT THE ROLE:
• Answer complex analytic questions from big data sets to help Careem shape its products and services in a better way.
• Implement strong data pipe lines for Machine Learning algorithms that will be used in production
• Design Real time / Streaming jobs for analytics / production purposes
• Implement large scale machine learning algorithm
• D esign, implement and support ETLs /ELTs to process terabytes of data each week.
• B uild data-driven services and solutions that empowers Careem different products either by working directly on them or by improving our large scale data infrastructure.
• Transform raw data from different sources and using different big data tools into meaningful insights
• You will always challenge the status quo and continually investigate new data processing technologies and seek to ensure that we follow the industry best practice s.
• Experience with distributed analytic processing technologies (Hadoop, Hive, Spark, Presto, Redshift, etc)
• Strong problem-solving skills and ability to swiftly trouble-shoot
• Ability to write well-abstracted, reusable and clean code components
• You have some working knowledge of ETL and data warehousing concepts
• You have a high degree of comfort in leveraging SQL to manipulate data
• You have exceptional communication skills and can manage multiple tasks and work to tight deadlines
• Experience in implementing scalable machine learning data pipelines is a plus.
• Experience with AWS is desirable.
We offer an attractive total compensation package, with emphasis on equity compensation, excellent health benefits and monthly Careem credits. You will have a unique opportunity to join a fast-growing company on the ground floor and shape its direction.
Industry Type :
Logistics / Transportation / Warehousing / Courier
Functional Area :
DBA / Datawarehousing (IT Software)