Vezeeta is the leading Digital Healthcare Platform in Middle East and Africa that is transforming healthcare accessibility across the region. Having served to date more than 10 Million patients across 78 cities in 6 countries through cutting-edge digital solutions, we empower patients throughout the entirety of their healthcare journey allowing them to access the quality care they deserve.

Our organization is quickly growing, generating demanding roles for like-minded, forward-thinking individuals - with plenty of room for professional and personal growth. If you think you share our belief in redirecting possible healthcare pains into productive tech-enabled solutions, and can add to our positive culture, we’d love to hear from you.

 

 

The Role

The engineering department at Vezeeta is having a highly skilled technology engineers with business knowledge and technical expertise to implement strategies, evaluate processes, and build upon an infrastructure that supports the ever-changing needs of our customers. In this role, you will work with different systems, designing, developing, and integrating them at every stage of the life cycle. If you are reliable in delivering mission-critical solutions with the goals of accessibility, optimization, and security, you could be our ideal candidate. This is a position that will see you gain experience and skills in a wide range of areas as well as work at the forefront of the health-tech industry.

 

Responsibilities:

  • Provide analytical reports and recommendations to improve product features or build new data products based on real time and historical data review and analysis.
  • Report regularly on the quality of data acquired internally and externally.
  • Build analytics tools that utilize the data pipeline to provide insights into operational efficiency of the key business performance metrics.
  • Develop KPIs for Analytics teams that assist them in optimizing the data into decision making tools.
  • Create and maintain optimal data pipeline architecture, and develop, construct, test and maintain architectures. Align architecture with business requirements
  • Developing and automating data acquisition processes
  • Identify ways to improve data reliability, efficiency and quality
  • Prepare data for predictive and prescriptive modeling.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Create and maintain optimal data pipeline architecture for real-time and batch analysis.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems
  • Perform other related duties and responsibilities associated with the position
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and ‘big data’ technologies.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, and re-designing infrastructure for greater scalability,

Requirements

  • Bachelor's degree in Computer Science, Computer Engineering, Statistics, Informatics, Information Systems or related subject from a reputable university.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
  • Strong project management and organizational skills.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • Strong understanding of Statistical algorithms and their applications.
  • Experience with design and development of ETL processes, and data pipelines and workflow management tools.
  • Data modelling experience in building Logical and Physical data model.
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with AWS cloud services: EC2, EMR, RDS, Redshift
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • Experience with any programming languages: Python, Java, C++, Scala, R etc.
  • Highly analytical thinking with demonstrated talent for identifying, scrutinizing, improving, and streamlining complex work processes.
  • Exceptional listener and communicator who effectively conveys information verbally and in writing.

Areas that Add to Your Strength

If you are...

  • Agile: You can live, adapt and help others survive every change around you in the organization.
  • Enthusiast: You possess a zest for the job -- smile easily and have a positive, eager, and responsive attitude.
  • Negotiator: You navigate the negotiation process that achieves mutually beneficial results.
  • Organized: You own the world if you leave your car keys to find them in the same place every time.
  • Influencer: You communicate with others, they listen, they process, and then act on what you said.