engi
Job Title: Data Engineer
Location: Lagos
Job type: Full-time
Reporting line: Head of Data
About ENGIE Energy Access:
ENGIE Energy Access is one of the leading Pay-As-You-Go (PAYGo) and mini-grids solutions provider in Africa, with a mission to deliver affordable, reliable and sustainable energy solutions and life-changing services with exceptional customer experience. The company is a result of the integration of Fenix International, ENGIEMobisol and ENGIEPowerCorner; and develops innovative, off-grid solar solutions for homes, public services and businesses, enabling customers and distribution partners access to clean, affordable energy.
ThePAYGosolar home systems are financed through affordable installments from $0.19per day and the mini-grids foster economic development by enabling electrical productive use and triggering business opportunities for entrepreneurs in rural communities. With over 1,700 employees, operations in 9 countries across Africa (Benin, Coted’Ivoire, Kenya, Mozambique, Nigeria, Rwanda, Tanzania, Uganda and Zambia), over 1.2 million customers and more than 6 million lives impacted so far, ENGIE Energy Access aims to remain the leading clean energy company, serving millions of customers across Africa by 2025.
Job Purpose/Mission
- This position will be part of the Global Data team. This is an incredible opportunity to join a high-performing team that is passionate about pioneering expanded financial services to off-grid customers at the base of the pyramid.
- Key responsibilities will include building and maintaining data pipelines between our main transactional and analytics databases, IoT data delivered from devices, PBX, and our in-house ticketing system.
- You would also be responsible for building pipelines to deliver data in realtime to our field team mobile application to allow data-informed decisions to be made in the field, as well as working with members of the data team to ensure high code quality and database design.
- Your work will make a meaningful impact by enabling Engie to continuously innovate on how we support our customers in their repayment journey.
Responsibilities
Data Modelling and Extract, Transform and Load (ETL):
- Design and implement robust data models to support analytics and reporting requirements.
- Develop and maintain scalable ETL processes from various sources, including multiple ERP systems, into a data warehouse.
Data Ingestion and Automation data pipelines:
- Implement data validation and quality checks to ensure accuracy and completeness.
- Design, build, and maintain automated data pipelines to streamline data processing and transformation.
- Utilize orchestration tools to schedule and monitor pipeline workflows.
- Collaborate with data analysts to understand data requirements and support their analysis needs.
- Optimize data structures and queries to enhance performance and usability for analytical purposes.
Data Warehousing:
- Design and optimize data warehousing solutions to support business intelligence and analytics needs.
- Implement data modelling techniques to organize and structure data for optimal querying and analysis.
Optimization and Performance Tuning of Data Dashboards:
- Troubleshooting and fixing issues on existing reports/dashboards while also continuously building improvements.
- Optimize dashboard performance and ensure responsiveness for large datasets and complex queries.
- Design, Data Visualization and Dashboards
Qualifications and Experience
- Bachelor’s or Master’s Degree in Computer Science, Machine Learning, or related field
- 5+ years of industry experience working on data engineering with a focus on data ingestion, data warehousing, pipeline automation, and ETL development
- Experience building infrastructure to support streaming or offline data.
- Extensive programming experience in Python/Scala/Java
- Experience with SQL in addition to one or more of Spark/Hadoop/Hive/HDFS
- Working knowledge of databases, data systems, and analytics solutions, including proficiency in SQL, NoSQL, Java, Spark and Amazon Redshift for reporting and dashboard building.
- Experience with implementing unit and integration testing.
- Ability to gather requirements and communicate with stakeholders across data, software, and platform teams.
- Ability to develop a strategic vision for data pipelining and infrastructure.
- Experience managing a team of mid-level engineers.
- Sense of adventure and willingness to dive in, think big, and execute with a team
Language(s):
- English
- French is a plus.
Technology:
- Python, Java, SQL, NoSQL, Amazon Redshift, Kafka, Apache Beam, Apache Airflow, Apache Spark