Job Title: Principal, Data Engineer
Location: Lagos
Job type: Full-time
Department: Technology
About Cellulant Nigeria Limited:
Cellulant is Africa’s no. 1 company in the payments & transfers category (Fintech Awards 2016). We are a PPISP (Payment Platform Infrastructure Service Provider) regulated by the Central Bank of Nigeria (CBN) and insured by the Nigerian Deposit Insurance Corporation (NDIC).
Job Description:
- As the Principal Engineer in the Database & Data Engineering Department, you will be responsible for building and maintaining Cellulant’s rapidly expanding Data infrastructure.
- You will be responsible for architecting, implementing, and maintaining our data infrastructure while ensuring high availability, performance, and security of our databases.
- The ideal candidate will have deep expertise in database design, data pipelines, ETL processes, AWS data solutions, on-prem database setups, and cloud-native architectures.
Core Responsibilities:
- Build different types of data warehousing solutions to meet Cellulant’s data needs.
- Lead the design, implementation, and successful delivery of large-scale data solutions involving multiple data sources.
- Build scalable data infrastructure and understand distributed systems concepts from a data storage perspective.
- Utilize expertise in SQL, ETL, and data modeling.
- Ensure the accuracy and availability of data and understand how technical decisions impact analytics and reporting.
- Deploy and maintain relational and NoSQL databases, including MySQL, PostgreSQL, Oracle, and Redis.
- Optimize database performance, indexing, partitioning, and query tuning.
- Manage database migrations and upgrades,
- Implement high availability, clustering, replication strategies, and disaster recovery solutions for on-prem and cloud databases.
- Work closely with networking and security teams to ensure secure database access.
- Develop and manage ETL pipelines using tools like AWS Glue and Lambda.
- Ensure data quality, governance, and compliance with standards like GDPR, HIPAA, and SOC 2.
- Mentor and coach junior data engineers and DBAs, fostering a culture of automation and DevOps best practices.
- Leverage AWS Database Migration Service (DMS) for seamless data replication and migration.
- Utilize data streaming platforms like Kafka.
- Monitor and analyze database performance using Datadog and other observability tools.
- Implement infrastructure as code (IaC) using tools like Terraform and Ansible for database provisioning and automation.
- Establish data security and encryption best practices across database environments.
- Develop strategies for cost optimization and resource efficiency across cloud-based and on-prem data infrastructure.
- Define SLAs, performance benchmarks, and monitoring standards for data services.
- Collaborate with product and analytics teams to ensure scalable data solutions align with business goals.
Key Relationships:
- Software Engineers.
- Infrastructure Engineers.
- CI/CD Engineers.
- Data Warehouse Team.
- Customer Success Teams.
- Service Operations Team.
Qualifications:
- Having obtained a Degree in a relevant Computer Science.
- Qualified to ISEB/ISTQB Foundation Level or better.
- You’ve got a working knowledge of Docker.
- You’re interested in growing your knowledge and skills in Test Environment Provisioning and Configuring using technologies like Terraform, Ansible, Kubernetes, GCP, or AWS.h
- Knowledge Continuous Integration systems (e.g., Jenkins, Travis, GitLab), Programming languages and tools: Python, Selenium, Java, XML, SQL, JavaScript and REST API testing tools such as Postman, SOAP UI, JMeter.
Experience:
- 8+ years of experience in database administration and data engineering.
- Expertise in SQL database administration, tuning, and optimization.
- Experience with AWS RDS, Aurora, and DynamoDB.
- Vast experience deploying large-scale SQL databases and data warehouse platforms.
- Experience deploying services on Managed Databases & Data Warehouses using cloud providers like AWS, GCP, and Azure.
- Proven experience with on-prem database architecture, high availability, and disaster recovery strategies.
- Familiarity with security best practices and compliance frameworks.
- Strong experience with IaC tools like Terraform, Ansible, or CloudFormation.
- Experience with deploying Non-relational (NoSQL/NewSQL) databases will be an added advantage
- Solid Experience in the administration of Linux environments.
- Solid Experience in the administration of data streaming platforms eg Kafka
- Experience with data programming/scripting languages eg Python, R is desirable.
- A solid understanding of Internet-based technologies (TCP/IP, DNS, Security, HTTP/HTTPS)
- Strong security awareness for cloud and on-prem databases.
- Ability to document key design and operational practices.
- Experience with Machine Learning model creation, training & operationalisation
- Familiarity with data lake architectures and big data frameworks.
Skills:
- Database Administration for SQL and NoSQL environments.
- Linux Administration.
- Proficiency in Data Streaming Platforms (Kafka).
- Programming/scripting languages (Python, R) desirable.
- Infrastructure as Code (IaC) proficiency (Terraform, Ansible, CloudFormation).
- Database security and encryption best practices.
- Performance tuning and capacity planning.
How to Apply:
Interested and qualified candidates should Click on ‘Apply now’ below.