View all jobs

Data Engineer

Lagos, Lagos

Job Description Data Engineer (SQL, Python, PLSQL)


Job Summary:

We are seeking a talented and experienced Data Engineer to join our team. The Data Engineer will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure to support our data analytics and business intelligence needs. The ideal candidate will have a strong background in data engineering, a deep understanding of data architecture, and experience with big data technologies.


Key Responsibilities:

Data Pipeline Development: Design, develop, and maintain scalable data pipelines to process and analyze large datasets from various sources.


Data Integration: Integrate data from multiple data sources and ensure data quality, consistency, and reliability.


Data Warehousing: Build and maintain data warehouses and data lakes to support analytics and reporting needs.


Data Transformation: Implement data transformation and cleaning processes to ensure data is ready for analysis.


Performance Optimization: Optimize data processing performance and troubleshoot any issues related to data pipelines and infrastructure.


Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand their data needs and provide the necessary data infrastructure.


Documentation: Document data engineering processes, data flows, and infrastructure designs.

Security and Compliance: Ensure data security, privacy, and compliance with relevant regulations and standards.


Continuous Improvement: Stay up-to-date with the latest industry trends and technologies, and continuously improve data engineering practices and processes.



Education: Bachelor’s degree in Computer Science, Information Technology, or a related field. Master’s degree is a plus.


Experience: Minimum of 3-5 years of experience in data engineering or a similar role.

Technical Skills: Proficiency in programming languages such as Python, Java, or Scala. Strong SQL skills.

Big Data Technologies: Experience with big data technologies such as Hadoop, Spark, Kafka, and NoSQL databases.


Data Warehousing: Experience with data warehousing solutions such as Amazon Redshift, Google BigQuery, or Snowflake.


ETL Tools: Familiarity with ETL tools and processes.

Cloud Platforms: Experience with cloud platforms such as AWS, Azure, or Google Cloud.

Analytical Skills: Strong analytical and problem-solving skills.

Communication Skills: Excellent verbal and written communication skills.

Collaboration: Ability to work effectively in a collaborative team environment.

Certification: Relevant certifications (e.g., AWS Certified Data Analytics, Google Professional Data Engineer) are a plus.

Share This Job

Powered by