Data Engineer
Role details
Job location
Tech stack
Job description
across 20+ countries, our scale enables us to support our clients globally and locally, providing a seamless client experience across borders and service lines.Job DescriptionWe are seeking a skilled and detail-oriented Data Engineer to design, build, and maintain scalable data pipelines and infrastructure that support analytics, reporting, and data-driven decision-making. The successful candidate will work closely with PBI analysts, data scientists, and business stakeholders to ensure reliable, secure, and high-quality data solutions.This role requires strong technical expertise in data architecture, ETL processes, and cloud platforms, along with a collaborative mindset and a passion for improving data systems.Main ResponsibilitiesData Pipeline Development
- Design, develop, and maintain robust ETL/ELT pipelines.
- Build scalable data ingestion processes from multiple structured and unstructured sources.
- Ensure high data quality, integrity, and availability.
Data Architecture & Infrastructure
- Develop and maintain data warehouses and data lakes.
- Optimize database performance and storage solutions.
- Implement best practices in data modelling (star/snowflake schemas).
Cloud & Platform Management
- Deploy and manage data solutions on MS Fabric
- Work with distributed platforms specifically Microsoft SQL Server
- Monitor system performance and implement improvements.
Collaboration & Support
- Partner with data analysts and data scientists to support analytics initiatives.
- Translate business requirements into technical solutions.
- Provide technical documentation and knowledge transfer.
Governance & Security
- Implement data governance standards and controls.
- Ensure compliance with regulatory and security requirements.
- Manage access controls and data protection measures.
Requirements
- Strong proficiency in SQL and Python.
- Experience building ETL/ELT pipelines (e.g., dbt cloud, ADF, Fabric Pipelines).
- Experience with relational and non-relational databases.
- Hands-on experience with cloud platforms (AWS, Azure, or GCP).
- Familiarity with data warehousing concepts and dimensional modelling.
- PowerBI (not essential)
Tools & Technologies (Examples)
- SQL
- Python
- Dbt cloud
- Fabric
- Fivetran
- Alteryx
Soft Skills
- Strong analytical and problem-solving skills.
- Excellent communication and stakeholder management.
- Ability to work independently and in cross-functional teams.
- High attention to detail and commitment to data quality., * Bachelor's degree in Computer Science, Engineering, Mathematics, or related field (or equivalent experience).
- 3+ years of experience in data engineering or a related role.
- Relevant cloud certifications are advantageous.