Data Engineer
Role details
Job location
Tech stack
Job description
You'd be joining the IB Technology Data Science team to design and build the engine behind our Trade Intelligence Analytics (TIA) platform - an analytical platform serving our Electronic Trading Desk's clients. TIA provides cutting-edge insights into trade performance, powering alpha-seeking clients with end-to-end Trade Cost Analysis through dashboards, data feeds and machine learning products downstream.
You'll work alongside Data Engineers, Analytics Engineers and Data Scientists who own the full stack - from Kubernetes infrastructure through to analytics and data science workloads. You will sit within and work tightly with the Data Science team who have been building the project to date. You will work closely with the Low Touch (Berenberg Electronic Algorithmic Trading) desk to deliver the product to clients.
What will you do?
- Develop, scale and maintain data pipelines using Python/Rust, SQL and DBT
- Translate business requirements into optimised trade performance benchmark calculations
- Design and implement scalable data models within the Data Lakehouse / Data Platform; develop APIs for client interaction with our product
- Normalisation of data feeds to support client integration and adoption of the TIA platform
- Drive engineering best practices including test-driven development, version control and CI/CD pipelines
- Contribute to analytics frontend development and UX
Requirements
- Expert-level Python, SQL and ideally Rust, with a track record of building efficient data pipelines
- Experience or knowledge in Lakehouse architectures
- Comfortable working with large-scale, high-volume datasets as source inputs for analytical pipelines
- Experience managing data pipelines using data orchestration tools (Airflow, Dagster or other)
- Expertise in DBT for data modelling and transformation workflows
- Hands-on experience with engineering best practices: Docker/Podman, Git, test-driven development, CI/CD pipelines
- Knowledge of data visualization tools such as Tableau or PowerBI
- Strong data modelling skills and familiarity with designing data architectures that support both analytics and machine learning use cases.
- Experience using cloud compute platforms like Snowflake is preferred
Benefits & conditions
- Private pension plan - 10% of base salary contribution by Berenberg
- Generous 30 day holiday allowance
- Private Health Insurance
- Life Insurance scheme
- Flexible working hours
- Enhanced parental leave policies
- Employee Assistance Programme offering counselling sessions related to mental health, financial well-being and other topics