Data Engineer
Role details
Job location
Tech stack
Job description
Code Conversion: Lead the end-to-end migration of SAS code (Base SAS, Macros, DI Studio) to PySpark using automated tools (SAS2PY) and manual refactoring.
Pipeline Engineering: Design, build, and troubleshoot complex ETL/ELT workflows and data marts on AWS.
Performance Tuning: Optimise Spark workloads for execution efficiency, partitioning, and cost-effectiveness.
Quality Assurance: Implement clean coding principles, modular design, and robust unit/comparative testing to ensure data accuracy throughout the migration.
Engineering Excellence: Maintain Git-based workflows, CI/CD integration, and comprehensive technical documentation., Job Title: Azure Cosmos DB Admin and Developer-with python, pyspark, Cosmos, AzureLocation: Hybrid, London UK (3days/Week Onsite)Duration: 6months+ Contract Inside IR35 The RoleThe DB Admin and Developer will play a critical role in designing and developing scalable,..., Data Engineer | Outside IR35 | £450 - £500 | 6 months | Hybrid LondonWe're supporting a company who are looking for a Data Engineer to build and enhance the data processing capabilities within their Databricks environment. You'll be responsible for developing the code that..., 12 months contract/Onsite/Rate 550 pounds per day Inside IR 35Description:Join our Growth Marketing Technology team to help shape the future of marketing through data-driven innovation and AI-powered solutions. As a Product Manager III, you'll play a pivotal role in..., AWS Data Engineer (Innovation Group)Location: London (Hybrid - 3 days per week)Duration: 6 Months Initial ContractRate: TBDAre you a Data Engineer who thrives on building more than just standard ETL pipelines? Our Innovation Group is looking for a technical problem-solver...
Requirements
PySpark (P3): 5+ years of hands-on experience writing scalable, production-grade PySpark/Spark SQL.
AWS Data Stack (P3): Strong proficiency in EMR, Glue, S3, Athena, and Glue Workflows.
SAS Knowledge (P1): Solid foundation in SAS to enable the understanding and debugging of legacy logic for conversion.
Data Modeling: Expertise in ETL/ELT, dimensions, facts, SCDs, and data mart architecture.
Engineering Quality: Experience with parameterisation, exception handling, and modular Python design.
Additional Details
Industry: Financial Services experience is highly desirable., Job: Data Delivery & Technical Engagement Lead - Key skills: Experience with insurance domain, AWS, data engineering, data architecture, Lloyd's & London Market and Specialty Insurance, - Experience : 10+ years - Location : London, UK - Job Type: Permanent (Full...
Benefits & conditions
Working Pattern: Fully remote with internal team collaboration days.
Benefits: 33 days holiday entitlement (pro-rata). xhkmmrq
Randstad Technologies is acting as an Employment Business in relation to this vacancy. Similar jobs
PySpark Developer
Randstad Digital £300 - £350 per day