Data Cloud Engineer

Ycotek
20 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English

Job location

Tech stack

API
Artificial Intelligence
Airflow
Data analysis
Azure
Microsoft Basic Data Partition
Cloud Computing
Cloud Computing Security
Cloud Engineering
Databases
Continuous Integration
Data as a Services
Data Governance
ETL
Data Warehousing
Software Debugging
Document-Oriented Databases
Python
Performance Tuning
Cloud Services
SQL Databases
Azure
Snowflake
Semi-structured Data
Real Time Data
Data Management
Terraform
Data Pipelines

Job description

We are looking for a skilled "Sr. Data Cloud Engineer" to design, build, and operate scalable data pipelines and cloud-native data platforms. You will work closely with analytics, product, and engineering teams to enable reliable reporting, analytics, and AI/ML use cases. This role is hands-on and focused on building and owning production-grade data pipelines in Azure, with an emphasis on data quality, performance, and maintainability. What You'll Do

Data Pipelines & Processing Design, develop, and maintain robust batch and near-real-time data pipelines. Build and optimize ETL/ELT workflows for structured and semi-structured data. Ingest data from multiple sources (databases, APIs, files, and cloud services). Ensure pipelines are reliable, scalable, and easy to operate in production.

Data Modeling & Warehousing Design and maintain data models for analytics and reporting (fact/dimension, star/snowflake schemas). Build and optimize data warehouse / analytical datasets. Partner with analytics and BI teams to deliver clean, well-documented datasets.

Cloud & Platform Engineering Implement and manage workflows using Azure Data Factory (ADF) and Apache Airflow. Deploy and manage data workloads using Azure Container Apps and Azure Function Apps. Monitor pipelines, troubleshoot failures, and continuously improve performance and cost efficiency. Apply best practices for cloud security, scalability, and cost management.

Collaboration & Quality Work closely with analytics, product, and engineering teams to support business and AI/ML use cases. Implement basic data quality checks, validation, and monitoring. Document data pipelines, data models, and operational processes.

Requirements

Strong understanding of data warehousing concepts (fact/dimension modeling, star and snowflake schemas). Hands-on experience building and operating ETL/ELT pipelines. Proficiency in Python and SQL (writing production-quality code and queries). Experience working with large-scale datasets. Solid experience with Azure data services, including: Azure Data Factory (ADF) Apache Airflow Azure Container Apps Azure Function Apps Good understanding of cloud security, performance optimization, and cost management. Strong problem-solving, debugging, and operational mindset.

Nice to Have Experience with CI/CD for data pipelines. Familiarity with data governance, data quality frameworks, or monitoring tools. Experience supporting analytics, BI, or AI/ML workloads. Exposure to infrastructure-as-code (Terraform, ARM, or similar).

Apply for this position