Kafka Developer

Kryptos Technologies Limited
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Junior

Job location

Tech stack

Airflow
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Apache HTTP Server
Bash
Continuous Integration
DevOps
Github
Python
Data Streaming
Parquet
Data Ingestion
Amazon Web Services (AWS)
Containerization
Data Lake
Debezium
Kafka
Terraform
Docker

Job description

Pipeline Development: Take ownership of the CDC ingestion framework utilizing Kafka connectors (Debezium, Iceberg sink, S3 sink). Containerized Infrastructure Management: Deploy and manage Debezium and Kafka Connect workers using Docker containers orchestrating on AWS ECS (Elastic Container Service) and ECR. Data Lake Integration: Manage data ingestion into AWS S3, utilizing Parquet and Apache Iceberg formats. Infrastructure as Code: Use Terraform to provision and manage AWS resources supporting the data platform. CI/CD: Build and maintain deployment pipelines using GitHub and GitHub Actions. Operational Excellence: Monitor pipeline health, troubleshoot connectivity issues, and ensure the reliability of the Kafka ecosystem. Optional: Support and optimize workflow orchestration using Airflow where applicable., Streaming: Apache Kafka, Kafka Connect, Debezium Compute/Containerization: AWS ECS, AWS ECR, Docker Storage/Format: AWS S3, Apache Iceberg, Parquet DevOps: Terraform, GitHub Actions Languages: Python, Bash Optional Orchestration: Apache Airflow

Requirements

Apache Kafka & Kafka Connect: Multiple years of hands-on experience configuring, deploying, and managing Kafka Connect clusters in a production environment. Containerization: Extensive experience with Docker is required. You must be comfortable building images and managing container lifecycles. AWS Compute: Proven experience running containers on AWS ECS and managing images via AWS ECR.

Apply for this position