Responsibilities
- Build and maintain scalable data pipelines for batch and real-time data processing.
- Work with cloud-native data platforms to ensure reliable, cost-effective data processing solutions
- Build and maintain REST APIs to serve processed and aggregated data to downstream applications and teams.
- Collaborate with cross-functional teams to define data architecture and infrastructure requirements.
- Monitor and improve pipeline performance, scalability, and resilience.
Expected Qualifications
- Bachelor's degree in Computer Science, Engineering or related Information Technologies field
- 2+ years of experience as a Data Engineer or in a similar data-focused role
- Strong Software Engineering skills
- Experience with Scala or Java programming language
- Experience with Go programming language is a plus
- Solid experience with at least one big data processing framework such as Apache Spark or Apache Flink
- Experience with cloud-native data infrastructure (GCP preferred) to ensure reliable, cost-effective solutions
- Familiarity with real-time data processing and streaming architectures for low-latency analytics
- Familiarity with modern data architectures including data lakes, data warehouses (BigQuery, Snowflake), and lakehouse table formats (Apache Iceberg, Delta Lake)
- Experience with workflow orchestration tools such as Apache Airflow, Prefect, or Temporal
- Strong proficiency in SQL for data manipulation, querying, and optimization
- Experience with RESTful APIs
- Experience with both SQL and NoSQL databases
- Experience with testing frameworks, including unit testing and data quality validation for data pipelines
- Experience with containerization technologies such as Docker and Kubernetes
- Knowledge of CI/CD pipelines for data engineering and infrastructure-as-code practices
- Strong problem-solving skills and the ability to work independently and collaboratively
- Strong English communication skills, written and verbal
Technology Stack
- Languages: Scala/Java (primary), Go (preferred), SQL (advanced), Python (nice to have)
- Big Data Processing: Apache Spark (primary), Apache Flink
- Streaming & Messaging: Apache Kafka
- Workflow Orchestration: Apache Airflow
- DevOps & Infrastructure: Docker, Kubernetes, GitLab CI, Terraform
- Cloud Platform: Google Cloud Platform (BigQuery, GCS, GKE, Dataproc, Cloud Functions, Composer, Monitoring, Logging)
- Version Control: Git
Top Skills
What We Do
We were founded in 2010 with a dynamic and agile start-up spirit. Since then, we have grown into a decacorn, backed by Alibaba, General Atlantic, Softbank, Princeville Capital, and several sovereign wealth funds. We believe that technology is the driver; e-commerce is the outcome. Thanks to our dedicated team, we are one of the top five e-commerce companies in EMEA and one of the fastest-growing e-commerce companies in the world! We deliver more than 1.5 million packages every day across 27 countries. We offer our 30 million customers a flawless shopping experience. Dreaming big is in our DNA: We're gearing up to be the leading global e-commerce platform. As a dynamic and passionate company, we are constantly growing with Trendyol Tech, one of the top R&D centres; Trendyol Express, the fastest growing delivery network; Dolap, the largest second-hand goods platform; and Trendyol Go, our instant food and grocery delivery service. And we’re not done yet! Now, we are on a journey to expand the positive impact we create to international markets. We opened our first international office in Berlin in May 2022 and Amsterdam followed in October 2022 and may others are on the way.








