Key Responsibilities :
1. Design & Develop Scalable Data Pipelines: Architect and build efficient data ingestion pipelines that empower the business with timely and actionable insights to drive strategy and operations.
2. Collaborate with Cross-Functional Teams: Work closely with technical and business teams to gather requirements and translate them into technical specifications and solutions.
3. Optimize Data Architecture: Continuously refine and improve the performance, reliability, and scalability of data pipelines to ensure seamless integration of data across systems.
4. Ensure Data Integrity: Perform rigorous data quality checks and uphold best practices to maintain the accuracy, consistency, and integrity of data.
5. Build ETL Solutions: Develop ETL frameworks to collect, transform, and integrate data from diverse sources such as Kafka, Scylla, PostgreSQL, MongoDB, APIs, and various file formats.
6. Adopt Best Practices: Implement best-in-class engineering practices around reporting and analysis, ensuring data integrity, testing, validation, and comprehensive documentation.
Basic Qualifications :
1. Bachelor’s degree in Computer Science, Information Systems, or a related technical field, or equivalent work experience.
2. 3+ years of hands-on experience with cloud-based data technologies, including message queues, event grids, relational databases, NoSQL databases, data warehouses, and big data technologies (e.g., Spark).
3. Proficiency in Spark (Java, Python, SQL): Expertise in developing and optimizing Spark-based applications for large-scale data processing.
4. Advanced SQL Skills: Ability to create complex queries, manage database structures, and ensure optimal performance.
5. Experience with Docker and Kubernetes: Familiarity with deploying applications using modern containerisation and orchestration tools.
6. DevOps and CI/CD: Solid understanding of modern DevOps practices, including Git, continuous integration, and continuous deployment pipelines.
7. Agile Development Experience: Comfortable working in an Agile environment, particularly using Scrum methodology.
Preferred Qualifications
1. API Frameworks & OOP: Experience with API frameworks and object-oriented programming to integrate services and improve data workflows.
2. Business Acumen: Ability to collaborate closely with leadership to deliver innovative solutions tailored to evolving business needs.
3. Strong Communication Skills: Excellent verbal and written communication abilities, with a knack for explaining complex technical concepts to both technical and non-technical stakeholders.
4. Ability to leverage tools like langchain, llama index ..etc to build llm based applications.
Top Skills
What We Do
ZZAZZ transforms digital content into a real-time tradable asset through its revolutionary Large Pricing Model (LPM)—an advanced AI-driven system that dynamically assigns accurate market values based on billions of engagement signals, real-time user interactions, and live market data. With ZZAZZ, creators and publishers gain immediate clarity on their content’s true economic worth, enabling precise monetization strategies and maximizing revenue opportunities effortlessly. Forget outdated static pricing models—ZZAZZ ensures every piece of content captures its fair value, instantly responding to real-world demand shifts, making your content strategy smarter, more profitable, and future-proof.








