Important Information
Location: Latin America
Work Mode: Hybrid
Job Summary
As a Senior Data Engineer (12736), you will be responsible for designing, developing, and maintaining high-quality software solutions. You will collaborate with cross-functional teams to understand business requirements and translate them into scalable and efficient software applications. Your role will involve leading technical projects, mentoring junior engineers, and continuously improving software development practices to ensure the delivery of robust and reliable software systems.
Responsibilities and Duties
- Data Pipeline Development: Design, build, and optimize scalable data pipelines for ingesting, processing, and transforming large datasets.
- Data Architecture Design: Develop and maintain robust data architectures, ensuring efficient storage and retrieval in systems like Hadoop, Spark, or cloud-based solutions.
- ETL Process Management: Implement and manage ETL workflows to ensure data accuracy, consistency, and reliability across systems.
- Performance Optimization: Monitor and enhance the performance of data systems, focusing on scalability and efficiency.
- Collaboration: Work closely with data analysts, scientists, and stakeholders to understand data needs and deliver actionable insights.
Qualifications and Skills
- Bachelor’s degree in computer science, software engineering, or a related field.
- Extensive experience in software development with a focus on designing and building scalable applications.
- Professional/ Advanced English skills.
- +5 years of experience.
- Working with business stakeholders and our delivery team to understand high value business problems that can be solved through the application of data processing and analytical systems.
- Developing, expanding, and evolving our existing databases and ETL pipelines.
- Working to design, build and support a transformational Databricks cloud data platform for the business.
- Being a core and professional member of the new data engineering practice
- Understand business requirements and help refine development tasks and estimate their complexity.
- Research, evaluate and adopt new technologies with a right tool for the job mentality.
- Focus on both speed of delivery and quality, with suitable pragmatism – ensuring your solutions are always appropriate and not too complex or over-engineered.
- Quick progression of projects from proof-of-concept to post-production stage.
- Communication and presentation of ideas to colleagues in all parts of the wider data team.
- Participating in code reviews for data engineering practice.
- Hands-on experience working with modern cloud data lake house technologies e.g. Databricks (preferred) or Snowflake. You must understand how cloud-native solutions must be built differently from traditional ones.
- Excellent knowledge of and proven experience with at least one of our core languages for data engineering SQL or Python. The ability to rapidly adopt the other if you are experienced with just one.
- Experience working using a modern DevOps approach. You should not just have written data code but also the CI/CD pipelines necessary to test and deploy that code in a professional environment.
- A robust understanding of core data engineering topics – ETL vs ELT, structured and unstructured data, data quality and data governance.
- Ability to contribute to all aspects of a solution – design, infrastructure, development, testing and maintenance.
- The ability to design and advocate for technical solutions to business problems.
- Experience collaborating with technical and non-technical team members in agile Scrum ceremonies – roadmap planning, feature workshops, backlog elaboration, code review.
- Track record of taking initiative and delivering projects end-to-end; clear evidence of being self-driven and motivated
- Immense curiosity, high energy, and desire to go the extra mile to make a difference.
- Beyond the required skills we are open to individuals of diverse talents. Experience with additional technologies, data science knowledge or an insurance background are all valued. We want to know how your unique abilities can contribute to our team.
Additional Requirements
- Technologies: Spark, Hadoop, Kafka, Airflow, Snowflake.
- Competencies: Autonomy, agile work environments, effective communication, and leadership.
About Encora
Encora is a global company that offers Software and Digital Engineering solutions. Our practices include Cloud Services, Product Engineering & Application Modernization, Data & Analytics, Digital Experience & Design Services, DevSecOps, Cybersecurity, Quality Engineering, AI & LLM Engineering, among others.
At Encora, we hire professionals based solely on their skills and do not discriminate based on age, disability, religion, gender, sexual orientation, socioeconomic status, or nationality.
Top Skills
What We Do
Headquartered in Santa Clara, California, and backed by renowned private equity firms Advent International and Warburg Pincus, Encora is the preferred technology modernization and innovation partner to some of the world’s leading enterprise companies. It provides award-winning digital engineering services including Product Engineering & Development, Cloud Services, Quality Engineering, DevSecOps, Data & Analytics, Digital Experience, Cybersecurity, and AI & LLM Engineering. Encora's deep cluster vertical capabilities extend across diverse industries, including HiTech, Healthcare & Life Sciences, Retail & CPG, Energy & Utilities, Banking Financial Services & Insurance, Travel, Hospitality & Logistics, Telecom & Media, Automotive, and other specialized industries.
With over 9,000 associates in 47+ offices and delivery centers across the U.S., Canada, Latin America, Europe, India, and Southeast Asia, Encora delivers nearshore agility to clients anywhere in the world, coupled with expertise at scale in India. Encora’s Cloud-first, Data-first, AI-first approach enables clients to create differentiated enterprise value through technology