The Engineer, Data Platform
Engineering, partners with senior engineers to design, develops and maintains
digital technology infrastructure to support information technology
applications and services. With limited supervision, this person collaborates
with cross functional teams to implement new features to meet user needs and
business goals, and upgrade existing platforms to improve performance and
functionality through writing clean, efficient and scalable code and
troubleshooting and debugging complex issues. This person also participates in
code reviews to maintain high code quality and share knowledge with platform
engineering team members.
· Software Development: Designs and develops high quality software solutions by writing
clean, maintainable and efficient codes.
· Automation: Leads the application of internal software deployment platform,
continuous integration or continuous delivery pipeline and twelve factor
development methodology to automate the deployment process, ensuring smooth and
reliable releases.
· Collaboration: Partners with other engineers and Architects to gather complex
requirements, break it down and deliver solutions that meet business needs.
· Testing & Debugging: Writes and maintains complex unit tests and integration tests and
performs debugging to maintain the quality and performance of the software,
applying test driven development as needed.
· Documentation: Creates and maintains comprehensive documentation for complex
software applications, deployment processes and system configurations.
· Technical Support: Provides technical support and troubleshooting for complex issues
with deployed applications to ensure minimal downtime and fast resolution.
· Reliability & Professionalism: Delivers predictably without follow‑up; estimates accurately; manages dependencies and keeps team
SLAs/commitments green.
· Design & Solutioning: Designs and builds moderately complex data products that meet
functional and non‑functional
requirements. Ensures solutions are scalable, cost‑aware, and observable.
· Quick learner and flexible to move across technology stacks to
meet varying business needs of DT&D organization.
· Minimum requirement of 2 years
of relevant work experience out of a total of 4 plus years in the
industry. Typically reflects 5 years or more of IT experience.
· Proficient in Python frameworks for data processing, API integration, and working with various data formats
used in big data and streaming systems.
· Very good hands-on experience with Terraform and scripted automations. Experience building
AND operating DevOps pipelines in AWS codepipeline / Jenkins for
application deployment.
· Build and operate secure data platforms on AWS, leveraging core services such as AWS
IAM, S3, Lake Formation, Glue, Lambda, RDS and CloudWatch.
· Good command of SQL and working knowledge of relational database
administration in AWS.
Skills Required
- Minimum of 2 years of relevant work experience
- Proficient in Python frameworks
- Hands-on experience with Terraform
- Experience building and operating DevOps pipelines in AWS CodePipeline/Jenkins
- Build and operate secure data platforms on AWS
- Good command of SQL and relational database administration
- AWS certified solutions architect-associate proficiency
What We Do
AlgoLeap specializes in AI-powered software solutions, digital product engineering, and IT consulting services, focusing on digital transformation and AI-driven innovation.




.png)



