It's fun to work in a company where people truly BELIEVE in what they are doing!
We're committed to bringing passion and customer focus to the business.
Job Responsibilities:
- Be an integral part of large-scale client business development and delivery engagements by understanding the business requirements.
- Hands-on with Dataflow/Apache beam and Realtime data streaming
- Engineer ingestion and processing pipelines on GCP using python libraries, Java, BigQuery and composer.
- Automate the repeatable tasks into a framework that can be reused in other parts of the projects.
- Handle the data quality, governance, and reconciliation during the development phases.
- Being able to communicate with internal/external customers, desire to develop communication and client-facing skills.
- Understand and contribute in all the agile ceremonies to ensure the efficiency in delivery.
Qualification & Experience:
- A bachelor’s degree in Computer Science or related field.
- Minimum 5 years of experience in software development.
- Minimum 3 years of technology experience in Data Engineering projects
- Minimum 3 years of experience in GCP.
- Minimum 3 years of experience in python programming.
- Minimum 3 years of experience in SQL/PL SQL Scripting.
- Minimum 3 years of experience in Data Warehouse / ETL.
- Ability to build streaming/batching solutions.
- Exposure to project management tools like JIRA, Confluence and GIT.
- Ability to define, create, test, and execute operations procedures.
Must have skills:
- Strong understanding of real time streaming concepts
- Strong problem solving and analytical skills.
- Good communication skills.
- Understanding of message queues like Kafka, Rabbit MQ, PubSub
- Understanding of fast data caching systems like Redis/Memory Store
- GCP experience – ~3+ years
- Dataflow/Apache beam hands of experience – Custom templates
- Understanding of Composer
- Good experience with Big Query and PubSub
- Good hands-on experience with Python
- Hands on experience with modular java code development involving design patterns – Factory, Reflection, etc.
Good to have skills:
- GCP Professional Data Engineer certification is an added advantage.
- Understanding of Terraform script.
- Understanding of Devops Pipeline
- Identity and Access Management, Authentication protocols
- Google drive APIs, One drive APIs
Location
- Hyderabad (Client location)
If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!
Not the right fit? Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!
Top Skills
What We Do
Fractal is one of the most prominent players in the Artificial Intelligence space. Fractal's mission is to power every human decision in the enterprise and brings AI, engineering, and design to help the world's most admired Fortune 500® companies.
Fractal's products include Qure.ai to assist radiologists in making better diagnostic decisions, Crux Intelligence to assists CEOs, and senior executives make better tactical and strategic decisions, Theremin.ai to improve investment decisions, and Eugenie.ai to find anomalies in high-velocity data & Samya.ai to drive next-generation Enterprise Revenue Growth Management.
Fractal has more than 3,000 employees across 16 global locations, including the United States, UK, Ukraine, India, Singapore, and Australia. Fractal has consistently been rated as India's best companies to work for, by The Great Place to Work® Institute, featured as a leader in Customer Analytics Service Providers Wave™ 2021, Computer Vision Consultancies Wave™ 2020 & Specialized Insights Service Providers Wave™ 2020 by Forrester Research, and recognized as an "Honorable Vendor" in 2021 Magic Quadrant™ for data & analytics by Gartner.