- Collaborate with functional leaders, Business Stakeholders, IT and project teams to create and manage data pipelines and analytics solutions using SQL, Python, and AWS services.
- Work with requirements to document detailed data pipeline requirements into functional specifications for all data pipelines to be used during the progressive build cycles of the implementation.
- Work closely with IT Team leads, Senior business leaders and Users to determine the data pipelineing requirements.
- Effectively translate complex business requirements into technical requirements and assist in designing Functional Design Documents.
- Manage technical delivery resources supporting data pipeline development, testing, deployment activities in onshore/offshore model.
- Work with Security and Compliance team to define and build structure that defines roles/privileges and to control data pipeline access.
- Design, development, and maintenance of ongoing KPIs, metrics, data pipelines, analyses, dashboards, etc. to drive key business decisions.
- Monitor, respond and resolve tickets and issues submitted by users, ability to perform RCA on critical tickets/incidents.
- Bachelor's degree in Information Technology, Business Analytics, or related field
- 2+ yrs experience in implementing data engineering pipelines using Python and Spark
- Strong hands-on experience in AWS data engineering tools such as Glue, Redshift, Athena, and Lambda.
- Proficient in writing advanced SQL and Python scripts for data transformation, extraction, and processing.
- Skilled in performance tuning, data quality validation, and pipeline troubleshooting across distributed systems.
- Must be self-motivated and able to work independently in a fast-paced, agile team environment.
- Excellent verbal and written communication skills, strong attention to detail, and the ability to manage multiple priorities and meet deadlines.
- Solid understanding of data modeling, data flow, and architecture best practices in AWS.
- Experience working with financial or supply chain datasets and transforming data for analytics and reporting.
- Familiarity with AWS-based data governance, lineage tracking, and access control mechanisms.
- Strong coding fundamentals and experience developing modular, reusable, and scalable pipelines.
Similar Jobs
What We Do
Driscoll’s is the global market leader for fresh strawberries, blueberries, raspberries and blackberries. With more than 100 years of farming heritage and hundreds of independent growers around the world, Driscoll’s is passionate about growing great tasting berries.
Driscoll’s exclusive patented berry varieties are developed through years of research using only natural breeding methods – that means no GMOs. A dedicated team of agronomists, breeders, sensory analysts, plant pathologists and entomologists help grow baby seedlings that are then grown on family farms. Our independent berry growers then work with Mother Nature, using experience and know-how to get the very best out of each strawberry, blueberry, blackberry and raspberry plant.
Driscoll’s is serious about yummy berries! Driscoll’s is the trusted brand for Only the Finest Berries ™.







