Key Responsibilities:
- Design, develop, and maintain highly reliable data pipelines to ingest, transform, and load data into Snowflake with an uncompromising focus on accuracy and quality.
- Leverage AWS services (e.g., RDS, DMS, S3, Data Firehose, Kinesis, Lambda, DynamoDB) to build scalable and fault-tolerant data workflows.
- Maintain infrastructure as code using AWS CDK to maintain data infrastructure
- Utilize open-source distributed data processing frameworks (e.g., Hadoop, Spark) to handle large-scale data transformations and batch processing.
- Manage schema evolution and database migrations with tools like SchemaChange or Flyway.
- Write and maintain code in programming languages (Java, Go, or Python)
- Monitor, troubleshoot, and optimize pipelines to ensure maximum uptime, performance, and data integrity, critical for banking operations.
- Collaborate with data engineers, analysts, and other cross-functional teams.
- Document pipeline designs, processes, and quality assurance measures to maintain transparency and auditability.
Required Skills and Qualifications:
- 5-7 years of experience in data engineering, with a proven track record of building and maintaining high-quality data pipelines.
- Expertise with AWS services, including RDS (Aurora), AWS DMS, S3, Data Firehose, Kinesis, Lambda, DynamoDB, or other equivalent services.
- Hands-on experience with AWS CDK for infrastructure as code or equivalent
- Experience with distributed data processing. example (mapreduce/hadoop/spark).
- Proficiency in one of the programming languages (Java, Go, or Python)
- Knowledge of data modeling (e.g., star schemas, dimensional modeling, data mesh).
- Attention to detail and a commitment to maintaining the highest standards of data accuracy and pipeline quality.
- Ability to troubleshoot complex issues and implement robust, reliable solutions
- Strong communication skills to work with cross-functional teams.
Preferred Qualifications:
- Experience in the banking or financial services industry, with an understanding of compliance and data accuracy requirements.
- Experience with Snowflake database.
- Exposure to CI/CD pipelines for deploying data infrastructure in a controlled, auditable manner.
- Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience).
What we offer:
- At Lead, we design our benefits to support company culture and principles, to foster an efficient and inspiring work environment, and to create the conditions for our team to give their best in both work and life
- Competitive compensation, including opportunities for equity grants, based on experience, geographic location, and role
- Medical, Dental, Vision, Life, 401k Matching, and other wellness benefits, including FSA, HSA and HRA
- Paid parental leave
- Flexible vacation policy, including PTO and paid holidays
- A fun and challenging team environment in a dynamic industry with ample opportunities for career growth
Similar Jobs
What We Do
Lead Bank is where expertise, experience and technology unite the people behind inspired businesses. We take pride in working side by side with companies to grow bottom lines that become cornerstones of the community.
We’ve always been a bank that leads the way, not follows the herd. In 2010, we rolled out our new name along with a new suite of next generation banking resources. From the robust online banking capabilities to our remote deposit technology, these digital solutions let you bank however you want, wherever you want. While our bank has roots in Cass county, we offer full-service banking and outstanding customer service for the entire Kansas City metropolitan area.
Our relationships with our clients remain at the heart of what we do. We get to know you and your business so we can tailor financial solutions to help you achieve your goals.