ABOUT HAKKODA
Hakkoda is a modern data consultancy that empowers data driven organizations to realize the full value of the Snowflake Data Cloud. We provide consulting and managed services in data architecture, data engineering, analytics and data science. We are renowned for bringing our clients deep expertise, being easy to work with, and being an amazing place to work! We are looking for curious and creative individuals who want to be part of a fast-paced, dynamic environment, where everyone’s input and efforts are valued. We hire outstanding individuals and give them the opportunity to thrive in a collaborative atmosphere that values learning, growth, and hard work. Our team is distributed across North America, Latin America, and Europe. If you have the desire to be a part of an exciting, challenging, and rapidly-growing Snowflake consulting services company, and if you are passionate about making a difference in this world, we would love to talk to you!.
Role Description
As an AWS Managed Services Architect, you will play a pivotal role in architecting and optimizing the infrastructure and operations of a complex Data Lake environment for BOT clients. You’ll leverage your strong expertise with AWS services to design, implement, and maintain scalable and secure data solutions while driving best practices.
You will work collaboratively with delivery teams across the U.S., Costa Rica, Portugal, and other regions, ensuring a robust and seamless Data Lake architecture. In addition, you’ll proactively engage with clients to support their evolving needs, oversee critical AWS infrastructure, and guide teams toward innovative and efficient solutions.
This role demands a hands-on approach, including designing solutions, troubleshooting, optimizing performance, and maintaining operational excellence.
Qualifications
- 7+ years of hands-on experience in cloud architecture and infrastructure
- (preferably AWS).
- 3+ years of experience specifically architecting and managing Data Lake or big data solutions on AWS.
- Expertise in AWS services such as EMR, Batch, SageMaker, Glue, Lambda,
- IAM, IoT TimeStream, DynamoDB, and more.
- Strong programming skills in Python for scripting and automation.
- Proficiency in SQL and performance tuning for data pipelines and queries.
- Experience with IaC tools like Terraform or CloudFormation.
- Knowledge of big data frameworks such as Apache Spark, Hadoop, or similar.
- Proven ability to design and implement secure solutions, with strong knowledge of IAM policies and compliance standards.
- Analytical and problem-solving mindset to resolve complex technical challenges.
- Exceptional communication skills to engage with technical and non-technical
- stakeholders.
- Ability to lead cross-functional teams and provide mentorship.
- Bachelor’s Degree (BA/BS) in Computer Science, Information Systems, or a
- related field.
- AWS Certifications such as Solutions Architect Professional or Big Data Specialty.
- Experience with Snowflake, Matillion, or Fivetran in hybrid cloud environments.
- Familiarity with Azure or GCP cloud platforms.
- Understanding of machine learning pipelines and workflows.
Technical Skills:
Data Governance & Security:
Problem-Solving:
Collaboration:
Education:
Nice to Have
Responsibilities
- AWS Data Lake Architecture: Design, build, and support scalable, high-performance architectures for complex AWS Data Lake solutions.
- Service Expertise: Deploy and manage solutions using AWS services, including but not limited to:
- EMR (Elastic MapReduce): Optimize and maintain EMR clusters for big data
- processing.
- AWS Batch: Design workflows to execute batch processing workloads
- effectively.
- SageMaker: Support data science teams with scalable model training and
- deployment.
- Glue: Implement Glue jobs for ETL/ELT processes to ensure efficient data
- ingestion and transformation.
- Lambda: Develop serverless solutions to automate processes and manage
- events.
- IAM Policies: Define and enforce security policies to control resource access
- and maintain governance.
- IoT TimeStream Database: Design solutions to handle time-series data at scale.
- DynamoDB: Build and optimize scalable NoSQL database solutions.
- Data Governance & Security: Enforce compliance, governance, and security best practices, ensuring data protection and privacy throughout the architecture.
- Performance Optimization: Monitor and fine-tune performance across AWS resources to ensure cost-effective and efficient operations.
- Automation: Develop Infrastructure as Code (IaC) solutions using tools like AWS
- CloudFormation, Terraform, or similar.
- Client Collaboration: Work closely with clients to understand their business goals and ensure the architecture aligns with their needs.
- Act as a technical mentor for delivery teams and provide support in
- troubleshooting, design reviews, and strategy discussions.
- Innovation: Stay updated on AWS advancements, best practices, and emerging tools to incorporate into solutions.
- Documentation: Develop and maintain architecture diagrams, SOPs, and
- knowledge-sharing materials for internal and client-facing purposes.
Team Leadership:
Hakkoda is an exciting, high growth company, and we’re scaling our team. We are looking for exceptional people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Hakkoda. We are a collaborative team of high achievers. We love to explore, challenge and have a lot of fun along the way.
Top Skills
What We Do
We've seen the power of data. We know that the speed and accessibility of data determines the agility and innovation of business.
Hakkoda was founded to help Snowflake customers realize the true value of their data. Whether you need help future-proofing your data structure, migrating your data to the cloud as efficiently and securely as possible, or using your data to build and monetize data-driven services, we are a partner you can trust.