Sr. Data Architect

| Argentina | Remote
Apply
By clicking Apply Now you agree to share your profile information with the hiring company.

Yalo

Hi! This is Yalo! We are on a mission to bring conversational commerce to the world...

Remember how it used to be to interact with businesses that knew and understood you, that could recommend exactly what you needed, and that with a simple message could get you what you wanted??? Yep... neither do we. That is why at Yalo we are marrying the scale of digital commerce with the personalization and simplicity of conversations to help companies delight their users.

We know that traditional SAAS companies focus on first world problems... we don't! Having started in Latin America, our roots are in Emerging Markets and therefore we care about bringing amazing experiences to a population that traditionally has been underserved, such as the small shop owner in Brazil that is ordering online for the first time.


Your mission 🚀

We are seeking a skilled Senior Data Architect who will be responsible for designing and maintaining the organization's data architecture, ensuring it aligns with business goals and requirements.


What are the responsibilities for this role?

  • Design, build and maintain batch or real-time data pipelines in production. 
  • Maintain and optimize the data infrastructure required for accurate extraction, transformation, and loading of data from a wide variety of data sources.
  • Build and maintain Kafka and Snowplow pipelines.
  • Develop ETL (extract, transform, load) processes to help extract and manipulate data from multiple sources. 
  • Help to design and maintain a semantic layer. 
  • Automate data workflows such as data ingestion, aggregation, and ETL processing. 
  • Prepare raw data in Data Warehouses into a consumable dataset for both technical and non-technical stakeholders. 
  • Establishing and enforcing data governance policies and procedures to ensure data quality, integrity, and security.
  • Partner with data scientists and data analysts to deploy machine learning and data models in production. 
  • Build, maintain, and deploy data products for analytics and data science teams on GCP platform.
  • Ensure data accuracy, integrity, privacy, security, and compliance through quality control procedures.
  • Monitor data systems performance and implement optimization strategies.
  • Leverage data controls to maintain data privacy, security, compliance, and quality for allocated areas of ownership. 
  • Collaboration: Work closely with cross-functional teams, product managers, and stakeholders to ensure the delivery of high-quality software.
  • Continuous Learning: Stay updated with the latest trends and technologies in data systems, ensuring that our systems remain state-of-the-art.


Job Requirements (Must have)

  • Bachelor’s/Master’s degree in Computer Science, Information Systems, or a related field.
  • Minimum 8 years of Data Engineering experience ideally in cloud environments and good understanding of microservices and APIs.
  • Ability to analyze complex data requirements and design appropriate data solutions.
  • Excellent communication and interpersonal skills to effectively communicate with technical and non-technical stakeholders.
  • Strong problem-solving abilities to identify and resolve data architecture challenges and issues.
  • Demonstrated leadership skills to lead and mentor junior team members and drive data architecture initiatives forward.
  • Adaptability: Ability to adapt to evolving technologies, tools, and business requirements in the data architecture space.
  • Business Acumen: Understanding of business processes, objectives, and strategies to align data architecture efforts with business goals.
  • Working knowledge of Kafka pipelines.
  • Strong experience in designing and building ETL models and data workflows. DBT & Great Expectations
  • Working knowledge on designing and implementing a BI semantic layer.
  • Strong foundation in data structures, algorithms, and software design.
  • Advanced SQL skills and experience with relational databases and database design.
  • Experience working with BigQuery cloud Data Warehouse and other solutions like Snowflake, Databricks.
  • Working knowledge in programing languages (e.g. Python).
  • Strong proficiency in data pipeline and workflow management tools (e.g., Airflow). 
  • Strong project management and organizational skills. 
  • Excellent problem-solving, communication, and organizational skills. 
  • Proven ability to work independently and with a team.


Nice to have:

  • Expertise in open table formats like Hudi, Iceberg, Delta.
  • Expertise with Snowplow pipelines.
  • Expertise in databases like Druid, Pinot, and Elasticsearch.
  • Collaborative project experience in Data Governance.


What do we offer? 

  • Unlimited PTO policy
  • Competitive rewards on the market range
  • Remote working is available (-+3 hours CT)
  • Flexible time (driven by results)
  • Start-up environment
  • International teamwork
  • You and nothing else limit your career here


We care,

We keep it simple,

We make it happen,

We strive for excellence. 

At Yalo, we are dedicated to creating a workplace that embodies our core values: caring, initiative, excellence, and simplicity. We believe in the power of diversity and inclusivity, where everyone's unique perspectives, experiences, and talents contribute to our collective success. As we embrace and respect our differences, we strive to create something extraordinary for the benefit of all.
We are proud to be an Equal Opportunity Employer, providing equal opportunities to individuals regardless of race, color, religion, national or ethnic origin, gender, sexual orientation, gender identity or expression, age, disability, protected veteran status, or any other legally protected characteristic. Our commitment to fairness and equality is a fundamental pillar of our company.


At Yalo, we uphold a culture of excellence. We constantly challenge ourselves to go above and beyond, delivering remarkable results and driving innovation. We encourage each team member to take initiative and make things happen, empowering them to bring their best ideas forward and contribute to our shared goals.

More Information on Yalo
Yalo operates in the Machine Learning industry. The company is located in San Francisco, California, Mumbai, Maharashtra and Bogotá, Cundinamarca. Yalo was founded in 2015. It has 288 total employees. To see all 12 open jobs at Yalo, click here.
Read Full Job Description
Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.

Similar Jobs

Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.
Learn more about YaloFind similar jobs