Company Description
Do you want to make the internet work better for millions of people?
IFTTT helps everything work better together. With over 25M users, 160k Pro customers, and 1,000+ supported services, we are the established no-code standard for connecting anything in our growing digital world. We believe IFTTT can become the platform of choice for Digital Creators and DIYers looking to automate their businesses, grow their communities, and connect their homes.
We’re looking for a mid-level Data Engineer to help us scale our data infrastructure, optimize data workflows, and collaborate closely with cross-functional teams to enable robust data-driven insights.
This role is perfect for someone who enjoys developing scalable data solutions that power meaningful analytics and insights. If you are excited about working with a high-traffic platform and designing efficient data solutions, we’d love for you to join us! This position is fully remote.
Job Description
You will:
- Design and maintain scalable data pipelines to ensure seamless data flow.
- Optimize existing data systems for performance and reliability.
- Implement best practices for data security, governance, and quality.
- Troubleshoot and resolve data pipeline issues in a timely manner.
You are:
- Passionate about building reliable and scalable data infrastructure.
- Eager to improve data systems and identify opportunities for optimization.
- Detail-oriented with a focus on data quality and best practices.
- A strong communicator who can work effectively with various teams.
- Enthusiastic about learning new technologies and staying updated in the field.
Qualifications
Must-haves:
- 5+ years of experience in data engineering or a related field.
- Proficiency in building and managing data pipelines with tools such as Apache Airflow, Spark, or similar.
- Strong SQL skills and experience working with relational and non-relational databases.
- Experience with cloud services (e.g., AWS, GCP, Azure) for data storage and processing.
- Familiarity with data modeling and data warehousing concepts.
- Proven ability to debug and optimize complex data workflows.
Nice-to-haves:
- Knowledge of Python or other programming languages used for data processing.
- Experience with container technologies (e.g., Docker, Kubernetes).
- Familiarity with stream-processing frameworks like Apache Kafka or AWS Kinesis.
- Exposure to data governance and data security practices.
- Experience working with big data tools (e.g., Hadoop, Hive).
- Understanding of ETL/ELT best practices and tools.
- Familiarity with Ruby & Rails.
Additional Information
Benefits:
- The annual salary for this position is $100,000 to $160,000 + equity and benefits. The starting pay for the successful applicant will depend on various job-related factors, which may include skills, education, training, experience, or location.
- Fast-paced and collaborative remote environment where you will regularly engage with our senior leadership
- Competitive compensation
- Medical, dental, vision, and life insurance
- Transit, FSA, HSA, & 401(k) benefits
- Meaningful Equity
- Flexible PTO policy
- Generous paid holidays/company off days per year
- New Hire Stipend (remote workstation)
Top Skills
What We Do
IFTTT is the world’s leading connectivity platform. We help over 600 global enterprises accelerate the digital transformation of their products into integrated services, dramatically reducing their development costs while extending compatibility and lifetime value. IFTTT is the connectivity standard and low-code alternative to building your own integrations in-house.
Our products have attracted 18 million consumers across 140 countries and served over 90 million activated connections. Enterprises like Domino’s, Amazon, Bosch, ING, and Samsung trust IFTTT for their connectivity solutions.
Created in San Francisco, IFTTT is backed by Andreessen Horowitz, IBM, Norwest Venture Partners, and Salesforce Ventures.