The Role
Responsible for maintaining and enhancing data warehouses and pipelines, developing scalable ETL processes, data analysis, and reporting through visualization tools like PowerBI.
Summary Generated by Built In
Job Overview
We are seeking a skilled Data Engineer to join our team and drive our data infrastructure forward. In this role, you will primarily focus on maintaining and enhancing our data warehouse and pipelines (80%) while also contributing to data analysis and reporting initiatives (20%). You'll work closely with cross-functional stakeholders to build robust data solutions and create actionable insights through compelling visualizations.
Key Responsibilities
Data Engineering
- Infrastructure Management: Maintain, enhance, and optimize existing data warehouse architecture and ETL pipelines.
- Pipeline Development: Design and implement scalable ETL/ELT processes ensuring data quality, integrity, and timeliness.
- Performance Optimization: Monitor and improve pipeline performance, troubleshoot issues, and implement best practices.
- Documentation: Create and maintain comprehensive documentation for data engineering processes, architecture, and configurations.
Data Analysis & Reporting
- Stakeholder Collaboration: Partner with business teams to gather requirements and translate them into technical solutions.
- Report Development: Build and maintain PowerBI dashboards and reports that drive business decisions.
- Data Modeling: Develop new data models and enhance existing ones to support advanced analytics.
- Insight Communication: Transform complex data findings into clear, actionable insights for various departments.
Required Qualifications
Technical Skills
- Programming & Query Languages: Strong proficiency in Python, SQL, and PySpark.
- Big Data Platforms: Experience with cloud data platforms including Snowflake, BigQuery, and Databricks. Databricks experience highly preferred.
- Orchestration Tools: Proven experience with workflow orchestration tools (Airflow preferred).
- Cloud Platforms: Experience with AWS (preferred), Azure, or Google Cloud Platform.
- Data Visualization: Proficiency in PowerBI (preferred) or Tableau.
- Database Systems: Familiarity with relational database management systems (RDBMS).
Development Practices
- Version Control: Proficient with Git for code management and collaboration.
- CI/CD: Hands-on experience implementing and maintaining continuous integration/deployment pipelines.
- Documentation: Strong ability to create clear technical documentation.
Experience & Communication
- Professional Experience: 3+ years in data engineering or closely related roles.
- Language Requirements: Fluent English communication skills for effective collaboration with U.S. based team members.
- Pipeline Expertise: Demonstrated experience building and maintaining production data pipelinesk
Top Skills
AWS
Azure
BigQuery
Databricks
Git
Google Cloud Platform
Power BI
Pyspark
Python
Snowflake
SQL
Am I A Good Fit?
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.
Success! Refresh the page to see how your skills align with this role.
The Company
What We Do
MileIQ provides automatic mileage tracking for deductions and expenses. Whether you're a self-employed worker looking for a mileage deduction or a large company that wants to get a handle on mileage expenses reports, MileIQ is automatic, easy and accurate.
MileIQ is mileage tracking the way you need it to work.
We’d love to hear from you:
• Website: http://www.mileiq.com
• Twitter: twitter.com/mileiq
• Facebook: www.facebook.com/mileiq
• Email: [email protected]