Data Engineer

Posted 4 Days Ago
Be an Early Applicant
Makati, Fourth District NCR, National Capital Region, PHL
Hybrid
Senior level
Cloud • Information Technology • Consulting
The Role
Design, implement, and maintain scalable data infrastructure for analytics and operational decision-making, optimizing data pipelines and ensuring data quality.
Summary Generated by Built In

Makati City, (Robinsons Summit Center) Manila, PH 

Hybrid 

8 am - 5 pm PST 

Rocket Partners employees are passionate about solving complex technology challenges for our clients and harnessing the power of data and the cloud to accelerate creative solutions. We are looking for a Data Engineer to join our growing team of experts. We are looking for builders; leaders who can design, build, and deliver net new solutions from scratch, while leading our growing organization and client-base to continued scalable, sustainable success. Rocket Partners embody the values of curiosity, initiative, follow through, and growth.

Founded in 2016, Rocket Partners has had incredibly successful growth. To that success, our company is not backed by a VC or PE firm - we are continually profitable on our own! As the business continues to grow we are looking for employees to share our passion for driving the success of our clients.

We are seeking an experienced Data Engineer to join our team. In this role, you will design, build and maintain a production-grade Microsoft Fabric data lakehouse using Medallion Architecture (Bronze/Silver/Gold layers). The platform integrates data from multiple source systems via custom API connectors and Fabric Data Pipelines, serving Power BI semantic models with row-level security for 40 users. This is a hands-on engineering role where you will write Python/PySpark notebooks, build incremental ingestion pipelines, implement CI/CD via Fabric Deployment Pipelines, and own data quality across all layers.

Responsibilities

  • Build and maintain ETL/ELT pipelines in Microsoft Fabric Notebooks (Python/PySpark) ingesting data from multiple source systems via REST and SOAP APIs

  • Implement Bronze layer landing (raw data, no transformation), Silver layer cleansing/typing/deduplication, and Gold layer aggression for business analytics  

  • Design and build custom API connectors with OAuth 2.0/Bearer token authentication, incremental sync, pagination handling, rate-limit/retry logic and error recovery 

  • Configure and manage Fabric Workspaces across Dev/Test/Production environments using Fabric Deployment Pipelines and Git integration for CI/CD 

  • Build and maintain Power BI semantic models (star/snowflake schemas) supporting operational reporting and analytics dashboards 

  • Implement row-level security (RLS) using Azure AD RBAC, warehouse-level DAX filters, and dynamic RLS platforms 

  • Set up monitoring, alerting, and telemetry using Azure Monitor, Log Analytics, and Application insights to track pipeline health and data freshness 

  • Manage API credentials and secrets via Azure Key Vault with automated rotation policies 

  • Collaborate with the client’s BI team through knowledge transfer sessions, pair development, and documentation handover 

  • Contribute to data governance: data dictionary creation, lineage tracking, and documentation of transformation logic across all pipeline stages

Requirements

  • 5+ years experience in data engineering or a similar role in a commercial environment

  • Hands-on experience with Microsoft Fabric (Lakehouses, Notebooks, Data Pipelines, OneLake, SQL Analytics Endpoints) or equivalent depth in Azure Synapse Analytics

  • Advanced SQL for data transformation, performance tuning, and Delta Lake table management

  • Proficient in Python/PySpark for data processing, API integration, and pipeline automation

  • Proven experience building custom API connectors (REST, SOAP/XML) with OAuth 2.0, pagination, rate limiting, and incremental sync patterns

  • Strong understanding of Medallion Architecture, data lakehouse concepts, and Delta Lake (merge, upsert, soft-delete handling, time travel)

  • Experience with dimensional modelling (star/snowflake schemas) and Power BI semantic model development 

  • Working knowledge of Azure services: Key Vault, Azure Monitor, Log Analytics, Azure AD/Entra ID

  • Experience implementing row-level security in Power BI (DAX-based RLS, dynamic security models)

  • Familiarity with CI/CD in a Fabric context: Deployment Pipelines, Git integration, environment promotion workflows

  • Experience with semi-structured data formats (JSON, XML) and handling schema evolution across pipeline layers

Preferred

  • Prior work in retail, wholesale, distribution, or fresh produce / supply chain data environments

  • Experience with ERP data extraction and replication tools in a Fabric or Azure context

  • Knowledge of dbt, Power Query, or similar transformation frameworks

  • Relevant Azure or Microsoft certifications (DP-600, DP-203, PL-300)

  • Openness to AI coding assistants (GitHub Copilot, Cursor, Claude) as part of your development workflow

What We Offer

  • Competitive salary with mentorship and career growth opportunities

  • Hands-on work with cloud-first technologies (Microsoft Fabric, Azure, Power BI)

  • Direct client-facing engagement with an international team (Atlanta, Madrid, Manila) 

  • Fast-paced, innovative environment where AI-augmented development is the norm

Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
Atlanta, Georgia
40 Employees
Year Founded: 2016

What We Do

At Rocket Partners, we understand the challenges faced by enterprises dealing with large, cumbersome software solutions. Our team of specialized experts is here to revolutionize your business and overcome the innovation barriers holding you back. With our talent, experience, and dedication, we ensure that your existing solutions are enhanced to their fullest potential, empowering your business for future success. We are passionate about driving innovation and empowering businesses to reach new heights. We specialize in providing managed services, custom solutions, and program management expertise to large corporations across various industries. Our mission is simple: to be your trusted partner in navigating the complex world of technology. We understand the challenges faced by enterprises with limited technology resources and the need for tailored solutions that drive revenue growth, enhance operational efficiency, and foster innovation. We’re a team of over 50 high-performing cloud experts who provide next-generation custom software solutions to our clients and partners. We build innovative solutions from the ground up. We are passionate about delivering end-to-end excellence for our clients. “Unlock the true potential of your enterprise solutions.”

Similar Jobs

Pfizer Logo Pfizer

Full-stack Engineer

Artificial Intelligence • Healthtech • Machine Learning • Natural Language Processing • Biotech • Pharmaceutical
Hybrid
Makati City, Metro Manila, National Capital Region, PHL
121990 Employees

Kroll Logo Kroll

Data Engineer

Big Data • Security • Software • Analytics • Cybersecurity
Remote or Hybrid
2 Locations
5001 Employees
In-Office
2 Locations
11699 Employees

ING Logo ING

Data Engineer

Fintech • Payments • Financial Services
In-Office
Manila, Metro Manila, National Capital Region, PHL
65710 Employees

Similar Companies Hiring

Amplify Platform Thumbnail
Fintech • Financial Services • Consulting • Cloud • Business Intelligence • Big Data Analytics
Scottsdale, AZ
62 Employees
Standard Template Labs Thumbnail
Artificial Intelligence • Information Technology • Software
New York, NY
25 Employees
Golden Pet Brands Thumbnail
Digital Media • eCommerce • Information Technology • Marketing Tech • Pet • Retail • Social Media
El Segundo, California
178 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account