C2 SMART Integration Engineer (Data Acquisition)

Posted Yesterday
Hiring Remotely in USA
Remote
Senior level
Software
The Role
The Integration Engineer designs and implements data integration solutions to enable real-time insights, ensuring data quality and compliance while managing complex data architectures across domains.
Summary Generated by Built In

About the Role

Are you ready to build the data backbone that enables trusted, real‑time insight across complex mission environments? As an Integration Engineer, you will design and implement data integration solutions that securely and reliably move data across diverse domains while ensuring data consistency, quality, and governance. You’ll build streaming and batch pipelines to support operational intelligence, cross‑domain analytics, and enterprise data products - often in environments with strict security, segmentation, and governance requirements. Working closely with domain experts and platform teams, you’ll help ensure data is consistent, high‑quality, and available where decisions are made, even across distributed and constrained architectures.


Responsibilities

  • Design and implement integration patterns that enable seamless data flow across multiple business domains.
  • Navigate domain-specific security models, network segmentation, and data sovereignty requirements.
  • Implement cross-domain service architectures using APIs, event streaming, and data virtualization to decouple source and consumer domains.
  • Collaborate with domain data stewards to define service-level agreements (SLAs), data contracts, and handshake protocols between domains.
  • Build scalable data acquisition pipelines from diverse sources.
  • Implement change data capture (CDC) using Debezium, AWS DMS, or similar tools for database sources.
  • Develop resilient ingestion frameworks that handle variable data volumes, network latency, and source system unreliability.
  • Architect, deploy, and manage Apache Kafka clusters across multiple domains or environments (on-premise, cloud, hybrid).
  • Implement Kafka Streams or ksqlDB for real-time data enrichment and transformation.
  • Design canonical data models that serve as the lingua franca for cross-domain data exchange.
  • Collaborate with domain experts to align business definitions, hierarchies, and metrics across functions.
  • Implement cross-domain security controls including encryption in transit, encryption at rest, and fine-grained access controls (RBAC, ABAC).
  • Ensure compliance with regulatory requirements (GDPR, CCPA, SOX, etc.) across domain boundaries.
  • Set up alerting for pipeline failures, data latency, schema drift, and cross-domain connectivity issues.

TAG: #LI-I4DM

Required Qualifications:

  • 5+ years of experience in data engineering, data integration, or software engineering with a focus on enterprise-scale environments.
  • Proven experience designing and operating cross-domain data integration architectures in large enterprises.
  • Experience navigating network segmentation, firewall policies, and security zones in hybrid or multi-cloud environments.
  • Production experience with Apache Kafka, including Kafka cluster administration (brokers, topics, partitions, replication, consumer groups)
  • Experience with managed Kafka services: Confluent Cloud, Amazon MSK, Azure Event Hubs, or similar.
  • Experience with cross-cluster replication, disaster recovery, and multi-region Kafka architectures.
  • Proven experience acquiring data from:
  • Enterprise applications (SAP, Oracle EBS, JD Edwards, Salesforce)
  • APIs (REST, GraphQL, SOAP) with advanced handling of rate limits, pagination, and authentication
  • Databases via CDC (Debezium, Oracle GoldenGate, AWS DMS)
  • Experience with edge data acquisition and IoT platforms (AWS IoT Core, Azure IoT Hub).
  • Deep experience with enterprise data modeling across multiple domains.
  • Proficiency with data modeling tools (ERwin, ER/Studio, SAP PowerDesigner, or open-source alternatives).
  • Advanced proficiency for custom integration development, Kafka producers/consumers, and automation.
  • Experience with Kafka client libraries and stream processing applications.
  • Expert-level for data validation, reconciliation, and complex transformations.
  • Deep experience with AWS (MSK, ECS, Lambda, S3, IAM, VPC) or Azure (Event Hubs, Data Factory, Synapse, Databricks).
  • Docker, Kubernetes, Helm for deploying streaming applications.
  • Experience with cloud data warehouses (Snowflake, BigQuery, Redshift).
  • Git and CI/CD pipelines (GitHub Actions, GitLab CI, Jenkins).
  • Terraform, AWS CloudFormation, or Azure Resource Manager.

Top Skills

Apache Kafka
Aws Cloudformation
Aws Dms
Aws Iot Core
Azure Event Hubs
Azure Resource Manager
BigQuery
Debezium
Docker
Git
Kubernetes
Redshift
Snowflake
Terraform
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
Millersville, , MD
61 Employees
Year Founded: 2002

What We Do

Ready to advance your career as an agent of change? View our available positions at i4dm.com/resourcing/careers or forward your resume to [email protected]. i4DM is a full-service information technology firm that believes in the versatility of IT. i4DM was founded in 2002 by Michael Peart and partner Ben Hannon. Forged together by Michael’s military background and Ben’s passion for technology, they created a company grounded in military values, dedicated to serving clients through innovation and strategy. With a client-first approach, the team is equipped with the necessary certifications and skill sets to serve all industries. Through market expansion, joint ventures, and new locations, i4DM has grown into an industry leader that revolutionizes the way information technology is leveraged by clients to accomplish their missions. i4DM is passionate about empowering clients’ information technology to incite change, increase productivity, and keep them one step ahead in a dynamic market. Aiming for excellence, and delivering innovation, they go beyond the routine and create entirely customized solutions. They believe in the spirit of collaboration, exploring the line of the unknown, and pushing the boundaries of what’s possible with technology solutions

Similar Jobs

Zscaler Logo Zscaler

Staff Software Engineer

Cloud • Information Technology • Security • Software • Cybersecurity
Easy Apply
Remote or Hybrid
San Jose, CA, USA
8697 Employees
130K-185K Annually

Tapestry - Coach and Kate Spade Logo Tapestry - Coach and Kate Spade

Sr. Manager, Human Resources

eCommerce • Fashion • Other • Retail • Sales • Wearables • Design
Remote or Hybrid
Sacramento, CA, USA
16000 Employees
105K-130K Annually

Tapestry - Coach and Kate Spade Logo Tapestry - Coach and Kate Spade

Assistant Store Manager I

eCommerce • Fashion • Other • Retail • Sales • Wearables • Design
Remote or Hybrid
South Coast, CA, USA
16000 Employees
19-38 Hourly

Capital One Logo Capital One

Support Engineer

Fintech • Machine Learning • Payments • Software • Financial Services
Remote or Hybrid
4 Locations
55000 Employees
209K-286K Annually

Similar Companies Hiring

Milestone Systems Thumbnail
Software • Security • Other • Big Data Analytics • Artificial Intelligence • Analytics
Lake Oswego, OR
1500 Employees
Fairly Even Thumbnail
Software • Sales • Robotics • Other • Hospitality • Hardware
New York, NY
Kepler  Thumbnail
Fintech • Software
New York, New York
6 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account