Software Engineer SME (TS/SCI with Poly Required)

Sorry, this job was removed at 06:13 p.m. (CST) on Wednesday, May 13, 2026
Be an Early Applicant
Chantilly, VA, USA
In-Office
Information Technology • Software • Analytics • Cybersecurity
The Role

GCI embodies excellence, integrity and professionalism. The employees supporting our customers deliver unique, high-value mission solutions while effectively leverage the technological expertise of our valued workforce to meet critical mission requirements in the areas of Data Analytics and Software Development, Engineering, Targeting and Analysis, Operations, Training, and Cyber Operations. We maximize opportunities for success by building and maintaining trusted and reliable partnerships with our customers and industry.

At GCI, we solve the hard problems. As a Software Engineer, a typical day will include the following duties:


  • Conduct comprehensive assessments of existing data pipelines, infrastructure, and data flows including integrations with operational systems like ServiceNow, network management platforms, and business applications to identify technical debt, bottlenecks, and reliability issues.
  • The Candidate shall evaluate current data architecture against industry best practices and organizational needs; develop technical recommendations and roadmaps for data infrastructure improvements.
  • Design, build, and maintain production-grade data pipelines using orchestration tools such as Airflow or Prefect.
  • Develop robust ETL (Extract-Transform-Load/ELT (Extract-Load Transform) processes from diverse sources: Saas platforms, network management systems, databases, APIs, files, and streams.
  • Build API integrations handling authentication (OAuth, API keys, and Single Sign-On (SSO), rate limiting, pagination, retry logic, and error handling.
  • Extract data from systems not designed for export; reverse-engineer undocumented data structures and relationships.
  • Handle semi-structured data (JSON and XML); and transform into structured datasets with consistent schemas.
  • Design dimensional models, data warehouses, and data marts following industry methodologies.
  • Create conceptual, logical, and physical data models optimized for query performance and storage efficiency.
  • Implement slowly changing dimensions and other data warehousing patterns.
  • Establish naming conventions, data standards, and modeling best practices.
  • Implement comprehensive data quality checks, validation rules, and automated monitoring with alerting.
  • Build error handling, failure recovery, logging, and observability into all processes.
  • Optimize pipelines for performance, cost, and resource utilization.
  • Develop reusable components and frameworks; refactor legacy pipelines for reliability.
  • Build and maintain data infrastructure on cloud platforms (Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP) using infrastructure-as-code using Terraform and CloudFormation.
  • Implement CI/CD pipelines, version control (Git), and automated testing frameworks.
  • Manage database performance tuning, indexing, partitioning, and capacity planning.
  • Establish backup, recovery, security controls, access controls, and compliance measures.
  • Partner with analysts, software developers, and business stakeholders to translate requirements into technical solutions.
  • Create comprehensive documentation for systems, processes, and integrations.
  • Provide technical guidance on data availability and proper usage; enable self-service access.
  • Troubleshoot pipeline failures, performance issues, and data discrepancies; perform root cause analysis.

REQUIRED SKILLS AND DEMONSTRATED EXPERIENCE

  • Demonstrated experience designing, building, and maintaining production data pipelines using orchestration tools such as Apache Airflow or similar.
  • Demonstrated experience with SQL skills including complex queries, optimization, and performance tuning across multiple database platforms.
  • Demonstrated experience integrating data from Customer Saas platforms and operational systems via APIs, including handling authentication, pagination, and rate limiting.
  • Demonstrated experience working with semi-structured data (JSON and XML) from API responses and transforming into structured datasets.
  • Demonstrated experience with developing robust API integrations with proper error handling and retry logic.
  • Demonstrated experience working with systems that have limited documentation or vendor-specific data models.
  • Demonstrated experience with dimensional modeling and data warehouse design patterns.
  • Demonstrated proficiency in Python for data engineering including working with data processing libraries.
  • Demonstrated experience with cloud data platforms such as AWS, Azure, or GCP, including data services and infrastructure.
  • Demonstrated experience implementing ETL/ELT processes from diverse data sources.
  • Demonstrated experience with version control (Git) and software engineering best practices.
  • Demonstrated experience with strong problem-solving and troubleshooting skills for complex data pipeline issues.
  • Demonstrated experience implementing data quality checks and validation frameworks.
  • Demonstrated experience translating business requirements into technical data solutions.
  • Demonstrated experience in having a proven track record of delivering reliable, scalable data infrastructure.

HIGHLY DESIRED SKILLS AND DEMONSTRATED EXPERIENCE

  • Demonstrated experience with ServiceNow APIs, data models, and integration patterns.
  • Demonstrated experience with network management or IT operations systems data extraction.
  • Demonstrated experience with Forward Networks, NetM, SolarWinds, or similar network management platforms.
  • Demonstrated experience and knowledge of ITSM (Information Technology Service Management), ITOM (Information Technology Operations Management), and CMDB (Configuration Management Database) data structures and relationships.
  • Demonstrated experience with API gateway platforms and API management tools.
  • Demonstrated experience with Apache Spark, particularly PySpark, for distributed data processing.
  • Demonstrated experience with DBT (data build tool) for transformation workflows.
  • Demonstrated experience with infrastructure-as-code tools such as Terraform or CloudFormation.
  • Demonstrated experience implementing CI/CD (Continuous Integration / Continuous Delivery) pipelines for data engineering code.
  • Demonstrated experience and knowledge of streaming data technologies such as Kafka, Kinesis, or similar platforms.
  • Demonstrated experience with data quality platforms such as Great Expectations, Soda, or Monte Carlo.
  • Demonstrated experience implementing data observability and monitoring solutions.
  • Demonstrated experience and knowledge of Data Vault or other advanced modeling methodologies.
    Demonstrated experience with containerization (Docker) and orchestration (Kubernetes) for data workloads.
  • Demonstrated experience with reverse ETL and operational analytics patterns.
  • Demonstrated experience with data governance platforms and metadata management tools.
  • Demonstrated experience with multiple cloud platforms and multi-cloud architectures.
  • Demonstrated experience mentoring or leading data engineering initiatives.

A candidate must be a US Citizen and requires an active/current TS/SCI with Polygraph clearance.

Equal Opportunity Employer / Individuals with Disabilities / Protected Veterans

Qualifications Education Preferred BA/BS or better. Experience Required Demonstrated work related experience. Equal Opportunity Employer
This employer is required to notify all applicants of their rights pursuant to federal employment laws. For further information, please review the Know Your Rights notice from the Department of Labor.

Similar Jobs

GCI Incorporated Logo GCI Incorporated

Software Engineer

Information Technology • Software • Analytics • Cybersecurity
In-Office
Tysons Corner, VA, USA
180 Employees

GCI Incorporated Logo GCI Incorporated

Software Engineer

Information Technology • Software • Analytics • Cybersecurity
In-Office
Chantilly, VA, USA
180 Employees

GCI Incorporated Logo GCI Incorporated

Software Engineer

Information Technology • Software • Analytics • Cybersecurity
In-Office
Dulles, VA, USA
180 Employees

GCI Incorporated Logo GCI Incorporated

Software Engineer

Information Technology • Software • Analytics • Cybersecurity
In-Office
McLean, VA, USA
180 Employees
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Reston, Virginia
180 Employees
Year Founded: 1989

What We Do

GCI is an Engineering and IT Services company focusing on Data Analytics, Engineering, Cyber Operations, Targeting and Analysis, Operations Solutions and Training. We help our customers solve their greatest challenges by providing exceptional consulting and mission solutions.

Similar Companies Hiring

Golden Pet Brands Thumbnail
Digital Media • eCommerce • Information Technology • Marketing Tech • Pet • Retail • Social Media
El Segundo, California
178 Employees
Kepler  Thumbnail
Fintech • Software
New York, New York
6 Employees
Onshore Thumbnail
Software
US
100 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account