Data Analyst

Posted 9 Days Ago
Leesburg, VA, USA
In-Office
Mid level
Cloud • Information Technology • Analytics
The Role
The Data Analyst will design and maintain data pipelines, perform analysis of datasets, and provide insights to support decision-making for federal programs.
Summary Generated by Built In
Anika Systems is an outcome-driven technology solutions firm that guides federal agencies in solving complex business challenges and preparing for the future. Our services span AI Strategy, Data Intelligence, AI & Machine Learning, Intelligent Automation, Enterprise Platforms and Engineering, with a specialized focus on National Security and Federal Financial programs. We are dedicated to delivering forward-thinking solutions that accelerate the critical missions of our government clients.  This position is 100% remote.
Position Summary
We are in search of a highly collaborative and experienced Data Analyst to support the Office of Chief Data Officer (OCDO) and the Office of Performance Quality (OPQ) for a federal government contract. In this role, you will design and maintain robust data pipelines, perform in-depth analysis of large-scale datasets, and deliver actionable insights that drive mission decisions. You will work within a Databricks environment leveraging SQL, PySpark, and Python, to transform raw agency data into reliable, governed, and analytics-ready assets. The ideal candidate combines strong engineering fundamentals with analytical acumen and is comfortable operating within complex federal data environments. 
Candidates must be a U.S. Citizen with the ability to obtain and maintain a government suitability clearance.
Key Responsibilities
Data Engineering & Pipeline Development
  • Design, build, and maintain scalable ETL/ELT data pipelines using PySpark and Python within Databricks environments.
  • Develop and optimize SQL queries, and data models to support analytical and reporting workloads.
  • Automate data ingestion workflows from disparate agency sources including APIs, flat files, relational databases, and streaming feeds.
  • Monitor pipeline health, resolve data quality issues, and implement alerting and logging to ensure reliability of data products.
  • Collaborate with data architects to design and enforce data schemas, partitioning strategies, and performance optimization practices.
Data Analysis & Reporting
  • Conduct exploratory data analysis to identify trends, anomalies, and opportunities for improvement.
  • Develop self-service analytics dashboards and reports using Databricks SQL, Tableau, or Power BI.
  • Write complex, performant SQL queries against large datasets to answer ad hoc analytical requests from program managers and leadership.
  • Translate business questions into clearly scoped analytical tasks and deliver findings as data visualizations, written summaries, or briefings.
Collaboration & Stakeholder Support
  • Work closely with data scientists, program analysts, IT engineers, and agency stakeholders to understand data needs and deliver tailored solutions.
  • Document pipelines, data models, and analytical notebooks to support knowledge transfer, peer review, and audit readiness.
  • Participate in Agile sprint ceremonies, contribute to backlog grooming, and deliver iterative data products aligned with program priorities.
Required Qualifications
  • Bachelor's degree in Computer Science, Information Systems, Data Science, Engineering, Mathematics, or a related technical field.
  • 3+ years of experience in data engineering, data analytics, or a closely related discipline.
  • Demonstrated experience on federal government programs or supporting a federal agency data environment.
  • Strong proficiency in SQL — including complex joins, window functions, CTEs, and query performance tuning against large datasets.
  • Hands-on experience with PySpark for distributed data processing, transformations, and optimization techniques.
  • Proficiency in Python for scripting, data manipulation, and automation.
  • Direct experience working within Databricks, including notebooks, jobs, clusters, and Unity Catalog.
  • Familiarity with data lakehouse concepts including Delta Lake, bronze/silver/gold architecture, and medallion design patterns.
  • Experience with version control systems (Git/GitHub/GitLab) and collaborative development workflows.
Preferred Qualifications
  • Databricks Certified Associate Developer for Apache Spark or Databricks Certified Data Engineer Associate/Professional.
  • Experience with cloud platforms such as AWS GovCloud, Microsoft Azure Government, or Google Cloud for Government.
  • Familiarity with CI/CD practices for data pipelines, including automated testing and deployment using tools like Azure DevOps or GitHub Actions.
  • Working knowledge of data visualization platforms (Tableau, Power BI) and experience connecting them to Databricks SQL endpoints.
  • Familiarity with Unity Catalog for data access control, lineage, and governance within Databricks.

Top Skills

Aws Govcloud
Databricks
Google Cloud For Government
Microsoft Azure Government
Power BI
Pyspark
Python
SQL
Tableau
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Leesburg, VA
71 Employees
Year Founded: 2005

What We Do

Anika Systems is a SBA certified 8A and EDWOSB firm. The pace at which the government is changing, there is a need for technology consulting companies that rise to those challenges by taking a fresh approach to problems, solutions that get to market faster, offer service that exceed the customers’ expectations and disrupt the status quo. Anika Systems is an outcome-driven technology consulting firm that helps federal agencies solve business problems and enable them for the future, with services and solutions spanning Data and Analytics, Intelligent Automation, IT Modernization, Application Development and Cloud Engineering. We’re a team of thinkers, lifelong learners, makers, and doers that deeply understand the Federal government customers missions and goals. Our teams are deeply connected and bring their shared experiences and insights to each and every engagement. With a Show me over Tell Me philosophy which is imbibed in our corporate DNA, we produce Minimum Viable Products (MVPs) and delight our customers with solutions, not boring "solution decks". We accomplish these MVPs in our poly-cloud based Virtual Innovation Transformation Acceleration Lab (VITAL), wherein we synthesize ideas into a business concept (intake), select ideas (assess, evaluate, decide) and implement the selected ideas using the appropriate technology (fulfillment). We specialize in building Agency-wide Centers of Excellence for Data and Analytics, Intelligent Automation and Cloud Management.

Similar Jobs

Capital One Logo Capital One

Data Analyst

Fintech • Machine Learning • Payments • Software • Financial Services
Hybrid
2 Locations
55000 Employees
119K-150K Annually

Capital One Logo Capital One

Data Analyst

Fintech • Machine Learning • Payments • Software • Financial Services
Hybrid
2 Locations
55000 Employees
119K-150K Annually

Capital One Logo Capital One

Data Analyst

Fintech • Machine Learning • Payments • Software • Financial Services
Hybrid
3 Locations
55000 Employees
119K-150K Annually

Capital One Logo Capital One

Data Analyst

Fintech • Machine Learning • Payments • Software • Financial Services
Hybrid
2 Locations
55000 Employees
119K-150K Annually

Similar Companies Hiring

Standard Template Labs Thumbnail
Artificial Intelligence • Information Technology • Software
New York, NY
25 Employees
Scotch Thumbnail
Artificial Intelligence • eCommerce • Fintech • Payments • Retail • Software • Analytics
US
35 Employees
Milestone Systems Thumbnail
Software • Security • Other • Big Data Analytics • Artificial Intelligence • Analytics
Lake Oswego, OR
1500 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account