Data Quality Engineering Lead

Posted 14 Days Ago
Be an Early Applicant
Dearborn, MI
In-Office
Mid level
Information Technology • Software • Analytics
The Role
Lead data quality testing for Insight products: create and execute manual and automated tests, validate dashboards and ETL pipelines in Databricks/Azure Data Factory, test AI prompting tools, write SQL/Python/PySpark tests, log defects in Jira, build PyTest automation, and coordinate with stakeholders and offshore teams.
Summary Generated by Built In


Job Summary: This position is for an Onshore Test Lead to support client Insight suite of products. This Test Lead will be responsible for leading Insight products related testing efforts which include gathering requirements, test case creation, test execution, defect logging and retest, coordinating with various stakeholders including technical teams, business teams, product owners, project managers, external vendors and offshore teams.

This is a data centric product, and we are looking for a candidate who is a data enthusiast and who has a zeal to work with data, analyze and understand various metrics being derived out of data. Experience with testing AI prompting tools is a must. 

Essential Job Functions:

  • Testing various dashboards and certifying the metrics on these dashboard are correct after comparing with the backend underlying data.
  • Testing prompt based AI tools to make sure the prompts are returning the right values back to the UI.
  • Focus on testing to verify the accuracy of this data as various business rules are applied.
  • Understand the data flow, validate the content on UI screens, understand and test the business rules involved with data transformation and data aggregation.
  • Create and execute scenarios to test various API. Preparing request block and analyzing the responses in JSON/XML formats.
  • Validate the flow of data from disparate sources ingested into multiple databases inside Databricks, post which data is transformed by pipelines and workflows built within Azure Databricks and Azure Data Factory (ETL process).
  • Thoroughly test the ETL rules built for data transformation & complex business rule built for data aggregation.
  • Strong in SQL skills. Should possess ability to understand and write complex queries.
  • Execute tests using SQL or Python or PySpark as per the user stories to validate the data inside various databases within the Databricks environment.
  • Test different source and target tables available in Azure Databricks that are sourced, cleansed, transformed, joined, aggregated and final data sent to downstream applications.
  • Automate recurring QA processes through the use of advanced languages such as Python or PySpark or Java as needed.
  • Design and build out an automation framework using PyTest to validate different scenarios and its data. This includes both automating new tests and/or updating existing scripts.
  • Previously should have exposure to code repository tools, creating branches, pull requests and perform code merge activities.
  • Previously should have exposure to SonarQube and main code quality, fix code smells etc.
  • Create and execute detailed manual test cases from time to time using functional requirements and technical specifications within Jira to ensure quality and accuracy.
  • Log appropriate defects within Jira when product does not conform to specifications.
  • Participate in daily stand-ups with project team as part of the agile methodology.
  • Coordinate with development team members regarding defect validation and assist development team members with re-creating defects.
  • Create appropriate test cases within TestRail Test Management tool.
  • Update tasks information in Jira as appropriate to communicate progress with onshore test lead.

Minimum Qualifications and Job Requirements:

  • 3+ years strong experience in writing complex SQL queries.
  • 3+ years of experience in building test automation for data processing within data intensive projects.
  • 3+ years of experience in Python data management programming or PySpark experience is a must.
  • 2+ years working with Apache Delta Lake or Databricks, Azure Databricks preferred
  • 1 year experience in testing AI Chat / Prompting Tools.
  • Experience with code repository tools, creating branch, pull requests and perform code merges.
  • Good understanding of file formats including JSON, Parquet, Avro, and others
  • Ability to learn new technologies quickly
  • Excellent problem-solving skills
  • Working understanding of clean code software development principles.
  • Knowledge of Jira
  • Ability to handle multiple tasks/projects concurrently and meet deadlines.
  • Ability to work in a fast-paced team environment. Expectations include a high level of initiative and a strong commitment to job knowledge, productivity, and attention to detail
  • Solid software engineering skills - participated in full lifecycle development on large projects.

Other Responsibilities:

  • Maintain technology expertise, keeping current with evolving testing tools, techniques, and strategies to improve the overall testing efficiency, processes, and best practices.
  • Maintain a focus on customer-service, efficiency, quality, and growth.
  • Safeguard the company's assets.
  • Adhere to the company's compliance program.
  • Maintain comprehensive knowledge of industry standards, methodologies, processes, and best practices.

Top Skills

Sql,Python,Pyspark,Java,Databricks,Azure Databricks,Delta Lake,Azure Data Factory,Pytest,Sonarqube,Jira,Testrail,Git,Json,Xml,Parquet,Avro,Ai Prompting Tools
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Reston, Virginia
28 Employees
Year Founded: 2003

What We Do

DATAMAXIS takes pride in delivering a wide range of business IT modernization, data analytics, and technology management services. With command of the cutting-edge developments in these fields, our team and consultants are ready to provide you a robust technology modernization experience that results in a big boost in performance capability and operational efficiency.

Similar Jobs

Dynatrace Logo Dynatrace

Sr Manager, Software Development - Salesforce Service Cloud

Artificial Intelligence • Big Data • Cloud • Information Technology • Software • Big Data Analytics • Automation
Remote or Hybrid
Detroit, MI, USA
5200 Employees
180K-210K Annually

Silverfort Logo Silverfort

Director Of Customer Success

Information Technology • Sales • Security • Cybersecurity • Automation
Remote or Hybrid
United States
507 Employees

PagerDuty Logo PagerDuty

Marketing Manager

Artificial Intelligence • Cloud • Information Technology • Machine Learning • Software • Big Data Analytics • Automation
Easy Apply
Remote or Hybrid
USA
1200 Employees
114K-173K Annually
Remote or Hybrid
US
15100 Employees
151K-212K Annually

Similar Companies Hiring

Scotch Thumbnail
Software • Retail • Payments • Fintech • eCommerce • Artificial Intelligence • Analytics
US
25 Employees
Milestone Systems Thumbnail
Software • Security • Other • Big Data Analytics • Artificial Intelligence • Analytics
Lake Oswego, OR
1500 Employees
Fairly Even Thumbnail
Software • Sales • Robotics • Other • Hospitality • Hardware
New York, NY

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account