We are looking for a Data Quality Analyst who will help us improve and optimize our data annotation processes. In this role, you will work closely with lidar-based point cloud data, develop quality metrics, and ensure consistent dataset quality across multiple annotation workflows.
You will be deeply involved in analyzing annotation outputs, identifying opportunities for process improvements, and implementing data-driven solutions. A key part of the role includes monitoring data quality, building validation mechanisms, and supporting the team with insights that help improve both the annotation pipeline and the final dataset.
This position is ideal for someone who is detail-oriented, comfortable working with complex datasets, and motivated to enhance data quality and annotation efficiency through analytical and technical approaches.
What You'll DoQuality Analytics
- Design and automate metrics to evaluate annotation quality;
- Monitor these metrics on a regular basis and investigate anomalies;
- Analyze dataset balance and completeness across locations, scenarios, and labels;
- Assess how new scenes contribute not just more data, but better data.
Annotation & Processes
- Work closely with lidar-based 3D annotation outputs: review samples, spot issues, and provide structured feedback;
- Contribute to the design of 3D annotation pipelines;
- Perform recurring checks on edge cases and tricky scenarios to ensure consistency;
- Document edge cases and best practices for working with 3D data so that others can follow them.
Operational Routine & Controls
- Regularly validate annotation batches (including repetitive checks) to make sure they meet quality and coverage expectations;
- Maintain simple control dashboards and reports, update them on a weekly basis, and follow up on issues;
- Take care of many small but important operational tasks that keep the annotation process stable and predictable.
Support & Automation
- Use Python and ClickHouse for analysis, monitoring, and process support;
- Work closely with product and engineering teams to implement improvements;
- Where possible, build scripts or LLM-based helpers to automate repetitive tasks — but be ready to handle the parts that still require manual attention.
- A degree in a relevant field (Computer Science, Data Analytics, Engineering, or another technical discipline);
- Strong analytical thinking and attention to detail — you’ll frequently review data, spot anomalies, and work through repetitive validation tasks;
- Practical Python skills for data processing and analysis (Pandas);
- Ability to query databases and work with analytical data stacks;
- Readiness to handle routine and sometimes monotonous work — dataset checks, manual validations, and weekly quality reviews are a core part of the role;
- A process-oriented mindset: ability to follow existing workflows, maintain consistency, and keep documentation and reports up to date.
- Experience with ClickHouse
- Strong systems thinking, attention to quality, and a proactive mindset.
- Experience using semi-automated labeling tools or active learning methods.
Candidates are required to be authorized to work in the U.S. The employer is not offering relocation sponsorship, and remote work options are not available.
Top Skills
What We Do
Avride is a leading developer in the autonomous vehicle and delivery robot industry.
Our dynamic team, composed of a few hundred engineers develops and operates autonomous cars and delivery robots across the globe, shaping the future of mobility and logistics.
At Avride, we are committed to making the roads safer and more accessible for everyone. At the core of our philosophy is the belief in the transformative power of technology. Every product we develop, every test we conduct, and every service we launch is anchored in our vision of creating a safer and more sustainable world with help of cutting-edge technologies and breakthrough solutions

.png)

.png)





