Proactively Manage Your AI Adoption With This DevEx Survey

If you want to build a bottom-up culture around AI that will stick, this survey is a key first step.

Written by Bushra Anjum
Published on Sep. 08, 2025
An employee completes a survey on a monitor
Image: Shutterstock / Built In
Brand Studio Logo
REVIEWED BY
Seth Wilson | Sep 05, 2025
Summary: Startups often adopt AI tools in software development through bottom-up exploration, creating security and quality challenges. A proactive DevEx survey approach — tracking tool use, impact, concerns and trust — helps leaders identify team profiles and craft targeted AI strategies.

Many startups and small companies aren’t adopting AI tools in software development through a top-down strategic plan. Instead, the implementation is mostly organic, developer-led exploration, with individual contributors using tools to boost their personal productivity. Although this bottom-up innovation is powerful, it can place leadership in a difficult situation in which it reacts to new security risks, code quality questions and IP concerns instead of getting ahead of them.

Engineering leaders often use developer experience (DevEx) surveys to move from a reactive stance to a proactive one. To manage AI adoption effectively, we need to apply the same principle. Simply adding a question that asks people to list which AI tools they use isn’t enough, however. To truly understand the landscape, your goal is to diagnose your software development team’s specific “AI personality.” Are they bold, risk-taking enthusiasts? Cautious skeptics? Or a fragmented group of occasional explorers?

Based on industry research, conversations with engineering leaders and analysis of other DevEx surveys, we are proposing four questions you should add to your regular DevEx survey. It’s designed to provide the data you need to identify your team’s profile so you can craft a targeted AI strategy that aligns with their actual needs, work habits and perspectives.

4 Questions to Ask on a DevEx Survey About AI Usage

  1. Which of the following AI-powered tools have you used for your work-related tasks in the last three months? Please check all that apply and indicate your frequency of use.
  2. When using AI tools for software development, how do you perceive their impact on the following aspects of your work?
  3. What are your primary concerns or blockers when using AI tools for software development? Please select up to three that are most significant to you. Additionally, how likely would you be to delegate the following tasks to an AI assistant you trust?
  4. Is there anything else you would like to share about your experience with AI in software development? This could be a specific tool you love, a particular challenge you have faced or an idea for how we could use AI to our advantage.

More From Bushra AnjumDoes Your Project Need an LLM, Machine Learning or Statistics?

 

The Diagnostic Toolkit: 4 Questions to Ask Now

These questions are specifically designed to address the challenges and opportunities of a small- to medium-sized startup where individual contributors are organically exploring AI tools. They’re engineered to provide a snapshot of the current state, perceived impact, primary concerns and desired support related to AI in your software development process. For each question, we have also included a quick note on why we’re asking it and how the answers will help us take meaningful action.

 

1. Which Tools Do You Use and How Often?

The Question

Which of the following AI-powered tools have you used for your work-related tasks in the last three months? Please check all that apply and indicate your frequency of use.

A DevEx survey question on AI tool use
Image: Screenshot by the author.

Objective

We want to map current tool usage and frequency. The goal is to directly address the unstructured, exploratory environment where engineers try out new tools without any official guidance or guardrails. The results will reveal the de facto standard tools within the team, and the frequency of use will help us understand the existing depth of integration, providing leadership with clear signals about where to focus policy and investment (and any subsequent decisions about standardization or licensing). The “Other” field is a valuable discovery mechanism for emerging or niche tools that may be gaining traction. This foundational data is essential for any subsequent decisions about standardization or licensing.

 

2. How Do AI Tools Help Your Work?

The Question

When using AI tools for software development, how do you perceive their impact on the following aspects of your work?

graph showing ai adoption survey
Image: Screenshot by the author.

Objective

This question aims to quantify the tools’ perceived impact on key tasks. The question deconstructs the generic term “productivity” to reveal the specific trade-offs of AI adoption (e.g., are we seeing a significantly positive impact on speed but a slightly negative impact on quality and time spent on debugging?) The results from this question allow leadership to move beyond a simple “AI is good/bad” debate and into a sophisticated discussion about where and how to apply AI for maximum benefit and minimum harm.

More on Establishing a Culture Around AI3 Steps to Building a Culture of Learning and Innovation Around AI

 

3. Concerns and Problems With AI Tools

The Question

What are your primary concerns or blockers when using AI tools for software development? Please select up to three that are most significant to you. Additionally, how likely would you be to delegate the following tasks to an AI assistant you trust?

A developer experience survey question
Image: Screenshot by the author.
A developer experience survey question
Image: Screenshot by the author.

Objective

These two questions measure the trust deficit or abundance for AI. The first part aims to identify and prioritize developer concerns and blockers, and asking them to pick only the top three creates a forced-rank priority list for leadership. The second part assesses the level of trust in AI when it comes to more critical engineering tasks, and the responses may offer useful insight for leadership when determining what policies to implement or how strictly those policies should be enforced regarding AI usage. The top concerns, when combined with willingness to delegate, should inform the company’s long-term acceptable use policy and AI strategy.

 

4. Open-Ended Response

The Question

Is there anything else you would like to share about your experience with AI in software development? This could be a specific tool you love, a particular challenge you have faced or an idea for how we could use AI to our advantage.

Graphic showing text input box
Image: Screenshot by the author.

Objective

This question is to capture the unknown unknowns and any related qualitative context. Qualitative data provides the “why” behind the quantitative “what.” Also, it gives a voice to your developers, demonstrating that you value their individual perspectives and contributing to a culture of psychological safety. Use open-ended questions sparingly, though, in order to produce invaluable and rich context, to surface unexpected issues and to give developers a voice to share nuanced feedback that the structured questions cannot capture. 

Once you’ve collected the responses, it’s time to connect the dots. The answers to these four questions (what tools are used, their perceived impact, the developers’ primary concerns and their willingness to delegate), when viewed together, will paint a clear picture and likely point toward one of three common team profiles. The profiles are listed in the next section, each requiring a different strategic approach for AI enablement.

 

From Data to Diagnosis: Identifying the Team’s AI Profile

Your survey data will likely point toward one of the three common profiles. Here is how to interpret each one and take action.

Profile 1: The Exposed Enthusiastics

For this profile, the survey response will likely show high usage of various tools, high perceived impact on speed, but also high concerns about code quality and security. The team is moving quickly, but potentially racking up security and technical debt along the way.

Immediate Actions

Establish clear guidelines immediately. Focus on best practices for prompting, code validation and security. Mandate peer reviews for all significant AI-generated code.

Long-Term Strategy

Invest in tools that provide enterprise-grade security and code analysis. Develop formal training programs focused not just on using the tools, but on using them critically and safely.

Profile 2: The Cautious Skeptics

For this profile, the survey responses will likely show low tool usage, low trust scores and significant concerns about job security and the reliability of AI suggestions. The team is hesitant and held back by concerns about reliability and impact.

Immediate Actions

Start with education. Run workshops that showcase safe and effective use cases. Create a psychologically safe environment for experimentation. Launch a pilot program with a small group of willing volunteers.

Long-Term Strategy

Focus on building trust. Share success stories from the pilot program. Use the data to show, not just tell, how AI can serve as a powerful assistant rather than a replacement. Provide sanctioned, secure tools to alleviate security concerns.

Profile 3: The Occasional Explorers

For this profile, the survey response will likely show pockets of high usage and enthusiasm mixed with areas of complete indifference. There are no common tools or practices; everyone is doing their own thing.

Immediate Actions

Use the survey data to identify the tools that are providing the most value and standardize around them. Consolidate your efforts to provide licenses and training for a limited set of vetted tools.

Long-Term Strategy

Build a community of practice. Empower your AI enthusiasts to become internal champions who can share their knowledge and best practices with the rest of the team.

AI + The Future of WorkAI Is Wiping Out Entry-Level Jobs. Here’s What Recent Grads Need to Know.

 

DevEx Is Just the Beginning

Understanding your software development team’s evolving relationship with AI is not a one-time task. By embedding a thoughtful, structured approach into your existing DevEx surveys, you’re not just gathering data; you’re building the foundation for an AI strategy that is grounded in real team dynamics. Whether your team is racing ahead, cautiously holding back or scattered in exploration, the insights from the diagnostic approach detailed in this article will allow you to lead innovation with responsibility and empower your developers while safeguarding your company’s long-term interests.

Explore Job Matches.