Professional Services Marketing

Where Is Your Firm Today? The Three-Dimensional AI Readiness Assessment

Written by Joy Youell | Dec 8, 2025 12:00:00 PM

Every AI transformation conversation starts with the same question: "Where should we begin?"

My answer is always the same: "I don't know yet. Tell me where you are today."

Most firms can't answer that question accurately.

They know they "should" be doing AI. They know competitors are moving. They know clients are asking about it. But they don't actually know where their firm stands on the three dimensions that determine whether AI transformation will succeed or fail.

They confuse aspiration with reality. They mistake a few early adopters for firm-wide readiness. They assume that because the technology is available, adoption is inevitable.

Then they launch an AI initiative from an imaginary starting point and wonder why reality doesn't cooperate.

You cannot navigate to your destination without an honest assessment of where you actually are today.

Let me show you the three dimensions you need to map.

The Three Dimensions That Matter

Most firms assess one dimension: technology adoption. "What tools do we have? Who's using them?"

That's necessary. It's also insufficient.

AI readiness exists in three distinct dimensions:

Dimension 1: Technology Adoption - What's actually happening with tools and systems
Dimension 2: People Readiness - Where your people are in terms of skills, mindset, and engagement
Dimension 3: Cultural Foundation - Whether your culture enables or prevents AI adoption

You need accurate assessment of all three. Missing any dimension means you're navigating with partial information, which is often worse than no information at all.

Dimension 1: Technology Adoption Current State

This is the dimension most firms think they understand. Usually, they don't.

What AI Tools Are Currently in Use?

Not "what tools have we purchased" or "what tools are available." What tools are people actually using, and for what?

The official answer: "We have Claude, ChatGPT, and Copilot licenses for all professional staff."

The real answer: 12% of staff use AI weekly. 40% tried it once and stopped. 48% haven't touched it. Of those using it, 80% are using it for email drafting and nothing else.

See the difference? Official availability doesn't equal actual adoption.

What you need to know:

  • Which specific tools are being used by which roles?
  • What are they actually being used for? (not what you hoped, what's real)
  • How frequently? (daily, weekly, monthly, never)
  • Are people using approved tools or finding their own workarounds?
  • What's working well and what's been abandoned?

How Systematic Is Implementation?

Are AI tools being adopted systematically or haphazardly?

Haphazard looks like: Individual staff members experimenting independently. No shared knowledge. Everyone reinventing solutions. No documentation. No standards. Success depends entirely on individual initiative.

Systematic looks like: Defined use cases. Documented workflows. Shared prompt libraries. Training tied to specific applications. Success metrics tracked. Knowledge shared across teams.

Most firms think they're systematic when they're actually haphazard. They provided training and assume that created systematic adoption. It didn't.

What you need to know:

  • Do you have documented AI-enhanced workflows for key processes?
  • Are there standards for which tools to use for which purposes?
  • Is knowledge being captured and shared or trapped with individuals?
  • Are there feedback loops to improve based on what's learned?

What's the Adoption Rate Across Teams?

Averages lie. "30% adoption" could mean everyone uses it sometimes, or it could mean 30% use it constantly and 70% never touch it.

What you need to know:

  • Which teams/departments have high adoption and which have none?
  • Are early adopters spreading knowledge or working in isolation?
  • What patterns explain high vs. low adoption? (leadership buy-in? workflow fit? team culture?)
  • Are adoption rates increasing, plateauing, or declining over time?

Red flag: If your adoption is driven entirely by 3-5 champions and nobody else is picking it up, you don't have adoption—you have enthusiasts working in isolation.

Dimension 2: People Readiness

This is the dimension most firms underestimate. They assume training equals readiness. It doesn't.

Skill Level Distribution Across Staff

People exist at vastly different skill levels, even after the same training.

The five typical levels:

Level 1 - Non-users (never tried): May not know tools exist, or tried once and gave up immediately. Completely dependent on others.

Level 2 - Basic users (occasional, single use case): Use AI for one thing (usually email or basic summaries). Don't understand how to refine prompts. Accept first outputs without iteration.

Level 3 - Competent users (regular, multiple use cases): Use AI for several applications. Understand basic prompting. Iterate to improve outputs. Limited to familiar use cases.

Level 4 - Advanced users (daily, discovering new applications): Use AI across their workflow. Sophisticated prompting. Share knowledge. Actively exploring new possibilities.

Level 5 - Champions (evangelists, teaching others): Not just using AI but helping others adopt it. Creating resources. Developing firm-specific applications.

What you need to know:

  • What percentage of your staff is at each level?
  • Are people moving up levels or staying static?
  • Where are the gaps most concerning? (if all your partners are Level 1, that's a problem)
  • Who are your Level 4s and 5s? (these are your most valuable assets for scaling adoption)

Reality check: Most firms discover they have 5-10% at Level 4-5, 15-20% at Level 3, 25-30% at Level 2, and 40-50% at Level 1. That's normal. It's also insufficient if you want transformation.

Resistance vs. Enthusiasm Patterns

Not everyone who isn't using AI is resistant. And not everyone using it is enthusiastic.

The four groups you actually have:

Enthusiastic adopters: Excited about AI. Experimenting actively. Sharing discoveries. These are your champions.

Pragmatic adopters: Not excited, but willing to use tools that clearly save time. Show them ROI and they'll engage.

Skeptical but persuadable: Doubtful about value but not actively resistant. Need proof points from trusted colleagues, not from leadership.

Active resisters: Believe AI threatens their value, professional identity, or job security. Or they've tried it and decided it doesn't work.

What you need to know:

  • What percentage falls into each group?
  • Are skeptics becoming pragmatists, or are pragmatists becoming skeptics?
  • What are the common themes in resistance? (fear, competence, values, or justified concerns?)
  • Who influences whom? (if your most respected senior person is a resister, that's contagious)

Critical insight: The middle two groups—pragmatic and skeptical—determine your success. They're persuadable, but only if your approach addresses their actual concerns.

Learning Velocity and Engagement

Are people learning faster over time or hitting a plateau?

High learning velocity looks like:

  • People asking increasingly sophisticated questions
  • Use cases expanding beyond initial training
  • Staff sharing discoveries with colleagues
  • Requests for advanced training or specialized applications
  • Early adopters helping train others

Stalled learning looks like:

  • Same basic questions six months after initial training
  • No expansion beyond the one use case covered in training
  • No peer-to-peer knowledge sharing
  • Declining attendance at optional learning sessions
  • Champions getting frustrated with lack of broader engagement

What you need to know:

  • Are people asking better questions over time?
  • Is the range of use cases expanding or static?
  • Are learning resources being used or ignored?
  • Is momentum building or fading?

Dimension 3: Cultural Foundation

This is the dimension almost nobody assesses honestly. It's also the dimension that determines everything else.

Do You Reward Experimentation?

Not "do you say you value innovation?" but "what actually gets rewarded?"

Assessment questions:

  • When someone experiments with AI and it doesn't work, what happens to them? (nothing? criticism? celebration of the learning?)
  • Who got promoted last year? People who experimented or people who executed flawlessly using traditional methods?
  • Are people who share failed experiments seen as transparent or incompetent?
  • Is time spent learning AI counted in utilization/productivity metrics or treated as "unbillable overhead"?

The honest answer reveals your real culture, not your aspirational one.

How Would You Handle AI-Related Trial and Error?

Not "how do you handle it" (you probably haven't faced it yet) but "how would you handle it?"

Scenario: A manager spends six hours learning to use AI for a task that normally takes three hours. The output needs significant refinement. They learn valuable lessons but look less productive than colleagues using traditional methods.

How does your firm respond?

Option A: "Why did you waste six hours? Just do it the normal way."
Result: Nobody experiments again.

Option B: "That's fine as a one-time learning experience, but you need to be more efficient going forward."
Result: People learn the absolute minimum and stop experimenting.

Option C: "What did you learn? Will those six hours make you more efficient on the next 20 similar tasks? Can you share what you discovered with the team?"
Result: Learning compounds. Knowledge spreads.

Which option describes your firm's likely response?

Is Curiosity Embedded or Aspirational?

Embedded curiosity shows up in systems and behaviors. Aspirational curiosity shows up in statements and values.

Embedded curiosity:

  • Protected time for learning (actually in schedules, not theoretical)
  • Questions celebrated in meetings, not just answers
  • "I don't know" treated as honest, not incompetent
  • Failed experiments discussed openly without career impact
  • Performance reviews that reward learning, not just execution

Aspirational curiosity:

  • "Innovation" listed as a firm value
  • Leadership says "we encourage experimentation"
  • Annual innovation awards with no supporting infrastructure
  • Learning happens "when you have time" (i.e., never)
  • Performance reviews measure only traditional productivity

Be honest: which describes your firm?

The Patterns That Predict Success or Failure

Once you've assessed all three dimensions honestly, patterns emerge:

Pattern 1: High tech adoption, low people readiness, weak culture

You have tools but nobody's using them well. Champions are frustrated. This predicts stalled transformation.

Pattern 2: Low tech adoption, high people readiness, strong culture

People are eager but lack tools or permission. This predicts rapid acceleration once you provide resources.

Pattern 3: Uneven adoption across dimensions

Some departments are thriving while others are failing. This predicts fragmented transformation that never scales.

Pattern 4: Strong in all dimensions

You're rare. You're ready for accelerated transformation.

Pattern 5: Weak in all dimensions

You're further behind than you thought. You need foundational work before AI transformation makes sense.

Most firms are Pattern 1 or 3. They invested in technology but underinvested in people and culture. The assessment reveals that the real work is ahead of them, not behind them.

What To Do With Your Assessment

An honest assessment does three things:

First, it destroys comfortable illusions. "We're doing pretty well with AI" becomes "We have tools that 12% of people actually use."

Second, it reveals the real starting point. You can't navigate from where you wish you were. You can only navigate from where you actually are.

Third, it focuses your investments. If the gap is technology, buy tools. If the gap is people, invest in learning infrastructure. If the gap is culture, work on psychological safety and reward systems.

Most firms invest in technology when the real gap is people or culture. The assessment prevents that expensive mistake.

Run Your Own Three-Dimensional Assessment

You need honest data, not hopeful estimates.

For Technology Adoption: Anonymous survey + usage analytics from actual tools. Don't trust what people say they do. Look at what they actually do.

For People Readiness: Skills assessment + interviews with different levels. Talk to enthusiasts and resisters. Ask what would help, not what's working.

For Cultural Foundation: Leadership interviews + staff focus groups asking about real experiences with experimentation, failure, and learning. What actually happened last time someone tried something new?

The assessment will probably reveal you're not where you thought you were. That's uncomfortable. It's also necessary.

Because you can't get where you're going if you don't know where you are.

Ready to assess where your firm actually stands? Winsome's consulting practice conducts comprehensive three-dimensional AI readiness assessments that reveal the truth about your technology adoption, people readiness, and cultural foundation. We don't just measure—we help you understand what the assessment means and what to do next. Let's find out where you really are today.