Understanding your PVX Score

Last updated 26 days ago

What is PVX?

PVX (Procurement Value Experience) is your procurement team's overall performance score, measured on a scale of 0 to 10.

Think of it like a Net Promoter Score but specifically designed for procurement team effectiveness.


The Scale

Score Range

Rating

Meaning

9.0 - 10.0

🌟 Exceptional

World-class procurement experience

7.5 - 8.9

βœ… Strong

High-performing team, minor improvements needed

6.0 - 7.4

πŸ“Š Good

Solid performance, clear improvement opportunities

4.5 - 5.9

⚠️ Needs Improvement

Significant gaps, action required

0.0 - 4.4

🚨 Critical

Major issues, immediate attention needed

πŸ’‘ Typical scores: Most procurement teams score between 6.0 - 7.5 on their first survey.


How PVX is Calculated

Not Just an Average

PVX uses quality-weighted scoring, meaning responses with detailed feedback count more than one-word answers.

Why Quality Weighting?

Problem: Simple averages treat all responses equally.

  • Quick "5/10" with no explanation = same weight as detailed feedback

  • Encourages gaming the system with minimal effort

Solution: Weight by response quality.

  • Detailed responses (50+ words) get full weight (1.0x)

  • Brief responses (10-49 words) get reduced weight (0.9x)

  • Minimal responses (< 10 words) get lowest weight (0.8x)

Result: Your score reflects thoughtful, engaged feedback more accurately.


The Four Dimensions

Your PVX score is built from four key dimensions:

Q1: Communication & Responsiveness

"How well does procurement communicate and respond to your needs?"

Measures:

  • Response time to requests

  • Clarity of communication

  • Accessibility of team

  • Proactive updates

Common feedback themes:

  • βœ… "Always quick to respond"

  • ❌ "Hard to reach when urgent"


Q2: Process & Efficiency

"How would you rate procurement processes and efficiency?"

Measures:

  • Process simplicity

  • Speed of approvals

  • System usability

  • Administrative burden

Common feedback themes:

  • βœ… "Streamlined and easy"

  • ❌ "Too many approval steps"


Q3: Value & Strategic Impact

"How effectively does procurement deliver value beyond cost savings?"

Measures:

  • Strategic thinking

  • Innovation enablement

  • Risk management

  • Supplier quality

Common feedback themes:

  • βœ… "Real business partner"

  • ❌ "Only focus on price"


Q4: Relationship & Trust

"How would you describe your relationship with the procurement team?"

Measures:

  • Trust and credibility

  • Collaboration quality

  • Business understanding

  • Partnership approach

Common feedback themes:

  • βœ… "True collaboration"

  • ❌ "Us vs. them mentality"


Reading Your Dashboard

Overall PVX Score

The big number at the top of your dashboard.

Example:

PVX Score: 7.2/10 πŸ“ˆ +0.4 vs. last quarter

This is your headline number - share it with leadership, track over time.


Dimensional Breakdown

See performance across all four dimensions:

Dimension

Score

Trend

Communication

7.8

↑ +0.3

Process

6.9

β†’ 0.0

Value

7.0

↑ +0.5

Relationship

7.5

↓ -0.2

What to look for:

  • βœ… Strengths: Scores > 7.5

  • ⚠️ Opportunities: Scores < 6.5

  • πŸ“Š Trends: Changes vs. previous campaign


Response Distribution

See how respondents rated you:

Promoters (9-10): β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‘β–‘ 40% Passives (7-8): β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ 45% Detractors (0-6): β–ˆβ–ˆβ–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘ 15%

NPS-style interpretation:

  • Promoters: Your advocates

  • Passives: Satisfied but not enthusiastic

  • Detractors: At-risk relationships

Sentiment Score = Promoters% - Detractors% = +25


Interpreting AI Insights

After 5+ responses, AI analyzes feedback and generates insights in four categories:

🚨 Urgent Actions

Critical issues requiring immediate attention

Example:

"Communication delays flagged by 4 respondents. Average response time perceived as 3+ days. Risk: Project delays and stakeholder frustration."

When to act: Within 1-2 weeks


⚑ Quick Wins

Easy improvements with high impact

Example:

"3 respondents request procurement portal tutorial. Low effort fix with significant satisfaction improvement."

When to act: Within 1 month


πŸ’ͺ Strengths

What's working well - keep doing it!

Example:

"Relationship building praised by 5 respondents. Continue quarterly business reviews and proactive check-ins."

When to act: Reinforce these behaviors


πŸ“š Training Needs

Skill gaps to address

Example:

"Category expertise gap noted in IT procurement. Consider upskilling team or hiring specialist."

When to act: Plan for next quarter


Benchmarking

Internal Benchmarks

Compare your scores:

  • πŸ“Š Stakeholder vs. Supplier: Are internal/external scores aligned?

  • πŸ“ˆ Trend Over Time: Improving or declining?

  • 🎯 Dimension Variance: Are all areas balanced?

Industry Benchmarks (Coming Soon)

We're collecting anonymized data to provide:

  • Industry averages by sector

  • Percentile rankings

  • Best-practice comparisons

Note: Benchmarking requires 100+ organizations to ensure anonymity.


What's a "Good" Score?

It Depends on Context

First-time scores:

  • 6.0+ = Good starting point

  • 7.0+ = Strong baseline

  • 8.0+ = Exceptional (rare on first survey)

After improvement efforts:

  • +0.5 improvement = Meaningful progress

  • +1.0 improvement = Significant transformation

  • +2.0 improvement = Exceptional turnaround

Don't Obsess Over the Number

Remember:

  1. Trends matter more than absolutes

  2. Insights > Score - the "why" is more valuable than the number

  3. Action matters most - a 6.5 with action beats a 7.2 with inaction


Common Questions

"Our score dropped from last quarter. What happened?"

Possible reasons:

  • New challenges or changes

  • Higher expectations (paradoxically, a sign of trust)

  • Different respondent mix

  • Honest feedback emerging (initial scores often artificially high)

What to do:

  1. Read the AI insights for specific issues

  2. Review dimensional breakdown for problem areas

  3. Compare feedback themes to previous campaign

  4. Act on the insights


"We got an 8.5 - is that sustainable?"

Yes and no.

High scores are great but:

  • ⚠️ Can breed complacency

  • ⚠️ May mask emerging issues

  • ⚠️ Hard to improve further (ceiling effect)

Recommendations:

  • Keep doing what works

  • Watch for early warning signs

  • Maintain quarterly measurement

  • Don't rest on laurels


"Can we share this score publicly?"

Yes! PVX scores are designed to be shareable:

βœ… Good places to share:

  • Executive leadership meetings

  • All-hands presentations

  • Annual reports

  • Team dashboards

❌ Don't share:

  • Individual feedback quotes (breaks anonymity)

  • Dimensional scores without context

  • Comparisons between teams (can create unneccessary competition)


Taking Action on Your Score

1. Celebrate Strengths

Share positive feedback with your team. Recognition drives engagement.

2. Prioritise Based on Impact

Focus on:

  • 🚨 Urgent actions first

  • ⚑ Quick wins for momentum

  • πŸ“ˆ Long-term capability building

3. Communicate Plans

Let stakeholders know:

  • You heard their feedback

  • What you're doing about it

  • When they'll see changes

4. Measure Progress

Run campaigns quarterly to track improvement.