Understanding your PVX Score
Last updated 26 days ago
What is PVX?
PVX (Procurement Value Experience) is your procurement team's overall performance score, measured on a scale of 0 to 10.
Think of it like a Net Promoter Score but specifically designed for procurement team effectiveness.
The Scale
π‘ Typical scores: Most procurement teams score between 6.0 - 7.5 on their first survey.
How PVX is Calculated
Not Just an Average
PVX uses quality-weighted scoring, meaning responses with detailed feedback count more than one-word answers.
Why Quality Weighting?
Problem: Simple averages treat all responses equally.
Quick "5/10" with no explanation = same weight as detailed feedback
Encourages gaming the system with minimal effort
Solution: Weight by response quality.
Detailed responses (50+ words) get full weight (1.0x)
Brief responses (10-49 words) get reduced weight (0.9x)
Minimal responses (< 10 words) get lowest weight (0.8x)
Result: Your score reflects thoughtful, engaged feedback more accurately.
The Four Dimensions
Your PVX score is built from four key dimensions:
Q1: Communication & Responsiveness
"How well does procurement communicate and respond to your needs?"
Measures:
Response time to requests
Clarity of communication
Accessibility of team
Proactive updates
Common feedback themes:
β "Always quick to respond"
β "Hard to reach when urgent"
Q2: Process & Efficiency
"How would you rate procurement processes and efficiency?"
Measures:
Process simplicity
Speed of approvals
System usability
Administrative burden
Common feedback themes:
β "Streamlined and easy"
β "Too many approval steps"
Q3: Value & Strategic Impact
"How effectively does procurement deliver value beyond cost savings?"
Measures:
Strategic thinking
Innovation enablement
Risk management
Supplier quality
Common feedback themes:
β "Real business partner"
β "Only focus on price"
Q4: Relationship & Trust
"How would you describe your relationship with the procurement team?"
Measures:
Trust and credibility
Collaboration quality
Business understanding
Partnership approach
Common feedback themes:
β "True collaboration"
β "Us vs. them mentality"
Reading Your Dashboard
Overall PVX Score
The big number at the top of your dashboard.
Example:
PVX Score: 7.2/10 π +0.4 vs. last quarter
This is your headline number - share it with leadership, track over time.
Dimensional Breakdown
See performance across all four dimensions:
What to look for:
β Strengths: Scores > 7.5
β οΈ Opportunities: Scores < 6.5
π Trends: Changes vs. previous campaign
Response Distribution
See how respondents rated you:
Promoters (9-10): ββββββββββ 40% Passives (7-8): ββββββββββββ 45% Detractors (0-6): ββββββββββ 15%
NPS-style interpretation:
Promoters: Your advocates
Passives: Satisfied but not enthusiastic
Detractors: At-risk relationships
Sentiment Score = Promoters% - Detractors% = +25
Interpreting AI Insights
After 5+ responses, AI analyzes feedback and generates insights in four categories:
π¨ Urgent Actions
Critical issues requiring immediate attention
Example:
"Communication delays flagged by 4 respondents. Average response time perceived as 3+ days. Risk: Project delays and stakeholder frustration."
When to act: Within 1-2 weeks
β‘ Quick Wins
Easy improvements with high impact
Example:
"3 respondents request procurement portal tutorial. Low effort fix with significant satisfaction improvement."
When to act: Within 1 month
πͺ Strengths
What's working well - keep doing it!
Example:
"Relationship building praised by 5 respondents. Continue quarterly business reviews and proactive check-ins."
When to act: Reinforce these behaviors
π Training Needs
Skill gaps to address
Example:
"Category expertise gap noted in IT procurement. Consider upskilling team or hiring specialist."
When to act: Plan for next quarter
Benchmarking
Internal Benchmarks
Compare your scores:
π Stakeholder vs. Supplier: Are internal/external scores aligned?
π Trend Over Time: Improving or declining?
π― Dimension Variance: Are all areas balanced?
Industry Benchmarks (Coming Soon)
We're collecting anonymized data to provide:
Industry averages by sector
Percentile rankings
Best-practice comparisons
Note: Benchmarking requires 100+ organizations to ensure anonymity.
What's a "Good" Score?
It Depends on Context
First-time scores:
6.0+ = Good starting point
7.0+ = Strong baseline
8.0+ = Exceptional (rare on first survey)
After improvement efforts:
+0.5 improvement = Meaningful progress
+1.0 improvement = Significant transformation
+2.0 improvement = Exceptional turnaround
Don't Obsess Over the Number
Remember:
Trends matter more than absolutes
Insights > Score - the "why" is more valuable than the number
Action matters most - a 6.5 with action beats a 7.2 with inaction
Common Questions
"Our score dropped from last quarter. What happened?"
Possible reasons:
New challenges or changes
Higher expectations (paradoxically, a sign of trust)
Different respondent mix
Honest feedback emerging (initial scores often artificially high)
What to do:
Read the AI insights for specific issues
Review dimensional breakdown for problem areas
Compare feedback themes to previous campaign
Act on the insights
"We got an 8.5 - is that sustainable?"
Yes and no.
High scores are great but:
β οΈ Can breed complacency
β οΈ May mask emerging issues
β οΈ Hard to improve further (ceiling effect)
Recommendations:
Keep doing what works
Watch for early warning signs
Maintain quarterly measurement
Don't rest on laurels
"Can we share this score publicly?"
Yes! PVX scores are designed to be shareable:
β Good places to share:
Executive leadership meetings
All-hands presentations
Annual reports
Team dashboards
β Don't share:
Individual feedback quotes (breaks anonymity)
Dimensional scores without context
Comparisons between teams (can create unneccessary competition)
Taking Action on Your Score
1. Celebrate Strengths
Share positive feedback with your team. Recognition drives engagement.
2. Prioritise Based on Impact
Focus on:
π¨ Urgent actions first
β‘ Quick wins for momentum
π Long-term capability building
3. Communicate Plans
Let stakeholders know:
You heard their feedback
What you're doing about it
When they'll see changes
4. Measure Progress
Run campaigns quarterly to track improvement.