Performance Reviews aren’t Real.
Performance Reviews Aren’t Real
Once a year, in offices across the world, something very serious and comically theatrical happens. Two adults sit across from each other at a desk or on a video call and pretend they can summarize an entire year of someone’s professional life using a form with numbered boxes.
There are categories:
Communication.
Initiative.
Leadership potential.
Strategic thinking.
Collaboration.
Possibly something vague like “impact,” which sounds impressive until someone asks what it actually measures.
Then you start assigning numbers. Three out of five. Four out of five. Occasionally a five out of five, which is so rare it feels like spotting a rare bird in the wild.
This process is called a performance review.
But performance reviews aren’t real at least not in the way we pretend they are. People solve problems, make mistakes, learn things, improve at some tasks, struggle with others, and gradually develop skills through experience and repetition. But the idea that all of that complexity can be neatly captured in a once-a-year meeting and translated into a few numerical ratings is one of the most ambitious acts of workplace storytelling ever attempted.
The form isn’t measuring reality, it’s creating a narrative.
The Spreadsheet Version of a Human
The modern performance review comes from a simple idea: if you can measure something, you can improve it, and that logic works beautifully for machines. If a factory produces 10,000 units per day and you change a variable that increases production to 12,000 units, you have learned something useful. The numbers tell a clear story and the improvement is visible. However, as much as corporate workforce management departments want you to think so, humans are not machines. Humans work in bursts of creativity, long stretches of confusion, occasional brilliance, and unpredictable collaboration with other humans who are also improvising their way through the week.
Trying to measure that process with tidy numerical categories produces something a little disturbing, the spreadsheet version of a human being. Suddenly your entire professional existence becomes a set of ratings.
Communication: 4
Teamwork: 3
Leadership: 4
Strategic thinking: 3
Congratulations. Twelve months of your life have been converted into a slightly above-average collection of numbers.
The numbers feel official because they look mathematical. But the process that created them is often deeply subjective, your manager remembers the project that went well, or the meeting where you choked on that hot dog, or the email thread that caused confusion three months ago because you thought you were ordering pizza with an emoji. Human memory isn’t a spreadsheet, it’s a highlight reel with occasional bloopers.
Then when the form asks to compress that memory into a rating. The result isn’t data, it’s an interpretation.
The Annual Theater of Professional Growth
Performance reviews aren’t just about measurement. They’re also about ritual. Once a year, the workplace pauses to perform a ceremony called reflection. Everyone gathers their accomplishments. Managers gather their observations. HR departments gather documents that look reassuringly organized.
And then the conversation begins and you discuss your goals from last year. Some were achieved, some were quietly forgotten around March, and others evolved into something slightly different that nobody mentioned until this meeting.
Then the conversation shifts to development.
Managers use phrases like “areas for growth” and “opportunities for improvement,” which are professional ways of saying, “There are things we should probably work on but nobody wants this meeting to become uncomfortable.” Employees nod thoughtfully, explaining lessons learned, and promise to focus on strategic priorities. Everyone agrees that the next year will include meaningful progress.
Then the meeting ends, and nothing really changes.
This doesn’t mean the conversation was useless. Reflection can be valuable. Feedback can be helpful. But the formal structure of the performance review often turns the process into a scripted exchange.
A performance review is basically a polite meeting where both people pretend the next twelve months will go exactly according to plan.
The Memory Problem
Another reason performance reviews struggle to capture reality is timing. Work happens continuously, but performance reviews happen once a year. Imagine trying to evaluate an entire movie based only on the scenes you remember a year later. You might recall the dramatic moments, the funny parts, the one confusing scene that made no sense. But the quiet details that actually built the story might disappear from memory entirely.
The same thing happens in the workplace.
Your manager remembers the big projects, the visible wins, and the occasional mistake that happened during a stressful week. But most or your work lives in the middle. The emails answered quickly. The problems solved quietly. The small improvements that made processes smoother but never became headline moments. These things rarely appear in the review conversation because they blend into the background. The annual review compresses a complex year into a handful of memorable moments. Which means the narrative often depends on what happened recently. If your best work occurred in June but the meeting happens in April, your brain, and your manager’s brain, might give more weight to whatever happened last month.
It’s not malicious, it’s just how memory works. But it means the performance review becomes less about the full year and more about the story that survived in people’s minds.
The Research Problem
One of the quiet ironies of performance reviews is that the system designed to measure performance has been studied extensively, and the results are not particularly flattering. Over the past two decades, organizational psychologists and management researchers have examined how annual performance reviews actually function inside companies. The conclusion that keeps appearing is surprisingly consistent: traditional performance reviews rarely improve performance in the way organizations expect.
Several large companies discovered it the hard way. In the 2010s, firms like Adobe, Microsoft, and General Electric began dismantling or redesigning their traditional review systems after internal studies showed the process was creating more stress than progress. Managers spent enormous amounts of time filling out forms and calibrating scores, while employees often left the meetings feeling confused, defensive, or demotivated.
Researchers studying workplace evaluation found a few recurring problems:
First, annual reviews rely heavily on human memory. Managers tend to remember recent events much more clearly than events from earlier in the year, which means ratings are often shaped by what happened in the last few months rather than the entire performance period.
Second, rating scales introduce bias. Studies consistently show that different managers interpret the same performance differently. One manager’s “exceptional” may be another manager’s “meets expectations.” The numbers look objective on paper, but they are often reflections of individual perception rather than measurable reality.
Third, the feedback itself arrives too late to be useful. By the time an annual review happens, the work being evaluated may have taken place six or nine months earlier. Even if the feedback is accurate, the moment where it could have improved the outcome has already passed. In other words, the system is evaluating history instead of shaping the future.
This is why many organizations are quietly shifting toward more frequent feedback conversations instead of formal yearly reviews. Shorter, ongoing discussions about projects and goals allow employees to adjust in real time rather than receiving a summary months later. The research does not say that feedback is useless, feedback is incredibly valuable when it’s timely, specific, and connected to real work. What the research suggests is that the ritual of the annual performance review: the forms, the numbers, the ceremonial meeting is often less about improving performance and more about maintaining a structure that feels official.
Which leads to a strange realization. The system built to measure work might be one of the least efficient ways to actually improve it. It turns out the best way to improve performance might be talking to people during the year instead of surprising them with a spreadsheet about last January.