InSights - What you didn't know you didn't know

Kryptonite & Criteria

superman

The Fallacy of Setting Unrealistic Productivity Improvement Expectations

Kryptonite is a term often used synonymously with the Achilles’ heel, referring to the one weakness of an otherwise invulnerable hero or heroine – in its original case, this was Superman. Superman flourishes and displays superhuman strength and ability in the absence of Kryptonite. However, in its presence, Superman’s otherwise powerful self is reduced to a mere mortal. In similar vein, mediocre data that are used in an applied research project like a criterion validation study means death to an otherwise strong process.

A validation study is essentially a look at the relationship between two things. A good real-world example of a validation study is examining the relationship between smoking and lung cancer. To do that, you would need a sample of smokers and non-smokers, and a sample of people who have lung cancer as well as no lung cancer. Ultimately, we would expect the results to demonstrate that people who smoked had stronger instances of lung cancer than those who did not.

A validation study for the workplace is a similar process where a test or assessment is used to evaluate relationship between scores on the instrument and performance on the job. For this, we need to assess those who are currently performing in the job and get a clear understanding of how people are performing on the job. Seems pretty simple, right? Not so fast. Believe it or not, just getting the right people assessed is often incredibly challenging. Even further, many companies haven’t a clue how their own employees are performing! Much like Superman, even the strongest test or assessment means little in the absence of good criteria, and can be reduced to totally worthless by “Kryptonite criteria.”

The Faces of Kryptonite: GIGO!

To help showcase some of the challenges applied work researchers face and must overcome, here are examples.

So, what do you do for a living? A recent study for a multinational corporation revealed that the client had no means to identify whether many of their own employees were Key Account Managers, National Account Managers, or Business Development Managers. In fact, one group listed as Key Account Managers were really Distributor Account Managers, not even direct employees. Their business was all indirect rather than direct sales. Research has long proven that different sales roles require different selling skills and this is a great example of how critical it is to have a lens on the right group of comparison people. Trying to evaluate people for the wrong job would have wasted time and money for our client.

garbage_in_garbage_out2

Lawyers and HR and Works Councils, Oh My! Legal uncertainty is rampant in the pre-employment testing arena – and rightfully so. There are roughly 2600 assessment companies just in the United States, as there is also no barrier to entry. This means any ex- baseball player or air conditioning salesperson can start an assessment company. Unfortunately, it is because of many of these companies with poor practices that lead lawyers get concerned, which instantly makes Human Resources people nervous. And if you are dealing with unions or multinational projects with foreign Works Councils then processes must be strictly managed, adding about twice the time that it normally takes to get a research project completed.

Once this kind of bureaucracy and politics arises, transparency and constant communication will be critical. In fact, while wishful thinking corporate function may want a deliverable is due no later than six weeks, it will often 12 weeks just to manage local Works Councils.

Hit this moving target! Often a client will ask to verify that the new hires were generating greater % profit than the longer tenured salespeople. But equally often too many things have changed or weren’t tracked. For example, Salespeople were reassigned to different territories, assigned new product to sell, or even transferred to different divisions in reorganization. So, we really couldn’t tell who is old, who is new, or even whether % profit was a metric that we were able to match to incumbent assessment data.

We cannot provide you with any data…now tell us if this test works, will you? Customers routinely ask to demonstrate ROI to prove that an assessment tool is working. ROI is great: (a) it helps retain a customer and (b) helps to create stories and marketing opportunities. Quality providers can prove it’s not snake oil, but rather a solid product or service.
But, Mr. and Ms. Client, you have to give some to get some back! One cannot assess retention without hire and termination data. One cannot assess productivity without production numbers or quality sales data. Typical problems:

  • Country or division differences in types of sales roles, or
  • Different ERP systems such as SAP, versus individual home-grown systems, or
  • Focused on different data in each market: i.e., some salespeople were bonused on total team performance, others were strictly individual, or
  • Manager ratings where everyone is rated at 80% or higher

Conclusions, Recommendations and Best Practices

So, now that we have had some fun, let’s get more pragmatic. A quality applied research project requires quality planning, commitment, resources and time. Here are some suggestions to smooth the process.

  • Work with other leaders in your company to determine what you are really trying to accomplish and what you need to focus on in order to get there (e.g., is it selection, training design, reducing turnover?).
  • Obtain buy-in from multiple stakeholders to secure commitment. This keeps the focus on what is really important to the organization, rather than what is important to just one person. This helps immensely when a new player (e.g., Sales VP) comes in and wants to change things up.
  • Notify Human Resources and Legal (and Works Councils/Unions if necessary) during the vendor vetting process to help speed things up once the project is kicked-off. There is nothing more frustrating than starting a process and then being told by the HR team to stop.
  • Work with Human Resources and IT early in the process to determine how accurate available performance metrics are, how fast you can get them and how easy it is to get them exported on a routine basis. If you are running SAP or Taleo or some other applicant tracking system, confirm that everyone you want to assess and measure is actually in that system. If they are not, work with your vendor to devise the best solution.
  • Get a good, clear communication strategy put together to ensure managers, and managers, of other managers know about the project before it is communicated to the masses. This helps with buy-in. Be honest about the program/project and why it is being used. Communicate seven different times and seven different ways to ensure coverage.
  • Be patient. Real results take time. A few cycles of hiring and terming must occur in order to see if the performance distribution is shifted upward. Work with your vendor to share data. Validation should be viewed as a process – not a single event in time. If the assessment is doing its job, then adjustments or calibrations should be routinely made in order to continue raising the bar for your workforce. But as with most investments, it is difficult to view ROI in too short a period of time.
  • Conduct a periodic Talent Audit to help focus on areas of strength and developmental opportunities for your team.