Making evidence-based practice in HR/work psychology a two way street

Rich Littledale
6 min readApr 25, 2018

--

Evidence-based practice

Recently I went to a great talk by Rob Briner¹ (@Rob_Briner on Twitter) on evidence-based practice in organisational psychology and HR. It was an informative and funny session on what evidence-based practice is (and isn’t), why it’s important, and why it’s difficult. If this is of interest, you can find loads of great resources at the Center for Evidence-Based Management website (https://www.cebma.org/).

Practice-based evidence?

At the end of the session I asked a question: “we’ve just heard some really great tips about being effective consumers of evidence in the way we practice, but how can we as practitioners contribute?”

Rob’s response was straightforward and honest: “It’s a good question, but collecting practitioner based evidence is difficult, and I don’t have the answer — what do you think?”

I had two reasons for asking this question. One is quite personal, and it relates to a growing unease I have had as my career has gone on (17 years post MSc and counting) that is best expressed as a question: to what extent am I contributing to a body of knowledge that helps people and companies do better? Probably part of my own mid-life crisis, but I do want to be doing work that moves us forward as a society, and that others can build on.

The second was more intellectual. If practitioners — those out in the world — are exclusively the users of evidence and not generators, how does the profession stay in tune with the way that the world of work is changing? Now I agree that many claims about topics such as the pace of change in a “VUCA” world, about changing generations, are vastly overblown. And I agree that being suspicious of novelty for novelty’s sake is healthy. But at the same time the world does change, and those of us working hand in hand with organisations will be the first to see those changes. Without using that privileged position there is a danger that the evidence will remain two paces behind the context.

The answer I gave on the spot was that I collect my thoughts in blogs. That was a bit rubbish, so I wanted to give it a bit more thought.

Why is practice-based evidence difficult?

  • Measurement in the field is difficult — for many of the activities that HR or organisational psychologist get involved in it is very difficult to point to direct causality between intervention and effect. In some cases (e.g. graduate hiring) it may be years before you have any data on your success rate. And the causes of real world outcomes (e.g. company or team performance) are so complex that the work we do could only ever explain a small part. I’ve written about this in more detail elsewhere (https://medium.com/@richlittledale/agile-methods-the-saviour-of-hr-8f527d417cc0)
  • But we are also a bit crap at measurement as a profession — it is hard but we could do better. Clients often don’t have time or money to spend on measurement. Consultants more often than not err towards pragmatism, and fail to communicate the benefits of measurement effectively. I also think the fallacy of sunk costs is a blocker. We’ve all been doing this stuff so long, do we really want to know whether it works or not?
  • Which means there is an abundance of anecdotal evidence — we all as practitioners have the evidence of our own eyes. And I like many others do sometimes share this in the form of blogs or articles (like this one). But the irony of me writing these articles is that when I read similar things written by others I don’t trust them.
  • Delivery culture — HR teams, and the people who seek to advise and support them, operate within a culture that rewards activity more than it does impact. “Delivery” is king. Using and collecting evidence puts the brakes on in a way that can be deeply uncomfortable in that kind of culture, even if the results end up being better. Lots of times in my career as a consultant I have discussed with clients the reasons why what they want to do may or may not be a good idea (it wasn’t a good idea!). A few times I’ve heard back a phrase that tells me clearly what to do next: “let’s explore the art of the possible”. Let’s explore the art of the possible = JFDI.

So what is the art of the possible for practice-based evidence?

So given that it is difficult, what can be done? Here are the things that I am going to commit to in order to try to bridge the gap. I should say here that I am not suggesting that anything that I suggest here is “the answer”. The problem is far to big for quick easy fixes. For instance, some of my answers relate to better links with academia, but there is plenty wrong with the way academic research currently works (see https://retractionwatch.com/2017/02/21/got-significosis-five-diseases-academic-publishing/). However, these are the things that I plan to do in order to a) make myself feel like I can make an impact and b) start to understand the problem better and share any insight from that with others.

  • I’m going to keep blogging — it may not be the best source of evidence, but the things we practitioners experience, and the insights that these give us are a great source of hypotheses. As practitioners we should be generating questions, and theorising as to what the answers might be. We then need to be better integrated with researchers to test these questions systematically. So I’m going to keep blogging.
  • I’m going to try to make links to academia — I’m going to look for partners who I can help and who can help me. One way of doing that is through connections with MSc courses, as MSc students are always seeking data and interesting experiences for their dissertation. In 2001 had access to a large development centre dataset, having won a competition to work with one of the country’s biggest insurers. That was great, and I was really grateful, but I didn’t hear anything from the client about what they wanted to know, what they were interested in, or what they thought was happening. We should give access to MSc students and other researchers, but not leave it at access. Within two years I would like to have contributed to an academic paper.
  • I’m going to collaborate with other practitioners — having left the shelter of my consulting job two months ago to star my own business this is one I know will be hard. It is so much easier to be idealistic when you are not worried about paying the bills, and intellectual property is valuable. Part of my commitment to this is sharing stuff that I am working on (https://www.peopleuphq.com/opensource/), but I also want to meet like-minded practitioners who are willing to let idealism win from time to time. If this is you, give me a shout and lets meet.
    Other related fields — Clinical Psychology for instance — advocate the use of Practice Research Networks (PRNs) or formal Communities of Interest. As it stands I don’t know if any such networks exist for Occupational/IO Psychology, but I’m going to find out and update this blog.

¹ Thanks very much to Rob for taking the time to provide some comments on this piece — although views here are mine — and pointing me in the direction of some great additional resources.

--

--

Rich Littledale
Rich Littledale

Written by Rich Littledale

Psychologist in startup land, exploring the people side of technology and technology businesses. Consulting at www.peopleuphq.com, co-founder at www.supc.co.uk

No responses yet