Information, confirmation, and influencing advice

23 June 2011

In the world of evidence-based policy-making, we often struggle with its evil twin: policy-based evidence-making.

Proponents of the former dream of a world in which a problem is identified, those in charge commission rigorous research, several options are presented, the pros and cons of each are weighed and the best choice is enshrined into policy (with the assumption that said policy is then implemented and there is further monitoring and evaluation of its implementation to make tweaks and improve upon the policy).

Anyone familiar with policy making – whether within governments, companies, organisations or social institutions – will know that such an ideal policy cycle is far from reality. More often we see someone with a value or interest in a particular topic looking to build a case for a particular solution, which may become policy if the case is compelling enough. These issue ‘champions’ may be nefarious schemers looking after their own best interest, but case building is equally prevalent among advocacy organisations and politicians working for the greater good.

Case building is a conscious, active, purposive and biased process. Researchers like to pretend, of course, that bias is beneath them and that the scientific method and peer review guard against this. But research into human psychology would indicate otherwise. Even when trying our best to be analytically objective, the human subconscious falls prey to something known as ‘confirmation bias’.

Confirmation bias expresses itself in different ways among humans, but effectively it can be boiled down to a subconscious predisposition to find or give more weight to evidence/data/information that supports a pre-existing belief or pre-existing knowledge. In research this may mean seeking data that supports a given hypothesis (or conversely, ignoring evidence that doesn’t support the hypothesis). Don’t believe me? Try this little activity to see confirmation bias in action.

In the real world, this means that we often try to support our beliefs by going to information sources that already support our view. This is what makes political punditry of the likes of Fox News (on the right) and MSNBC (on the left) in the U.S. so compelling. They do the heavy lifting of interpreting reality to fit in with pre-conceived worldviews so viewers don’t have to.

Indeed it turns out that, not only do we tend to seek information that already agrees with us, but information sources (or knowledge brokers) that we can trust to do the same. And as Eli Pariser points out, the increasing use of advanced web technologies that use algorithms to filter out ‘irrelevant’ information means that people don’t even have to make conscious decisions to seek out trusted information – and might make it even more difficult for people actively trying to challenge their existing beliefs.

This poses a serious problem for those of us seeking to influence others to change attitudes and beliefs: how do we get inside a target stakeholder’s filter bubble? If it was difficult to do in real life, it’s that much more so when we’re trying to game an algorithm.

I suggest several strategies:

  • Work through existing/trusted channels: I wrote previously on this blog about how new government regulations on communications (or policy influence and research uptake) emphasise using existing channels to communicate research – this post on confirmation bias should further support that view. If one suspects that target stakeholders are already going to certain news outlets, websites or information sources (online or off), use them – even if that means working with the ‘enemy’.
  • Go with the grain: Given what we know about confirmation bias, it is probably unreasonable to expect that confronting people with facts and figures is going to do much difference (indeed it might make them more obstinate). To change opinions, we must work with and not against pre-held beliefs and those trusted opinion shapers to subtly shift the understanding of an issue. Instead of creating a fully formed argument to change opinion, try for messaging and channels that first lays a groundwork for openness to a message. Indeed it may even encourage information-seeking behavior – one type of confirmation bias is known as ‘Baader-Meinhoff Phenomenon’, where the mind might focus in on a particular piece of information and then begin to find it everywhere.
  • Sew wild oats: Getting past filter bubbles will likely mean trying a few different strategies. Don’t just put a study onto a project website and call it good. Try to get plant information in a number of different guises (i.e. on different sides of the political spectrum) and formats (i.e. not just in the blogs or on a website, but also in the news, and even offline).
  • S-E-OH?: The first rule of the web today is to make sure content is search engine optimised (SEO) so that the Googles and Bings of the world can find what you have to say. This is done through a bit of magic usually involving page titles, page headings, and what links back to your site. When publishing your content online, make sure that the first two items play well in more than one bubble. Or, if you’re feeling particularly clever, why not start two different blogs with similar content but framed for different audiences.