Policy and Real World Evidence

From
Revision as of 07:28, 3 June 2023 by Bosmana fem (talk | contribs) (Created page with "The field epidemiologist aims to provide, as reliable as possible, real-world evidence (RWE) to support policy decisions. The policy and decision-making processes in health a...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

The field epidemiologist aims to provide, as reliable as possible, real-world evidence (RWE) to support policy decisions.

The policy and decision-making processes in health are usually complex. During crises specifically, this complex process can be subject to various overt and covert pressures, leading governments to pursue policies that disregard real-world evidence or rely on false evidence. Such a phenomenon can be observed in many different contexts, not just during crises like the COVID-19 pandemic. It can be due to a variety of reasons, including cognitive biases, political interests, and lack of quality information.

Possible explanations if this phenomenon:

  1. Cognitive biases: Decision-makers, like all humans, are influenced by various cognitive biases. They may overemphasize evidence that supports their existing beliefs and disregard evidence that contradicts them, a phenomenon known as confirmation bias (Nickerson, 1998). Similarly, they may be affected by availability bias, where they give undue weight to immediate and memorable examples over statistically significant evidence (Tversky & Kahneman, 1973).
  1. Political Interests and Populism: Governments sometimes prioritize political interests over evidence-based policy. They might disregard evidence if it contradicts their political agenda or the preferences of their supporters (Cairney, 2016). Populist movements can exacerbate this tendency by appealing to emotions and popular beliefs over expert evidence (Mudde, 2004).
  1. Misinformation and False Evidence: The spread of misinformation and fake news can also lead governments to rely on false evidence, especially if they lack the resources to properly vet information (Lewandowsky et al., 2012). This problem has been particularly pronounced during the COVID-19 pandemic, due to the rapid spread of misinformation on social media (Pennycook et al., 2020).
  1. Scientific Uncertainty: Especially in a rapidly evolving situation like the COVID-19 pandemic, scientific evidence is often uncertain and can change over time. This can make it difficult for governments to determine what the "real" evidence is, and can lead to flip-flopping on policy as new evidence emerges (Sarewitz, 2004).
  1. Short-term focus: Governments can sometimes be influenced by short-term interests, such as upcoming elections, rather than longer-term outcomes. This can lead them to implement policies that have immediate, visible benefits, rather than those that are most supported by evidence (Jacobs, 2011).
  1. Influence of Lobbying: Policy decisions can also be heavily influenced by lobbying from various interest groups. If these groups are promoting policies that are not based on solid evidence, this can lead to a disconnect between policy and evidence (Grossmann, 2012).
  1. Lack of expertise and resources: Governments may not always have sufficient access to expertise or resources to fully evaluate the evidence. This is particularly likely in less developed countries, but can also be a problem in developed countries where government capacity has been eroded (Peters et al., 2019).

References

  • Cairney, P. (2016). The Politics of Evidence-Based Policy Making. Palgrave Macmillan.
  • Grossmann, M. (2012). Interest group influence on US policy change: An assessment based on policy history. Interest Groups & Advocacy, 1(2), 171-192.
  • Jacobs, L. R. (2011). Governing for the long term: Democracy and the politics of investment. Cambridge University Press.
  • Lewandowsky, S., Ecker, U. K. H., & Cook, J. (2017). Beyond Misinformation: Understanding and Coping with the “Post-Truth” Era. Journal of Applied Research in Memory and Cognition, 6(4), 353–369.
  • Mudde, C. (2004). The populist zeitgeist. Government and opposition, 39(4), 541-563.
  • Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220.
  • Pennycook, G., McPhetres, J., Zhang, Y., Lu, J. G., & Rand, D. G. (2020). Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy nudge intervention. Psychological Science, 31(7), 770-780.
  • Peters, B. G., Jordan, A., & Tosun, J. (2019). Over-reaction and under-reaction in climate policy: an institutional analysis. Journal of Environmental Policy & Planning, 21(5), 585-598.
  • Sarewitz, D. (2004). How science makes environmental controversies worse. Environmental Science & Policy, 7(5), 385-403.
  • Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207–232.

Contributors