Why Stubborn Myths Like ‘Learning Styles’ Persist

Opinion | Learning Research

Why Stubborn Myths Like ‘Learning Styles’ Persist

By Dan Belenky     Aug 15, 2017

Why Stubborn Myths Like ‘Learning Styles’ Persist

“Fool me once, shame on you. Fool me twice, shame on me.”

We should learn from experiences, particularly if those experiences show our previous beliefs to be untrue. So why are people so easy to fool when it comes to beliefs about learning?

For years, a stream of articles have tried to dispel pervasive but wrong ideas about how people learn, but those ideas still linger. For example, there is no evidence that matching instructional materials to a student’s preferred “learning style” helps learning, nor that there are “right-brain” and “left-brain” learners. The idea that younger people are “digital natives” who use technology more effectively and who can multi-task is also not supported by scientific research.

These ideas have been around a lot time—and so have articles attempting to dispel them. But those beliefs persist. A 2012 study surveyed teachers in the U.K. and found that, on average, they believed half of the neuromyths they were presented with.

Perhaps we should stop focusing as much on discussing what the common neuromyths are and instead focus on understanding why these ideas persist.

To explore this, let’s connect neuromyths to other phenomena that have been observed in social and behavioral science. Specifically, social psychology and behavioral economics offer useful clues when thinking about why these ideas persist, as well as what can be done to more effectively change people’s beliefs.

In this vein, Benjamin Riley and Michael Pershan, from the nonprofit Deans for Impact, recently published an article describing cultural cognition. They write:

“Cultural cognition describes how we interpret certain facts and evidence through the lens of our existing values. Usually, we accept scientific claims as true because, overall, most of us trust science and scientists. But sometimes – in rare but notable cases – our stance on a scientific matter comes to take on a larger, much more personal meaning. Beliefs about science can become entangled with our self-identities, even if they didn’t start out that way.”

They hypothesize that researchers’ insistence on abandoning neuromyths may come into conflict with teachers’ value of teacher autonomy, for example. This explanation is definitely plausible. We also believe there are other tendencies that we have as humans that may both contribute to beliefs and point to ways to address the issue.

Cognitive Ease (or why we like things that are easy to process)

Humans have evolved to use decision-making short-cuts (called heuristics) that make it much easier for us to get through most circumstances we encounter in every-day life. Usually, these heuristics do their jobs and make it easy for us to rapidly make decisions which generally work out well for us. But sometimes, heuristics allow us to accept false conclusions, or perform sub-optimal behaviors (if conditions are set up in certain ways).

One of these heuristics helps us decide how much we like some new experience or piece of information. As researchers Rolf Reber, Norbert Schwarz and Piotr Winkielman put it, beauty is in “the perceiver’s processing experience.”

Put simply, their idea is that things we can easily process are seen to be more pleasing and preferable. People, in general, avoid spending extra mental effort processing new information. If a new idea can be grasped easily, it is more likely to be perceived positively, and, ultimately, adopted.

One clear implication of this is that the (cognitively) easier we can make certain behaviors, the more likely it is to be done, but if the “costs” are high (in terms of processing, effort or time) than it will be more difficult to implement. One way of lowering this cost is through clearer communication. The easier an idea is to process and understand, the more likely it is to be retained and believed. (It is no wonder an idea like “learning styles” can take off; the two words together connote the whole idea!)

Evidence-based practices are frequently referred to using jargon-filled names pulled from academic research, like “retrieval practice,” “elaborative self-explanation,” or “managing cognitive load.” You’ll notice, for example, that I labeled this section “Cognitive Ease” rather than “Processing Fluency” (the term used in scientific research) in an effort to make it easier to grasp. Even well-designed communications like the one put together by Deans for Impact on effective instructional principles are still complex. It is clear that simpler communications with more direct application to instructional practice is necessary.

In general, reducing the effort needed to understand and adopt research-based practices would go a long way towards helping those ideas be implemented.

Confirmation Bias

Once an idea has been accepted, it can be hard for a person to change their mind. This isn’t just stubbornness; humans have evolved to share a bias towards information that confirms their existing beliefs.

This confirmation bias has been observed in a number of ways across decades of research. In one study from 1967, participants listened to speeches supporting or challenging ideas they already believed in. (For example, smokers listened to speeches about the connection between smoking and cancer, or to a speech arguing against that link). The speeches were recorded in a way that made it hard to hear by having a masking static played over them. Participants were provided a button that let them, for short periods, remove the static so that the speech was easier to hear. People chose to use the button more for speeches that aligned with their existing beliefs and behaviors.

With the echo chambers that have been built up through the internet, and social media in particular, we have a lot of first-hand experience with people posting, reading, commenting, and sharing information that confirms their existing beliefs (and not being persuaded when acquaintances begin posting conflicting information in response!).

There is no “cure” for confirmation bias—it is a tendency we all share. But that doesn’t mean we should accept it as a default state of mind. With effort, we can choose to engage in a different kind of reasoning.

  1. Simply being aware of this tendency is the first step towards trying to monitor for it. This is easier said than done, but it is important to try and be mindful of the ways in which you seek out and process information that confirms or disconfirms what you already believe.
  2. Find opportunities to discuss ideas people who disagree with us. Warren Buffett, for example, has invited people who disagree with his investment approach to ask questions at shareholder meetings.
  3. For instructors, it may be useful to structure opportunities for teachers from different backgrounds, with different training, and with different pedagogical ideas to meet and discuss. Don’t expect dramatic “conversions” where people change their minds about something completely and all at once, but these kinds of interactions help people recognize the assumptions in their own thinking and more critically evaluate new information they encounter.

There are many reasons incorrect ideas like neuromyths spread and persist. The few areas of psychology covered here are by no means an exhaustive list. But these few concepts already point to potential solutions worth exploring. If we can help people recognize how their pre-existing beliefs influence their evaluation of new ideas, as well as make it easier to adopt best practices, we are more likely to succeed in ending pervasive “neuromyths” and improving learner outcomes.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up