NOTE: This post is Part 4 in my month-long deep dive into exploring "expertise and authority." Get the whole series (as it progresses) here.
"Meat eaters are more selfish than vegetarians."
This announcement really struck a note with me—a keen lover of meat. Actually, "love" isn't quite strong enough to describe the passion I have for a good steak. I have been known to weep openly and joyfully in a restaurant, fork and knife in hand and napkin tucked into my shirt collar.
The phrase wasn't just rhetoric, either. It was ultimately coming from a bonafide and trusted source—a paper written by Professor Diederik Stapel of Tilburg University.
Stapel's study hadn't actually been printed in a scientific journal. It had crept into cultural knowledge and reference after the release of a press bulletin. But numerous other studies bearing Stapel's name were published over the course of his career, all backed by his credentials and position at Tilburg University. Stapel was a respected expert in social psychology.
But in 2011, despite accolades, honors, and the prestige of being a published and respected authority, Stapel's work was called into question. A committee was formed to investigate the integrity of his scientific research, alleging and later proving that Stapel had faked his data.
The aftermath of the committee's investigation was swift and destructive. Numerous papers were retracted. Journals published public statements denouncing him and his work. Stapel himself even surrendered his PhD on the grounds that his career and his work had been "inconsistent with the duties associated with the doctorate." Stapel's career as a renown researcher was finished.
So ... was eating meat still a selfish act?
Still selfish after all these years
Apparently, because despite the fact that the paper was never actually published, and that in 2011 the author of the paper was completely discredited and much of his work retracted, I was learning about how selfish I was as a meat eater a full year later—in 2012.
And in 2013. And in 2014. And in 2015.
The study, despite never having been published in any official capacity, and instead having been completely discredited due to fraudulent data, is often cited to this day as proof that being a vegetarian is not only a healthy lifestyle choice, it's the correct moral choice as well.
And while that's not really much of a problem—people are always looking for justification for or affirmation of their own philosophy—it's a really good example of both the power and responsibility of expertise.
The fact is, facts change
This isn't the only "blunder of expertise" in history, of course. There was a time when the greatest authorities of humanity knew that the Earth was flat, that the atom was the smallest particle in existence, that eating eggs was bad for you/good for you/bad for you ... where are we again on the eggs?
The fact is, facts change.
What we know now is just what we're able to observe now, and what we're able to make of what we already know (or think we know) about the past. We can build on past knowledge, and we can make some fairly well educated guesses about the future. But we can never know anything for absolute certain, because there will always be some undiscovered factor we haven't accounted for.
And that's one of the reasons experts are so highly valued. Because they, unlike we lowly laymen, know their stuff. They've taken the time to study something so thoroughly that they know it and understand it better than the rest of us. We know, from personal experience, that there's a whole lot we don't know. So we look to experts to either fill in those gaps, or tell us it's ok to ignore them. We trust them, sometimes even if what they tell us makes no real sense, or if their advice forces us to change our own viewpoint of the world.
And that's weird.
Because we humans all have at least one trait in common: We tend to distrust things that we don't understand.
We keep away from complex machinery if we don't have any knowledge of how it works. We avoid animals we can't identify. We refuse to eat food we don't recognize. it's our nature and our habit. We may overcome it, through choice and will, but our first instinct isn't to trust it.
But with experts, that tendency to distrust goes right out the window. If we recognize someone as an authority on a topic, our natural inclination is to roll with what they say.
That isn't universal, of course. If an expert is saying something completely counter to what we know (or think we know) of reality, then we'll likely have some doubt. We'll maybe do a little digging ourselves, Google a few things, ask a few other experts. But if we already trust this expert, or if another expert we do trust can somehow vouch for them or agree with what they say, then we may just shrug, adjust our worldview accordingly, and move on. No need to dig further. The experts say it is so.
That's interesting (and maybe even a little alarming). We're turning over some of our decision making to someone, even if what they're telling us is counter to what we already knew or believed. But even more interesting, we apparently have a tendency to trust an expert by default if they're advising us about something we don't fully understand.
In a 2009 paper titled Expert Financial Advice Neurobiologically “Offloads” Financial Decision-Making under Risk, the authors studied the effects of the statements of experts on the decisions made by laymen investors.
"Our behavioral results indicated that the expert's advice significantly influenced behavior," the study reported. "We obtained behavioral evidence demonstrating that the presence of the expert's advice led to a significant increase in [probability weight] in the direction of the advice, such that participants overweighted low probabilities and underweighted high probabilities more after receiving the advice." [SOURCE: http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0004957#s2]
In other words, the subjects gave more consideration to the expert's advice than to their own estimate and knowledge of the probabilities. They considered the expert's advice to have more weight, more validity, if the advice was about something they didn't fully understand themselves.
On our own, we might not buy a stock that seemed to have a low probability, especially if we didn't understand the industry or business that stock represents. But there's a great likelihood that you would buy that stock if you were told to by someone you recognized as an expert.
The key is "granted authority"—a term I just made up, but is probably pretty accurate. Trust me, I'm an authority on words.
We're the ones who grant authority, after all. We choose to recognize someone as an expert or authority, or we choose not to. We choose to heed advice, even in spite of our own knowledge or convictions, based entirely on our assessment of the expertise of the individual giving the advice. Our choice. Sometimes our blunder. But more times than not, trusting the expert pays off, so we keep doing it. We really like being told what to do by someone who knows more than we do.
The reason for this is that humans really aren't that keen on owning responsibility for our decisions. Because if we know anything at all in this great big universe, it's that we are seriously flawed and prone to mistakes.
We know ourselves better than we know anyone else, after all. We are experts in "us." We know how many times we've flubbed it when the chips were down, and how many times we've fudged it when the facts weren't in. So we try to outsource our decision making any way we can. We don't have time for extensive research outside of our own personal field of expertise, even if we really need that information. So we look for shortcuts, to offset our ignorance. We look to outsiders who, by all accounts, know more than we do.
That gives experts a lot of power. And as my boyhood hero and role model, Spider-man, is famous for saying, "With great power comes great responsibility."
So what can we learn from Diederik Stapel and his data downfall?
Here was a certified and verified expert in his field who was telling us things about ourselves that we didn't know, but might have suspected (or, in some cases, wanted to be true). A lot of what he was reporting seemed wrong to us meat eaters, maybe. Or maybe it supported some personal believe we were holding about the right and wrong of eating meat. Either way, many of us gave greater weight to his findings despite the probability of those findings, because he had a PhD and a research staff and the backing of a university. He had all the right boxes checked. He should have been trustworthy. But he failed to be responsible.
A lot of the people I work with are experts and authorities in their own fields. They have reputations, upon which they've built their careers and their lives. They work with their own clients, or write their own books and papers, or produce their own podcasts and video blogs, and all of it is predicated on the idea that they are, in fact, experts—that they are relaying what they know. But more important than their knowledge, our trust in them is built on the belief that they are, above all, honest.
People respect experts. They honor them. Doctors and scientists. Lawyers and judges. Consultants and personal coaches. Each of these is only able to do their work as long as their audience trusts them and their methods. It's a social contract, in which the layman trusts the expert to actually be an expert, and to provide honest and reliable information to the best of their ability.
Break that social contract, and the whole system of trust and credibility crumbles.
Not only do careers go down in flames, sometimes the people we're trying to reach get hurt, too. Bad advice from a financial expert can cause a client to go into bankruptcy. Bad advice from a personal coach can cause a client to make decisions that aren't in keeping with their own principals and needs. A bad diagnosis from a doctor can cause a patient to get worse instead of better—maybe even die.
Things can get pretty bad, when an expert isn't being responsible for his or her power.
Does this change what it means to be an expert?
No, not really. Not at all, actually.
This doesn't change our working definitions of expertise and authority. It's still as subjective and relative as it always was. But what this does tell us is that the titles come with more than just prestige and power—they come with a burden of responsibility. It's on us, as experts, to be responsible for what we are and what we know and what we do with our knowledge. We have an obligation to be honest and thorough.
As I keep digging into these two terms and their deeper meaning, I'm finding that expertise is a little like the game of Othello. "A moment to learn. A lifetime to master." We can become experts and authorities quickly and easily—it isn't hard. We just have to know more than the person we're talking to, and we have to be able to demonstrate that knowledge in a way that is recognized as valid and socially valuable.
That's the easy part.
The hard part is what happens after. Because as my friend Paul Campbell, host of the Real United States video blog, told me, "Expertise is not a binary state, but one that is gradient, some experts being more expert than others."
We aren't "finished" when we become an expert. The work isn't done. If we want to continue in that recognition, we have to keep working for it.
The responsibility of expertise isn't just honesty—it's diligence. If we want to keep holding the title, we have to keep stepping back in the ring, keep throwing the right punches and dodging any incoming blows, and keep working our sweet, sweet science until the current match is one, then move on to training and preparing for the next.
That's what power looks like. That's what it means to be an expert and an authority. It means taking ownership of our expertise, and treating it with respect, and a commitment to being responsible for what we produce in the world.