By Scott Berkun, August 9, 2006
Everyone lies: it’s just a question of how, when and why. From the relationship saving “yes, you do look thin in those pants” to the improbable “your table will be ready in 5 minutes”, manipulating the truth is part of the human condition. Accept it now.
I’m positive that given our irrational nature and difficultly accepting tough truths, we’re collectively better off with some of our deceptions. They buffer us from each other (and from ourselves), avoid unnecessary conflicts, and keep the wonderful confusion of our psychologies tucked away from those who don’t care. White lies are the spackle of civilization, tucked into the dirty corners and crevices our necessary, but pretentiously inflexible idealisms create. Small lies prop up and support our powerful truths, holding together the insanely half honest, half false chaos that spins the world.
But lies, serious lies, should not be encouraged as they destroy trust, the binding force in all relationships. One particularly troublesome kind of lie is known as Bullshit (BS). These are unnecessary deceptions, committed in the gray area between polite white lies and complete malicious fabrications. BS is usually defined as inventions made in ignorance of the facts, where the primary goal is to protect oneself. The aim of BS isn’t to harm another person, although that often happens collaterally. For a variety of reasons BS can be hard to detect, which is why I’m offering this missive as a crash B.S. in BS detection. But be warned: to keep you on your toes there are several bits of BS tucked inside this essay which you will have to find for yourself.
Why people BS: a primer
The first lie in the Western canon comes from the same joyful tome as the first murders, wars and plagues: the Old Testament. Despite my distaste for trips into religious texts, this one has supreme tragicomic value.
To recap from the book of Genesis, God tells Adam and Eve not to eat fruit from the tree of knowledge, as pretty as it is, for they’ll die. He wanders off to do some unexplained godlike things, as gods are prone to do, leaving the very tempting, and non pit-bull or electrified fence protected, tree out for all to see. Meanwhile Satan slinks by and convinces Eve apples are good: so she and Adam have an apple snack (the bible refers to them only as fruit, apples are a western addition). God instantly returns, scolds Adam, who blames Eve; resulting in everyone, snakes, people and all, getting thrown out of Eden forever.
Please note that in this tale nearly everyone lied. God lied, or was deceptively ambiguous, about the apples (they weren’t fatal in any modern sense, as if I told you reading this essay will kill you, you’d expect I meant sometime today – please read the footnote if you’re angered by critiques of scripture). Satan misrepresents the apple’s power, and Adam, approximates a lie in his wimpy finger pointing to Eve. It’s a litany of deception and a cautionary tale: in any book that makes everyone look bad in just a few pages, is it really a surprise how the rest plays out?
People lie for three reasons; the first is to protect themselves. They may wish to protect something they want or need, a concept they cherish, or to prevent something they fear, like confrontation. There is often a clear psychological need motivating every lie.
A well known fib, “the dog ate my homework”, fits the BS model. In the desperate fear driven attempt not to be caught, children’s imaginations conceive amazing improbabilities. Fires, plagues, revolutions, curses, illnesses and absurd reinventions of the laws of physics and space-time have all been summoned by children around the world on the fateful mornings when they find themselves at school, sans-homework. It’s an emotional experience, this need to BS: as logically speaking, the stress of inventing and maintaining a lie is rarely easier than accepting the consequences of the truth.
Which leads to the second reason people lie: sometimes it works. It’s a gamble, but when it works, wow. Did you lie to your parents about girls, boys, fireworks, drugs, grades, or where you were till 2am on a school night? I sure did and still do. My parents still think I’m a famous painter / doctor / professor in London (shhh), and my best friend still believes his high school girlfriend and I didn’t get it on every time I borrowed his car. Even my ever faithful dog Butch used to lie, in his way, by liberating trash from a house-worth of garbage cans, then hiding in his bed, hoping his lack of proximity to the Jackson Pollock of refuse that was formerly my kitchen would be indistinguishable from innocence.
Which gives us the third reason people lie, a truth saints and sinners have known for ages: we want to be seen as better than we see ourselves. Sadly, comically, we also believe we’re alone in both having this temptation, as well as the shame it brings with it (e.g. “We’re not alone in feeling alone“). The secret truth is everyone has moments of weakness: times when fear and greed melt our brains and we’re tempted to say the lies we wish were true. And for that reason the deepest honesty is found in people willing to admit to their lies, or their barely resisted temptations, and own the consequences. Not the pretense of the saints, who pretend, incomprehensibly, inhumanly, to never even have those urges at all.
Ok, enough philosophy: lets get to detection.
The first rule of BS is to expect it. Fire detectors are designed to expect a fire at any moment: they’re not optimists. They fixate on the possibility of fires and that’s why they save lives. If you want to detect BS you have to swallow some cynicism, and add some internal doubt to everything you hear. Socrates, the father of western wisdom, based his philosophy around the recognition, and expectation, of ignorance. It’s far more dangerous to assume people know what they’re talking about, than it is to assume they don’t and let them prove you wrong. Be like Socrates: assume people are unaware of their own ignorance (including yourself) and politely, warmly, probe to sort out the difference.
The first detection tool is a question: How do you know what you know?
Throw this phrase down when someone force feeds you an idea, an argument, a reference to a study or over-confidently suggests a course of action. People so rarely have their claims challenged, that asking someone to explain how they know something sheds light on whatever ignorance they’re hiding. It instantly diminishes the force of a BS driven opinion. It works well in response to the following examples:
- “The project will take 5 weeks“. How do you know this? What might go wrong that you haven’t accounted for? Would you bet $10k on this claim? $100k?
- “Our design is groundbreaking.” Really? What ground is that? And who, besides the designers/investors, has this opinion?
- “Studies show that liars’ pants are flame resistant..” What studies? Who ran them and why? Did you actually read the study or a two sentence press clipping (poorly) explaining the results? Are there any studies that claim the opposite?
When you ask a flavor of “how do you know what you know?” often they can’t answer quickly. Even credible thinkers need time to sort through their logic, separating assumptions from facts: an an exercise that works in everyone’s favor (the fancy word for this is epistemology. I wish there was a word like epistemologize, to describe when you challenge someone’s epistemological basis for something).
Of course it’s fine to hear: “This is purely my opinion” or “It’s a guess, as we have no data”, but those are far weaker claims that most people, especially if they’re making stuff up, typically make. Identifying someone’s opinion as speculation, rather than fact, disarms the threat of most kinds of BS.
The second tool is also a question: What is the counter argument?
Anyone who has seriously considered something will have seen enough facts
to fit their current argument as well as alternative position: ask for them. It’s a grade school assignment, intended to show there are many reasonable ways to interpret the same set of facts. However, someone who is bullshitting you won’t have researched or thought through anything: they’re making things up. Asking for the counter argument will force them to either back up their position, or to end the discussion until they’ve done due diligence. (If they claim there is no counter argument, end the discussion. They are not only BS’ing you, they think you’re a moron).
Similarly useful questions include: Who besides you shares this opinion? What are your biggest concerns, and what will you do to address them? What would need to change for you to have a different (opposite) opinion?
Time & Pressure
A good thought holds together. Its solid conceptual mass maintains its shape no matter how much you poke, probe, test and examine. But BS is all surface. Like a magician’s bouquet of flowers, it’s pretty as it flashes past your eyes, but its absence of integrity become obvious when you hold it in your hands. Anyone creating BS knows this, and will tend towards urgency. They’ll resist reviews, breaks, consultations or the suggestion of sleeping on decisions before they’re made.
Use time & pressure, the third tool of BS detection, in your favor: never allow big decisions to be mismanaged to the point where they must be made urgently. Ask to withhold judgment for a day, and watch the response. Invite people with expertise you need but don’t have to participate in decisions to add intellectual and domain pressure (Hiring them if necessary. The $500 you pay a lawyer, accountant or consultant to review something effectively becomes a well spent BS insurance fee).
Be a leader in creating an environment unpleasant for BS. If everyone knows the gauntlet of friendly, but rigorous, intellectual curiosity claims must run through, BS will be discouraged while still in the minds of the tempted.
Confidence in reduction
Especially in business and technology, jargon and obfuscation hide huge quantities of BS. Inflated language is a technique of intimidation. The bet is that if you don’t understand what they’re talking about, you’ll feel stupid, or distracted, and give in to the appearance of their superior knowledge. This is, of course, entirely bullshit. To withstand BS you have to have an inner core of self-reliance, holding on to your doubts longer than the BS’er holds onto their charade.
Our dynamic flow capacity matrix has unprecedented downtime resistance protocols.
If you don’t understand what the hell this means, err on your own side. Don’t assume you’re missing something: assume they are. They’re either hiding something, communicating poorly, or don’t themselves understand what they’re talking about. BS deflating responses include:
- I refuse to accept this proposal until I, or someone I trust, fully understands it.
- Explain this in simpler terms I can understand (repeat if necessary).
- Break this into pieces you can verify, prove, compare, or demonstrate for me.
- Are you trying to say “our network server has a backup power supply?” If so, can you speak plainly next time?
Assignment of trust
The fourth tool of BS detection (derived from the rule of expecting BS) is careful assignment of your trust. Never agree to more than your trust allows. Who cares how confident they are: the question is how confident are you in them? It’s rare that there isn’t
time for trust to be earned. Divide requests, projects or commitments into pieces. It’s not offensive to refuse to take someone’s word if they have no history of living up to it before (especially if they’re trying to sell you something).
And trust can be delegated. I don’t need to trust you, if you’ve earned the trust of people I trust. Anyone skilled in the BS arts has obtained that skill through practice, diminishing the odds that many BS-proof people have been successfully deceived by them in the past. Nothing defuses BS faster than a collective of people that help each other detect and eliminate BS. If a team of people witnesses the complete evisceration of someone’s BS few will attempt it again: they’ll know your world is a BS free zone. Great teams and families help each other detect bullshit, both in others and themselves, as sometimes the real BS we need to fear
is our own.
 One popular interpretation of Genesis 2:17 is that God meant “you will be mortal” when he said “you will surely die”, so its not a lie – this is in line with the many who believe in the omnibenevolence of god or the perfect nature of the bible. While I question these positions, they are popular views and deserves mention. More to my point, in the context of Genesis, there is no reason Adam could know, when told by God he’d surely die, any of these modern interpretations of God’s words, or the symbolic meaning of all these things, we know now in the present.
 This is of course, complete bullshit. I have never lied to anyone ever.
The link about apples not being in Genesis was added on 3/6/2012. I’d known about this common erroneous assumption, but didn’t see the need to call it out until now.
The phrase, “or was deceptively ambiguous”, was added 9/25/2006.
The phrase “..they weren’t fatal in any modern sense… ” was added 6/30/2010
- This essay was written long before I became aware of Cognitve Bias, which provides a great foundation for understanding the role our brains play in creating, recognizing or falling victim to BS.
- On Bullshit by Harry G. Frankfurt is a popular book on the subject. It’s a short read, barely 70 pages, but is sadly toned more like a philosophy textbook than the entertaining romp it could have been.
- Carl Sagan’s Baloney Detection Kit
- Why we lie – Short essay summarizing some basic research into the psychology of lies (LiveScience).
- Web economy and Dilbert’s mission statement generators – If the output of these seem all too familiar, run.
- Why smart people defend bad ideas – The older, twisted sister to this essay.
- Bullshit entry at wikipedia.
- Web economy bullshit generator – speaks for itself.