Having a brief conversation with @anodyne2art on Twitter with regards to my post about trust, and while it’s true, the buzzwords are authenticity and honesty, I think these are only tangentially related. Trust probably has more to do with transparency, but it’s not quite that either.
Really, trust is about predictability. We trust that the sun will rise again tomorrow. We trust that Apple will come out with some new sparkly thing every so often. But we also trust that Microsoft will continue to “embrace and extend.” We trust that TV will always gravitate to whatever is sensational. We trust that Fox News is overtly biased and is basically a propaganda mouth piece for the Republican Party.
Hell, we’re beginning to trust the fact that Twitter will invariably go down every so often.
I guess a synonym would be reliability.
But we’re not robots, stuck to follow some monotonous program. Evolution has tuned us to appreciate a great deal of reliability, but with novelty mixed in every so often. Neuroscientists have a name for it: the dopamine reward circuit. This lights up every time something novel comes your way. It is also the circuit responsible for addiction, so that it’s no surprise that while some people get addicted to cocaine, others get addicted to novelty itself. This is probably the underlying pathophysiology of at least some subtypes of ADHD, but that’s for another blog post.
So when we come to trust a brand, it’s not that we expect it to do X all the time, but we want a reliable kind of experience.
Evolution has also shaped how we handle trust. Particularly since human beings are social creatures, trust has always been a type of currency that we’ve traded in, probably as far back as when we first came down from the treetops to roam the savannah. Game theory tells us that co-operation is more efficient than most forms of competition. Definitely more efficient than winner-takes-all competition. This has been well studied, and the most efficient form of co-operation is tit-for-tat. You scratch my back, I’ll scratch yours. So in order to co-operate, we have to be trusting.
However, with the advent of trust, deceit becomes a worthwhile algorithm. Go ahead and scratch my back. I ain’t gonna scratch yours. This rapidly becomes very costly to the trusting subject. Consequently, we are programmed to rescind trust quite harshly. You betray me, and you lose. Again, tit-for-tat.
Anyone who has ever betrayed another person knows for a fact that trust, once lost, is excruciatingly difficult to reacquire. In some cases, it becomes totally impossible.
With the advent of the Internet, it’s a lot easier to expose deceit. Consequently, deceit becomes a less viable strategy. Watch the Republican party continue to implode, for example.
In fact, it has become increasingly difficult to leverage pathological lying into the selective advantage that it once was. If whatever you ever said or wrote is etched in magnetic fields and photons somewhere on the Internet, and cached by Google, it becomes part of the collective memory of Western Civilization. Witness how much difficulty the pathological liar known as Hope Ballentyne is having now that people are recording their experiences with her on Craigslist. If you can’t keep your story straight, you’re going to get your ass kicked. Look what happened to HRC when she lied about being shot at at Kosovo. The bloggers were all over this. The pulled quotes out from the nether recesses of Google. And it was exposed as the bullshit that it was.
This is a lot of the reason why the neocons’ fascistic bid for power failed miserably. One, because a lot of their operatives were quite intellectually challenged. Two, because it became increasingly difficult to hide the corruption. It didn’t matter if it was posted by a real journalist or a blogger. Once it hit the Googlestream, it was out there, and it would spread like wildfire. Sometimes the traditional media would actually cover the fact that the story spread so fast.
Traditionally, it is believed that lies spread a lot faster than the truth, but the Internet restores the balance. Eventually you’re going to have to deal with facts, and Google seems to preferentially index facts over bullshit. This is simply a factor of semantics, really. The transmission of facts tends to be quite minimalistic in terms of syntactic sugar and unnecessary detail. Most people will just write what happened, and that’ll be it. You get enough sources, and everyone who records an event will eventually converge to some sort of consensus. It may not be fact in the sense that it’s actually independently verifiable and reproducible (which is the gold standard for scientific evidence), but it’ll be at least a social observation. You can at least conclude that a significant subset of the population believes such a thing.
In contrast, bullshit tends to be highly individualized and idiosyncratic. The more bullshit you collect, the wider the disparity between the different stories. Because bullshit tends to be unreliable. You copy unreliability over several generations, and you get even more unreliability. Eventually bullshit dissolves into noise, because it’s almost impossible to keep a unified narrative. You’ll notice that the best way to flush out a lie is to keep the liar talking. Eventually they’ll say something that makes everything fall apart.
The way that Google magnifies and amplifies consensus belief (I’ll step away from using the word “fact”, which I’ll reserve for things that are independently verifiable and reproducible) reminds me of how polymerase chain-reaction (PCR) works. PCR is one of the landmark technologies that launched the bioinformatic revolution. What you do with PCR is that you generate oligonucleotide probes that flank your sequence of interest, you start DNA replication, and you cycle it through multiple generations. Eventually, even if you have contaminants, what will come to predominate in your solution are your targets of interest. You can’t get rid of the contaminants, but they’ll be such a sparse contribution to your solution that it probably won’t matter.
I’m not saying that Google is the ultimate lie detector. All that has to happen is for Rupert Murdoch to open his big, fat wallet, and say “How much you want?” and then we’ll be back to the Dark Ages. Google only works the way it does because the guys who run the show stay true to the hacker ethos that is based in the ’60s culture. Some people accuse Sergey and Larry and friends of gaming the system. (Usually, however, such accusers tend to be spammers and black-hat SEO guys.) Even if this accusation were true, the execution is so subtle that it’s almost as if they were letting the truth run unfiltered.
But back to my point: evolution has tuned us to detect deceit, simply because falling for it is a big time loss. In lolcat/leetspeak, believing a lie = FAIL! Even the most feebleminded of us have at least some rudimentary lie-detection circuitry in that mush sitting in our skulls. And, as Abraham Lincoln said, you can fool some of the people all of the time, and all of the people some of the time, but you can never fool all of the people all of the time. Google just magnifies this principle by several magnitudes of order, and if you follow out the math, you’ll find that what this means is that the truth will out eventually. Bullshit is a temporizing measure at best.