trust
Throwbacks stuck in the ’80s seem to have a hard time accepting the Brave New World™ we find ourselves in. I’m not preaching some magical transformation of human nature. It’s just that the game has changed. There’s a transition under way, and we are slowly weaning ourselves from the past.
The ’80s was all about greed and self-interest. It was a simpler time. We had an archetypal enemy—Soviet Russia—and defeat meant succumbing to totalitarianism or being immolated in a nuclear holocaust. The rule of the day was every man and woman for themself, and you either got rich, or you ended up homeless. And as if we were trapped in some Calvinistic time warp, the former were assumed to be virtuous, the latter were assumed to have some flaw in their character. Alex P. Keaton and Gordon Gekko were the heroes of the decade. Ronald Reagan was their God.
But you can’t really sustain prosperity when only 2% of the population has almost all the wealth. You only need so many washing machines, you know? And when the economy goes to hell, whoever is at the helm takes the blame. It doesn’t help that trickle-down economics is a mad fantasy, and that its natural trajectory is into the shitter. So Bill Clinton took up the mantle, ushering in a new era.
I’m not entirely convinced that it was complete coincidence that the Internet happened to take off during the Pax Clintonia. 1994 was a seminal year, and the seeds for the change were planted deep. A disjunction was inevitable.
You could argue that the Information Revolution really started in the early ’80s, when the personal computer came to the fore. But if you examine the microcomputer culture, it really was a relic of the 1960s UNIX hacker age, replete with communal tendencies. We spent hours distributing copies of software to each other, legal or not. While we never laid down the legalese like Stallman did, we accepted a culture of share and share alike. Code was exchanged freely. We chatted on rudimentary proprietary social networks and on the lonely frontier of the BBS.
It was a wonderfully chaotic age, when multiple brands of computers each had part of the market, and developers dutifully released their games to each of them. Commodore, Atari, Apple, Tandy/Radio Shack, IBM. Those were the giants of the era.
Ironically, it wasn’t until one of the first technological disruptions came to the fore that Bill Gates decided to try and lock down the OS market. The first IBM clone was released in the late ’80s, allowing anyone with a few million dollars to contract with Intel and create their own computer that ran the vast amount of PC-DOS/MS-DOS software. The proprietary all-in-one computer was essentially dead (although I kept using my C64 until the early ’90s) and Apple (with the Macintosh) and Commodore (with the Amiga) clung tenuously to their scant market share, not yet making the transition from 16-bit to 32-bit. Intel’s 80386 secured the supremacy of the IBM-compatible platform. It was the only 32-bit CPU available to consumers.
The sad thing was that IBM architecture was the least advanced when it came to multimedia. Even with the C64, you already had four-voice polyphonic sound. The Macs and the Amigas of that era would not be surpassed by IBM-compatibles until well into the ’90s, when multimedia became the Next Big Thing™ At the time, IBM-compatibles had EGA displays and a pathetic speaker. Oh, VGA was within reach, and you could always get an AdLib sound card, but these were extra, whereas on the Mac and the Amiga, advanced display capabilities and sound were built-in.
We finally entered the world of the GUI wholesale with the release of Windows 3.0, finally (and barely) catching up with the interfaces of the Mac and the Amiga. But, hell, even the C64 could run a GUI in those days.
The world of software was no longer the unregulated free-for-all of the early ’80s. EULAs and copyrights were the rule of the day. Businesses were audited by the Software Protection Agency, and pirates were hung out to dry. IBM-compatibles came at best with an antiquated version of BASIC, and that was the best you were going to get for free. While BASIC was good enough for the C64 and the Apple IIc, an entire decade had already elapsed, and the thought of running an interpreted language on a 33 MHz machine was laughable. It would be a little while before gcc was dutifully ported to first MS-DOS, then to Win32. Eventually, Microsoft released a crippled, interpreted version of QuickBasic with the newest versions of MS-DOS, but that was even clunkier, and you certainly couldn’t write Windows apps that way.
Then 1994 happened. The sensational news was the release of NCSA Mosaic, but deep underground, the Open Source movement was gaining momentum.
A lot of people learned the wrong lessons from the rupture of the first tech bubble. The conventional wisdom was that we were going to revert to old business practices, and that an Internet-based economy was a pipe dream. Enter 15 years later, and boy, was conventional wisdom ever wrong. In spite of the bubble popping, or perhaps precisely because of it, the Open Source movement came to the foreground, led by Red Hat at the time. In 1998, Linux was already a viable replacement for Windows 98, and it crashed far less.
We entered a new culture of development, one that could not rely on top-down pronouncements from on high. It became a messy, consensus-seeking dramafest, with lots of shouting, and lots of forking.
Democracy in action.
Or as ESR put it, the bazaar became pre-eminent.
Except this wasn’t a new culture, but the old culture from the ’60s hacker days writ large. Level of contribution, fame, and importance was measured by your intelligence, and how much you contributed. An imperfect meritocracy developed. Sociologists of the day liked to use the term “gift culture”, and this was certainly a consideration, but they failed utterly to document what was really going on: trust became the new currency.
If something came from RMS or ESR or Linus Torvalds or Alan Cox, you knew it was going to be good. Names became saleable commodities. Branding had always been important, but now branding was the important thing. The customer’s trust was the most important thing you could gain, and if you could manage to keep their trust, they were yours for life.
This is the path Google took. Why did Google end up entering the Modern English language as a verb? Because we learned to trust its search results. That trust translates into the tune of hundreds of millions of dollars. The moment we stop trusting those search results would mark the beginning of the end.
When Steve came back to town and took the helm of Apple again, he rode this lesson to its logical conclusion. Why do Mac users keep coming back, even as they complain about crashes and clunkiness? Because they know what they’re going to get from Steve. Everybody thought that the iPod was a futile gesture, a Quixotic tilting at windmills. Instead, the iPod is the new Walkman. Even as micro-hard drives fail and iPods crash and become unusable, we still keep coming back.
The flip side of trust is that we also all learned that Windows sucks. Reboot, reformat, and reinstall was the reliable mantra. Not crashing was a miracle. We learned that trying to install anything that might improve the usability of your Windows box was liable to make it unbootable, so we stayed stuck in this awful limbo of crap functionality and inability to upgrade said functionality without Microsoft holding our hands and pushing out a new service pack. Windows suckitude had become so reliable that when XP finally came out, we were shocked that it wasn’t half-bad. Oh, sure, it could be taken over by malware writers in a few minutes after connecting to the Net, but the fact that it could even connect to the net without multiple reboots and animal sacrifice was a minor miracle. Even now, a lot of people distrust the notion that XP is not utter crap.
Trust also comes to fore when it comes to the atrocious phenomenon of spam. Now, some people are just too trusting, and probably shouldn’t be trusted with complete freedom, but I’m not the boss of other people, so I have to live with morons actually buying crap from spammers.
And even though spammers and the shady companies that they work for make a ton of money with their unethical business practices, it is conventional wisdom that spammers are the scum of the earth, on the level of crack dealers and pedophiles. We wouldn’t want to interact with a spammer in meatspace, that’s for sure. So anyone who is a spammer has to keep their identity hidden, like all those high-end prostitutes and CIA spooks. Hell, we’ve even managed to criminalize spam. Who’d’ve thunk it?
A spammer is pretty much at the bottom of the trust pile.
So we’re in an age where branding and maintaining trust is critical. Gone are the days where you would be trusted just because you were an authority. People demand evidence, and corroboration. Hence, smart scientists are publishing their results not just to traditional journals, but to open-access ones too. We have websites that allows us to rate everything from the last movie we watched, to the grocery store we shopped at, to what we think of our primary care physician. We search Google carefully before we buy big-ticket items.
I think it’s harder to scam people or to bully people these days. You have to earn trust, and coercive behavior only makes you lose trust.
Being a selfish asshole is not the selective advantage it used to be. While we haven’t magically turned into ideal communists singing “Kumbaya” and “Imagine” by the campfire, cooperation earns you more trust than cut-throat, back-stabby competition. Being able to navigate reasonably well in a social situation—either in meatspace or in metaspace—without appearing like a pervert or a serial killer is part and parcel of earning trust.
Fucktards get exposed on the Internet. And Google remembers everything. You can’t hide under the cloak of authority any more. While I’m not completely high and think we have complete transparency, I believe it is now easier for the common person to make sure that someone isn’t trying to pull a fast one. The perhaps unearned authoritative voice of traditional media has been diluted considerably. And I think the Internet deserves a lot of the credit for preventing the neocons from turning the U.S. entirely into some Orwellian nightmare. They were close, but we dodged a bullet. Now even Google knows that W is a miserable failure.
I’m not so drunk on the Kool-Aid to believe that the Internet is going to solve all the considerable problems we’re facing. After all, the economy is in the shitter once again. Rogue states like North Korea have nuclear weapons. Climate change is going to give new, tragic meaning to the phrase “water wars.” Trust in American Democracy took huge hits in the last two elections, thanks to the Supreme Court and Diebold, respectively. W is the first president since James Madison to actually have an entire American city destroyed completely under his watch. We all know what’s still going on in New Orleans, no matter what sort bullshit the traditional media is trying to feed us.
But for once, I’m hopeful. Today, the Republican Party that Nixon and Reagan built is pretty much in disarray. Its brand is tarnished beyond all recognition, and about the only thing we trust will come of it are unhinged religious fundamentalists who want to abolish science, homosexual men who are in denial (such as Larry Craig), and all manner of obscene corruption (such as Enron and Duke Cunningham.) Only a completely deranged ass-monkey would trust a Republican farther than they could kick them.
And remember this: trust is the basis of hope. If you can’t trust anything, there’s no way you can be hopeful, and without hope, you’re doomed to stagnation and eventually death. Hope is a lot more than an empty platitude that politicians like to throw around. Hope is pretty much the operational basis of frontal lobe function. One of the major differences between humans and non-verbal animals is the fact that we are able to imagine what isn’t. It isn’t just a random amalgamation of buzzwords when people speak about vision and mission statements. Without vision, there’s no way to direct your movement. If you’re not forward-thinking, then you’re barely functioning beyond the level of a non-verbal animal. We are motivated and driven by the hope of turning what does not exist into actual reality. This concept is more commonly termed creativity.