After 2016: Planning for a ‘post-fact-checking’ era

The 2016 presidential race has, at long last, illustrated the clear and urgent need for journalists to be more than stenographers or writers of milquetoast, “he-said-she-said” missives that regardless won’t satisfy the true believers of Fox News’ “fair-and-balanced” fiction. The election suddenly elevated the importance of fact checking and, even, for the still-sadly small number of journalism’s boldest, to call a lie a lie.

Unfortunately, it required the unrelenting and nightmarish ugliness of an orange-faced, hate-filled, grotesquely unqualified GOP nominee to get to that point, when journalism ideally was always supposed to have started at that position in the first place.

Yes, hack reporters and yellow journalism have been with us throughout the modern era. But the gold standard for reporting — truth-telling, shining a light on the darkness, comforting the afflicted and afflicting the comfortable — has been clear from the start nonetheless.

And yet still, even some at the very top have seemed to need reminding. Why, for example, did it take Trump to refresh The New York Times’ memory on “how to write the paragraph that said, ‘This is just false,’ “? as executive editor Dean Baquet put it recently in an interview. Why should that ever have been a struggle?

Short answer: It shouldn’t have been.

Anyway, here we are, with “fact checking” suddenly in vogue, at least in some circles. But you know what? Fact checking, while important, isn’t enough and won’t be enough going forward into what looks to be a difficult future. Because it’s always been the case for some people that the facts don’t matter.

“Maybe we didn’t know the details,” said Burt Lancaster’s character Ernst Janning in the 1961 film, “Judgment at Nuremberg.” “But if we didn’t know, it was because we didn’t want to know.”

And therein lies the problem with an emphasis on fact checking alone: For various other reasons, the facts — no matter how many you might assemble and muster to your defense — simply won’t move the needle of opinion among some (perhaps even most) people. The underlying cause of this phenomenon is something called “motivated reasoning.”

Here’s how Yale law and psychology professor Dan Kahan described it in 2011, using the related term, “motivated cognition”:

“[M]otivated cognition refers to the unconscious tendency of individuals to fit their processing of information to conclusions that suit some end or goal. Consider a classic example. In the 1950s, psychologists asked experimental subjects, students from two Ivy League colleges, to watch a film that featured a set of controversial officiating calls made during a football game between teams from their respective schools. The students from each school were more likely to see the referees’ calls as correct when it favored their school than when it favored their rival. The researchers concluded that the emotional stake the students had in affirming their loyalty to their respective institutions shaped what they saw on the tape.”

Scientists who study the way the human mind works understand that we are not fundamentally rational thinkers, however much we might believe we are. So while fact-checking is a good feature for any journalistic outlet — and I hope it continues — going forward after the election, fact-checking alone won’t be enough. Considering the brink we’ve been pushed to as a society during this election, we have a responsibility to do more.

Some writers have no idea what they’re trying to say

Is it just me, or has the dramatic shift from reading on paper to reading on a screen — and, even more dramatically, to reading on a mobile screen — had a detrimental effect on written logic, reasoning and rhetorical skills? I’ve begun thinking so increasingly frequently, especially with every online feature or thinkpiece I give up on or, after slogging all the way through, walk away from wondering, “What the heck was that all about? What exactly was the writer trying to say?”

Of course, this happens regularly on social media like Twitter, where a large number of comments are offered up for consumption with little or no context, leaving the Tweeter’s intent a mystery.

But I’ve been noticing it more and more on online news and magazine features: there are a whole lot of words and time devoted to making an argument… I guess. But what that argument is, I have no idea.

The toxic and alluring mix of truth and falsehood

Ask pretty much anyone who’s been around long enough to see a few U.S. election cycles in action, and they’ll probably agree with the statement, “Politicians lie.” And there’s plenty of evidence from the past few months alone to prove that’s true.

Lies can be useful to a would-be leader, especially one with Machiavellian tendencies. Truth-telling, on the other hand, is more generally respected — in the abstract, anyway — as an admirable quality. In reality, though, truths are often treated as “inconvenient,” politically incorrect or downright unwelcome.

One example of this that’s recently gotten much (belated) attention is the powers-that-be reception given to It’s Even Worse than It Looks: How the American Constitutional System Collided with the New Politics of Extremism, a book by the Brookings Institution’s Thomas E. Mann and the American Enterprise Institute’s Norman J. Ornstein. The 2012 book made a strong case that, among other things, the mainstream political “wisdom” that the Republican and Democratic parties were equally dysfunctional, was a fiction. Of the two key sources of U.S. leadership woes, Mann and Ornstein wrote, one was the face that “one of the two major parties, the Republican Party, has become an insurgent outlier — ideologically extreme; contemptuous of the inherited social and economic policy regine; scornful of compromise; unpersuaded by conventional understanding of facts, evidence, and science; and dismissive of the legitimacy of its political opposition.”

In other words, Mann and Ornstein argued, both parties were not equally to blame for the mess in Washington: the Republican party had alone jumped the shark and gone mad.

The reception to this premise in the political and media halls of power: nothing but the chirp of crickets and a collective turning of heads to look anywhere but at the truth unmasked. The top political shows aired every Sunday suddenly appeared to “lose” Mann and Ornstein’s phone numbers.

The temptation of easy vs. dangerous complexity

To anyone (like me) who remembers what daily life was like in pre-Internet/pre-mobile phone days, the technological innovations of the past 20 years or so appear — no, are — mind-blowingly wonderful. And new developments just on the horizon — artificial intelligence in particular — sound almost magical in their potential capabilities.

Indeed, to the vast majority of us who don’t really understand how the cellphones in our pockets, the tablets on our laps or the apps on our mobile devices work, these technologies are practically magic in action. For the most part, we don’t question what complex inner workings enable them to let us do the things we do … we’re just glad we can do them.

But for all the benefits our tools and gadgets bring us today, their built-in and opaque-to-most-of-us complexity also brings danger. Our growing ignorance about how the things we rely on work makes us vulnerable to being misled by people who actually do understand their complexity … or to becoming marks for people who exploit that ignorance.

You don’t have to look hard to find examples of that danger today. Look, for example, at the battle now coming to a boil between Apple and the FBI.

Apple has made customer security via end-to-end encryption a cornerstone of its business model. The company’s message, since iOS 8 was released, has been: ‘Your iPhone’s data will be safe from hackers, thieves and even us, because you and only you hold the password to unlock it.” Implicit in that message is the underlying idea that, ‘You don’t have to understand HOW iPhone security works, just THAT it works.’

The FBI’s experts, on the other hand, understand both HOW it works and THAT it works, and now — through a court order compelling Apple to break the security of one iPhone, the one the FBI is holding that belonged to San Bernardino terrorist Syed Rizwan Farook — they want Apple to show them how to make it NOT work.

Easy, right? People like Donald Trump want you to believe so. But, as with almost everything, the reality of what the FBI is asking here is far more complex than many want you to think. Digital forensics expert Jonathan Zdziarski explains the whys of that complexity in this sober and well-reasoned blog post better than almost anybody else whose analysis on this case I’ve read.

Consider too that U.S. Congressman Ted Lieu (D, California), who is an actual expert on computer science (one of only four in Congress), has come out on Apple’s side in this case, arguing that “weakening our cyber security is not the answer.” As a computer expert, Lieu understands that forcing Apple to break the security of this one iPhone threatens the potential security of all of our devices … and not just through snooping by the FBI alone. This move lets a genie out of the bottle that can’t be put back in, and there are many other actors — repressive governments, hackers, thieves, terrorists and so on — who will be eager to call on that genie for their own purposes.

Asserting that this isn’t so, and suggesting that this is an “easy” and one-time-only request for Apple, sounds sensible if you don’t understand — or choose to ignore — the complex nature of today’s technological systems, software, encryption methods and security applications. And, because things ARE so complex, most people don’t fully understand. Even people who think they do because they understand a little bit of technology (this Salon Q-and-A is a good example of that) don’t get it in the way that the technology experts get it.

The problem is that the people who don’t really get it also include legislators who make laws on such issues, judges who rule on such issues and politicians who see opportunity in exploiting those issues.

What’s the solution? Better education is definitely one answer: if more people learned even basic science and coding, they’d be less susceptible to junk science and junk tech talk. Another answer lies with better communication by the people who understand best the ins and outs of our complex technologies; we need more voices who can speak clearly and compellingly to the public about tech like Carl Sagan did and Neil Degrasse Tyson does today for science.

It’s important to remember that, while “easy” is tempting, dealing with complexity — in technology, in science, in medicine, in life — is hard. We need to embrace that difficulty, not look for easy fixes where none exists.

To be human is to migrate

To be human is to migrate. More than any other species, we are a uniquely migratory creature.

Since homo sapiens first appeared on the ancient savannas of Africa, we have wandered, spreading in fits, starts and great migratory waves across and into every continent on the planet. Even Antarctica is today stamped indelibly with the evidence of our existence.

The very first migrations were unhampered by competing cultures or civilizations: the only obstacles, and there were plenty, were other species — dire wolves and smilodons and cave bears and rampaging mastodons — and the vicissitudes of water, wind and weather. But as our species ranged the Earth, settling in hospitable corners wherever we could find them, the chances of one wandering group encountering another already settled grew. It’s easy enough to imagine the encounters did not always end well, and paleontology has confirmed such imaginings.

Mass graves in Europe from the Neolithic period tell grisly tales of what happened when one band of humans came into another band’s space: broken legs, smashed skulls, whole villages wiped out. When every day’s survival was so tenuous, when every precious morsel of food was so hard to come by, people — like any other species, really — were not inclined to be hospitable to perceived competitors.

Today, so many people in so many places are rallying for their borders to be shut, for fences and walls to be erected, for their kind to put a stop to the “hordes” and “swarms” of others that are not their kind. In the U.S., some counter these arguments by pointing out that — unless you are a Native American — you too are descended from people who came from somewhere else. Go back far enough in time, though, and even the Native Americans “came from somewhere else,” walking to a new land from Siberia across a narrow land bridge that no longer exists. And those ancient Siberians, likewise, came from other parts and — like all of us — can trace their origins back to those first homo sapiens on the savanna.

Anywhere beyond the boundaries of that cradle of humanity, we are all — if we’re honest with ourselves — refugees.

The truth matters… because it always wins

Economists and laptop philosophers still argue over their varying interpretations of John Maynard Keynes’ 1923 comment in “A Tract on Monetary Reform”: “In the long run we are all dead.” But the non-economic-theory, real-world validity of his statement is indisputable: No one here — as Jim Morrison observed — gets out alive.

Reality, on the other hand, survives everything. Call it what you will — the laws of nature, cold hard facts, the truth — there are certain things on this planet and in this universe that prevail because that’s the way things are. No matter what “narratives” or “spin” we try to impose on such reality, no matter what religious, political, ideological or philosophical lens we try to understand it with, actual reality is impervious to our efforts. The best we can aim for is to understand that reality for what it in fact is, and to find reality-based ways to deal with it to make our existence better rather than worse.

Nobody benefited, for example, from labeling AIDS (or, for that matter, ebola or the plague or leprosy or ergot poisoning) as a “punishment” or a “judgment” by God for some perceived sin or moral failure by the person suffering from it. Our collective existence improved only through our efforts’ to understand those maladies for what they in reality were … and then working to find real solutions to treat those maladies.

It’s arrogance to believe anything else.

Yes, our experience of reality is colored by each of our unique perceptions and beliefs. And that understanding can drive us to do many things that either help or harm ourselves and others. But we do not, as Karl Rove reportedly told New York Times Magazine writer Ron Suskind in 2004, “create our own reality.” That type of so-called reality exists only in our own heads (and, oh boy, can that mental reality mess us up — see, for example, “mass psychogenic illness”). The reality of the real world — sooner or later — always comes crashing down on that mental simulacrum.

That’s why we would be wise to, for instance, approach the “existential” threat posed by terrorist movements today with a sober assessment of what solutions have been shown to work best with similar threats in the past. (And, yes, there have always been similar threats in the past: while the capabilities for mass destruction and obscene amounts of human carnage have advanced, tragically, along with our advancing technologies, the fundamental motivations driving such movements haven’t changed much over the centuries.) Studies of the most effective real-world solutions — such as a 2008 RAND analysis of the eventual fate of 648 terrorist groups between 1968 and 2006 — have found that police work, intelligence and even political talks have had much more success than going to “war.”

“All terrorist groups eventually end,” the RAND study stated. “The evidence since 1968 indicates that most groups have ended because (1) they joined the political process or (2) local police and intelligence agencies arrested or killed key members. Military force has rarely been the primary reason for the end of terrorist groups, and few groups within this time frame achieved victory.”

In that light, it should be clear that actions based on emotion, hidden agendas and propaganda are not the best way forward. We should all try to remember that the next time some leader or would-be leader tries to get the public fired up for some costly and painful battle of “good” versus “evil.”

Similarly, we as a species would do ourselves — and many other species with which we share this planet — a favor by acknowledging there are other truly existential threats that can’t be wished or spun or shouted away just because some of us don’t like the societal implications of cold hard facts.

Fact: global atmospheric concentrations of carbon dioxide have been rising steadily since the start of the Industrial Age.

Fact: that extra carbon dioxide has come largely from human activities ranging from coal-fired power and the internal combustion engine to the skyrocketing amounts of animals we raise and kill for meat.

Fact: carbon dioxide — like other increasingly abundant and human-generated gases like methane and nitrogen oxides — contributes to a planetary greenhouse effect whose mechanics have been well understood by scientists for more than a century.

Fact: the global average temperature has been rising for well over 100 years now.

Whether we accept those facts wholeheartedly — or try tooth-and-nail to deny them — doesn’t matter to the facts. The reality is that those facts, unless we choose to act on them in a reality-based way, will have — already are having — many impacts on our planetary system that will radically change the relatively stable climate that humanity evolved in. We have a choice in what the future reality will be, but — whatever we choose to do — that reality will come crashing down on us all the same.

Reality has a way of doing that.

The next worldview revolution

In one way, it’s a bit unfair to harshly blame a company like Exxon for doing what investigative reports say it did: work hard for years to promote climate change denialism even as its own scientists had concluded that climate change was both real and dangerous.

Exxon and others that have so vigorously muddied the waters of public sentiment on the need for climate action recognized the science for what it was: a worldview-shattering truth. And such truths have always tended to be welcomed at first with an impulse to kill the messenger.

They also have tended to need time — lots of it — to percolate through society’s belief defenses and eventually be accepted by a critical mass of people. Think about the Copernican view of the solar system, or the ideas about individual agency and human governance that emerged during the Enlightenment. These worldviews took time to win over a meaningful proportion of society, in particular, society’s leaders and decision-makers.

Or look at an idea like the Big Bang, which even now is questioned by some supposedly smart and well-educated people like Republican presidential wanna-be Ben Carson.

The problem with climate change, though, as opposed to so many of those other worldview revolutions of the past, is that we probably don’t have the amount of time other shattering ideas needed to win acceptance. The implication of the science of climate change, as Exxon and other active deniers have intuitively recognized, is that the system of incentives and rewards that has worked so well for us up until now — that is, capitalism — is now revealed as inadequate to the challenges of today. That means it needs to be replaced by a different system… and a lot of people really, really don’t want that.

Imagining a better internet

“O what a tangled web we weave, when first we practice to deceive.”

I was a voracious reader and super-curious kid all through school: I would have been over the moon if I had woken up one morning as a 12-year-old to discover a magic typewriter on my desk that could help me find the answer to almost any question imaginable, show me photos and moving pictures of distant lands and planets, let me listen to spoken languages from anywhere in the world and almost instantly learn about news events anywhere on the globe.

That vision was the promise of the internet at its best. While we do have all those good things today, we have more besides… much of it vile or cynical, manipulative or vicious, intrusive or inflammatory, wicked or deceitful. In a way, it’s a digital near-simulacrum of the physical-world human condition… as if every person, corporation and other institution on the planet had a three-dub doppelganger.

But it’s not, really. The Web remains a playground, library, town commons or battlefield for only a select portion of humanity… many still have no voice on it — not yet, at least — and even most of those who do hold little sway or command much interest.

Big-data hubris makes us lose sight of real people

It was bad enough when workers were first labeled “human resources.” It became even more disturbing with the more recent term, “human capital” … as if the many individuals who work for any one company amount to little more than numerical entries in a vast, digital ledger somewhere.

Now, the dizzying advances in big data and analytics are making it easier every day to categorize human beings — their online browsing histories, purchasing histories, search habits and responses to carefully calculated web-based stimuli, positive or negative — as little more than data points on a vast spreadsheet. Minuscule blips in a database. A rounding error, even. And when that starts becoming habitual, strange and worrisome things can happen.

Things like militarized “social science” that, with the vast amounts of data now at its disposal, can comfortably characterize social movements and public-led demands for change as “social contagions.”

Things like Facebook’s — and academia’s — recent study of the impacts of positive or negative “emotional contagion,” conducted on nearly 700,000 online subjects without their knowledge or acquiescence.

And things like Facebook data scientist Andrew Ledvina’s description of “all of this hubbub over 700K users like it is a large number of people.”

In the real world where people still daily participate in one-on-one, face-to-face personal encounters and expect to be treated as unique individuals, that comment comes off as extraordinary hubris and conceit. In the real world, 700K (700,000) people is a mind-bogglingly large figure. It’s thousands of times greater than Dunbar’s number, the theoretical limit (estimated at anywhere from 100 to 250) on the number of people with whom any one individual can maintain stable social relationships.

In a less abstract realm, it’s a far greater number of customers than many small-business owners could ever hope to serve. And it’s way more than the greatest number of students even the most dedicated and hard-working teacher could expect to personally work with in a real-world, physical classroom over a lifetime.

It’s only in the rarefied air of today’s massive online social networks and equally massive global corporations that anything that affects 700,000 people could be described as a tempest in a teapot.

When a scientific research project messes with the head of even one person unknowingly, and without his or her approval, it is a very big deal. And when 70 or 7,000 or 700,000 people speak out publicly, rally in groups or sign petitions in an effort to change things they believe should be changed, it’s not a “contagion.”

The day you lose sight of the fact that those 70 or 7,000 or 700,000 datapoints represent living, breathing, individual human beings who could be someone’s sister, son or neighbor — that is, someone with rights and responsibilities equal to your own — is the day you start down a very slippery and dangerous slope.

Digital journalism: A cautionary tale

The brave new world of digital journalism can be more dystopian than utopian, as author Tony Horwitz recently discovered. In his op-ed in the New York Times, Horwitz recounts that sorry tale of months of hard work and unpaid efforts that all but disappeared down the drain.

“As recently as the 1980s and ’90s, writers like me could reasonably aspire to a career and a living wage,” he writes. “I was dispatched to costly and difficult places like Iraq, to work for months on a single story. Later, as a full-time book author, I received advances large enough to fund years of research. How many young writers can realistically dream of that now?”

Book-writing and book-publishing have always been challenging enterprises, with thousands of authors languishing with low pay and low sales — if they were lucky to be published at all — for every big-name best-seller who made millions. If there’s any lesson to be learned from Horwitz’s experience, it’s that digital publishing and the potential of quickie e-books hasn’t necessarily changed that reality for most writers.