Recently, I’ve seen far too many attempts to distinguish “good” news sources from “fake” and biased ones through the use of blacklists, usually long lists of news sources identified as suspect with no explanation or evidence provided, and often in a suspiciously even mix of left-wing and right-wing sites (i.e., “both-siderism” clearly at work).
This is altogether the wrong way to help people determine which news reports are most reliable. For one, as seen with Twitter trolls, identifying one by name might lead to that account being suspended, but the same person can quickly and easily set up another new account with a completely different name and start dishing out the same bile again with little interruption. Distinguishing good from bad with static lists is, to mix metaphors, a Sisyphean task of whack-a-mole.
Another problem: “good” news sources can — and have — disseminated “bad” and “fake” news: See, for example, the rush to drum up the case for the 2003 U.S./coalition invasion of Iraq. Many so-called mainstream and respectable news outlets — most of them, actually — swallowed the Bush administration’s arguments hook, line and sinker… and repeated them with few questions. Their reporters and editors kept their jobs. On the other hand, TV host Phil Donahue, who expressed his opposition to and doubts about the case for war, was fired from MSNBC. Jingoistic and unskeptical reporting back then are to blame for the many who still today believe Iraq really had weapons of mass destruction before the invasion… It didn’t.
No, what’s needed instead of lists is a method for helping readers (and media members themselves) better judge the quality of the information they find on any site. In other words, we need an evidence-supported means of establishing provenance for each purported fact reported in an article, wherever it is. Call it the scientific method for journalism.
What would that look like? That’s something I intend to explore on these pages in days and weeks to come. Stay tuned.
Funny thing about “fake news” and propaganda: no matter who spreads the misinformation or how widely it’s believed, the “reality-based world” and rules of nature still ultimately prevail.
No matter how many people wanted to blame medieval Europe’s Black Death on Jews, Romani or lepers, scientists today understand that the pandemic was caused by bites by infected fleas carried on rats.
No matter how many government leaders in the U.S. and elsewhere chose to believe that reports about concentrations camps and gas chambers were just war propaganda, millions died just the same.
No matter how much the Westboro Baptist Church says AIDS is a divine punishment for homosexuality, the disease also afflicts straight people — not to mention children — without discrimination.
And no matter how desperately climate-change deniers and those who profit from fossil fuels want to muzzle research linking higher carbon dioxide levels to rising average global temperatures, ocean acidification and a raft of other harmful consequences, CO2 will keep accelerating the greenhouse effect the more of it we pump into the atmosphere. Because science.
The problem is, as with so many other examples from history where reality-averse, ignorant or Machiavellian humans tried to persuade their followers that the sky was not up and the ground wasn’t down, hostility to facts can leave a lot of damage — and a lot of bodies — in its wake.
This is the reality-based world: The greater the concentrations of carbon dioxide and methane in the atmosphere, the more of these molecules there are to absorb and re-emit infrared solar radiation. The more carbon dioxide taken up by the oceans, the more seawater’s pH levels decrease, which means the more corals, shellfish and other calcium-carbonate-producing lifeforms are susceptible to dissolving.
The 11th-century Scandinavian King Canute, the old story goes, understood the powers of nature and reality when he demonstrated to his courtiers that even he — a powerful ruler at the time — couldn’t order the ocean tide to stop getting his feet wet. In historian Henry of Huntingdon’s telling of the tale a century later, Canute was then reported to have said, “‘Let all men know how empty and worthless is the power of kings, for there is none worthy of the name, but He whom heaven, earth, and sea obey by eternal laws.”
Yet here we are, a millennium later, hearing new kings — would-be, imagined and otherwise — asserting the exact opposite: that they are not bound in any way by the laws of heaven, earth and sea… or reality in general.
How do scientists, fact-checkers and others with deep expertise deal with this, the election of a profoundly incurious president with little respect for the truth and aggressive hostility to views that differ from his? What, now, do the real experts do to try and preserve society’s grasp on reality?
Finding answers that work is critical now more than ever. Because our world is an inordinately complicated puzzle of countless pieces: seven billion-plus humans, anywhere from 20 to 75 billion Internet-connected devices and sensors (everything from your computer and smartphone to your smart TV and the security surveillance cameras in your stores, workplaces and other locations) and an uncountable number of other living plants and creatures that share this planet with us. And the world is growing only more complicated with every passing day.
It’s not just the threats to polar bears and indigenous villages in the Arctic, as if those are the only impacts climate change could have. (Hint: Think impacts on hunting and fishing,“sunny-day flooding”, drought and wildfires, crop failures and more regional conflicts such as the years-long bloodbath we’ve been seeing in Syria).
It’s also massive hacks that can take down major websites like Amazon, Reddit and Netflix and are now possible through unregulated vulnerabilities in the fast-expanding Internet of Things.
It’s artificial intelligence and machine learning that are advancing so quickly even the experts have a hard time keeping up, much less the public officials who are tasked with updating regulations on information technologies.
It’s the growing risk of antibiotic resistance, which one study predicts could cause 10 million deaths every year by mid-century… more than cancer causes. Meanwhile, other diseases that could spread widely in the right (or wrong) conditions — ebola, zika, chikungunya, etc. — remain extremely difficult to treat, and leave lasting and devastating impacts on survivors and/or their babies.
Candidate Donald Trump asserted, all evidence to the contrary, that he knows “more about ISIS than the generals do”. Pre-Brexit, Britain’s justice secretary Michael Gove pooh-poohed the vast number of economists warning about the potential fallout of an EU exit, by declaring that citizens “have had enough of experts”.
But it’s certain that, were Trump or Gove or anyone else who believes such things to become seriously ill, each of them would consult an expert — a trained medical specialist — to seek relief and cures (though, admittedly, Trump has made some unusual choices in this realm). We know they depend upon trained technicians to maintain and update their websites and social media presence. They do not tap random strangers on the shoulder on the tarmac to pilot their airplanes to their next destination. In so many things, like the rest of us do, they depend upon the knowledge and expertise of experts.
But not now? To help guide the most momentous decisions possible that will affect millions and billions of other human lives? For these decisions, Trump has already made it clear he prefers a rogue’s gallery of “leaders” who have failed at the basic tasks of leadership. Throughout his campaign and long before, he also made it clear that he has little understanding of the realities of climate change, the value of vaccinations, the true challenges of cybersecurity (despite the fact he apparently benefited greatly from aggressive Russian cyber-shenanigans), etc.
The system is broken. This is indisputably true. But no one thing or person broke it.
Yes, the Republican party has been sowing the seeds for this day for a very long time. Nixon’s Southern Strategy helped. So did the Powell Memo. And Reagan’s 1980 “states’ rights” campaign kickoff speech outside the Mississippi town of Philadelphia, where three young civil rights workers were murdered in 1964. Dog whistles and outright appeals to racist thinking set the stage for this moment.
But so did a string of damaging-to-democracy Supreme Court rulings, from the 2010 verdict on Citizens United v. FEC to Justice John Roberts’ decision to join the 2013 5-4 ruling on Shelby County v. Holder, which set loose a line of tumbling-domino laws across numerous U.S. states that aggressively reduced the ability of some citizens (of color, typically) to vote. (Esquire’s ever-brilliant Charles P. Pierce traces the origins even further back, to the 6-3 2007 decision on Crawford v. Marion County Election Board et.al that enabled Indiana to begin clamping down on the franchise.)
There was also the decision in 1987 — by a Reagan-appointee-dominated Federal Communications Commission — to abolish the Fairness Doctrine, which required broadcasters to adequately cover issues of interest to the public. And, of course, the 1988 withdrawal of the non-partisan League of Women Voters as sponsors of the presidential debate, following collusion between both candidates (George H.W. Bush and Michael Dukakis) to take control over the choice of debate questioners, press access, audience composition and more.
The rise of, first, cable TV and then the Internet played a part as well. Both helped simultaneously to suck dry newspapers of the ad revenues that had sustained print journalism for decades, and to also continually speed up and cheapen the quest for “news,” turning it from a recognized public good into a commodity that could be sliced, diced and sold to the highest bidder. These developments led eventually, inexorably, to the rise of things like large numbers of Macedonian spammers creating Trump-friendly “news” sites to rake in click-generated bucks through Facebook ads. Of course, Rupert Murdoch and Fox News also have much to answer for. The late and much-missed Mike Royko saw that one coming many many years ago.
And then there were the current election cycle’s Neville Chamberlains — so, so many of them — eager to appease a click-generating, eyeball-grabbing candidate for any number of reasons, just to ensure the clicks and the eyeballs kept coming, or to secure themselves a place at the trough of power down the road. It’s clear why the Chris Christies, Rudolph Giulianis, Newt Gingriches and others like them did it. But how might it pay off (or not) for the likes of Paul Ryan, John McCain, John Thune, Deb Fischer and so many more who unendorsed and then cravenly endorsed again?
There were craven, ratings-obsessed entertainers too, like the playfully hair-ruffling Jimmy Fallon.
For some in the media, meanwhile, it was the labor-affirming desire to keep the 2016 contest a “horserace,” no matter how unequal the two candidates’ qualifications actually were. For others, it was journalism’s perverse faith in what Jay Rosen has called the “view from nowhere,” which held that reporters need to stay impartial by not taking sides, even if one source claims the Sun revolves around the Earth (demonstrably false) while the other states the scientifically established fact that the Earth revolves around the Sun (see: pretty much every mainstream news article on the science of climate change). There was also the press’ desire for “refuge,” along with the all-too-frequently observed sin of false equivalence, which said if Candidate A’s supporters yelled “Boo, hiss!” at Candidate B, that was just as bad as Candidate B’s supporters yelling the Nazi slur “Lügenpresse” at reporters. Sadly, I count the usually excellent Matt Taibbi among the recently guilty here.
The mainstream media’s late-to-the-party dedication to fact-checking was welcome, but — by then — pathetically tardy and inadequate for the task at hand. By the time the New York Times’ Dean Baquet, among others, realized they really needed to point out to readers when something was “just false,” the U.K.’s pro-Brexit team had already helped win the day in Britain with the help of declarations from the likes of Michael Gove that the “people in this country have had enough of experts.”
A large number of people in the U.S. have now apparently agreed with this, throwing into jeopardy everything from meaningful action on the well-established science of climate change, a concept barely mentioned in mainstream election coverage, to fact-based strategies for securing cyber space and the ever-expanding Internet of Things.
So, yes, examining the 2016 election results in the light of day clearly identifies plenty of people and practices to blame beyond those in the Trump camp alone. The question now is, where do we go from here? To once again begin assigning a fair value to facts, truth, equity and, for goodness’ sake, fairness, we need to dig deeper and think bigger. Building an atmosphere where the majority of us can, once again, agree at least that there are basic facts and fundamental descriptions of reality upon which we all agree, will take nothing short of a large-scale truth-and-reconciliation campaign across the country. No side in this debacle of an election can afford to give up on this goal. From equal rights and human rights to digital security and a livable climate, the stakes are too high to surrender now.
The 2016 presidential race has, at long last, illustrated the clear and urgent need for journalists to be more than stenographers or writers of milquetoast, “he-said-she-said” missives that regardless won’t satisfy the true believers of Fox News’ “fair-and-balanced” fiction. The election suddenly elevated the importance of fact checking and, even, for the still-sadly small number of journalism’s boldest, to call a lie a lie.
Unfortunately, it required the unrelenting and nightmarish ugliness of an orange-faced, hate-filled, grotesquely unqualified GOP nominee to get to that point, when journalism ideally was always supposed to have started at that position in the first place.
Yes, hack reporters and yellow journalism have been with us throughout the modern era. But the gold standard for reporting — truth-telling, shining a light on the darkness, comforting the afflicted and afflicting the comfortable — has been clear from the start nonetheless.
And yet still, even some at the very top have seemed to need reminding. Why, for example, did it take Trump to refresh The New York Times’ memory on “how to write the paragraph that said, ‘This is just false,’ “? as executive editor Dean Baquet put it recently in an interview. Why should that ever have been a struggle?
Short answer: It shouldn’t have been.
Anyway, here we are, with “fact checking” suddenly in vogue, at least in some circles. But you know what? Fact checking, while important, isn’t enough and won’t be enough going forward into what looks to be a difficult future. Because it’s always been the case for some people that the facts don’t matter.
“Maybe we didn’t know the details,” said Burt Lancaster’s character Ernst Janning in the 1961 film, “Judgment at Nuremberg.” “But if we didn’t know, it was because we didn’t want to know.”
And therein lies the problem with an emphasis on fact checking alone: For various other reasons, the facts — no matter how many you might assemble and muster to your defense — simply won’t move the needle of opinion among some (perhaps even most) people. The underlying cause of this phenomenon is something called “motivated reasoning.”
Here’s how Yale law and psychology professor Dan Kahan described it in 2011, using the related term, “motivated cognition”:
“[M]otivated cognition refers to the unconscious tendency of individuals to fit their processing of information to conclusions that suit some end or goal. Consider a classic example. In the 1950s, psychologists asked experimental subjects, students from two Ivy League colleges, to watch a film that featured a set of controversial officiating calls made during a football game between teams from their respective schools. The students from each school were more likely to see the referees’ calls as correct when it favored their school than when it favored their rival. The researchers concluded that the emotional stake the students had in affirming their loyalty to their respective institutions shaped what they saw on the tape.”
Scientists who study the way the human mind works understand that we are not fundamentally rational thinkers, however much we might believe we are. So while fact-checking is a good feature for any journalistic outlet — and I hope it continues — going forward after the election, fact-checking alone won’t be enough. Considering the brink we’ve been pushed to as a society during this election, we have a responsibility to do more.
Is it just me, or has the dramatic shift from reading on paper to reading on a screen — and, even more dramatically, to reading on a mobile screen — had a detrimental effect on written logic, reasoning and rhetorical skills? I’ve begun thinking so increasingly frequently, especially with every online feature or thinkpiece I give up on or, after slogging all the way through, walk away from wondering, “What the heck was that all about? What exactly was the writer trying to say?”
Of course, this happens regularly on social media like Twitter, where a large number of comments are offered up for consumption with little or no context, leaving the Tweeter’s intent a mystery.
But I’ve been noticing it more and more on online news and magazine features: there are a whole lot of words and time devoted to making an argument… I guess. But what that argument is, I have no idea.
Ask pretty much anyone who’s been around long enough to see a few U.S. election cycles in action, and they’ll probably agree with the statement, “Politicians lie.” And there’s plenty of evidence from the past few months alone to prove that’s true.
Lies can be useful to a would-be leader, especially one with Machiavellian tendencies. Truth-telling, on the other hand, is more generally respected — in the abstract, anyway — as an admirable quality. In reality, though, truths are often treated as “inconvenient,” politically incorrect or downright unwelcome.
One example of this that’s recently gotten much (belated) attention is the powers-that-be reception given to It’s Even Worse than It Looks: How the American Constitutional System Collided with the New Politics of Extremism, a book by the Brookings Institution’s Thomas E. Mann and the American Enterprise Institute’s Norman J. Ornstein. The 2012 book made a strong case that, among other things, the mainstream political “wisdom” that the Republican and Democratic parties were equally dysfunctional, was a fiction. Of the two key sources of U.S. leadership woes, Mann and Ornstein wrote, one was the face that “one of the two major parties, the Republican Party, has become an insurgent outlier — ideologically extreme; contemptuous of the inherited social and economic policy regine; scornful of compromise; unpersuaded by conventional understanding of facts, evidence, and science; and dismissive of the legitimacy of its political opposition.”
In other words, Mann and Ornstein argued, both parties were not equally to blame for the mess in Washington: the Republican party had alone jumped the shark and gone mad.
The reception to this premise in the political and media halls of power: nothing but the chirp of crickets and a collective turning of heads to look anywhere but at the truth unmasked. The top political shows aired every Sunday suddenly appeared to “lose” Mann and Ornstein’s phone numbers.
To anyone (like me) who remembers what daily life was like in pre-Internet/pre-mobile phone days, the technological innovations of the past 20 years or so appear — no, are — mind-blowingly wonderful. And new developments just on the horizon — artificial intelligence in particular — sound almost magical in their potential capabilities.
Indeed, to the vast majority of us who don’t really understand how the cellphones in our pockets, the tablets on our laps or the apps on our mobile devices work, these technologies are practically magic in action. For the most part, we don’t question what complex inner workings enable them to let us do the things we do … we’re just glad we can do them.
But for all the benefits our tools and gadgets bring us today, their built-in and opaque-to-most-of-us complexity also brings danger. Our growing ignorance about how the things we rely on work makes us vulnerable to being misled by people who actually do understand their complexity … or to becoming marks for people who exploit that ignorance.
You don’t have to look hard to find examples of that danger today. Look, for example, at the battle now coming to a boil between Apple and the FBI.
Apple has made customer security via end-to-end encryption a cornerstone of its business model. The company’s message, since iOS 8 was released, has been: ‘Your iPhone’s data will be safe from hackers, thieves and even us, because you and only you hold the password to unlock it.” Implicit in that message is the underlying idea that, ‘You don’t have to understand HOW iPhone security works, just THAT it works.’
The FBI’s experts, on the other hand, understand both HOW it works and THAT it works, and now — through a court order compelling Apple to break the security of one iPhone, the one the FBI is holding that belonged to San Bernardino terrorist Syed Rizwan Farook — they want Apple to show them how to make it NOT work.
Easy, right? People like Donald Trump want you to believe so. But, as with almost everything, the reality of what the FBI is asking here is far more complex than many want you to think. Digital forensics expert Jonathan Zdziarski explains the whys of that complexity in this sober and well-reasoned blog post better than almost anybody else whose analysis on this case I’ve read.
Consider too that U.S. Congressman Ted Lieu (D, California), who is an actual expert on computer science (one of only four in Congress), has come out on Apple’s side in this case, arguing that “weakening our cyber security is not the answer.” As a computer expert, Lieu understands that forcing Apple to break the security of this one iPhone threatens the potential security of all of our devices … and not just through snooping by the FBI alone. This move lets a genie out of the bottle that can’t be put back in, and there are many other actors — repressive governments, hackers, thieves, terrorists and so on — who will be eager to call on that genie for their own purposes.
Asserting that this isn’t so, and suggesting that this is an “easy” and one-time-only request for Apple, sounds sensible if you don’t understand — or choose to ignore — the complex nature of today’s technological systems, software, encryption methods and security applications. And, because things ARE so complex, most people don’t fully understand. Even people who think they do because they understand a little bit of technology (this Salon Q-and-A is a good example of that) don’t get it in the way that the technology experts get it.
The problem is that the people who don’t really get it also include legislators who make laws on such issues, judges who rule on such issues and politicians who see opportunity in exploiting those issues.
What’s the solution? Better education is definitely one answer: if more people learned even basic science and coding, they’d be less susceptible to junk science and junk tech talk. Another answer lies with better communication by the people who understand best the ins and outs of our complex technologies; we need more voices who can speak clearly and compellingly to the public about tech like Carl Sagan did and Neil Degrasse Tyson does today for science.
It’s important to remember that, while “easy” is tempting, dealing with complexity — in technology, in science, in medicine, in life — is hard. We need to embrace that difficulty, not look for easy fixes where none exists.
To be human is to migrate. More than any other species, we are a uniquely migratory creature.
Since homo sapiens first appeared on the ancient savannas of Africa, we have wandered, spreading in fits, starts and great migratory waves across and into every continent on the planet. Even Antarctica is today stamped indelibly with the evidence of our existence.
The very first migrations were unhampered by competing cultures or civilizations: the only obstacles, and there were plenty, were other species — dire wolves and smilodons and cave bears and rampaging mastodons — and the vicissitudes of water, wind and weather. But as our species ranged the Earth, settling in hospitable corners wherever we could find them, the chances of one wandering group encountering another already settled grew. It’s easy enough to imagine the encounters did not always end well, and paleontology has confirmed such imaginings.
Mass graves in Europe from the Neolithic period tell grisly tales of what happened when one band of humans came into another band’s space: broken legs, smashed skulls, whole villages wiped out. When every day’s survival was so tenuous, when every precious morsel of food was so hard to come by, people — like any other species, really — were not inclined to be hospitable to perceived competitors.
Today, so many people in so many places are rallying for their borders to be shut, for fences and walls to be erected, for their kind to put a stop to the “hordes” and “swarms” of others that are not their kind. In the U.S., some counter these arguments by pointing out that — unless you are a Native American — you too are descended from people who came from somewhere else. Go back far enough in time, though, and even the Native Americans “came from somewhere else,” walking to a new land from Siberia across a narrow land bridge that no longer exists. And those ancient Siberians, likewise, came from other parts and — like all of us — can trace their origins back to those first homo sapiens on the savanna.
Anywhere beyond the boundaries of that cradle of humanity, we are all — if we’re honest with ourselves — refugees.
Economists and laptop philosophers still argue over their varying interpretations of John Maynard Keynes’ 1923 comment in “A Tract on Monetary Reform”: “In the long run we are all dead.” But the non-economic-theory, real-world validity of his statement is indisputable: No one here — as Jim Morrison observed — gets out alive.
Reality, on the other hand, survives everything. Call it what you will — the laws of nature, cold hard facts, the truth — there are certain things on this planet and in this universe that prevail because that’s the way things are. No matter what “narratives” or “spin” we try to impose on such reality, no matter what religious, political, ideological or philosophical lens we try to understand it with, actual reality is impervious to our efforts. The best we can aim for is to understand that reality for what it in fact is, and to find reality-based ways to deal with it to make our existence better rather than worse.
Nobody benefited, for example, from labeling AIDS (or, for that matter, ebola or the plague or leprosy or ergot poisoning) as a “punishment” or a “judgment” by God for some perceived sin or moral failure by the person suffering from it. Our collective existence improved only through our efforts’ to understand those maladies for what they in reality were … and then working to find real solutions to treat those maladies.
It’s arrogance to believe anything else.
Yes, our experience of reality is colored by each of our unique perceptions and beliefs. And that understanding can drive us to do many things that either help or harm ourselves and others. But we do not, as Karl Rove reportedly told New York Times Magazine writer Ron Suskind in 2004, “create our own reality.” That type of so-called reality exists only in our own heads (and, oh boy, can that mental reality mess us up — see, for example, “mass psychogenic illness”). The reality of the real world — sooner or later — always comes crashing down on that mental simulacrum.
That’s why we would be wise to, for instance, approach the “existential” threat posed by terrorist movements today with a sober assessment of what solutions have been shown to work best with similar threats in the past. (And, yes, there have always been similar threats in the past: while the capabilities for mass destruction and obscene amounts of human carnage have advanced, tragically, along with our advancing technologies, the fundamental motivations driving such movements haven’t changed much over the centuries.) Studies of the most effective real-world solutions — such as a 2008 RAND analysis of the eventual fate of 648 terrorist groups between 1968 and 2006 — have found that police work, intelligence and even political talks have had much more success than going to “war.”
“All terrorist groups eventually end,” the RAND study stated. “The evidence since 1968 indicates that most groups have ended because (1) they joined the political process or (2) local police and intelligence agencies arrested or killed key members. Military force has rarely been the primary reason for the end of terrorist groups, and few groups within this time frame achieved victory.”
In that light, it should be clear that actions based on emotion, hidden agendas and propaganda are not the best way forward. We should all try to remember that the next time some leader or would-be leader tries to get the public fired up for some costly and painful battle of “good” versus “evil.”
Similarly, we as a species would do ourselves — and many other species with which we share this planet — a favor by acknowledging there are other truly existential threats that can’t be wished or spun or shouted away just because some of us don’t like the societal implications of cold hard facts.
Fact: global atmospheric concentrations of carbon dioxide have been rising steadily since the start of the Industrial Age.
Fact: that extra carbon dioxide has come largely from human activities ranging from coal-fired power and the internal combustion engine to the skyrocketing amounts of animals we raise and kill for meat.
Fact: carbon dioxide — like other increasingly abundant and human-generated gases like methane and nitrogen oxides — contributes to a planetary greenhouse effect whose mechanics have been well understood by scientists for more than a century.
Fact: the global average temperature has been rising for well over 100 years now.
Whether we accept those facts wholeheartedly — or try tooth-and-nail to deny them — doesn’t matter to the facts. The reality is that those facts, unless we choose to act on them in a reality-based way, will have — already are having — many impacts on our planetary system that will radically change the relatively stable climate that humanity evolved in. We have a choice in what the future reality will be, but — whatever we choose to do — that reality will come crashing down on us all the same.
Reality has a way of doing that.