October 4, 2017

Conspiracy theories, fake news, and other delusions in American history

conspiracy theories - Fantasyland - Kurt AndersenFantasyland: How America Went Haywire by Kurt Andersen

@@@@@ (5 out of 5)

You will be amazed. In Kurt Andersen‘s shocking 500-year survey of US history, Fantasyland, you’ll learn just how truly exceptional America is—and not in a good way.

Who is responsible for “fake news?”

If you think only Donald Trump, Fox News, anonymous online pundits, and Russian hackers have a monopoly on “fake news,” guess again. Andersen relates countless incidents of purportedly true accounts of satanic cults, multiple personality disorder, recovered memory, vaccines causing autism, and other once-pervasive delusions on ABC and NBC News and other mainstream media over the years. Even that paragon of accurate journalism, The New York Times, has fallen prey to such nonsense from time to time. Is it any wonder, then, that ludicrous conspiracy theories should multiply on the World Wide Web, where any nut can say anything anonymously without fear of contradiction?

Who spreads conspiracy theories?

Equating The New York Times with Breitbart and Russian hackers as purveyors of fake news would be highly misleading. Andersen doesn’t do that. As he notes in another context, “There are different degrees of egregious.” However, he is clear that “fake news” and conspiracy theories are by no means limited to the so-called “Trump voters” pilloried by professional journalists and commentators.

Huge numbers of other Americans have left the realm of rationalism for Fantasyland. Consider Scientology, the antivaccine movement, hysteria about GMO food, alien abductions, homeopathy, and the national missing-children panic of the early 1980s. None of these delusions and conspiracy theories are solely identified with any class, region, or race. And popular New Age gurus such as Marianne Williamson, Deepak Chopra, and Eckhart Tolle, all of whom sometimes spout nonsense, have not attracted notably large followings among the creationist set. Similarly, Oprah, Dr. Oz, Bill Maher, and other popular show business celebrities have promoted delusional beliefs. As the late Senator Daniel Patrick Moynihan famously noted, “You are entitled to your own opinion, but you are not entitled to your own facts.” Unfortunately, as Andersen makes abundantly clear, far too few Americans take that sentiment seriously—and, in that respect, the United States stands out clearly in comparison with all other developed nations.

The religious roots of America’s infatuation with fantasy

Andersen’s account begins early in the sixteenth century with the establishment of English colonies in present-day Virginia and Massachusetts. In both cases, conventional wisdom has it that the search for religious freedom drove early colonists to American shores. That’s only partly true, and only in the case of New England. Andersen explains that the primary motivations for all the earliest European expeditions were visions of gold and the Northwest Passage. And the Puritans—they only later called themselves Pilgrims—who landed south of Boston were in no way motivated by religious “freedom.” They had set out to establish a theocracy intolerant of any religious practices that departed even slightly from the rigid prescriptions of their faith.

However, in Protestantism, with its view that “every man [is] his own priest,” there lurked a fatal flaw in its commitment to conformity: if “every man” was “his own priest,” what was to stop them from inventing their own religions? In fact, as American history clearly shows, that is precisely what has happened over the five centuries since Jamestown and Plymouth Rock. Beginning not long after the landing at Plymouth Rock with Roger Williams and Anne Hutchinson, Americans have demonstrated unending creativity in devising variations, often radical variations, on Christianity, from Mormonism, Christian Science, Seventh Day Adventism, and the Jehovah’s Witnesses to the numberless evangelical Protestant denominations.

In every case, these new belief systems rested on fantasy. And there, Andersen argues, lies the rub. Most Americans seem willing to suspend disbelief to worship on the basis of precepts any self-respecting science fiction writer would reject as improbable. (If you think I’m exaggerating, read The Book of Mormon as written by Joseph Smith, or Going Clear: Scientology, Hollywood, and the Prison of Belief by by Lawrence Wright.) Interestingly, Andersen cites studies by scholars at Yale and the University of Chicago that found “the single strongest driver of conspiracy belief [is] belief in end-time prophecies.”

Andersen frequently cites findings from public opinion surveys to telling effect. “Nearly all American Christians believe that Heaven (85 percent) and Hell (70 percent) are actual places,” he writes. Focusing on “the solid majority of Protestants, he adds that “at least a quarter of Americans . . . are sure ‘the Bible is the actual word of God . . . to be taken literally, word for word.'” And “more than a third of all Americans . . . believe that God regularly grants them and their fellow charismatics magical powers—to speak in tongues, heal the sick, cast out demons, and so on.” Elsewhere, Andersen notes, “According to Pew, 58 percent of evangelicals believe that Jesus will return no later than the year 2050. (And only 17 percent of all Americans said they thought He definitely wasn’t coming back during the next thirty-three years.)” With such beliefs so widely held, fake news and “alternative facts” can be no surprise.

The problem is far broader than fanciful religious beliefs

Fantasyland is far from limited to the religious sources of Americans’ predisposition to fantasy. Andersen regards shopping malls, planned communities, Civil War reenactment and Renaissance Faires, fantasy sports, theme restaurants, People magazine, cosmetic surgery, pro wrestling, computer games, reality TV, and Disney theme parks as other signposts of our infatuation with the unreal and the impossible. It’s difficult to argue with this on a strictly logical basis. Andersen makes the case. Yet I find it a stretch too far to imply that such phenomena are in any way equivalent to fantasies such as widespread voter fraud, hysteria about vaccines, and the pernicious practices of Scientology, all of which have real-world consequences and sometimes lead to physical harm and even death. However, Andersen implies that, because of conditioning by these seemingly inconsequential realities, Americans are peculiarly susceptible to dangerous conspiracy theories.

Another author examined America’s religious history in an excellent recently published book. I reviewed One Nation, Under Gods by Peter Manseau at America’s surprising religious history in a highly readable book. Earlier, I had reviewed two other books with insight about American history: Corruption in America by Zephyr Teachout (Citizens United, bribery, and corruption in America) and Republican Gomorrah by Max Blumenthal (When religion dominated the views of American conservatives).

 

September 20, 2017

America’s surprising religious history in a highly readable book

religious history - One Nation Under Gods by Peter ManseauOne Nation Under Gods: A New American History, by Peter Manseau

@@@@@ (5 out of 5)

If you’ve wondered how anyone could insist that the United States is a “Christian nation” when so many other faiths are practiced within our borders—and so many Americans shun religion entirely—you should enjoy One Nation Under Gods: A New American History by Peter Manseau. In fact, if you yourself believe that claim, it’s even more important that you read this surprising and revealing book. One Nation Under Gods is an ideal companion volume to Howard Zinn’s classic secular history, A People’s History of the United States. Together, the two books provide a well-rounded picture of American history as it really happened, not as we were taught it in high school.

And the perspective advanced in One Nation, Under Gods is mirrored in the news: “Last week, in a report entitled America’s Changing Religious Identity, the nonpartisan research organization Public Religion Research Institute (PRRI) concluded that white Christians were now a minority in the US population. Soon, white people as a whole will be, too.”

Surprises galore in America’s religious history

In One Nation Under Gods, Manseau surveys the country’s religious history from the arrival of the Conquistadores and the Puritans to Scientology, the Right-Wing evangelicals, and New Age cults of recent years. To illustrate each of the major belief systems that have flourished (and sometimes faded away) over the centuries, Manseau offers a lively sketch of each faith’s earliest practitioner, or in cases such as the Mormon Church, the founder.

One Nation Under Gods is full of surprises.

  • For instance, did you know that a North African Muslim explored large swaths of what is today the United States nearly a century before the fabled landing at Plymouth Rock—and did so far more extensively than his Spanish Christian masters?
  • That twenty percent of the African slaves brought to America were Muslims?
  • That the theology espoused by Joseph Smith, founder of the Church of Latter-Day Saints, was derived in significant part from the oracular vision of an Iroquois leader named Handsome Lake?
  • That many of America’s leading intellectuals of the nineteenth century—Emerson, Longfellow, Thoreau, Melville, Whitman—were heavily influenced by a traveling Hindu lecturer and the sacred writing of the Hindu faith?
  • Or that the first woman prosecuted as a witch in Salem was a Caribbean Indian woman named Tituba whose practice of her traditional religion induced two young girls in a preacher’s household to imitate her “heathen” practices? (Tituba escaped the gallows. The two girls didn’t.)

I certainly knew none of these things. And these examples merely scratch the surface of the revelations in Manseau’s delightful book. I suspect you’ll learn a lot, too.

Religious intolerance in American history

The central theme in One Nation Under Gods is the extraordinary diversity of religious beliefs that have captivated Americans over the centuries. Manseau provides abundant evidence that religious diversity was a reality from the earliest days of English settlement in America, giving the lie to the rigid conformity demanded by Puritan preachers in seventeenth-century New England. Names familiar to any religious scholar—Anne Hutchison and Roger Williams, for example—represented only an early stage in the splintering of the Puritan faith. As settlement moved further into the wilderness, more and more distant from the churches of the towns, the lure of Christian faith grew ever dimmer.

However, it might as well be said that the book is a study of religious intolerance in our history. From John Winthrop’s “city on a hill” to the Chinese Exclusion Act of 1882 and the internment of Japanese-Americans during World War II, the leaders of whichever version of the Christian faith held sway at any one time went to great lengths to stamp out competing belief systems. (Manseau demonstrates that both anti-Chinese and anti-Japanese anger was based at least in large part on intolerance of the Buddhist religion.) The dogmatism of today’s Right-Wing evangelicals who demand “prayer in the schools” is only one manifestation of this history.

About the author

Peter Manseau is the religion curator at the Smithsonian Institution’s Museum of American History in Washington, DC. One Nation Under Gods is his fourth book.

You might also enjoy my reviews of two books about Scientology. One is at Scientology revealed in a definitive investigative report. The other: Inside Scientology: set up your own religion, and make a billion dollars. I also reviewed another excellent book of religious history in America: Joseph Smith: the remarkable man who founded the Mormon Church.

September 12, 2017

Historical perspective on “the vast Right-Wing conspiracy”

Book exploring the "Right-Wing conspiracy"Democracy in Chains: The Deep History of the Radical Right’s Stealth Plan for America, by Nancy MacLean

@@@@@ (5 out of 5)

In Dark Money, New Yorker staff writer Jane Mayer exposed the dominant role of what Bernie Sanders calls “the billionaire class” behind the rise of the Radical Right in America—what then-First Lady Hillary Clinton famously called “the vast Right-Wing conspiracy.” In Democracy in Chains, Duke University professor Nancy MacLean probes the historical roots of the radical libertarian ideology they profess. Together, the two books deepen our understanding of the misleadingly-named “conservative” movement that has come to dominate American politics in the second decade of the 21st century.

The historical roots of today’s “conservative” movement

MacLean’s argument is essentially simple. Dig down to the intellectual roots of today’s Radical Right, she asserts, and you’ll find John C. Calhoun‘s spirited defense of slavery in the 19th century and Harry Byrd‘s campaign of massive resistance to desegregation in the 1950s and 60s. Contemporary “conservatives” don’t acknowledge the racist roots of their ideology in the “states’ rights” arguments of the past. They advocate “economic liberty” grounded in a “free market,” citing the work of Right-Wing economists Ludwig von Mises, Friedich A. Hayek, and Milton Friedman. While acknowledging the influence of these and other intellectuals on what has come to be called the conservative movement, MacLean persuasively argues that instead the principal figure in the evolution of the ideas at the core of today’s Radical is a lesser-known Nobel Prize-winning economist, James M. Buchanan.

“The vast Right-Wing conspiracy”

In a detailed exploration of Buchanan’s work at a succession of Right-Wing campuses, chiefly the University of Virginia and George Mason University, MacLean points to his decades-long partnership with Charles Koch and other ultra-wealthy donors as the central thread in the ascendancy of the Right. “In the eventual merger of Koch’s money and managerial talent and the Buchanan team’s decades of work monomaniacally identifying how the populace became more powerful than the propertied,” she writes, “a fifth column movement would come into being, the likes of which no nation had ever seen.” She characterizes her account as “the utterly chilling story of the ideological origins of the single most powerful and least understood threat to democracy today: the attempt by the billionaire-backed radical right to undo democratic governance . . . a stealth bid to reverse-engineer all of America, at both the state and national levels, back to the political economy and oligarchic governance of midcentury Virginia, minus the segregation.” This movement’s hidden agenda—and it has been consciously hidden for many years—is to end public education, abolish Social Security and Medicare, close down the U.S. Postal Service, repeal minimum wage laws and prohibitions against child labor, eliminate foreign aid, close the Environmental Protection Agency, and eventually end taxes and government regulation of any kind. In other words, given the stealth nature of this radical libertarian movement, Hillary Clinton was right on-target when she called it a “vast Right-Wing conspiracy.” But why has so much time and money gone into this effort? Buchanan had a simple answer to that question: “‘Why must the rich be made to suffer?'”

Criticism of Democracy in Chains

Given MacLean’s obviously negative perspective on these developments, it’s no surprise that her book has been bitterly criticized by commentators on the Right. For  example, one critic, writing in the Washington Post, referred to “dubious claims” in MacLean’s account, challenging the importance she ascribes to Buchanan’s work and the relevance of the resistance to desegregation in 1950s Virginia. Others have questioned her scholarship. Judging from what I’ve seen, I’m not convinced that these critics have actually read MacLean’s book.

What “economic liberty” really means

Buchanan espoused “public choice theory.” In his view, which won him the Nobel, democracy inevitably leads to overspending because the majority continually forces politicians to fund new government services. The taxes required to fund these services constrain the “economic liberty” of wealthy people and corporations. Buchanan’s intellectual descendants call these privileged people and their business enterprises “makers,” as opposed to the rest of us, who are “takers.”

MacLean observes in her conclusion that “There is another, biting irony to note: the goal of this cause is not, in the end, to shrink big government, as its rhetoric implies. Quite the contrary: the interpretation of the Constitution [they seek] to impose would give federal courts vast new powers to strike down measures desired by voters and passed by their duly elected representatives at all levels—and would require greatly expanded police powers to control the resultant popular anger.”

Reviews of related books

My review of Jane Mayer’s Dark Money is here: How the Koch brothers are revolutionizing American politics. You might also be interested in Robert Reich explains how to make capitalism work for the middle class. And, for another take on the origins of today’s Right-Wing movement, see my review of Right Out of California: The 1930s and the Big Business Roots of Modern Conservatism, by Kathryn S. Olmsted. It’s at How today’s conservatism grew in the cotton fields of California.

August 29, 2017

Richard A. Clarke asks, can we avoid a dystopian future?

dystopian futureWarnings: Finding Cassandras to Stop Catastrophes, by Richard A. Clarke and R. P. Eddy

@@@@@ (5 out of 5)

There is no lack of dire predictions about the future. Hundreds of dystopian novels, especially the flood of books in that genre for young adults, have portrayed innumerable variations on future catastrophes. I became so intrigued about all this attention to a possible dystopian future that I wrote a book about it. It’s called Hell on Earth: What we can learn from dystopian fiction. Now I’ve found someone far better positioned to assess the likelihood that some of those dystopian scenarios might come to pass: Richard A. Clarke. In collaboration with his colleague R. P. Eddy, the former U.S. counterterrorism czar under three presidents has written Warnings: Finding Cassandras to Stop Catastrophes. This is a deadly serious inquiry into the reality underlying predictions of a killer pandemic, sudden massive sea rise, a devastating meteor strike, runaway artificial intelligence, and other chilling possibilities.

Accurate predictions of a dystopian future

In Warnings, Clarke and Eddy dive deeply into the expert predictions of scientists, engineers, and journalists who have stuck their necks out, often against enormous resistance, to warn the U.S. cassandra about seemingly unthinkable possibilities. They call these stubborn and courageous individuals Cassandras (after the princess of Troy in Greek mythology whose accurate predictions of disaster were forever doomed to be ignored). However, in every case, Clarke and Eddy’s Cassandras have been anything but ignored—although some have labored for decades to be heard.

Warnings is not simply a study of the brave people who have risked their careers to make exceedingly unpopular predictions based on their expertise. The authors have undertaken to analyze the factors common to most Cassandras, deriving a “Cassandra Coefficient” based on four critical components: the character of the threat or risk itself and how it is received; the expertise and personality of the would-be Cassandras; the extent and character of resistance from the Cassandra’s critics; and the receptiveness of the decision makers they hope to influence.

Eight “Cassandras” who were ignored

In the book’s first part, “Missed Warnings,” Clarke and Eddy relate the stories of eight Cassandras whose predictions were ignored. Included are the military analyst who predicted Saddam Hussein’s invasion of Kuwait in time for it to have been prevented; the meteorologist who warned about the certainty of massive hurricane damage to New Orleans before Katrina; the seismologist who is even today pleading with authorities to mitigate the damage of the catastrophic earthquake that is certain to strike the U.S. Northwest; and others. It’s a sobering account.

Will these later Cassandras be ignored, too?

The second part of the book, “Current Warnings,” portrays the efforts of seven people who today are clamoring to be heard about the danger of such potential catastrophes as a massive meteor strike, a devastating pandemic, and runaway genetic engineering, among others. Each is a grim cautionary tale. In each chapter, the authors report on their interviews with the experts they portray as Cassandras. If you’re prone to worry, these accounts may keep you up at night. Every one of the threats related in these chapters has the potential to yield a dystopian future.

Richard A. Clarke served as counterterrorism czar under Presidents George H. W. Bush, Bill Clinton, and George W. Bush. Following his departure from the White House in 2003, he gained widespread attention nationally with his harsh criticism of the Bush Administration’s response to 9/11. He is the author of five nonfiction books and four novels. I reviewed another of his books at An authoritative insider’s take on the threat of cyber war.

You might also be interested in my list of 24 compelling dystopian novels, with links to my review of each one.

August 8, 2017

An eye-opening book about the air we breathe

air we breatheCaesar’s Last Breath: Decoding the Secrets of the Air Around Us, by Sam Kean

@@@@@ (5 out of 5)

Ever heard of dichlorodifluoromethane? (That’s CCl2Fto you chemistry students.) Well, guess what? You inhale seven trillion molecules of the stuff every time you breathe. Yes, it’s in the air we breathe. That’s just one of the lesser revelations in Sam Kean’s eye-opening and thoroughly enjoyable new survey, Caesar’s Last Breath: Decoding the Secrets of the Air Around Us. 

As the title will suggest to the careful reader, the central conceit in Kean’s book is that “roughly one particle of [the last breath Julius Caesar took after he was stabbed] will appear in your next breath.” Apparently, this “how-many-molecules-in-X’s-last-breath exercise has become a classic thought experiment in physics and chemistry courses.” Not in mine, though. I don’t remember much about those courses, but I’m sure I would’ve remembered that.

In Caesar’s Last Breath, Kean will take you on a fast ride through the 4.5-billion-year history of Earth’s atmosphere and then through the more than one hundred different gases that comprise the atmosphere today. Yes, more than one hundred. Individual chapters—and “interludes” placed between them—tell tales about each of the major substances. Everybody knows about nitrogen, oxygen, and carbon dioxide. But there’s also carbon monoxide (CO), nitrous oxide (N2O, known as laughing gas), methanethiol (CH3SH), and all manner of others. However, this is no mind-numbing laundry list of unfamiliar substances. Kean uses each one as a lever into the history of atmospheric science. And along the way he strays—delightfully—into topics that may be only tangentially related to the air we breathe.

In fact, Caesar’s Last Breath is as much about the scientists, famous and not, whose discoveries over the centuries have helped us understand the nature and the effects of each of the major gases in our atmosphere. If you’re at all familiar with the history of science, you’ll recognize the names Fritz Haber, Joseph Priestley, Antoine-Laurent Lavoisier, Robert Boyle, Henry Cavendish, Humphry Davy, and so many others who have made the world around us easier to understand. Don’t think for a minute, though, that Kean simply offers up the usual dry recitation of each scientist’s discovery and how he made it. No. Instead, the author tells us things we never knew, or at least things that I never knew, about these fascinating people.

For example, Henry Cavendish, the man who discovered hydrogen, was autistic and “communicated with his domestic staff via notes.” He was also filthy rich. “During his lifetime, Cavendish had more money in the Bank of England than any other British subject.” Joseph Priestley, the co-discoverer of oxygen, was a Protestant minister whose investigations prompted a mob in Birmingham to burn down his church and his home, hoping (without success) to see him burn inside it. Antoine-Laurent Lavoisier, who tried to take sole credit for the discovery of oxygen, was an aristocrat who went to his death on the guillotine in the French Revolution. Lavoisier had been a rapacious tax-collector, and if anybody deserved such a fate, he probably did. And Albert Einstein teamed up with fellow nuclear physicist Leo Szilard to invent a better refrigerator. (They actually invented several and made a pile of money from them.) You’ll also meet people whose names you’re highly unlikely to know but will probably never forget, including the man who proved why the sky is blue, “the worst poet who ever lived,” and Le Pétomaine (the Fartomaniac), who became the highest-paid performer in France for his wildly popular act in which he sang and did impressions by passing gas.

Caesar’s Last Breath is full of fascinating and sometimes hilarious sidelights such as these. For example, did you know that “[B]efore 1850 people routinely committed suicide rather than face surgery?” (I didn’t, and I’ve read a bit about the history of medicine.) It was only in the mid-19th century that anesthetics—first nitrous oxide, then ether, and finally chloroform—finally started coming into use.

Sam Kean‘s writing style is informal, to say the least. (I don’t think I’ve ever seen the word “catookus” in print anywhere else.) You can easily imagine him talking to you and laughing pretty much half the time. Given the aridity of so much of typical science histories, Caesar’s Last Breath is a delight.

For other books about science that I’ve enjoyed, go to My three favorite books about science.

 

 

 

July 25, 2017

How Steve Bannon sold the alt-right to Donald Trump and made history

Steve BannonDevil’s Bargain: Steve Bannon, Donald Trump, and the Storming of the Presidency, by Joshua Green

@@@@ (4 out of 5)

Donald Trump has been in the White House for six months as I write. His approval rating today (July 25, 2017) stands at 38.9%, according to an average of national polls on Nate Silver’s widely read blog, FiveThirtyEight. His disapproval rating is nearly 20 points higher. These numbers establish him as the most unpopular president since World War II as measured six months into his term (although Gerald Ford came close because of his pardon of Richard Nixon). Meanwhile, Trump’s major initiatives—the ban on Islamic refugees, the plan to build a border wall and make Mexico pay for it, the pledge to spend $1 trillion on rebuilding infrastructure, and the effort to repeal and replace the Affordable Care Act—have all failed to gain traction so far. Given all that, combined with his proclivity to lash out in fury at friend and foe alike, Donald Trump has become the Republican Party’s worst nightmare (not to mention the rest of us)..

The new book Devil’s Bargain: Steve Bannon, Donald Trump, and the Storming of the Presidency by BloombergBusinesweek senior national correspondent Joshua Green attempts to explain both why the notoriously self-promoting developer and reality-TV star came to be in the White House—and why his presidency is failing. The case Green makes, singling out alt-right provocateur Steve Bannon for a large measure of the responsibility, is persuasive if not entirely convincing. He writes, “Trump needed him. Practically alone among his advisers, Bannon had had an unshakable faith that the billionaire reality-TV star could prevail—and a plan to get him there.” At another point, Green notes that “Trump wouldn’t be president if it weren’t for Bannon.” However, whenever an analyst looks for a single explanation for a complex historical event, it’s always best to view it with a skeptical eye. History is rarely so easily explained. If there is anything close to a single explanation for the chaos, incompetence, and amoral behavior in the White House, it lies in the singular personality of Donald Trump himself.

In Devil’s Bargain, Green weaves together three stories: the unfolding of Trump’s presidential campaign before Steve Bannon took charge during its last two months; how Bannon came to hold the extremist beliefs he professes; and how the relationship between Trump and Bannon grew, matured, and eventually cooled. The focus is on the election campaign, with an insider’s description of the Trump campaign on election night, November 8, 2016, opening and closing the book. Green writes well, star journalist that he is. Though a good deal of what he writes has come to light in press reports during the past year and a half, Green tells the tale well. It’s full of personal observations that could only have come from the players themselves.

In a recent post, Why Hillary Clinton lost the 2016 election, I reviewed Shattered, two other journalists’ take on the election, but from the perspective of the Clinton campaign. Jonathan Allen and Amie Parnes found the explanation for Clinton’s loss in the character and the conduct of the candidate herself. By contrast, Green leans toward viewing Steve Bannon as primarily responsible for Trump’s victory. Just as Allen and Barnes overlooked the four decades of the country’s steady drift rightward, Green seems to underestimate the role Donald Trump himself in winning the presidency. It’s clear to me that Trump possesses limited intelligence, but he is unquestionably shrewd and manifests unusually sharp political instincts (even if he doesn’t often follow them as president).

In Green’s view, Bannon believes most of what he professes. He seems to think that Trump believes very little of what he says.  For example, Green observes that when Trump met Bannon, “his views changed. Trump took up Bannon’s populist nationalism.” At another point he notes, “Trump ran against the Republican Party, Wall Street, and Paul Ryan, but then took up their agenda.” Certainly, Trump’s views have flip-flopped again and again, and Green documents many specific instances in which Trump clearly took a stand on an issue strictly because it helped advance his campaign. But Green overlooks the pattern of Trump’s virtually nonstop lying. It’s difficult to tell whether the erstwhile reality-TV star believes in anything at all other than his own importance.

Devil’s Bargain may not be the last word on one of the most important presidential elections in American history. But in the absence of anything even more authoritative, it’s an excellent beginning.

Given my interest in US politics, it will be no surprise that Devil’s Bargain and Shattered are not the only books I’ve read about the contemporary political scene. For my review of another, John Judis’ The Populist Explosion, see Donald Trump: populism, or fascism? Two others are at How the Koch brothers are revolutionizing American politics and Al Franken’s memoir is revealing, insightful—and funny.

 

July 18, 2017

DARPA: inventors of Agent Orange, the M16, GPS, and the Internet

DARPAThe Imagineers of War: The Untold Story of DARPA, the Pentagon Agency That Changed the World, by Sharon Weinberger

@@@@@ (5 out of 5)

Many of the products of the Pentagon’s in-house research facility, DARPA, are widely known. The Internet. GPS. The M16 rifle. Agent Orange. Stealth aircraft. What is less widely known and understood is the story of the scores of scientists, engineers, and bureaucrats who sired these and many other innovations over the nearly 60-year history of the agency. Now, journalist Sharon Weinberger has brought that history to light in a captivating account, The Imagineers of War: The Untold Story of DARPA, the Pentagon Agency That Changed the World, her third book about America’s defense establishment.  

What is most distinctive about Weinberger’s study of DARPA is the wealth of information and insight she gained from interviews with dozens of current and former employees of the agency as well as with those who observed it in action over the years. Prominent among her interviewees were many of the men (and a couple of women) who served as DARPA’s directors. In the process, and in extensive archival research, Weinberger turned up a great deal of information about the agency in the 1950s, 60s, and 70s that has been ignored or even suppressed for many years.

DARPA’s shifting mission

DARPA’s mission has shifted sharply over the years. At its inception in 1958 and for a short while afterward, DARPA was the nation’s first space agency. DARPA’s focus quickly shifted to missile defense. “By 1961,” Weinberger writes, “ARPA was spending about $100 million per year, or half of its entire budget, on missile defense.” The Cuban Missile Crisis and President Kennedy’s subsequent emphasis on achieving a nuclear test ban accelerated the process. Along the way, this research “modernized the field of seismology” in the agency’s effort to detect underground nuclear tests. Around the same time, the agency became involved in counterinsurgency in Vietnam (and later in many other countries). The counterinsurgency work involved social science research as well as the development of new weapons such as the M16. DARPA’s most famous product, the Internet (originally ARPANET), was an easily ignored, low-priority project in the face of the billions spent on the war. During the 1970s, the agency turned its attention to what the Pentagon and the White House deemed the country’s gravest threat: the potential of a Soviet invasion of West Germany with a massive tank attack that could not be stopped with nuclear weapons alone. Within less than two decades, that threat evaporated. The fall of the Berlin Wall and the collapse of the Soviet Union shocked DARPA’s leadership, as it did everyone else in the US government. The agency only gradually found its way forward with a primary focus on precision weaponry and the electronic battlefield. In Weinberger’s opinion, DARPA’s work today is aimed at much lesser problems than it has tackled in the past. It’s much more focused on solving specific problems posed by Pentagon brass rather than delving into genuine scientific research, which had been the case in earlier decades.

“Today,” Weinberger writes in her conclusion, “the agency’s past investments populate the battlefield: The Predator . . . Stealth aircraft . . . Networked computers . . . precision weapons . . .” But it’s unlikely anything as disruptive as the Internet is ever likely to come again from DARPA.

Revealing DARPA’s many huge failures

The history of DARPA in its early years in The Imagineers of War is especially strong. By burrowing into obscure declassified documents and interviewing many of those who were active in the agency’s first years, Weinberger uncovered the seminal role of William Godel. It was Godel who “managed to use the power vacuum at ARPA [following the loss of space programs to NASA] to carve out a new role for the agency in Vietnam.” Following the lead of the British in Malaya, where many of the tools of counterinsurgency were first developed, Godel built what ultimately became a multi-billion-dollar program in Vietnam. His aim was to make it unnecessary for the US to commit troops to the war, and in that he obviously failed miserably. It was Godel who promoted the notorious strategic hamlets and introduced Agent Orange and other defoliants as well as the combat rifle that came to be known as the M16. Because much of his work was clandestine and involved cash payments to undercover agents, Godel became enmeshed in an investigation into his program’s financial reporting and later spent several years in prison as a result of a colleague’s misappropriation of funds. Probably because of this intensely embarrassing chapter in DARPA’s history, and his later turn to gunrunning in Southeast Asia, Godel’s role has been deeply buried. There is not even a Wikipedia page for him.

William Godel was by no means the only DARPA executive to darken the agency’s history with outsize failures. Others squandered billions of dollars in sometimes lunatic schemes, such as a plan to explode nuclear weapons in the Van Allen belt in the upper atmosphere in hopes of destroying intercontinental missiles launched from Russia. The agency spent almost $2 billion in a failed effort to develop a prototype of a “space plane” that would travel at Mach 25 (“one of DARPA’s costliest failures”). Another embarrassing episode involved extensive research into mind control. An even bigger embarrassment loomed as a possibility in 1983 when Ronald Reagan announced his plan for the Strategic Defense Initiative (“Star Wars”). Luckily for the agency, the work was shifted to a new Pentagon department that eventually blew a total of $30 billion on an effort that scientists had almost universally said was impossible with contemporary or foreseeable technology.

An earlier history of DARPA

In December 2015, I reviewed a then-new book, The Pentagon’s Brain: An Uncensored History of DARPA, America’s Top-Secret Military Research Agency, by Annie Jacobsen. Jacobsen is the author of three other studies of the Pentagon: Operation Paperclip: The Secret Intelligence Program that Brought Nazi Scientists to America, Area 51: An Uncensored History of America’s Top Secret Military Base, and, most recently, Phenomena: The Secret History of the U.S. Government’s Investigations into Extrasensory Perception and Psychokinesis. All four books were published by prestigious mainstream firms. I’ve cited all these titles to convey a clear sense that Annie Jacobsen is an accomplished and trustworthy source of information about the Pentagon. She has spent years researching the American military, with a focus on its activities in research and development. But it’s clear to me that Sharon Weinberger’s more recent study, The Imagineers of War, does an even better job of laying bare the truth about DARPA’s checkered history.

For a humorous take on a closely related subject by humorist Mary Roach, go to A journalist looks at military science.

July 4, 2017

The iPhone: the world’s most profitable product?

world's most profitable productThe One Device: The Secret History of the iPhone, by Brian Merchant

@@@@@ (5 out of 5)

Other Silicon Valley observers have written about the development of the iPhone—but it’s unlikely that anyone else has delved as deeply into the subject as Brian Merchant . . . or will ever do so in the future, for that matter. Merchant’s brilliant new book, The One Device: The Secret History of the iPhone, tells the tale from the mining of the minerals from which the phone is crafted to the oppressive working conditions in Apple’s Chinese manufacturing plants and the scavengers at Third World dumps where discarded iPhones are sometimes now found. Those topics bookend the story, which largely consists of interviews with some of the hundreds of people who played a hand in the phone’s development.

Steve Jobs didn’t invent the iPhone

If you have the impression that Steve Jobs invented the iPhone and is largely responsible for its success, The One Device will quickly disabuse you of that misconception. Without question, Jobs was hugely influential in the project: his obsessive attention to detail, his passion for secrecy, and his genius at marketing all contributed in major ways to the ultimate runaway success of the product. However, not only was the iPhone not Jobs’ idea—he actively resisted pursuing the project for several years. (A team of key staff members worked in secret in defiance of his refusal to authorize the work. Their meetings began before the turn of the century. The first iPhone was released in June 2007.) Jobs’ insistence on secrecy contributed to the buzz that surrounded the phone in the months leading up to its release, but during the many years that Apple devoted to designing the iPhone, that same paranoid obsession with secrecy impeded the project’s progress by compartmentalizing the staff. “Of all the complaints about working at Apple . . .,” Merchant writes, “its secrecy was at the top of the list—engineers and designers found it set up unnecessary divisions between employees who might otherwise have collaborated.” And Jobs’ notoriously volcanic temper and his sometimes abusive treatment of employees may have forced many of them to work longer and harder on the phone than otherwise would have been the case. But it’s difficult to believe that morale wouldn’t have suffered as a result—and I know from decades of experience as an employer that low morale takes a toll on productivity.

As Merchant makes clear, “The story of the iPhone starts . . . not with Steve Jobs or a grand plan to revolutionize phones, but with a misfit crew of software designers and hardware hackers tinkering with the next evolutionary step in human-computer symbiosis.” And a truly fascinating tale it is. Ultimately, hundreds of people, not just at Apple but at key suppliers such as Corning and Samsung as well, made key contributions to the success of the iPhone. Merchant does his best to identify them by name and interview them.

A century of antecedents

One of the strengths of Merchant’s account is the thoroughness with which he studied the history of technology. In doing so, for example, he learned that “[v]isions of iPhone-like devices can be traced back to the late 1800s.” A Finnish inventor “successfully file a patent for what appears to be the first truly mobile phone”—in 1917. And “[b]y 1994, Frank Canova had helped IBM not just invent but bring to market a smart-phone that anticipated most of the core functions of the iPhone.” Thirteen years before Apple’s product announcement!

The world’s most profitable product?

Merchant frequently refers to the iPhone as “the world’s most profitable product.” For one thing, it didn’t start out that way. Initial sales of the phone were disappointing. Jobs had steadfastly refused to let outside developers supply apps to run on the iPhone. Only when he relented at last and allowed the opening of the App Store did sales explode upwards—and explode they did. Certainly, the profits Apple realizes from the phone are now massive, and it accounts for two-thirds of the company’s revenue. Where else might Apple’s cash hoard of more than $250 billion have come from? But is it the world’s most profitable product? That strikes me as hyperbole. Like other journalists, Merchant clearly fell prey to the fallacy that only huge corporations matter. Although Time lists the iPhone as #1 on its list, the magazine qualifies that claim with the statement that it is “one of the world’s most profitable products.” And Merchant’s extravagant use of language doesn’t stop with his assertion about the phone’s profitability. For example, “The iPhone might actually be the pinnacle product  of all of capitalism to this point.” Later, he adds, “The iPhone isn’t just a tool; it’s the foundational instrument of modern life.” Really? As of last year, iPhone sales passed one billion units. But there are more than 7.4 billion people on the planet.

 

June 27, 2017

An authoritative look at technology’s potential

technology’s potential

The Driver in the Driverless Car: How Our Technology Choices Will Create the Future, by Vivek Wadwa and Alex Salkever

@@@@@ (5 out of 5)

“Not long ago, I was very pessimistic about the future. . . Today, I talk about this being the greatest period in history, when we will solve the grand challenges of humanity and enter an era of enlightenment and exploration such as we saw in my favorite TV series, Star Trek.” Thus begins The Driver in the Driverless Car: How Our Technology Choices Will Create the Future, by Vivek Wadwa and Alex Salkever.

An authoritative look at technology’s potential

In this fascinating and authoritative look at the potential of technology, both positive and negative, Wadhwa demonstrates intimate knowledge of the latest developments in such diverse fields as biomedicine, robotics, education, the Internet of Things, and prosthetics. Unlike the unreservedly optimistic scenarios presented by Ray Kurzweil and Peter Diamandis, Wadhwa paints an almost symmetrical portrait of technology’s future, extolling its promise but vividly describing its potential to harm us. (I previously reviewed Abundance: The Future Is Better Than You Think by Diamandis.) Ray Kurzweil famously speaks about the exponential rate at which technology advances. Wadhwa bases his argument on the same formula but reaches different conclusions. “You will see that there is no black and white,” he writes; “the same technologies that can be used for good can be used for evil in a continuum limited only by the choices we make jointly.”

It’s a cliché to remark on the speed of technological change, but the reality is nonetheless staggering. As Wadhwa notes, “the amount of information buzzing over the Internet is doubling roughly every 1.25 years. . . We are now creating more information content in a single day that we created in decades or even centuries in the pre-digital era.” He adds, “the iPhone 11 or 12 will have greater computing power than our brains do.”

Three questions to ask about any new technology

The Driver in the Driverless Car is organized around three broad questions, which Wadhwa poses in connection with each of the technologies he discusses: “1. Does the technology have the potential to benefit everyone equally? 2. What are the risks and the rewards? 3. Does the technology more strongly promote autonomy or dependence?” He is merciless in responding to these questions. Only two of the many technologies treated in this book emerge with unreservedly positive reviews: driverless cars and trucks, and solar power. Everything else comes up short, from the biomedical miracles emerging from laboratories on a daily basis to the Internet of Things. In a great many cases, the new technologies render us susceptible to identity theft or worse. For example, Wadhwa fears the loss of privacy that will come from having all our appliances connected to the Internet and to each other: “I am not looking forward to having my bathroom scale tell my refrigerator not to order any more cheesecake.”

A sometimes fantastic vision of the future

Disputing Wadhwa’s sometimes fantastic vision of technology’s future may be a fool’s errand. However, it’s difficult not to remain skeptical about some of his predictions. For example, he envisions 200-mile-per-hour driverless cars guided by a web of sensors on the roadways. Despite the miniscule cost of individual sensors, it’s hard to see where the money might come from to implement such a system. Can you imagine how much it might cost to embed sensors along a 200-mile stretch of highway, much less the full 381 miles from San Francisco to Los Angeles? Similarly, the author envisions a sea change in our transportation system within the foreseeable future, with driverless electric cars available on command everywhere, private vehicles and stop lights eliminated, and parking lots a thing of the past. Perhaps, eventually, all this might come to pass. But is it realistic to expect that politicians will resist the screaming complaints from auto manufacturers, oil companies, service station and parking lot owners, and individual citizens by the millions?

Wadhwa emphasizes throughout The Driver in the Driverless Car that only grassroots citizen pressure can force politicians to enact the legislation necessary to permit the widespread use of some of these technologies. For instance, FDA approval may be necessary for the acceptance of many of the biomedical innovations Wadhwa describes. And state governments everywhere will be required to allow driverless vehicles to travel on their roads, a prospect that does not seem imminent. The future Wadhwa envisions may eventually come to pass. But we would be naive to expect no bumps, twists, and turns along the way.

About the authors

Vivek Wadwa has an extraordinary resumé. An Indian-born American futurist, he lives in Silicon Valley and researches technology developments there. Wadhwa holds distinguished positions at Carnegie Mellon and Duke and is a globally syndicated columnist for the Washington Post. In 2012, Foreign Policy magazine named him one of the world’s Top 100 Global Thinkers. Wadhwa calls his co-author, Alex Salkever, V.P of Marketing Communications at Mozilla, his “writing guru.” The two also co-authored The Immigrant Exodus: Why America Is Losing the Global Race to Capture Entrepreneurial Talent, which the Economist named a Book of the Year in 2012.

 

June 20, 2017

A very funny book about words, grammar, and dictionaries

grammarWord by Word: The Secret Life of Dictionaries, by Kory Stamper

@@@@@ (5 out of 5)

When you think of dictionaries, chances are good the ones that would come to mind are the Merriam-Webster Collegiate and the Oxford English Dictionary (as well as whatever comes up online). Did I get that right? Certainly, those are the two most commonly consulted by educated American readers. If you’re a curious sort, you might wonder how all the words and definitions find their way into the pages of those dictionaries. Well, wonder no more! The lexicographer Kory Stamper of Merriam-Webster, Inc., has written Word by Word, a delightfully profane and often hilarious account of how she and her colleagues work to update their dictionaries, not just the Collegiate but the online Merriam-Webster Unabridged as well (the successor to the old Webster’s Third New International Dictionary).

Stamper is passionate about her work. “The more I learned,” she writes, ” the more I fell in love with this wild, vibrant whore of a language.” Her book abounds with charming examples of the intensity she and other Merriam-Webster editors bring to their jobs. And no wonder: it’s clearly hard work.

Unless you’re already familiar with the ways and means of lexicography, you’ll be amazed at the extraordinary pains the Merriam-Webster staff sometimes takes simply to define a single word. “By the time a word is put in print either on the page or online, it’s generally been seen by a minimum of ten editors.” Stamper describes the process, step by step, in language so lively you’ll never think about the world of dictionaries as stuffy ever again. “What appears to be a straightforward word ends up being a linguistic fun house of doors that open into air and staircases that lead nowhere,” she writes. For example, at one point Stamper’s job was to revise the definition of “take.” That seemingly simple word, it turns out, means twenty different things. Sorting through all the citations set aside to illustrate those different definitions was a Herculean task. It required “a month of nonstop editorial work.” But when Stamper bragged (or complained) to a table-full of editors at a dinner about the length of time she’d invested in a single word, a lexicographer from the Oxford English Dictionary was amused: “‘I revised “run,” he said quietly, then smiled. ‘It took me nine months.'” Stamper explains: “Of course it [took nine months]. In the OED, “run” has over six hundred separate senses [definitions] . . .”

And yet language, especially English, changes far more quickly than lexicographers could ever possibly keep up, Stamper explains. “A dictionary is out of date the minute that it’s done.”

In an extended discussion of English grammar, Stamper will disabuse you of any lingering notion that ours is a tidy and rational language. With example after example, she demonstrates the sheer illogic of the rules of grammar. “[W]here do these rules come from, if not from actual use?” she asks. “Most of them are the personal peeves, codified into law, of dead white men of yore . . . Standard English as it is presented by grammarians and pedants is a dialect that is based on a mostly fictional, static, and Platonic ideal of usage.” (The italics are Stamper’s.)

Throughout her book, Stamper is free with profanity. For example, she drops the “f-bomb” 17 times. At one point she explains that the profanity is to make her come across as cooler than she is.

There are plenty of surprises in Word by Word. “As you go through the written record, you’ll find that Shakespeare used double negatives and Jane Austen used ‘ain’t.’ You’ll find that new and disputed coinages have come in and have not taken away from the language as it was used, but added to it; that words previously considered horrendous or ugly—words like ‘can’t’—are now unremarkable.”

1 2 3 32