Tag Archives for " science "
Astronomy. Epidemiology. Lexicography. Microbiology. These are among the thirty different scientific fields discussed and explained in the thirty-three excellent books about science that I’ve read and reviewed. I’m listing them here in alphabetical order by the fields’ names. Each is linked to my review. (If a link comes up short, just go to www.malwarwickonbooks.com and search for the title.)
Animal Husbandry: Eating Animals, by Jonathan Safran Foer
Astronomy: Beyond: Our Future in Space, by Chris Impey
Atmospheric Science: Caesar’s Last Breath: Decoding the Secrets of the Air Around Us, by Sam Kean
Gastroenterology: Gulp: Adventures on the Alimentary Canal, by Mary Roach
General Science: A Short History of Nearly Everything, by Bill Bryson
Medical Research: The Immortal Life of Henrietta Lacks, by Rebecca Skloot
Military Science: Grunt: The Curious Science of Humans at War, by Mary Roach
Personality Psychology: Quiet: The Power of Introverts in a World That Can’t Stop Talking, by Susan Cain
@@@@ (4 out of 5)
The concluding chapter in The Economist‘s new book, Megatech: Technology in 2050, highlights “the central role of capitalism” in driving the demand for new technology. The preceding 19 chapters justify that reading, for the most part indirectly. That should be no surprise in a product of The Economist, a London-based weekly “50% owned by the English branch of the Rothschild family and by the Agnelli family.” No Marxists to be found in those precincts!
Like any anthology, Megatech is uneven. Chapter 1, “A toolkit for predicting the future,” sets the scene exceptionally well. Chapters on biotechnology, health care, agriculture, materials science, military technology, and personal technology are all informative and engaging. So is “Visiting Hours,” one of the two science fiction short stories included in the book. Chapters on computer science, energy, the “Physical foundations of future technology,” and the history of innovation are less valuable. In fact, I found the essay on the principles of physics to be impenetrable. It does not appear to have been written for a general audience. By contrast, the essay on materials science includes a fascinating account of how the BMWi3 is manufactured. The “car begins life in a Japanese rayon factory as a spool of polyacrylonitrile,” which is “shipped to the US, where it is baked into carbonised strings only 7 micrometers (millionths of a metre) in diameter,” and, finally, “woven on carpet-like sheets on what appears to be a giant knitting machine” in a factory near Munich.
Virtually any subject is open to debate. The future of technology is particularly susceptible to disagreement, and some of that surfaces in Megatech. For example, a chapter dedicated to “The great innovation debate” views the current obsession with runaway technological change with a jaundiced eye. “Decades of advance in information technology,” the author writes, “have not generated anything like the soaring growth in output per person, adjusting for inflation, that industrialised countries enjoyed in the mid 20th century.” A later chapter strenuously disagrees, arguing that “the pessimists misread the nature of technological change . . .” by underestimating “the cumulative effect of exponential improvement in computing power,” by mistakenly assuming that Moore’s law represents a constraint on further advances in computing, and by failing to understand that “it takes time to learn how to apply powerful new technologies.”
Unfortunately, Megatech doesn’t quite deliver the promise in its subtitle, Technology in 2050. Most of the essays included in the book examine current trends rather than looking 30 years ahead. Some don’t venture far at all into the future.
Unlike most publications, The Economist maintains a large Intelligence Unit that provides forecasting and risk analysis services for corporations and governments. Though the connections don’t appear in print anywhere in the book, it seems obvious that most of the 20 authors whose essays appear in this illuminating anthology are either editors of the magazine or among the “130 full time specialists and economists” employed by the Unit. (One obvious exception is Melinda Gates, who is nobody’s idea of an employee.) The book is edited by Daniel Franklin, Executive Editor of The Economist.
@@@@@ (5 out of 5)
Driverless cars, for sure. But pilotless airplanes? Machines that will replace doctors and corporate managers? And robots that can out-think the most brilliant human?
The popular term “robots“—first used in a Czech science fiction play staged in 1920—refers to machines that embody what scientists call artificial intelligence (AI). In an outstanding survey of the field, British science journalist Luke Dormehl delves deeply into the past, present, and future of humankind’s attempts to create machines capable of learning and decision-making on their own. His book, Thinking Machines: The Quest for Artificial Intelligence and Where It’s Taking Us Next, serves up the background readers need to understand why such luminaries as Stephen Hawking and Elon Musk have warned us that AI poses a grave threat to our future as a species—while others including Ray Kurzweil, a pioneer in the field, predict a new Golden Age.
Hawking fears that the evolution of artificial intelligence will make the human race irrelevant. “It would take off on its own, and re-design itself at an ever increasing rate,” he said. “Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.”
With a somewhat different take on the subject, Musk asserts that human and machine will inevitably merge, with devices such as brain implants to increase our intelligence. However, he is fearful of Artificial General Intelligence: AI that is “smarter than the smartest human on earth, which would present a “dangerous situation.”
Kurzweil famously speaks about the “singularity,” the time (which he puts at 2045) when robots will surpass the intellectual capacity of the most brilliant human being and usher in boundless new possibilities. Hardly fearful of the threat perceived by Hawking and Musk, Kurzweil believes the problem will be solvable. “If AI becomes an existential threat,” he wrote in TIME magazine, “it won’t be the first one. Humanity was introduced to existential risk when I was a child sitting under my desk during the civil defense drills of the 1950s. Since then we have encountered comparable specters, like the possibility of a bioterrorist creating a new virus for which humankind has no defense.” Not to mention climate change and the threat of an asteroid collision or an unstoppable pandemic. “Technology,” Kurzweil adds, “has always been a double edged sword, since fire kept us warm but also burned down our villages.”
Another, much different but equally reassuring view comes from New York University psychology professor Gary Marcus, who characterizes “deep learning,” which many enthusiasts regard as the route to Artificial General Intelligence, as “the hammer that’s making all problems look like a nail.” In a 2012 New Yorker article, he argued in reference to AI that “Just because you’ve built a better ladder doesn’t mean you’ve gotten to the moon.” Recently, at TEDx CERN, he asserted that perception is “a tiny slice of the pie. It’s an important slice of the pie, but there’s lots of other things that go into human intelligence, like our ability to attend to the right things at the same time, to reason about them to build models of what’s going on in order to anticipate what might happen next and so forth. And perception is just a piece of it. And deep learning is really just helping with that piece.”
Futuristic predictions notwithstanding, virtually all observers fear the near-term impact of AI, and that’s nothing new. In fact, beginning in the late 1940s, public concern mounted about the potential of automation to displace humans from their jobs. The topic was widely discussed in the 1950s and beyond; today’s preoccupation with robots in manufacturing, driverless cars, and computer algorithms that are putting lawyers out of work is merely the latest iteration of the problem. The trend has simply accelerated in recent decades. There’s no ignoring it now.
Dormehl traces the emergence of this trend through the work of individual scientists, many of whose names will be familiar to anyone with knowledge of the history of the computer industry—John von Neumann, Alan Turing, Claude Shannon, and Marvin Minsky, among others. The story as the author tells it is highly engaging, tracing the development of AI from the 19th-century work of Ada Lovelace (Lord Byron’s daughter) and Charles Babbage on the latter’s proposed Analytical Engine, which was to be the world’s first general-purpose computer. (It was never built.) Along the way, Dormehl explores the unsteady evolution of AI both in theory and in practice and describes many of the field’s current applications, some of them surprising: for example, robots displacing research scientists in discovering likely prospects for life-saving drugs, and IBM’s Watson, the Jeopardy! champion, turning chef. He turns a skeptical eye toward some of the more ambitious goals of the field, such as the use of computers to mimic the structure and functions of the human brain. This is an amazing story, and an important one.
In the final analysis, Dormehl writes that “speculating about where Artificial General Intelligence could potentially take us [as Hawking, Musk, and Kurzweil have done] is interesting, but ultimately the stuff of science fiction for now.” Clearly, it makes more sense to concern ourselves with how to provide meaningful employment to all the millions of people displaced by AI from their jobs in the years ahead.
Less than three decades ago an American historian named David Christian who was teaching at an Australian university at the time launched a new approach to world history. His unique take on the subject took the discipline far beyond the limits of the written word. Calling it Big History, Christian started his new course at the beginning of time itself: the Big Bang.
Christian enlisted guest lecturers from the fields of astrophysics, chemistry, geology, paleontology, biology, and other scientific fields. Incorporating their specialized knowledge into his comprehensive survey of Big History, Christian summed up what is known about the birth of the universe, the emergence of stars, the formation of the Earth, the turbulent formation and shifting of the continents, and the painfully slow advent of the most primitive, single-celled life. From this perspective, the several million years since humans first emerged, much less the 5,000 years of recorded history, must be seen as only the latest and briefest chapter in a story that will continue for billions of years longer.
Since Christian’s inspired initiative, others have flocked to the new discipline. A body of Big History literature has begun to emerge. The best-known contribution to the new discipline is Jared Diamond’s bestselling book, Guns, Germs, and Steel. But others have made notable contributions as well, adding insight and perspective to our understanding of our place in the universe.
Below I’ve listed eight books I’ve read and reviewed in my own venture into Big History. Not all span the life of the universe. But they all survey world history with the broad strokes that characterize this fresh approach to understanding how the past affects today’s world. They’re listed in alphabetical order by the authors’ last names, and all are linked to my reviews. I recommend them all. I also recommend the 48-lecture course David Christian recorded for The Great Courses. It’s titled Big History: The Big Bang, Life on Earth, and the Rise of Humanity. This is world history as it should be taught.
The geologist who explained to us how the dinosaurs went extinct ventures outside his academic bailiwick to track the story of the Earth from its earliest antecedents in the Big Bang to the emergence of homo sapiens as the dominant form of life on the planet. Emphasizing geological events throughout, he illustrates how radical changes in the natural environment have shaped the course of human events—and the very nature of our bodies themselves.
While David Christian leaned on colleagues in the sciences to carry the story for its first 13.65 billion years, Cynthia Stokes Brown took it all on herself. With a good deal of simplification but relatively few obvious errors, she surveys the prehistorical past with great skill. For anyone who thinks history is the story of wars and generals and presidents, Big History is a worthy remedy.
It’s easy to get the impression that science has answered all the big questions and is spending more and more time and money focusing on the little ones. Read Bill Bryson’s A Short History of Nearly Everything, and you will quickly be disabused of that illusion. Truth to tell, the human race is still abysmally ignorant of some of the most fundamental matters that determine how, why, and where we live.
Published 20 years ago, Diamond’s thesis is the only persuasive argument I’ve ever encountered for the huge wealth gap between the “West” and the “developing” nations of the Global South. He finds the roots of the problem in the history of the last 13,000 years. This is one of the most important books of the last half-century.
Harari sees history as divided by three broad-brush “revolutions”: the Cognitive Revolution, about 70,000 years ago, when Homo sapiens acquired the gift of speech and began to walk out of Africa; the Agricultural Revolution, which began around 10,000 years ago and ushered in a new world of towns, cities, empires, and a fast-growing human population; and the Scientific Revolution, only about 500 years old, which has shaped the world as we know it today. Big History, indeed.
Forget just about everything you learned in school about the peoples who lived in the Western Hemisphere before 1492—and about the land, too. It turns out that yesterday’s historians, anthropologists, paleontologists, and ecologists got it pretty much all wrong. Latter-day investigations in all these fields have turned up persuasive evidence that the Americas before Columbus were far more heavily populated, the leading civilizations far more sophisticated, and their origins far further back in time than earlier generations of scholars had suspected.
Chances are, you’re aware that the potato originated in Peru and smallpox in Africa, and that both species crossed the Atlantic shortly after Columbus. You probably know, too, that the potato later became a staple in many European countries and that smallpox decimated the native population of the Americas. However, what you may not know is how profound was the impact on the course of history of the much more extensive exchange of animals, plants, minerals, and microorganisms from the Old World to and from the New. Historians call this phenomenon the Columbian Exchange. From the perspective of Big History, this event was one of the most significant phenomena of the last 13,000 years.
Five years after Jared Diamond’s path-breaking book, Guns, Germs, and Steel, historian and archeologist Ian Morris laid out his own, more comprehensive view of the course of human history, reaching back 15,000 years and venturing into the 22nd Century. While many historians still engaged in the stale debate about whether “Great Men” or social forces are dominant in world history, Diamond and Morris convincingly laid out the case for the greater influence of the larger context in which human history takes place, delving into biology, sociology, and archaeology as well as history itself.
@@@@ (4 out of 5)
If you’re a physician, a nutritionist, or have studied biology, you’re probably aware that our bodies contain an immense number of microbes. Most of the rest of us find that surprising. Though I knew about the bugs that inhabit my digestive system, British science journalist Ed Yong helped me understand just how numerous and widely dispersed those microbes are on my body—and yours. Try 39 trillion. That number’s greater than the estimated number of human cells in our bodies. And it’s 100 times as great as the largest estimate of the number of stars in the Milky Way galaxy. In other words, it’s an understatement to say that the number of microbes we contain is astronomical.
Now, before you panic about the many diseases you’ll contract from all these bugs, rest assured. “There are fewer than 100 species of bacteria that cause infectious diseases in humans,” as Yong explains in I Contain Multitudes. “By contrast, the thousands of species in our guts are mostly harmless.” And a great many of them play indispensable roles in keeping us alive and disease-free. They bolster our immune system, digest our food, and even help make us who we are.
How did this come about? “Remember that animals emerged in a world that had already been teeming with microbes for billions of years,” Yong notes. “They were the rulers of the planet long before we arrived. And when we did arrive, of course we evolved ways of interacting with the microbes around us.”
The collection of microbes in each of us is generally called the microbiome. Yong explains its functions and its fathomless diversity in conversational English with occasional flashes of humor. “Every one of us is a zoo in our own right,” he writes. “A colony enclosed within a single body. A multi-species collective. An entire world. . . Some species are common, but none is everywhere.” And that collection of species is unique to each of us. The microbes in my gut and on my skin are very different from those in and on you. Not only that but “your right hand shares just a sixth of its microbial species with your left hand.”
The theme of I Contain Multitudes is symbiosis. In common parlance, symbiosis suggests mutual benefit. Technically, though, the meaning is broader. “If one partner benefited at the expense of the other, it was a parasite (or a pathogen if it caused disease). If it benefited without affecting its host, it was a commensal. If it benefited its host, it was a mutualist. All these styles of coexistence fell under the rubric of symbiosis.” Yong’s book includes a huge number of examples, not just within our own bodies but in animals and plants as well. If you have an affinity for science, you’ll love this book.
By the way, “there is no such thing as a ‘good microbe’ or a ‘bad microbe’ . . . In reality, bacteria exist along a continuum of lifestyles, between ‘bad’ parasites and ‘good’ mutualists. Some microbes . . . slide from one end of the parasite-mutualist spectrum to the other, depending on the strain, and on the host they find themselves in.” The possibilities of these combinations are endless.
Yong will take you on a tour of the laboratories where scientists study symbiosis and its effects on its many hosts in other living things. You’ll meet the pioneers in the field—a fascinating lot, few of whom match the stereotype of the frizzy-haired, absent-minded professor. You’ll learn about the emerging fields of biogeography, metagenomics, and synthetic biology. And you’ll see how the discoveries in these fields are benefiting medical science—and portend more advances in the future.
The title of this engrossing book is from Walt Whitman: “I am large. I contain multitudes.”
Born Edmund Soon-Weng Yong in 1981, Ed Yong is a staff member of The Atlantic. He has degrees in zoology and biochemistry from England’s most celebrated centers of scientific teaching and research, the University of Cambridge and University College London. His popular blog has been incorporated into the National Geographic online.
@@@@ (4 out of 5)
Funny thing. Michael Lewis’ newest book, The Undoing Project, tells the story of two surpassingly brilliant Israeli psychologists whose work earned a Nobel prize in economics and, according to the subtitle, “Changed Our Minds.” As always, Lewis writes well, and he succeeds in blending biography and intellectual history with his usual skill. I greatly enjoyed the biographical details about the two Israelis and their eccentric, on-again, off-again relationship that Lewis likens to a marriage. They’re both endlessly fascinating characters. But I still don’t quite understand why their work gained a Nobel prize, much less how they changed my mind.
OK, I exaggerate. I gather that they — actually just Daniel Kahneman and not his late partner in crime, Amos Tversky — won the Nobel because they seem to have persuaded at least some economists that we humans are not rational animals. Ever since Adam Smith, most economists have insisted that our economic choices are rigorously determined by self-interest; so far as I know, most still do. For example, in a typical paper cited by Lewis, one economist wrote “‘The agent of economic theory is rational, selfish, and his tastes do not change.'” Which, of course, is sheer nonsense, as any sensible and halfway intelligent non-economist could have pointed out.
Kahneman and Tversky proved in a dozen different ways that people make irrational decisions much of the time, and not just in the field of economics. Working with economists, though, they demonstrated clearly that economic choices in particular are not necessarily logical. Their work led to the development of a new field of inquiry, behavioral economics. I suppose this is good (though I’ve long had my doubts about the wisdom of proliferating sub-specialties). But I see no evidence that economic policymakers today are making any better decisions than they did two or three decades ago. Much of Kahneman and Tversky’s seminal work was carried out in the 1970s and 80s; Kahneman took home the Nobel in 2002.
Now about changing our minds. Throughout The Undoing Project, Lewis describes the evolution of Kahneman and Tversky’s thinking — first, separately, and later after they forged a symbiotic intellectual partnership that lasted for decades with varying degrees of intensity. I found much of this hard to follow. I’ve never studied either psychology or philosophy, or found myself attracted to either field, for that matter. It’s obvious that both men were brilliant almost beyond compare. Perhaps Michael Lewis is, too. I’m not. Despite Lewis’ valiant attempt to translate their thinking into simple English, I was frequently forced to re-read whole paragraphs, and even then often to little effect. On the other hand, many of their findings appeared to me to be simple common sense. And what did come through to me very clearly was the idea encapsulated in the title. Tversky and Kahneman focused on “undoing the mistakes of others.” My problem is that so many of those mistakes simply reflected lack of intelligence, stubbornness, poor judgment, sheer perversity, misplaced professional pride, or some combination of these failings.
To cite one common-sense example: “Everywhere one turned, one found idiocies that were commonly accepted as truths only because they were embedded in a theory to which the scientists had yoked their careers.” OK. This is news? Scientists, including doctors, have been showing themselves to be poor at decision-making for centuries for just this reason. Another common-sense example: Kahneman and Tversky’s paper, “‘Belief in the Law of Small Numbers,’ teased out the implications of a single mental error that people commonly made — even when those people were trained statisticians.” In other words, people tend to draw conclusions from samples that are far too small. Sorry, but I didn’t need two psychologists to tell me that. I’ve been seeing examples all around me since I was a child.
Admittedly, Kahneman and Tversky — or, as was the case for decades before Tversky died, Tversky and Kahneman — had considerable impact on several fields. Lewis points to examples in medicine, professional basketball, federal social policy, and history as well as economics. Undoubtedly, these two men have made important contributions. They laid the intellectual foundation for the likes of the Oakland A’s Billy Beane as Lewis depicted him in Moneyball. I just wish I could understand how they changed my mind. Reading passages like the following doesn’t help: “When they made decisions, people did not seek to maximize utility. They sought to minimize regret.” Supposedly, then, when we make a decision, we’re trying to avoid feeling bad instead of gaining some advantage. Really?
I must admit that one of the pair’s discoveries impressed me greatly. Apparently, they invented the concept of framing. This I understand. It’s a big deal in cognitive science. Republicans have understood it for a very long time. Democrats are just starting to get the point. Too late, it seems.
I’d already written up my list of the 10 best books of the year when the editors of Berkeleyside asked me to supply them with a list of my five top picks. (I’ll post the longer list next week.) Picking just five is a tough assignment, to put it mildly. But here goes, gritting my teeth all the way. All these books were published in 2016 or late in 2015.
In a searing exploration of the history of slavery, an African-born American woman traces the story of a Ghanaian family over more than two centuries through the lives of two branches of its descendants, one in Ghana, the other in the United States.
A British historian’s revisionist view of military intelligence in World War II, debunking the many myths that have inspired dozens of books and taking their exaggerations down a peg with a long-lacking sense of perspective. In short, Hastings demonstrates that virtually all human intelligence (“humint”) was useless.
A Vietnamese-American won the 2016 Pulitzer Prize for Fiction with this complex novel of the Vietnam War, viewing the conflict from those who took part both in the South and the North. It’s a perspective unfamiliar to most of us and could only have been written by a Vietnamese-American. The book is crammed with insight, and it’s beautifully written.
A science journalist traces the history of autism throughout the twentieth century, when it first became the subject of close study. It’s a fascinating story of myths and misunderstandings long held both among psychiatrists and the public. The psychiatric profession does not come off well in this telling.
A veteran investigative journalist explores the time in the 1950s and 60s when the CIA ran amok, assassinating foreign leaders and intervening in the affairs of other countries in the belief that the USSR was bent on world domination. The focus is on the legendary CIA Director, Allen Dulles. You won’t think more highly of him if you read this book.
@@@@ (4 out of 5)
UC Berkeley professor Walter Alvarez tackles the emerging field of Big History from his perspective as a geologist, viewing himself as “a historian of the Earth.” In A Most Improbable Journey, he writes about the universal context in which human life has emerged.
Beginning with the Big Bang and rushing through the intervening 13.8 billion years at top speed, he focuses on the geological processes through which the Earth was formed and progressively re-formed in ways that have determined the course of human events to this day.
“The topography and climate of continents,” he writes, have controlled the pattern of settlement and the lines of communication throughout history; resources are distributed in an irregular way across the continents; and land warfare is carried out on a geographical chessboard. The geography of the oceans has determined routes of exploration, trade, and migration and has set the stage for naval warfare.”
And all this, he emphasizes, is the result of the particular configuration of the continents at this moment in geological history. Because of continental drift, the shape and distribution of both land and sea have radically changed numerous times since the Earth was created 4.5 billion years ago. For example, to cite just two minor examples of the Earth’s changeability, he notes that “California is further away from Utah than it used to be.” And the coast of Northern California once extended to what we know today as the Farallones Islands. If your taste runs to nonfiction, you may well find this book as enjoyable as the best thriller.
The discipline of Big History is less than three decades old. Founded by David Christian, an American historian then teaching in Australia, its mission is to transcend the boundaries of written history and help us see ourselves in the context of an inconceivably vast and complex universe. Instead of focusing on the mere 5,000 years of recorded history, Big Historians typically direct our attention far backwards to the beginning of time itself. However, in most treatments, Big History explores the astronomical, physical, chemical, and geological realities of our past only as prologue to an abbreviated world history. Walter Alvarez takes a different approach in A Most Improbable Journey. Though he frequently dips into other scientific disciplines, his focus throughout is on the ways in which geological science can help us understand the shape our lives and the character of the planet we share.
In his short and highly readable book, Alvarez frames the story of the ascension of the human species as an accident. “At innumerable moments . . .,” he writes, “history could have taken different paths than the one our world actually did take, resulting in a human situation different from the one we have today—or possibly no human situation at all!” He emphasizes that “the human situation is balanced on a knife edge of improbability.” This is the principal theme of his book. Again and again, Alvarez returns to this point. Writing about the improbable evolution of our bodies, he asks, “What if bilateral symmetry had never appeared? What if the movable jaw had not evolved? What if the dinosaurs had not been killed off? What if other biological inventions we can barely imagine had shaped the path of evolution? As with so much else in Big History, it was a very particular and unlikely sequence of events that gave us the characteristics of our human bodies.”
Alvarez bookends his account with references to the theory that has put him on the map, so to speak: the hypothesis that the crash-landing of a meteor or comet in the Yucatan Peninsula caused the extinction of the dinosaurs and the rise of mammals to supremacy on the Earth. The improbability of this event—unlike anything in the previous half-billion years—reinforces his thesis that the emergence of our species is due to a long sequence of highly unlikely occurrences. Although Alvarez dips into geological jargon from time to time and offers more about the history of geological science than any lay reader might wish to know, A Most Improbable Journey is nonetheless entertaining. No doubt the book closely parallels the popular course in Big History he teaches at UC Berkeley. My only complaint is Alvarez’ unaccountable love for unnecessary emphasis. Surely, it’s not necessary to punctuate nearly every interesting observation with an exclamation mark! The frequency of this aberrant punctuation is annoying.
Walter Alvarez is a professor in the Earth and Planetary Sciences Department at the University of California, Berkeley. A geologist, he is best known for the hypothesis that a meteor impact on the coast of Yucatan 66 million years ago led to the extinction of the dinosaurs, which he developed with his father. The father, Luis Walter Alvarez, was an experimental physicist who paved the way for the discovery of whole new families of subatomic particles, work for which he won the Nobel Prize in Physics.
One of the very best ways to gain insight into history and the ways of the world around us is to read biographies. Which explains why I read them so frequently. Over the more than six years since I began writing this blog, I’ve read dozens. Here I’m listing 27 that stand out in my mind.
The 27 books below are arranged in no particularly order. You’ll see, too, that they cover a lot of territory. However, apart from Stacy Schiff’s biography of Queen Cleopatra and Robert Massie’s celebrated work on Catherine the Great, they’re all set in the 19th and 20th centuries. I occasionally read history set far in the past, but I’m far more interested in the modern era that began with the Industrial Revolution.
T. J. Stiles won both the Pulitzer Prize and the National Book Award for this outstanding biography of one of the seminal figures in American economic history. Cornelius Vanderbilt was the model for the generation of capitalists who came to be known as Robber Barons.
The amazing story of a 19th century superstar, little remembered today, who was regarded as a genius by Charles Darwin, Thomas Jefferson, and other leaders of Western civilization during and after his lifetime. This is the man who first laid down the principles of ecology — more than 200 years ago.
If any one person was most responsible for today’s divisive politics in America — and for the rise of the Tea Party and Donald Trump — it’s Roger Ailes. As the longtime chairman of Fox News, Ailes steadily made Right-Wing extremism ever more respectable. We’re all paying the price for that now and will probably continue to do so for a long time to come.
Few Americans today can imagine the abject fear that stalked summertime America when polio epidemics were an annual occurrence. Jonas Salk solved the problem. Often shunned by his fellow scientists, Salk was a true pioneer. He ignored the limitations of medical science as it was known in his day to fashion drug trials that gave us the first (and safest) polio vaccine.
Hollywood’s portrayals of Queen Cleopatra bore little resemblance to the reality, as Stacy Schiff makes clear in this extraordinary original biography. More historiography than simple history, Schiff examines how the legend of Cleopatra grew over the centuries — and was steadily distorted in the process.
John F. Kennedy’s younger brother was showing the potential to eclipse him when he was cut down by an assassin’s bullet on his path to the White House. Apart from the Cuban Missile Crisis, in which Bobby Kennedy played a significant role at the side of his brother, and the goal he set of landing a man on the moon, it’s difficult to point to much in JFK’s presidency that history will regard as truly significant. Bobby seemed prepared to do much more.
Social change movements don’t start by themselves. Someone leads them. And often that person is what today we call a community organizer. Cesar Chavez was one such man, and this excellent biography is about the gifted teacher who taught him the tricks of the trade.
This surprising biography of the Civil War hero and famous failure won the 2016 Pulitzer Prize for History. As Stiles makes clear, the jealousy of Custer’s fellow officers was probably in large part responsible for the general’s defeat at the Little Big Horn.
Though mainstream society shunned him as a criminal, most African-Americans in his time looked on Malcolm X as a hero. Along with Rev. Martin Luther King, Jr., Malcolm must be considered one of the most significant figures in recent American history.
The real-life Karl Marx was very different from the caricature created by Lenin, Stalin, and their minions. He was, in fact, a man of his time and not really a revolutionary in the manner of Lenin, much less Stalin or Mao.
Today we take for granted that scientific advancement comes from huge, well-funded teams, not solitary individuals laboring away in white coats. This biography of the remarkable atomic physicist Ernest Lawrence tells the story of how Big Science came to be — and how he was a central figure in its creation.
Few of us know any more about the Wright Brothers than the image lingering in our minds of that flimsy biplane lifting off the dunes at Kitty Hawk. Here, the prize-winning biographer David McCullough tells their remarkable story. What’s especially interesting are the years after Kitty Hawk, when the brothers became world famous.
David McCullough’s intimate biography of Steve Jobs grabbed the headlines, and it was beautifully done, as is all of McCullough’s work. But this later entry from two journalists who followed Jobs closely for many years gives a far more accurate and balanced picture of the man and his life. He was even more complex than we knew.
In his time, Joe Kennedy was considered by some (especially himself) as a possible contender for the Presidency. When his hopes were frustrated, he transferred his ambitions to his sons. This is the insightful story of a remarkable man who established one of the most important families of 20th-Century America.
In his own time, Clarence Darrow was one of the most famous men in America. As an attorney — the country’s leading attorney — for unpopular people and causes, he was probably loathed at least as widely as he was loved. But no one would ever have dreamed of dismissing him as inconsequential.
Among the Tsars of Russia, only Peter the Great can be considered as a peer to the Prussian woman who married an heir to the throne and came to be called Catherine the Great when she succeeded her husband after a few years. Catherine ruled over the country for 34 years, expanding its borders and modernizing its institutions along Western European lines.
Espionage is, of course, a risky business. Few spies manage to operate undiscovered for more than a few years. Those who gain access to secrets at the highest level tend to be in even greater jeopardy. Kim Philby was a rare exception. For three decades, he worked undercover in the UK as a spy for the Soviet Union inside the British intelligence establishment. Even after his English colleagues became convinced he was a spy and isolated him from access to sensitive information, the CIA continued to defend him.
In its own time, and into the present day, the Church of Latter-Day Saints was one of the world’s fastest-growing religions. For decades, the religion founded early in the 19th century by an uneducated young man in Upstate New York defended the practice of polygamy, a practice which the founder himself indulged in to an extreme degree. Eventually, the Mormon church abandoned its defense of plural marriage, but the mystifying fantasy at the heart of the beliefs expounded by Joseph Smith nearly two centuries ago live on.
Much of what the public knows about poverty in the Global South comes from the work of an American economist who gained fame at an early age working a “miracle” in Bolivia. Unfortunately, there were no miracles to follow in any of his work over the following three decades. As Nina Munk makes clear through diligent research, Jeffrey Sachs is no miracle-maker, and the path he described out of poverty is a dead end.
A Russian-American journalist unmasks the former KGB agent who has set out to reconstruct the Soviet empire and is now aggressively taking on the world. His intervention in Syria and his meddling in the 2016 American elections are just two of Vladimir Putin’s efforts to work his way on the world. And, by the way, he’s stolen enough to amass a personal fortune of $40 billion. While Putin and his cronies have become absurdly rich, the Russian economy is in a shambles.
Robert Caro is one of America’s most celebrated political biographers. Though not without its critics, his multi-volume portrait of Lyndon Johnson is widely regarded to be one of the best presidential biographies ever written — and it’s yet to be finished. The Passage of Power is the fourth volume, and it brings Johnson’s story only up to 1964, when he was elected in his own right to the White House.
Like so many clowns, Kurt Vonnegut lived a sad life. His satirical take-downs of war, corporations, and life in mid-century America in his books were sometimes hilarious. But it doesn’t appear that the man laughed a lot. And even though for many years Vonnegut was regarded as one of America’s most important writers, it remains to be seen whether that reputation can endure much longer.
Any educated person in America today is likely to be familiar with two of Stanley Milgram’s experiments in social psychology. One was the “obedience” experiment, in which he proved that Yale undergraduates could be persuaded to induce extreme, and even life-threatening, pain on others simply because they were told to do so. The other was the “small world” experiment, in which Milgram proved that we are separated from one another by no more than “six degrees of separation.”
The Office of Strategic Services (OSS), the wartime precursor to the CIA, was responsible for much of the partisan activity behind Nazi lines in Europe. Though later evidence suggests it was only marginally helpful to the war effort, Donovan and his work had the confidence of FDR and became world famous.
The historical record is shocking enough: the future Secretary of State and future CIA director helped steer Wall Street capital and American business to Nazi Germany throughout the 1930s. The older brother, Foster, brought the world to the brink of nuclear war with his diplomatic brinksmanship. The younger, Allen, first helped Nazi war criminals escape to the US and South America after World War II, sometimes with the fortunes they plundered. Later, he led US efforts to assassinate heads of government in Iran, Guatemala, Indonesia, Cuba, Congo, and probably many others. Yet, as David Talbot showed in his later book (listed just below), even worse was to come.
Digging much more deeply into the historical record, including interviews with contemporaries of Dulles and recently opened secret files, San Francisco investigative journalist David Talbot paints a much darker and more credible picture of Allen Dulles than Kinzer did in The Brothers. Even after JFK fired him as CIA Director, Dulles continued to meddle in political affairs at the highest level — with catastrophic consequences.
The astounding-but-true tale of how a penniless Eastern European immigrant founded the United Fruit Company, helped engineer the murder of the President of Guatemala, and became one of the richest men in the world. It was Samuel Zemurray whose efforts shaped the history several of what came to be called “banana republics.”
@@@@@ (5 out of 5)
There’s something seriously wrong with Mary Roach. Conjure up any vile, disgusting, or taboo subject that anyone in her right mind would shun — and you’ll find Mary Roach has written a book about it (or is probably planning to do so). Her books to date have dealt with cadavers, the afterlife, surviving interplanetary flight, the physics and chemistry of sex, and the human digestive (and excretive) tract. And she is not inclined to use circumlocutions. She calls crap crap. I love this woman!
Now, in Grunt, her latest book, Roach has turned her attention to military science, specifically the science-based efforts by the U.S. defense establishment to clothe, train, armor, and heal our soldiers, sailors, and airmen and protect them from every manner of wound and loss of function. Unfamiliar and uncomfortable topics such as battlefield hearing loss, shark repellent, bird strikes on airplanes, diarrhea, and penis implantation figure in the story. (The only major topic she avoids is PTSD, because, she writes, “it has had so much [coverage], and so much of it is so very good.”) Roach tells her tale with brutal honesty — and leavens it with an abundance of humor. Some passages are laugh-out-loud funny. Maybe what’s wrong with Mary Roach is that her sense of humor is so much better developed than it is in the rest of us. In any case, I love what she writes.
Unless you are remarkably knowledgeable about the U.S. military, you’re likely to learn a great deal about how it actually works by reading Grunt.
You get the point. Military science can be fun — at least, reading about it can.
Mary Roach has written eight books of science journalism. Her work has garnered several awards and been shortlisted for many more. Her books have been bestsellers from the start, beginning with her first effort, Stiff: The Curious Lives of Human Cadavers.