Cautionary Tale
Michael Crichton is best known for his technological cautionary tales. Through books like The Terminal Man, Jurassic Park, The Lost World, Timeline, and a whole bunch of others, he explored the potential that scientific and technological innovation could bring – and all the ways that said fields could be abused, often by greedy and unscrupulous corporations. I've seen him described as an anti-science writer (indeed, his views on climate change were... not great), but I don't think that's entirely accurate: I look back on his body of work as being incredibly skeptical of people, organizations, and the ways they can misbehave in ways that remain fresh and relevant decades later.
Jurassic Park is a fantastic read. It's got a killer, high-concept premise (what if a company brought dinosaurs back to life by cloning them?) and a snappy, gripping cadence that hooks me in exactly the same way that it did when I first read it in high school. While Steven Spielberg's film captures most of our nostalgic memories about the story and its meaning, I've always been partial to the book. In a lot of ways, the dinosaurs are something of window dressing for something that Crichton was far more interested in exploring: how corporations are happy to throw scientific ethics out the window when it suits them.
Sign up for Transfer Orbit
A newsletter about science fiction, reading, and the future
No spam. Unsubscribe anytime.
In the novel, John Hammond is the founder of a genetic engineering corporation called InGen, and he's set up Jurassic Park with the intention of earning billions and billions of dollars from an adoring public that wants to see real, live dinosaurs in person. The park itself is technologically advanced: he brags about how it can be controlled by just a handful of personnel, and that the computer systems they have in place can do everything from keep track of the dinosaurs in the field to personnel and guest safety.
He also highlights how they've taken extensive precautions in bringing extinct and dangerous animals back to the world. The dinosaurs that they hatch are all female and can't breed, they have to have certain chemicals in their diet, and they're situated on an island that's hundreds of miles away from other landmasses.
Crichton sets a line that runs through the center of the novel: how do you set up a complicated system and how do you prepare for contingencies when problems arise? This plays out through Hammond's conversations with mathematician Ian Malcom, who specializes in chaos theory and predicts that there's a high likelihood that the park will experience some sort of catastrophe; that it's simply impossible to control all the variations that could spiral out of control.
That's exactly what happens: in the beginning of the novel, we see as a child is attacked on another island by a mysterious lizard (which turns out to be a Procompsognathus that's escaped), while Paleontologist Alan Grant, Malcom, and others quickly realize that those precautions have failed: not only have the dinosaurs begun to breed (thanks to some frog DNA that's been spliced into the resurrected dinosaur's DNA), a hack from an insider working to steal InGen's intellectual property wrecks havoc across the park when he turns off the security systems, allowing some of the more dangerous creatures to escape.
As it turns out, extremely complicated systems are very, very difficult to control.
There's a section that jumped out at me while I've been rereading it recently:
"You wanted to fit them with radio collars," Hammond said. "And I agreed.
"Yes. And they promptly chewed the collars off. But even if the raptors never get free," Arnold said, "I think we have to accept that Jurassic Park is inherently dangerous."
Oh balls," Hammond said. "Whose side are you on anyway?"
A couple of pages later:
In the control room, Hammond said, "Damn, those people. They are so negative."
I'm reminded of my time covering the technology world, where I've seen plenty of smart people make counterproductive or even disastrous decisions: people so enamored of the products or companies that they were building that they don't – or won't – see the potential problems that arise from their creations.
Hammond is a model technology CEO: I could see him put at the head of anything from Facebook to Tesla to Uber without any changes. He's proud of his work and recognizes the potential that it has to change the world, but he's also vain: he complains loudly when he realizes that people might not approach his creation with the outright awe that his park is intended to elicit.
This is a familiar attitude from a range of tech figures. Elon Musk is a prime example here: between Tesla and SpaceX, he's pointed to each company's goals to reduce traffic fatalities or to turn humanity into an interplanetary species, only to push back on criticism about their respective safety records. Tesla has been involved in several high-profile crashes where its Full Self-Driving system has been implicated, something that Musk has countered by flipping the argument around: "Human driving is not perfect." When it comes to SpaceX, Musk often deflects criticism, pointing to the data that his engineers will be able to glean from an explosion (sorry, not an explosion, a "rapid unscheduled disassembly") or to point to the company's mission as a net benefit for humanity.
He's not wrong with those arguments: humans do suck at driving, and you can get a lot of data from a failed launch. And Hammond isn't wrong when he points out that there are engineering solutions to the problems that the park will eventually face.
But these individuals aren't really working in humanity's best interest, are they? Their arguments about making the world a better place is just PR window-dressing that deflects attention from their enormous profitability. Indeed, in a conversation with geneticist Henry Wu, Hammond pointed out that helping humanity is secondary to his goals of getting rich because of the regulations and red tape that governments would throw in the way:
"If you were going to start a bioengineering company, Henry, what would you do? Would you make products to help mankind, to fight illness and disease? Dear me, no. That's a terrible idea..."
"Suppose you make a miracle drug for cancer or heart disease–as Genentech did. Suppose you now want to charge a thousand dollars or two thousand dollars a dose. You might imagine your privilege. After all, you invented the drug, you paid to develop and test it; you should be able to charge whatever you wish..."
"From a business standpoint, that makes helping mankind a very risky business. Personally, I would never help mankind."
Hammond's ability to skirt around problems and put his head in the sand demonstrates just why his park will fail: in his pursuit of profitability and to get investors and regulators off his back, he's willing to ignore the real and pressing challenges that he faces. When his chief park engineer John Arnold begins listing off the bugs and problems that the park is experiencing, Hammond essentially waves those arguments away. "Let's keep it in perspective. You get the engineering correct and the animals will fall into place.
Crichton's cautionary tale of a novel is fun to read, but in it, he highlights the sociopathic nature of these companies: in their rush to market / profitability / power / etc., corporations are often willing to bend ethics, guidelines, laws, principles, and rules in order to make their investors happy and rich, glossing over the finer details and potential problems that might sit in their way.
This is a book about pushing back on these types of actors, to point out the clear flaws and dangers embedded in a system, and to not take someone at their word when they promise incredible things. These are lessons that apply throughout life and society; the tech barons of the 2020s aren't the only people in power who exhibit these traits.
This is a key reason for why Jurassic Park is well worth reading: Crichton, for all of his own issues, hit the nail on the head with this particular theme. We can do incredible things with science and technology, but those incredible things are often facilitated by actors with motives that are less than altruistic. It's a book that I wish that I'd returned to earlier: an essential lesson about today, wrapped up in the guise of a gripping thriller.