In 1858, Ralph Waldo Emerson traveled to the bustling industrial center of Cincinnati and gave a lecture that would become “Works and Days,” on the promise and peril of mechanical progress. “Our 19th century,” Emerson said, “is the age of tools ... and we pity our fathers for dying before steam and galvanism, sulfuric ether and ocean telegraphs, photograph and spectroscope arrived, as cheated out of half their human estate.”
If the 19th century was the age of tools, the 21st century is the age of very, very cool tools. And we would join most of our readers in pitying Emerson for never experiencing the joys of iPhones and microwave ovens.
Engineers—and the people who pay for what they invent—have inherited the earth.
But in our more sober moments, we might envy Emerson. He never had to live through mechanized warfare and environmental degradation. Many of his prophesies concerning the fate of the machine age remained mere prophesies; our generation—not his—has the unfortunate task of guarding against their eventuality. And so our engineers have a task ahead of them that has little to do with becoming more proficient inventors and everything to do with becoming better humans.
When Henry David Thoreau chided the readers of Walden that “Men have become tools of their tools,” he was suggesting, following his friend Emerson, that tools—initially developed to give us more freedom in our daily routines—have come to structure, and in some cases dominate, the lives that they were meant to liberate. Today, technology structures our communication, transportation, education, health care, personal records, and financial systems. Technology drives the distribution of our food, water, energy, and shelter. Technology shapes the way we work, the way we are born, the way we die, and the human relationships we form in between. The sonogram signals our first sparks of life; the heart monitor announces the point at which we’re snuffed out; and social media document the number of “friends” we’ve made in the interim.
In an age of tools, much less very cool ones, Thoreau suspected that we often confuse technological advancement with moral and legal progress. And so that which is easy and convenient is often regarded as necessarily moral and just. Just as long as we don’t misplace our smartphones, our lives can remain intact.
But what if the elision between efficiency and the good is not as simple as we would like to think? What if there is no elision at all? These are the questions that engineers need to ask in the coming century. But we fear they won’t.
“The machine unmakes the man,” Emerson continued in his lecture. “Now that the machine is so perfect, the engineer is nobody.” A hundred years after Emerson’s lecture, John von Neumann imagined the unsettling point known as “technological singularity,” when artificial intelligence would outstrip, and then make obsolete, the functions of human reason. Fringe critics of technology have been wringing their hands ever since.
But our concerns are more immediate: We worry that the many conveniences delivered by modern engineering make citizens loath to interrogate the moral implications of technological progress. As our technology becomes more precise, more perfect in the technical sense, it becomes less and less open to thoughtful criticism. It’s not that technology outstrips our rational capacities, but that its impressive capabilities distract us, or, even worse, that our vested interest in not having our cool toys taken away from us will make us willfully blind to the moral risks associated with those toys. Humans might willingly forfeit their rational and moral autonomy well before we reach the point of singularity.
Defending against this type of moral dystopia is not simple. Being a rational agent means that one should be able to exercise personal freedom in all sorts of ways, even those that are not particularly healthy. So limiting public access to Google Glass or Parrot drones is obviously not the way forward. Paternalism, despite its obvious efficiencies, is not a viable option in a liberal democracy.
The way to avoid forfeiting human agency in our technological age might lie much earlier in the research-and-development cycle—in the ethical decisions of engineers.
Consider a few promising facts: Engineers, unlike most social scientists and scholars in the humanities, already commit themselves to a complex code of ethics before entering their profession. And engineers, on the whole, are the Type A thinkers that could obsess (in a good way) over this code if they were given proper instruction that was reinforced throughout their careers.
The National Society of Professional Engineers articulates this Code of Ethics and its fundamental principles. Some of those cover simple, but not unimportant, issues like not lying to employers and not performing activities outside areas of competence. These make sense, but do not get to the heart of Emerson’s concern about our technological culture: that we give up some of our humanity in the pursuit of technological progress.
The NSPE Code begins to address that danger too, albeit in language not sufficiently urgent. The first, and supposedly most important, canon is to “hold paramount the safety, health, and welfare of the public.” If we take this canon seriously, it implies something counterintuitive about the skill set of tomorrow’s engineers, namely that they should be experts in discussing and determining what constitutes the common good. That might seem like a crazy idea, but Plato had a similar one when he suggests in the Republic that the most important training of soldiers is not in the technical art of swords and spears but in the practice of ethical judgments.
The engineers’ society proposes that its members can look out for the common good by being “encouraged to participate in civic affairs; career guidance for youths; and work for the advancement of the safety, health, and well-being of their community.” Encouraged? That tepid suggestion fails to reflect the nature of the obligation that engineers should have to their fellow citizens. The NSPE is not afraid of stating plainly that professionals “shall acknowledge their errors and shall not distort or alter the facts”; a similarly definitive statement is warranted concerning their pursuit of the public good: “Engineers shall henceforth be philosophers.”
The most experienced and seasoned engineers, those who have witnessed multiple life cycles of technology, recognize the need for philosophical reflection. The renowned chemical engineer John M. Prausnitz has noted, “If engineering is the application of science for human benefit, then the engineer must be a student not only of the application of science, but of human benefit as well.”
Today the public good is largely defined in the narrow terms of instrumental reason and technical efficiency, but that is not the best, much less the only, way to define public goods. Following philosophers such as Plato and Emerson, engineers need to recognize that there is a radical difference between efficiency and virtue and that it’s profoundly wrongheaded to confuse the two.
Engineers need to consider and debate the far-reaching outcomes of their inventions. That means speculating about the second- and third-order effects of even seemingly minor discoveries. If that slows down production cycles, so be it. If engineers fail to carefully weigh the long-term impact of their innovations and neglect to provide appropriate guidance for novel devices, then engineers share the culpability if their machines are used in ways that harm the public good—for example, in nuclear warfare and unmanned combat.
Attending to the common good also means that engineers should become intimately familiar with, and then employ, longstanding theoretical frameworks when justifying their activities, much the way that soldiers and military leaders have recourse to a martial code of ethics. For instance, engineers might consider principles of Kantian ethics (which underscore the importance of human autonomy and responsibility) in designing drone technology for surveillance and combat.
Such theoretical frameworks can help engineers think through important questions: What is the scope of the public that I am serving? What are the moral and legal objectives of an ideal society? How might the technology that I am developing further or undermine that ideal?
A key feature that distinguishes the art of engineering from the pursuit of fundamental science is the number of possible solutions for any given problem. Whereas scientists recognize that there is only one scientific hypothesis that correctly explains a given observation, engineers appreciate that several different designs might meet the same social need. It is the job of the scientist to find the one true universal law underlying a natural phenomenon; it is the job of the engineer to choose the best design solution given a number of possible choices.
Engineers bear a responsibility to consider both nontechnical and technical solutions to societal challenges. Because engineers have the skills to bring their visions to fruition, the engineer is both an arrowsmith and an archer, shaping the future direction of society. The key question for the engineer is: What kind of society am I creating?
Of course, that won’t guarantee that engineer-ethicists come up with the right moral answers; that’s not how moral deliberation works. Ethical deliberation, according to Aristotle in the Nicomachean Ethics, starts in vague outline and then, over time, comes into greater focus. Such deliberation should be natural for engineers, since their design process is iterative; they continually refine their designs.
Defining the common good should not be an afterthought, but rather the first step in a continuing discussion through the process of research and development. This is what it means to avoid becoming a tool of our tools.
Most engineering programs have some sort of ethics course requirement. But we work with engineers every day, and we suspect that most students view this course as just something to get through, a necessary but routine part of their curriculum. It doesn’t have to be that way.
Most engineering-ethics courses are chock full of case studies that outline the dangers of malpractice—the Challenger explosion, Chernobyl, the Ford Pinto debacle—the dangers of which are clear and present. There is no mystery that something went wrong; the trick is to find the solution.
Solving the “dilemmas” of these case studies is relatively easy (and boring) when compared with the fascinating difficulty of defining virtue and responsibility in the 21st century. Give engineers the real dilemma, in personal and riveting terms, and we suspect they will fulfill their new roles of philosophers quite nicely.
Buried in a reading list of a dozen books—most of them case studies and dry treatises of professional ethics—for the Massachusetts Institute of Technology’s graduate course in engineering ethics is a little red volume first published in 1945. General Education in a Free Society was written by a commission of American intellectuals after World War II as an educational how-to manual for avoiding totalitarianism and fascism. More specifically, it became the general-education scheme for Harvard College after the war.
This sort of book, both general and philosophically sophisticated, should be central in the training of engineers in our age of tools. Its central thesis is more pressing today than it was when it was first published: General education in a free society, an education that centers on the broadest of humanistic questions concerning responsibility and the common good, is the only appropriate education to supplement the expert knowledge of a growing number of technologists who will, we are sure, create ever-more-perfect machines.
The final canon of the NSPE ethics code is as vague as the first, and as pivotal in ensuring that we do not relinquish the very thing that our tools were meant to facilitate: our humanity. Engineers are to “conduct themselves honorably, responsibly, ethically, and lawfully so as to enhance the honor, reputation, and usefulness of the profession.”
That is a banal platitude, but in the words of David Foster Wallace, “banal platitudes can have a life-or-death importance.” The call to conscience of the final canon of the National Society of Professional Engineers might be regarded as a moment of wishful thinking or as the lynchpin that ensures that all of the other, more specific tenets of its ethical code will be followed. We prefer to think of it as the latter.
John Kaag is an associate professor of philosophy at the University of Massachusetts at Lowell. He is the author of A Wilderness of Books, forthcoming from Farrar, Straus and Giroux. Sujata K. Bhatia is a lecturer and assistant director of undergraduate studies in biomedical engineering at Harvard University and an associate of the John F. Kennedy School of Government.