Your computer is on fire
edited by Thomas S. Mullaney, Benjamin Peters, Mar Hicks and Kavita Philip (MIT Press, 2021)
A simple philosophy guides many leaders today: to create companies that can harness the power of new technology, grow as quickly as possible, and show investors that these companies can compete in an uncertain future. The result has been the rapid proliferation of digital (and digitally powered) businesses that have become increasingly relevant and essential to our lives. But these companies have also produced a wide range of unintended adverse consequences.
The impetus given to companies to face these consequences, including personal information breaches, bias in algorithms, the spread of disinformation, widen inequalities, and environmental damage– grows day by day. Activist shareholders, among many other stakeholders, advocate for “responsible technology” policies and for closer links between technological ethics and executive compensation programs. Environmentally and socially conscious consumers are voting with their wallets, encouraging companies to re-evaluate their products and purpose, including their role as employers of a diverse and engaged workforce. The global pandemic has only added to the momentum for change.
But how can companies maximize the positive impacts of technology while minimizing all the bad ones? This is the next big challenge facing business leaders and our system as a whole. And that’s why Your computer is on fire, which is critical read for business leaders looking to tackle this issue head-on, is the best tech book of 2021.
In some ways, this is an unusual choice. The book consists of 16 essays rather than a single account. And they’re written by academics for a target audience of STEM students, humanists, technologists, and social science researchers. But the frames that go through the more than 400 pages of Your computer is on fire will end up more than a little unstable. The authors fearlessly dismantle the tech industry’s most sacred assumptions, forcing a rethink of everything we accept as true about our digital lives and the multibillion-dollar digital transformations underway within our businesses. Headlines such as “Gender is a business tool”, “A network is not a network” and “Coding is not empowerment” are not convincing.
How can businesses maximize the positive impacts of technology while minimizing all the bad ones? This is the next big challenge facing business leaders and our system as a whole.
In the collection’s first and most provocative essay, “The Cloud is a Factory,” Indiana University Associate Professor Nathan Ensmenger challenges readers to think differently about one of the most popular technologies. most transformative for businesses in a generation: the cloud.
What exactly is the cloud? The quick answer is that it is a collection of computer software services, ranging from email to inventory tracking software, that users can access through the Internet, not through desktops or internal servers. . Cloud computing platforms have proven to be a powerful way to test new approaches and experiment with new technologies, including advanced analytics and 3D printing.
But in much simpler terms, the cloud is a collection of computers located in a data center elsewhere, computers that need physical materials such as metal and plastic, as well as electricity, water and electricity. people. A bit like… an industrial factory. As Ensmenger observes, a typical data center consumes between 350 and 500 megawatts of electricity and requires about 400,000 gallons of fresh water per day for cooling.
Yet, because the term cloud has been used as a metaphorical device, and because the cloud tends to be seen as a benign virtual technology solution, the IT industry has managed to bypass the long history of regulating physical infrastructure resources. In the past, when a traditional factory polluted the water supply or maimed workers, public policy reacted, even belatedly. But the cloud remains largely unregulated, with all of its factory-like negative effects underestimated. “Let’s bring this deliberately ambiguous and ethereal metaphor back to earth by rooting it in a larger history of technology, labor and the built environment, before it’s too late,” Ensmenger pleads.
In another essay, “Your robot is not neutral,” Safiya Umoja Noble, associate professor at the University of California, Los Angeles, calls for a deeper and common sense understanding of the processes involved in the formation of data, which, according to her, are essentially a social construct. Just as race and gender are social constructs – things we decide on rather than things that are immutable or naturally occurring – so too is the data that has come to dominate our lives. The problem is that there is no connection between the fabrication of these data and the historical social practices that inform their construction. When data is developed from a set of discriminatory social processes, such as the creation of statistics on policing in a city, it is often impossible to recognize that this data also reflects procedures such as surveillance excessive and disproportionate arrest rates in African Americans, Latinx, and low-income neighborhoods, argues Noble. “The concepts of data purity and neutrality are so deeply embedded in the training and discourses of what data is that it is very difficult to move away from the reductionist argument that ‘mathematics cannot discriminate because it’s math, ”she wrote.
The essays also address gender inequality, another issue for the high-tech world. In “Sexism Is a Feature, Not a Bug,” Mar Hicks, associate professor at the Illinois Institute of Technology, tells the story of sexist hiring and firing practices in the computer industry in England, and how computer technology has become an “abstraction of political power in machine form.” These failures are not just accidents, “writes Hicks,” they are features of how systems were designed to operate and, without significant outside intervention, how they will continue to operate. “
Although each writer examines a different question through their own lens, the collection of essays in Your computer is on fire achieves narrative cohesion. History, especially the history of computers and industrial society, serves as a thoughtful and astute organizational device, as the tech industry is not designed to look back, but only forward. The industry has been built on the notion of constant reinvention, and the authors know that the act of exploiting history for lessons is not in its DNA. But it should be. Like the writers of Your computer is on fire Be clear, there’s a lot at stake when we ignore history and don’t think more humanist about IT.
Undoubtedly, companies need to tackle the damage created by technology to prevent the damage from exceeding the gains. The book does not offer any concrete recommendations for making such responsible technology policies, nor does it describe major policy changes. Corn Your computer is on fire succeeds by forcing us to adjust the way we think and talk about the core issues at the center of business and society. And that, the authors note, is a great starting point for change.
As Benjamin Peters, professor at the University of Tulsa, author of “A Network Is Not a Network” explains: “Technology will neither keep its promises nor its curses, and technological observers should avoid both utopian dreamers. and the dystopian catastrophists. The world is truly on fire, but that is no reason for it to be cleansed or ravaged at the exact day and time predicted by the self-proclaimed prophets of profit and doom. The flow of history will continue to surprise.
Standing the Test of Time: 9 Rules for Humans in the Age of Automation
by Kevin Roose (Random House, 2021)
Artificial intelligence and advanced robotics allow machines to perform tasks that once required a person. According to some, nearly half of all jobs in the US economy could become obsolete. But what if our future reality was more nuanced than that? What if automation removed millions of people from their jobs while improving healthcare diagnostics and slowing climate change? And how do you thrive in this kind of hybrid environment? These are the questions at the heart of the fascinating book The test of time, written by New York Times columnist Kevin Roose. With honesty and humor, Roose attempts to correct some flaws in the way we think about AI and suggests ways to make the most of our benefits. Whether he advocates the integration of “consequentialist thinking” into a standard STEM curriculum or encourages “digital discernment,” Roose adds an important contribution to the scholarship surrounding our future of AI in this immensely readable book. and workable.
A World Without Email: Reimagining Work in the Age of Communication Overload
by Cal Newport (Portfolio / Penguin, 2021)
Did you receive my email? Email, and its ever-increasing volume, has become the bane of life for workers in the 21st century. But Cal Newport, associate professor of computer science at Georgetown University, thinks we can do without it. In his ledger, A world without email, Newport is tackling the way workplaces create a “hyperactive hive mind” — always and quickly communicating, responding and sharing information — and the problems that result from it. This style of working, Newport argues, forces people to constantly check their inboxes or email platforms, which reduces their ability to focus and concentrate, causing mental fatigue and contributing to dissatisfaction. job. His very accessible book sets out four simple principles for redesigning the world of work without email: the principle of attention capital (treating attention as a valuable resource), the principle of the process (developing work processes that maximize the value generated by your attention), the protocol principle (structuring work processes to optimize coordination between employees) and the principle of specialization (allowing employees to work more in depth on fewer things). While it’s not easy to change our email culture in the workplace, Newport notes that it is “one of the most exciting and impactful challenges” we face today. hui.