IN THE PAST SEVERAL YEARS, the media has produced a steady stream of stories about Silicon Valley tech executives who send their children to tech-shunning private schools. Early coverage included a widely discussed 2011 New York Times article about the preponderance of “digerati” offspring, including the children of eBay’s chief technology officer, at the tech-adverse Waldorf School of the Peninsula. A 2017 article in the Independent discussed the technology-free childhoods of Bill Gates’s and Steve Jobs’s kids. Haplessly conflating correlation and causation, it linked teens’ technology use to depression and suicide and smugly concluded, “wealthy Silicon Valley parents seem to grasp the addictive powers of smartphones, tablets, and computers more than the general public does.” A 2018 New York Times article called smartphones and other screens “toxic,” “the devil,” tantamount to “crack cocaine,” and intoned, “[t]echnologists know how phones really work, and many have decided they don’t want their own children anywhere near them.”
These articles assume that techies have access to secret wisdom about the harmful effects of technology on children. Based on two decades of living among, working with, and researching Silicon Valley technology employees, I can confidently assert that this secret knowledge does not exist.
To be sure, techies may know more than most people do about the technical details of the systems they build, but that’s a far cry from having expertise in child development or the broader social implications of technologies. Indeed, most are beholden to the same myths and media narratives about the supposed evils of screen time as the rest of us, just as they can be susceptible to the same myths about, say, vaccines or fad diets. Nothing in their training, in other words, makes them uniquely able to understand arenas of knowledge or practice far from their own.
As a case in point, many techies’ conviction that they must monitor and cultivate — with concerted effort — their children’s technology habits is firmly and prosaically rooted in the values and worldviews shared by many non-techie middle-class parents. Private schools almost by definition have to craft stories that appeal to privileged strivers anxious about their children’s futures. Some of these stories recount how their graduates’ creative brilliance was spawned in their school’s tech-free environment. Related ones ply anti-contamination themes, and fetishize the purity of childhood. Techie parents are as susceptible as anyone else. Moreover, the ways in which technology fits into these narratives — or is actively excluded from them — has far more to do with parents’ age-old fears about social change and new media than with any special knowledge vouchsafed to tech workers. Indeed, such stories are similar to widely held beliefs in 18th-century England that novels corrupted the soul. In the latter half of the 20th century, first television and then video games became the sources of this alleged corruption, joined by the internet at the dawn of this century.
This isn’t to say that these media are universally good for us — not at all — but their influence is far more nuanced and contextually dependent, and far less dystopian, than the stories quoted above lead us to believe. Research consistently shows that what really matters is the context of children’s technology use (is this time for the family to be together or a digital babysitter?), the content they consume (is this videochatting with grandparents or violent videos?), and how adults communicate with them about what they are seeing.
The more important point here is that believing techie parents have secret insider knowledge about the harmful effects of children’s technology usage reinforces the dangerous myth that techies are always the smartest people in the room — and that the more technical they are, the more wide-ranging their expertise.
As an example of how a technical background does not insulate people from specious reasoning, we need look no further than the vaccination rates at the elite, technology-shunning Waldorf schools in and around Silicon Valley. Scientific consensus has long upheld the safety and importance of vaccines. Yet at the Waldorf School of the Peninsula, which is the techie-dominated, tech-shunning school featured in the 2011 New York Times article, an average of only 36 percent of kindergarteners were fully vaccinated in the five years before California’s personal belief exemption was removed in 2016. If so many techie parents at this school are susceptible to vaccine misinformation, they are surely just as susceptible to screen time dystopianism, unfounded fears of “contamination,” and other forms of misinformation.
And yet I encounter this myth almost every day. As a culture, we look to tech company founders and executives for advice on solving the world’s problems, whether through their charismatic philanthropy, their shepherding of the high-profile TED Talk ecosystem, or their calls to simply “lean in” to overcome sexism in the workplace — topics generally falling well outside their expertise. And this extends beyond technology executives to employees as well.
¤
My first exposure to this myth of “techies as thought leaders” was actually from the inside — as a computer science major at UC Berkeley in the early 2000s, where I constantly heard stories about engineering exceptionalism. Computer science was rumored to be both the most difficult and most desirable major on campus. I heard the usual array of sexist jokes about the relative value of “hard” and “soft” disciplines, used to dismiss any non-techie and all non-technical majors. And perhaps most damning and dangerous, we were told, and many came to believe, that technology could be the solution to every problem. This has emboldened many techies to ignore evidence to the contrary — including the fact that many online spaces are indeed harmful to most of the population; that technology-driven education and development projects are often short-lived and at the expense of long-term improvements; and that far from flattening hierarchies, technology has enabled ever more power consolidation, surveillance, and control. Beliefs in techie superiority are, unfortunately, buttressed by the fact that money confers credibility: even inexperienced computer science majors can earn three times more from a summer internship at a tech company than from a whole year of work-study at the university. In short, when I was a student, it was all-too-easy to believe that we were demigods with the ability to do anything.
More crucially, our classes trained us that the power of computer science and engineering was in “modularizing,” “parameterizing,” and creating solvable “abstractions” that separate out the messiness of real-world contexts. Grappling with the world’s complexity was, quite literally and intentionally, beyond the scope of computer science education, and dismissed as unimportant. To the extent that this complexity was discussed at all, it was in the department’s few and often-derided “human-computer interaction” classes or in the department’s sole “ethics” seminar, which tended to focus on how to avoid software-caused disasters like the Therac-25 radiation deaths. Not only were these classes optional for computer science majors but the instructor in one such class quipped one day that classes like his were always assigned undesirable morning time slots, a signal of their relative importance in the department.
At Stanford, where my PhD in communication and minor in anthropology expanded my own expertise beyond the limits of this technical education, the culture and training in the computer science department from the mid-2000s to early 2010s appeared to be much the same as it had been at Berkeley. These norms have defined the technical world more broadly. Grants abound for researchers in engineering departments to tackle big ethical issues, while those in the social sciences who actually study these issues in depth have to scrounge for funding. Even though an internal study at Google found that technical skills were among the least important variables in predicting the effectiveness of its team leadership, many companies across the industry have continued to favor technical degrees when hiring managers. Moreover, they even elevate CS majors into titles like “Design Ethicist,” despite holders of such degrees not having specific training in ethics or social science. Not only have companies like Google actively disregarded the “soft” fields that for decades have been focusing on these areas, but they have ignored the many lessons they could learn from their findings. And on a personal note, I have found that my computer science undergraduate degree has opened far more doors at tech companies and started more conversations with potential tech research sites than it really should, given how little I draw on that training in my interpretive work today.
In short, there is nothing about being a techie — either in terms of training or work — that naturally equips techies to be moral or thought leaders. If anything, the apparent disjuncture between the technologies they help to build and the technology-free elite educations some of them choose for their own children says more about their comfort with the deep inequities that their work and personal choices help to sustain than it does about insider wisdom. It’s of a piece with their choosing not to vaccinate their own children, which is meant, in part, to give them an alleged competitive health edge at the expense of everyone else. Even during my own computer science training, when the idealism of the tech world was still relatively strong, the few women and minorities in our department ran up against its inherent biases, which have only gotten worse with time. Nowadays, with almost metronomic regularity, we hear about how racism, sexism, and sexual harassment within the industry largely goes unpunished; how technologies surveil and discipline the most vulnerable; how companies cooperate with totalitarian regimes and compromise democratic processes; and how the industry has enabled and profited from unprecedented data consolidation. The various mantras I first learned as a computer science student and heard repeated hundreds of times across the industry — such as “move fast and break things” and “it’s easier to ask forgiveness than to get permission” — are having their “disruptive” effects, and the results aren’t pretty.
¤
This is not to say that technology executives and employees are, by and large, trying to be evil any more than most anti-vaxxers are. Rather, many of them simply get wrapped up in the interesting technical details of the specific project they are building, or in the relentless drive for funding or promotions or recognition. Few of them consider more than abstractly and superficially the broader moral dimensions of their work — except, perhaps, when it comes to giving their children a leg up by protecting them from its purported effects. The companies that hire them institutionalize these norms by rewarding launches and lines of code rather than careful ethical reviews or decisions not to pursue a project. The tech executives who set these policies, even further removed by privilege from much of the population they might affect, are also compelled by shareholders to wring profit out of ventures that are often unprofitable without personalized advertising. As we now know all too well, this personalization has demanded ever more data collection — however dubious the ethics of the collection techniques and the quality of the resulting data.
Among those few who do go against the grain of engineering education and the norms of their industry to question the moral valence of their work, even fewer have the language or perspective to really grapple with the complexities they encounter. We can see this in the messaging of Tristan Harris, Google’s former “design ethicist,” trained in computer science at Stanford. He has been called “the closest thing Silicon Valley has to a conscience.” Nonetheless, he, too, leans on simplistically dystopian, and technologically deterministic, tropes and cherry-picked examples in fearmongering about our “addictions” to technology — the same kinds of stories that some techie parents lean on to justify technology-free private schools for their kids. In fact, the very same cultural fears about purity and contamination underlie desires to avoid technology, to eat organic and non-GMO food — and to avoid vaccines. Parents often displace their own anxieties onto their children, whether withholding screens or vaccines, and techies are no different. In short, even though this population is less likely to be enthralled by the inflated promises of artificial intelligence and other algorithmic sublimes, most lack the education and experience to unpack the intricate interplay between technical artifacts and the social worlds they inhabit.
When intersected with the myth that techies are the smartest people in the room, this moral lacuna takes on new urgency. Just as technical backgrounds have not insulated techie Waldorf parents from specious reasoning regarding vaccines, they have afforded them no privileged ability to assess technology’s influence on us more generally. Technical training as it generally exists today may even do the opposite, encouraging a degree of arrogance. As a society, we must see the technology world for what it is: an industry as insular as it is influential, and in desperate need of many more kinds of expertise.
¤
Morgan G. Ames’s book, The Charisma Machine: The Life, Death, and Legacy of One Laptop per Child (MIT Press, 2019), chronicles the history and effects of the One Laptop per Child project and explains why — despite its failures — the same utopian visions that inspired OLPC still motivate other projects trying to use technology to “disrupt” education and development. Ames is on the faculty of the School of Information at the University of California, Berkeley.
¤
Featured image: “Watching sid the science kid on the ipad” by jencu is licensed under CC BY 2.0.
Banner image: “The kids, with their heavy backpacks, head out to the bus” by woodleywonderworks is licensed under CC BY 2.0.