“I have a theory that much recent tech development and innovation over the last decade or so has had an unspoken overarching agenda—it has been about facilitating the need for LESS human interaction. It’s not a bug—it’s a feature. … I see a pattern emerging in the innovative technology that has gotten the most attention, gets the bucks and often, no surprise, ends up getting developed and implemented. What much of this technology seems to have in common is that it removes the need to deal with humans directly. The tech doesn’t claim or acknowledge this as its primary goal, but it seems to often be the consequence. I’m sort of thinking maybe it is the primary goal. There are so many ways imagination can be manifested in the technical sphere. Many are wonderful and seem like social goods, but allow me a little conspiracy mongering here—an awful lot of them have the consequence of lessening human interaction.
…
… Engineers and coders as people are often less than comfortable with human interaction, so naturally they are making a world that is more accommodating to themselves.
This last one might be a bit contentious, but hear me out. My theory is that much tech was coded and created by folks somewhere on the spectrum (I should know—I’m different now, but I used to find most social interactions terrifying). Therefore, for those of us who used to or who do find human interactions awkward and uncomfortable, there would naturally be an unconscious drive to make our own lives more comfortable—why wouldn’t we? One way for an engineer to do that would be to remove as much human interaction from their life, and therefore also our lives, as possible.”
I’ve been reading up on robotics lately, and this puts its finger on something that makes me uneasy about the implicit Silicon Valley vision of the future: it looks like the utopia of somebody who doesn’t particularly like dealing with other people (or at least with strangers).
Drone deliveries instead of mailmen, robo-cars instead of busses and taxis, order screens instead of cashiers, robots instead of waiters, cleaners, and nurses, online education instead of classrooms … put it all together and there’s a definite arc of transforming the utilitarian side of daily life from a social experience to a solitary experience. You could even see the grand UBI + automation vision as fitting into this: most jobs are social experiences, having money unconditionally auto-deposited to your account every month is a solitary experience. In such a robotopian future people will still socialize, e.g. by getting together with friends or seeking partners on dating sites, but they will increasingly do so only on their own terms; the necessity of dealing with other humans just to survive will be reduced or eliminated. Think of common stereotypes of tech types in relation to this, and it’s not hard to read in a desire to eliminate potential sources of social friction by replacing interactions with humans (who have independent will and desire) with interactions with machines (which do not). A self check-out machine will never get annoyed with you for insisting on digging out 11 cents worth of change instead of just handing over a ten dollar bill, and a robot bus will never get sassy with you over a miscommunication (as a human bus driver once did with me).
I strongly suspect I’m on the spectrum myself and have struggled with social stuff, so I have sympathy for this impulse … but I’m uneasy with it. If this is happening, it seems like an unrepresentative clique optimizing the future for their own atypical priorities and preferences, without realizing that’s what they’re doing.
Of course, some critiques come to mind:
If you look at what capitalism is actually doing instead of futurological speculations, what’s actually happening is kind of the opposite of this. Communication technology has exploded, while the roboticization of daily life remains mostly a fantasy. And jobs that are relatively low on social interaction have declined, while the social-intensive jobs (service and helping professions and bureaucracy) have become more important. Say what you will about factory work, but it required little emotional labor - and that’s an option that’s increasingly being foreclosed. Near-term trends seem likely to further this: the near future will probably have less factory workers and drivers and more waiters, cleaners, nurses, doctors, salespeople, and bureaucrats. This is terrible for socially awkward people, at least on the employment end. Incidentally, I think this is a major factor behind the rise of gender anxiety: women tend to be better than men at the sort of servile sociality that this sort of economy demands.
I think this may have something to do with the fact that Silicon Valley doesn’t actually rule our civilization: Wall Street, Washington, and Main Street get a say too, and they’re either psychologically typical (Main Street, by definition) or skew toward different atypical personalities (my cynical side suspects that Washington and Wall Street skew toward sociopaths with high cognitive empathy, i.e. the socially awkward person’s natural predator). Futurology looks different from reality because Silicon Valley and its adjacent fellow travelers have much more influence over futurological fantasies than over the real shape of society.
As @bambamramfan said, the actual trend isn’t so much toward simple dehumanization as transactionalization. I think this actually fits pretty well into an expanded version of Byrne’s thesis though.
- The great nemesis of socially awkward people is the tendency of human societies to be full of unwritten rules, implicit contracts, unacknowledged hierarchies, and unpredictable impulsive spontaneity. Transactionalization means the rules and contracts become explicit, and spontaneous impulsivity is circumscribed (breaking the contract means negative consequences). I’ve talked about the possible relationship I see between autism adjacency and preferring explicit contracts before. I think you could tie this back to David Byrne’s idea and suggest robotocization, transactionalization, bureaucratization, feminist “no means no” sex norms etc. as aspects of an attempt by socially awkward people to “terraform” society into something more hospitable to themselves (or perhaps I should say ourselves).
- I’ve been reading Just Ordinary Robots, and one of the points they make is that social rationalization often prefigures and paves the way for mechanization. First social practices are reformed to be more consistent and efficient, and this makes them more amenable to automation, and then they are automated. The assembly line factory is much more robot-friendly than the sixteenth century workshop. One of the reasons industry is so friendly to robots is it’s a controlled environment that’s already heavily rationalized. It would be easier to create home robots or car robots if our homes or highways were more rationalized (think of railroads for a much more rationalized version of a highway).
Of course, this is all based on fairly crude stereotyping. All I can say about that is, I don’t think stereotypes are an entirely useless heuristic. And I’m a socially awkward nerd who may be on the spectrum, so Byrne’s idea sounds believable to me because I can see the appeal of replacing interactions with potentially surly humans with interactions with safe and obedient robots for somebody like myself.
Edit: I also want to say I disagree with Byrne about social media. It’s true it isn’t as rich as face-to-face interaction, but I think it contains the essence of the social: you’re dealing with other intelligent beings with minds, agendas, desires etc. of their own.