Here, in the Year of Our Simulation 2024, humans have never been better at hating the very forces underlying this simulation – in other words, hating digital technology itself. And good for them. These ubiquitously dynamic critics of technology do not rely solely – in their trend-driven preoccupation – on vague, nostalgic, technophobic feelings no more. Now they have scientific documents to prove it. They have bestsellers including: Harari AND Haidt. They have – imagine their smugness –statistics. Kids, I don’t know if you’ve heard, are killing each other in class.
None of this bothers me. Well, teen suicide obviously happens, it’s terrible, but it’s not tough to refute the arguments that blame technology. It’s tough to refute, and what worries me is what I think is the only exception to this rule: the anti-technology argument put forward by a state-of-the-art philosopher.
When I say philosopher, I don’t mean a writer who publishes statistics and boasts about self-help. I mean the deepest, absurdly skilled analyzer, someone who breaks problems down into necessary pieces so that when the pieces are put back together, nothing looks quite the same. Descartes didn’t just put “I think, therefore I am” out of his mind. He had to go so far down head as best he could, taking away everything else before he got to his classic one-liner. (Plus God. People always seem to forget that Descartes, the inventor of the so-called rational mind, could not do away with God.)
So for someone trying to make a case against technology, a Descartes-style line of attack might look something like this: When we go as far into the technology as we can, stripping away everything else and breaking the problem down into its component parts, where do we end up? Right there, of course: in the literal bits, ones and zeroes of digital computation. And what do beats tell us about the world? I’m simplifying here, but basically: everything. Cat or dog. Harris or Trump. Black or white. Everyone thinks in binary these days. Because this is what is being imposed and perpetuated by the dominant machinery.
This is how, in compact, the craziest argument against digital technology can be presented: “I binarize” – computers teach us – “that’s why I am.” Some technoliterates have been mulling over versions of this Theory of Everything for some time; Earlier this year, Dartmouth English professor Aden Evens published, as far as I know, the first properly philosophical codification, Digitality and its dissatisfaction. I talked to Evens a bit. Nice guy. He claims he’s not a technophobe, but still: it’s clear that throughout the history of the world, the digital world has depressed him, and he roots that anxiety in the foundations of technology.
One day I might agree. Now, as I say: I’m worried. I am dissatisfied. The more I think about Evens et al.’s technophilosophy, the less I want to accept it. I think there are two reasons for my dissatisfaction. First of all: Since when are these base units? All dictate all of your expression at a higher level? Genes, the basic units of life, constitute only a tiny part of our development and behavior. The phenomena of quantum mechanics, the fundamental units of physics, have no influence on my physical actions. (Otherwise I’d be walking through walls – when I didn’t, I’d be dead half the time.) So why must binary digits forever define the limits of computation and our experience of them? Fresh behaviors always have the potential to mysteriously emerge when convoluted systems interact. The flock algorithm is nowhere to be found in a single bird! Turing himself said that you can’t look at computer code and know entirelywhat will happen.
And second: blaming dissatisfaction with technology on ones and zeros treats digitality as an end point, as a kind of logical conclusion from the history of human thought – as if humanity, as Evens suggests, had finally achieved the dreams of enlightened rationality. There is no reason to believe in such a thing. For most of its history, computer science has been NO digital. And if predictions about the return of analog are true, it will not remain fully digital for long. I’m not here to say whether computer scientists should or shouldn’t develop chips in a similar way, just to say that: if that happenedit would be foolish to claim that all the binaries of state-of-the-art existence, so thoroughly drilled into us by our digital machinery, would suddenly dissolve into nuance and glorious analog complexity. We invent technology. Technology doesn’t invent us.