Five months into my eight-month solitary confinement and right before the Persian New Year, Nowruz, the guards put me in a new cell at the other end of the Evin prison high-security facility in Tehran. Measuring 3 by 3 meters, it was much larger than my old cell, which meant I could walk in a figure eight across the corners. In the absence of anything else to do, continuous walks were my sole routine, and they quickly became an addiction.
I walked and walked. Remembered and imagined, anticipated and planned for all possible scenarios, and often conversed with myself out loud, in any languages I had any knowledge of. During these figure-eight walks, I faced the windows or the half-marble-covered walls. Sunlight seeped into the room, tracing paths of gold over the floor, then scaling the walls. It danced, warmed, then vanished, promising to return tomorrow. The marble canvas revealed images: the curved, nude back of a seated woman, surrounded by profiles of faces and clouds.
Deprived of sights, I sought refuge in sounds. The new cell received less light due to the tall, gorgeous plane and mulberry trees right outside. but it was right next to the main entrance and thus, within Evin standards, more eventful and entertaining—even if only through hearing. I could hear when the bored guards gossiped about their shift supervisors at the end of the hall, or when they responded to other inmates’ requests, or when they watched football or drama on state television. (I never heard any news, since they were strictly advised not to watch the news.) Once, a few seconds of an instrumental version of Radiohead’s “A Punch Up at a Wedding” on a stupid TV commercial made me cry my heart out. I wasn’t sure which I craved more: hugs or books. I suspect it is very rare to be deprived of both at the same time.
My only comfort came from our equality in this misery, or at least the perception of it. The guards and interrogators had always said no one was given books or newspapers in our ward. I had believed them, because I had seen no sight (nor heard any sound) of them.
One afternoon, though, I heard something that shattered this tiny comfort. Four pairs of slippers had appeared outside a cell two down from me, hinting at four inmates who most likely had just come out of solitary to be kept in a large cell together. A few hours later, through the ventilation shafts that connected the cells, I heard newspaper rustling. It broke my heart, truly. That common shaft and what I could hear through it deeply unsettled me for the next three months. Of all the injustices of a high-security prison ward, from the blindfolded walking breaks in the yard to the awful gray polyester uniform and the cheap blue nylon underwear, this one felt the harshest.
But what if there were no shared ventilation shafts between cells via which I heard the other cell? What if the ward were so vast that we never felt the presence of others? What if they could make us deaf as they made us blind? What if they could enclose our senses as they trapped our bodies? Broader questions emerge: If we know nothing about our colleagues’ salaries or where and with which standards they live, can we even know if we are treated fairly? Can injustice be felt if there is not a shared space where we can see and learn about others’ lives?
The rare blend of physical and cognitive isolation that I experienced in prison was an exaggerated version of the social fragmentation that is quickly becoming reality for many people in more developed urban areas around the world. The pandemic somewhat accelerated that reality. Many of us stopped going to the office, to events, shops, cafés, and restaurants. We drove cars or bicycles and avoided public transport. Face masks and other physical barriers shielded us from other people. Nearly all public or shared spaces where we were able to interact or even gaze at strangers had vanished, turning our lives into real physical cocoons, not metaphoric cognitive ones that we long feared.
I call this blend of material and cognitive isolation in everyday life mass personalization of truth. This is a much broader argument than the infamous “filter bubble,” which only focused on cognitive or information filtering.
Platforms are quickly becoming social institutions with deep and extended embodied, as well as cognitive, impact on our lives. Near-future technologies such as self-driving cars, mixed-reality headsets, and drone deliveries will turn the isolation we experienced during the pandemic into a permanent, everyday reality. Our chance to meet or interact with anyone that we don’t already know will dramatically shrink, because the shared spaces for these interactions will diminish or our access to them will be limited. They affect our mental as well as material life somewhat similar to how our bodies and minds are controlled in prison.
Society of One
A “market of one” used to be the dream of marketers and manufacturers around the world. If you are certain about someone’s unique and vital needs, you have already sold that product before making it. This ultimate form of personalization is where consumption and production become one.
Before the era of AI and machine learning, it was difficult to imagine personalization on a massive scale. But with big digital platforms like Google or Facebook, mass personalization has finally emerged: the automated, continuous process of hyper-fragmenting consumers and predicting their needs or desires based on massive data surveillance and complex technologies of classification. From Facebook, Instagram, and Twitter feeds and their embedded adverts, to Amazon and Netflix recommendations and Spotify’s Weekly Discover playlist, companies use statistics and probability to quickly learn what kind of things we may need or desire and nudge us toward them accordingly.
The question now is: What if the market of one is expanded to other zones of life and turns into a society of one?
When mass personalization extends beyond feeds or ads, it becomes an entirely different thing: mass personalization of truth. “Truth” here refers to long-term, embodied, lived experiences, and to the practical as well as instinctive knowledge every individual has about the outside world.
Think about how platforms could control our bodies and material experiences instead of just our cognitive experience. They could navigate us in self-driving cars, selecting routes where we’ll shop for things we don’t need; they could choose what events to take us to and which people to expose us to, perhaps with visual cues above their heads indicating who to approach or avoid; they will order things they decide we will not return, with personalized prices for us; they will decide who we date and mate and reproduce with. They may fail to confine our minds, but they are fully able to govern our bodies—and our minds will eventually follow where our bodies go.
A society of one means we will live by different personalized truths in both the mental and physical worlds, with little chance to experience the truths of others. This can work two ways. As I discovered in solitary, through the little shared space of an air shaft, that some inmates had access to newspapers, justice cannot even come to be realized without a form of collectivity (or shared space). Research has also shown that when poor kids befriend wealthier ones, they are significantly more likely to finish high school and will later earn on average 20 percent more as adults. It is no secret that segregation deepens inequality.
Not only justice and equality but democracy will also suffer from mass personalization, because it undermines autonomy, a prerequisite for any notion of citizenship. Imagine how a politician can run a simultaneously racist and anti-racist election campaign, and even win, if people are not exposed to each other’s lives or embodied “truths” in shared public spaces. Even after their victory, the politician can continue to manipulate voters by different selection and framing of their plans and achievements, while people have less and less meaningful interaction with those they don’t know.
This was a common tactic by interrogators in prison. They told prisoners different stories about their ethnic background and their politics. Only if inmates were transferred into public wards or found a way to cross-examine the interrogators’ accounts would they be able to know they were being manipulated.
Trust is also threatened by mass personalization, because it is only formed in collectivity. Who wants to fly in an empty plane of an unknown airline? A very disturbing aspect of my time in solitary was that I wasn’t able to trust any facts they gave me about the outside world. It was a constant belief that every piece of information they shared with me was intended to manipulate me into confessing things they thought I was hiding.
For example, because I was arrested a few months before the very tense 2009 elections, I didn’t believe a single thing they said about which candidates had begun their campaigns. The mistrust even applied to mundane facts such as who was appointed to be head coach of the Iranian national soccer team. Only months later, when I met other inmates in a shared space, did I realize they had not lied.
Platform Neutrality
A society of one may, in 2023, still sound like an impossible dream (or nightmare, depending on who you are)—but so was the market of one before the blend of big data and machine learning led to the emergence of giant digital platforms.
There is still time to preempt the dark consequences of mass personalization. One concrete policy idea I’ve been promoting since 2018 is something that I call “platform neutrality”: regulating platforms to unbundle their AI models or algorithms from their core code, thereby creating a free market of third-party algorithms and models that users can buy and install on any given platform.
Think of installing a third-party AI model on Google Maps that replaces its default one and allows you to avoid chain cafés or businesses with racist or polluting tendencies. Imagine if you can buy and use a third-party algorithm on Instagram that will protect teenage girls from bullying or self-harm. Or think about a third-party Tinder plug-in that makes your profile invisible to your colleagues or family or ex-partners.
At the very least this will make AI models and algorithms more transparent and more accountable.
Another solution may be what I did during my solitary detention. Using a pen I once stole and took to my cell, I kept writing short sentences in fine letters right along the natural lines on marble stones on the walls. My situation, stupid things interrogators said or asked, what I had missed most, words of song lyrics, advice for other prisoners, and so on. And I signed them all with a date. I kept doing that in all the three or four cells I was kept in during my eight months of solitary, and I continued it afterwards.
To this day, dozens of people who spent time in those cells have seen my words, learnt from them, sang and danced to them. This is how I managed to disrupt their personalized embodied truths.
The primary danger of mass personalization doesn’t lie in its effects on our minds, but rather on our bodies. As most Asian civilizations have long ago figured out, the body is not separate from the mind, and it is often through the body that the mind shifts, not the reverse.