As #BlackLivesMatter protests rocked the globe, dating services that had formerly endorsed racial exclusion-based dating appeared to change their tunes.
Some announced they would be removing ethnic and race filter options. Others declined, citing the longstanding motivation for preference filters: maximizing user choice. (Choice which I’ve argued is largely an illusion.)
Such changes could be taken as a mark of progress. They may have also just been virtue signaling.
When the powers-that-be make a concession to the oppressed, more often than not it is less a concession than a convergence of interests. The timing of this decision suggests as much.
The popularity of dating apps and services depends upon their promise of greater ease and convenience, but also the pleasures offered by gamification. Exclusion-based dating exists because app design reinforces this pattern.
To explain: sorting through the sheer number of prospects on dating services requires a process of elimination. Filters present themselves as the logical conclusion.
And so users are funneled into a preset selection of behaviors, responding to each candidate on a binary yes/no basis.
Sorting through an ocean of romantic/sexual options demands significant mental energy. Our brain quickly learns to conserve that energy by autonomizing the process.
A careful profile survey is refined into a reflexive swipe. Preferences shift into hard “no”s. Full sentences degrade into monosyllables.
The apps positively reinforce our continued engagement with this pattern through instant notifications. Flashy animations and sounds signal success, assuring us that whatever we are doing must be right.
We in turn interpret our behaviors as winning strategies, leaving us less prone to questioning our own biases.
Left unchallenged, biases color our perception of the world. They fledge into prejudice, promoting “isms” such as ableism, ageism, classism, ethnocentrism, heterosexism, transphobia, and sexism.
The first step to overcoming biases is awareness. If you suspect you are prone to a particular bias, consider taking a free implicit bias test.
The second step is introducing information that directly conflicts with our automatic patterns of thinking about a particular group.1
Consider writing a letter to yourself, exploring the reasoning behind a specific bias or prejudice. Describe the experiences that may have given rise to it. Find possible flaws or contradictions in your biased belief system.
Weigh your dating preferences against your values. Do the two align, and if not, what then are you willing to do to address it?
Revising any attitude, belief, and response involves some mental effort. Dating apps on the other hand encourage us to suspend “intention, attention, and effort”2 for the sake of convenience and efficiency—then reward us for doing so.
Giving into automaticity results in us falling back on old habits. Like a car following grooves and ruts in the road, we will very quickly “tramline” our way back into bias.
Without self-reflection, we are at the mercy of our worst instincts. Only by developing awareness about our own thinking can we escape the toxic hold of exclusion-based dating.
Collecting our behavioral data for private profit is a now-standard business practice first pioneered by tech giants like Google and Facebook.
On this surface, this may seem to be a mutual exchange: products and services, in return for personal information and what The Age of Surveillance Capitalism author Shoshana Zuboff calls “behavioral surplus” data.
From this surplus, these companies are able to construct profiles that are then sold as a commodity to other businesses.
These profiles can also be used to “nudge, coax, tune, and herd [our] behavior” in a way that serves the interest of top bidders, such as through targeted advertising.
The people guiding this process—a mysterious, corporate-run “data priesthood”—operate from behind a one-way mirror. They might know everything about us, but we know next to nothing about them.
This priesthood’s practice of collecting, selling, and exploiting our behavioral data has since been adopted by the likes of dating and hookup app operators, at great cost to our privacy—and wellbeing.
The normalization of surveillance capitalism
Zuboff argues that every time we give in to these companies and sign their obscure, incomprehensible terms-of-service agreements, we are handing over exploitable information about ourselves.
We comply with these agreements only because by now they appear bog-standard, and because they are a necessary hurdle to accessing services upon which we depend.
Fashioning an image of themselves as heroic entrepreneurs or authorities, data collectors buy our trust by promising “social connection, access to information, time-saving convenience, and, too often, the illusion of support”.
Yet their true goal as Zuboff points out is to extract human experience as a raw material for profit.
But succumbing to the new form of power represented by these organizations shouldn’t seem so inevitable. We still have the power to opt out.
Here’s why it’s crucial we exercise that power.
Surveillance capitalism in gay apps
In The Age of Surveillance Capitalism, Zuboff explains how the social media platform Facebook uses “closed loops of obsession and compulsion” pioneered by the gaming industry to engage and captivate users.
These loops rely upon “social pressure, social comparison, modeling, subliminal priming” to generate continued usage—and even addiction.
What’s not often discussed however is how app creators use behavior data to shape app design and to enhance the “hand-and-glove relationship of technology addiction”, to use Zuboff’s phrase.
For example, a cursory glance at Tinder reveals the creators have tuned the app design to generate more rewarding feedback, and thus more user engagement.
Consider the flashy animation and audio tone whenever you “match” with another user on Tinder—stimulation that’s likely to cause a release of the neurochemical dopamine, associated with the sensation of pleasure.
This is a form of positive reinforcement that ensures many of us keep on playing the swiping game, at least until we hit a paywall.
Paywalls in this case are used to create the illusion of scarcity. When free users swipe “no” on an interested candidate, the app will notify them they have missed a potential match, then suggest relieve the resulting fear of missing out (FOMO) by purchasing the right to chat with this other user.
Similarly, by offering a limited amount of free “likes”, the app levers loss aversion to coax users into buying a subscription.
App designers also nudge us to return to the app using push notifications. These notifications are also used to promote flash sale promotions or advertisements.
The examples provided here are blatant examples of the manipulation Zuboff describes. However, it’s the examples we don’t know about that I believe we should be most worried about.
The danger of manipulative app design
Zuboff cites studies that reveal the particular vulnerability of teenagers to social media addiction, owing to their development age.
If we don’t practice mindfulness, we are at risk of being caught in a toxic cycle, wherein “ego gratification and ego injury drive the chase for more external cues”.
To explain: when we are ignored or rejected on these apps, gratification is denied, and our ego is injured.
We may try to soothe that injury by pursuing still more gratification, returning over and over to the app for our fix.
The shallow, mechanical, and objectifying exchanges that often ensue are a far cry from the acknowledgment and affirmation we are seeking.
As we hover over our phones “anxiously awaiting the appearance of the little notification box as a sign” of our self-worth, we suffer a slow extinction by a thousand snubs.
For “Without the ‘others’,” Zuboff writes, “the lights go out.”
How surveillance capitalism hurts us
Enter dating and hookup apps with their endless stacks and grids of attractive faces and torsos.
In the case of gay men, this social comparison is taken to a new level: we aren’t just competing for the attention of other users, but also against them.
The competition for the best possible “match”, when combined with the illusion of scarcity, fuel FOMO regarding potential romantic or sexual interests.
Our interactions on these apps come to resemble some overwhelming game of chat whack-a-mole, in which we try desperately to catch, hold and hoard other’s attention.
It’s a game that often feels futile, as interest fluxes and users log on and off, often without explanation. Being shunned or ignored is commonplace, as is deception.
For instance, it’s not unusual to realize mid-chat that the person on the other end either isn’t who they claim to be—or is actually a chatbot.
Certainly, where dating is concerned, rejection is par for the course. But when identity and self-value come into question, as it so often does on these apps, the stakes often feel so much higher, as anyone who has ever found themselves caught in a flame war can attest.
Creating app-based addiction
To recap: surveillance capitalism allows creators to monitor users’ behavior and then use the resulting data to control us, for example through the gamification I’ve described above.
Like gamblers denying the odds, we keep coming back, even attempting to turn these odds in our favor by curating a profile we know will maximize user engagement, even to the point of trickery.
It is human nature to selectively present the best parts of ourselves, but these apps seem to actively encourage selective self-representation by providing profile fields that cater to one-dimensional hypersexuality.
Limiting as it is to be defined only by the minutia of one’s erotic interests, many users inevitably fall into line. Some do it in the name of efficiency or practicality, others in the name of achieving the success of a date, a hookup, or simply being messaged.
When taken to the extreme, users will adopt a completely different identity, knowing it will likely entice messages or photo exchanges.
Instant messaging is inherently rewarding, but add to this the ever-present possibility of sexual attraction or rejection, and users are pushed into heightened states of anxious arousal.
With enough exposure, we run the risk of developing an app-based process addiction.
Defying surveillance capitalism
Today’s tech-dependent world has arguably left us all pawns of surveillance capitalism.