Defend Truth

Opinionista

The Other News Round-Up: The AI Gaydar

Marelise van der Merwe and Daily Maverick grew up together, so her past life increasingly resembles a speck in the rearview mirror. She vaguely recalls writing, editing, teaching and researching, before joining the Daily Maverick team as Production Editor. She spent a few years keeping vampire hours in order to bring you each shiny new edition (you're welcome) before venturing into the daylight to write features. She still blinks in the sunlight.

Each week, Daily Maverick brings you some of the lesser reported news from South Africa and further afield. This week: artificial intelligence – with better gaydar than you.

Kids these days have a mixed bag, don’t they? There are things I feel a little sorry for them about, e.g. by the time our leaders are finished with them, they’ll probably be living in a nuclear wasteland. Or the smaller, everyday humiliations: those God-awful socks they are socially obliged to wear, and other relics of the inexplicable Fresh Prince revival. (Unlike us, they will have several hundred selfies to prove it in middle age too.)

But there are things they have going for them. It’s so much easier to get access to anything they want. They can download anything, or Google song lyrics any time, rather than suffering a childhood full of mondegreens or the horror of taping their favourite song from the Top 40 over the strains of an over-talkative DJ. They have instant access to the object of their affection via mobile, social media and text. That beats the heck out of the agonising, family-operated channel we knew: the land line.

But all this pales in comparison to the cherry on top. This week, Daily Telegraph reported that software developed by the University of Stanford can tell, just by looking at someone’s face, whether they are gay or straight.

Researchers trained the AI using pictures of 36,630 men and 38,593 women taken from online dating profiles of gay and straight people.

Say what?

Truly. The awkwardness of youth just got even less awkward, dammit. The software is a whole lot better at predicting sexual orientation than humans are, determining men’s orientation using just one picture, 81% of the time, and determining women’s orientation 75% of the time. The accuracy of the algorithm increased to 91% and 83%, respectively, given five facial images per person.

Human beings, by comparison, were correct just 61% and 54% of the time respectively, which explains the plot of a lot of romantic comedies.

Artificial intelligence software developed at the University of Stanford can predict a person’s sexuality with far more accuracy than humans, suggesting a ‘gaydar’ app may not be far away,” the publication reported. “This has significant implications for people entering the dating scene, and suggests they will not feel forced to wear carabiners, or resort to bringing up Barbra Streisand or Mariah Carey in conversation to test the waters,” it added.

It didn’t really say that last part.

In all seriousness, though, the report did my head in a bit. A gaydar app? This is really going to mess with LBGTI identity politics, isn’t it? (I can’t wait to see what the app does with concepts like “non-binary”.) You can call it whatever you like – reclaiming, identifying, pride, stereotyping or rebellion – but I wonder how much of the labelling we did back in the day was tied to a homing signal, and perhaps still is. When we were growing up, you couldn’t just find your tribe on social media or on Tinder. It required serious social literacy and complex navigation of a series of subtle signals. And if you were queer and you didn’t “look” it, sorry for you. And now there’s an app that can do all that just by scanning one picture?!

And yet – what a strange world we live in, where we are advanced enough to create artificial intelligence that has better gaydar than we do, but homosexuality is still illegal in over 70 countries and punishable by death in a dozen. Even in countries where it is legal, hate crimes are rife. That means, in reality, that although this research is theoretically permissive, enlightening, and interesting, it has frightening implications.

My greatest concern, as you’ve probably guessed, is not the recreational use of a “gaydar app”. Who am I to stop a generation of teenagers from escaping a few excruciating rites of passage? (Although, if I have a vote, I’m going to have to insist that they all go to at least one terrible small-town bar and do the YMCA.) Even among isolated homophobes I can’t see it making much impact, since block-headed bigots are not generally known for their commitment to rigorous research before action. Someone who wants to donder you for being gay is not, I’ll wager, about to hold up the app to your face and say to his mates: “Hold up, fellers! The app says it’s only 76% certain he’s a little light in the loafers! Sorry for the confusion, friend. Didn’t mean any harm. We’ll just be moving on then. Have a good night!” And trundle off with an indulgent wink.

Nope. History and experience tell us that if someone’s intent on a night of violence, they’ll be doing it regardless, and usually based on their own assessment of whether or not they like your face.

But in the hands of authorities, in the wrong regime… it doesn’t bear thinking about.

Let’s take a deeper look. The researchers “show that faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain,” they say. They “used deep neural networks to extract features from 35,326 facial images” which were entered into a logistic regression aimed at classifying sexual orientation. The identifying features used were both fixed, like nose shape; and transient, like grooming style.

Consistent with the prenatal hormone theory of sexual orientation, gay men and women tended to have gender-atypical facial morphology, expression, and grooming styles,” the researchers said. They further worked on prediction models aimed at gender alone, which could detect differences in facial structure relating to hormone levels in the womb.

This is where I start squirming. Firstly, because arguing over whether homosexuality is biologically predetermined is, to me, spectacularly beside the point. Freedom of choice between consenting adults is a fundamental human right and what said adults do is nobody’s bloody business but their own – the end. And secondly, when we start trying to predict sexual orientation, especially in childhood, we head into very murky waters. Just last week, an eight-year-old girl in Uganda was arrested on suspicion of being gay after apparently “luring” other girls to a nearby farm. Imagine a world where homophobic societies can make such “predictions” before birth.

It is also just a little odd that the rationale for the study is “given that companies and governments are increasingly using computer vision algorithms to detect people’s intimate traits, our findings expose a threat to the privacy and safety of gay men and women”. From where I’m sitting, that looks a heck of a lot like the researchers saying: “We see the frightening potential to invade the safety and privacy of gay people using artificial intelligence…and here’s how to do it!”

Where it could get really ugly is in countries that have – as the Telegraph delicately put it – “questionable” human rights records. Given how little control we have over our information, that’s worrying. So forgive me if I look this gift horse in the mouth; if I proceed with caution; if I’m not that excited at the prospect of biological absolution I didn’t ask for, or improving my (terrible) gaydar.

Growing up these days, eh? It really is so much easier to get that direct access to what you want. It’s just that unfortunately, that goes for the good guys and the bad. DM

Gallery

Please peer review 3 community comments before your comment can be posted