The researchers culled tens of thousands of photos from an
online-dating site, then used an off-the-shelf computer model to extract
users’ facial characteristics—both transient ones, like eye makeup and
hair color, and more fixed ones, like jaw shape. Then they fed the data
into their own model, which classified users by their apparent
sexuality. When shown two photos, one of a gay man and one of a straight man,
Kosinski and Wang’s model could distinguish between them eighty-one per
cent of the time; for women, its accuracy dropped slightly, to
seventy-one per cent.