Home WebMail Friday, November 1, 2024, 10:26 AM | Calgary | -5.1°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Posted: 2018-11-22T00:39:49Z | Updated: 2018-11-22T00:39:49Z

Its no secret that racial biases factor into swiping choices on dating apps even in 2018, people feel bold enough to write things like no blacks and no Asians on their profiles. But a new study suggests the apps themselves might reinforce those prejudices.

Researchers at Cornell University found that dating apps, including Tinder, Hinge and OKCupid, can reinforce biases or sexual racism of users depending on their algorithm.

People may have no idea that a matching algorithm is limiting their matches by something like race since apps are often very vague about how their algorithms work, said Jessie Taft, a research coordinator at Cornell Tech and co-author of the study

To conduct the study, the researchers downloaded the 25 top-grossing apps in the iOS app store as of fall 2017, including Tinder, OKCupid, Hinge, Grindr and some lesser-known apps like Meetville and Coffee Meets Bagel.

Then they logged in and looked for functionality and design features that could affect users discriminatory behavior toward other users. This included things like the apps terms of service, their sorting, filtering and matching algorithms and how users are presented to each other. (Do they get pictures or bios? Can you sort matches according to different categories?)

They found that most apps employ algorithms that cater to users past personal preferences and even the matching history of people who are similar to them demographically.

So, for instance, if a user had matched with white users repeatedly in the past, the algorithm was more likely to suggest more white people as good matches moving forward.

When apps encourage users to act on quick impressions and filter other people out, serendipity is lost, the researchers say

Users who may not have a preference for race or ethnicity in their partner may find their matching results artificially limited by an algorithm thats calculated to repeat good past matches without considering what good future matches might be, Taft told HuffPost.

Data released by apps themselves support the research. In 2014, OkCupid released a stud y that showed that Asian men and African-American women got fewer matches than members of other races. White men and Asian women, meanwhile, are consistently seen as more desirable on dating sites.

We dont want to stop users from dating the people they want to date; we want to ensure that minority users arent excluded, abused, or stereotyped as a result of those choices.

- Jessie Taft, a research coordinator at Cornell Tech and co-author of the study

While many of us have types were drawn to, its worth looking at whether lack of exposure as well as stereotypes and cultural expectations are influencing our preferences. (For instance, women may exclude Asian men in their search because of the group has long been portrayed as effeminate or asexual in film and on television .)

Given how widely used apps are one study suggested more than one third of U.S. marriages begin with online dating developers have a rare opportunity to encourage people to move beyond racial and sexual stereotypes rather than entrench them, Taft said.

The problem with giving users what they want, as the apps claim they do, is that more often than not the users who are getting what they want are the ones who are being discriminatory, not the ones who are being discriminated against, the researcher said.

Even small tweaks could make the experience more beneficial to users across the board.

The solutions that we propose in the paper adding community guidelines and educational materials, rethinking sorting and filtering categories and changing up algorithms can make outcomes better for marginalized users without interfering in anyones right to choose a partner, Taft added.