How Meetup Counters Algorithmic Sexism

Algorithms can easily reflect societal inequalities, but at least one company is trying to build a fair recommendation system.


Bias against women in the technology sector is still unfortunately pervasive. In a recent survey, only eight percent of female technologists said that they had never experienced gender bias in the workplace. The same study cited other problems in the sector with regards to women, including underrepresentation and a lack of networking opportunities and mentorship. These biases can make their way into the systems that increasingly shape our world, but the technology company Meetup has tried to design their recommendation system so that women are not pigeonholed or left out because of their gender.

Meetup is a social network created to facilitate in-person meetings, and can be used for everything from finding a book club to finding opportunities to meet and mingle with others in your industry.

“The way the platform works is that people come and register and share what their interests and passions are, and we connect them to other people near them who share those passions,” said Meetup communications director Kristin Hodgson. Sounds simple enough, but the Meetup team is aware of ways in which recommendation systems can go bad.

For example: “If you let the algorithm auto-optimize, it would see that men are more likely to join tech groups than women,” Hodgson said. The site would then learn to recommend tech groups to men more often than women.

The team in charge of machine learning at Meetup has taken steps to counter that tendency.

“We are careful in terms of how we use gender in recommendations,” said machine learning lead Zachary Cohn. “Primarily, it is restricted to being used to promote member-event fit with groups which have a gender restriction.”

So, if a woman says she is interested in technology and art, the Meetup site might recommend women-only groups for technology or art, but won’t recommend a sewing group simply because the user is a woman.

“There is an awareness of the potential for pure collaborative filtering to reflect societal bias and so we don’t use the member-topic relationship directly to infer individuals’ topic interests but mediate these through groups,” says Cohn. “For example, if you join a women-only tech group, we have chosen a model structure which won’t recommend you topics that other women like, but rather ones that are related to the technical content.”

Collaborative filtering, Cohn explained, is “a technique where you’re trying to learn from the data, so for example, people who like this like that. It’s essentially making inferences about individuals based on a larger dataset that contains them.”

Data scientist and mathematician Cathy O’Neil told Civicist that gender is one of the most common identifiers used in recommendation systems, and that programming without or around gender is very difficult because interests often correlate with gender. “Just ignoring that one question doesn’t mean that you’re ignoring gender,” she said. “Things like knitting are signals.”

“It’s not entirely clear to me how to bypass this issue,” O’Neil said. “In general, this is how recommendation systems work. It’s a difficult problem to solve.”

It may not seem like a big deal, but when steps like this aren’t taken, the consequences are noticeable and significant. For example, in 2015, Julia Carpenter reported for The Washington Post that Google displayed fewer ads for high-paying executive jobs to women.

Meetup has a slight advantage over other recommendation systems, like those that try to target internet browsers for specific advertisements, because its users have willingly told the company what interests them. However, Cohn said that this is only an advantage up to a point. “Our users expect us to make those inferences and not just rely on them to tell us what they want,” he said. “[They] want the platform to be a little bit smarter and help them to discover new groups and events.”

So the company still wants to use the information that machine learning can surface to infer that someone who likes hiking may also be interested in a group about rock climbing, even if they have never gone rock climbing before. They just want to do that without limiting options for women, or for men, based on their gender.

It’s a formidable task, and Cohn said he’s not ready to declare victory yet. As for judging their success, he says that’s difficult. “Most of how we’re accounting for this now is based on putting restrictions on the types of techniques that we use to make inferences and recommendations,” he said. “As opposed to having some prior assumptions as to what the distribution should be…although that might be interesting to look at.”

O’Neil concurs that it’s difficult to judge success, but that companies interested in actually changing systems need to be tracking the results.

“Are you monitoring how many men and how many women get tech recommendations?” she asked. “If you’re not counting then how do you know what you’re doing is fair?”