AI Weekly: Celeste Kidd on how to close the AI research gender gap


This week, VentureBeat published a collection of predictions about where machine learning is heading in 2020 from industry leaders like PyTorch creator Soumith Chintala, IBM Research director Dario Gil, Nvidia machine learning research director Anima Anandkumar, and Google AI chief Jeff Dean.

Each expert shared insights about subfields they think will make strides in the year ahead, like multitask learning and semi-supervised learning, and everyone seemed to agree that Transformer indeed transformed natural language AI in 2019, but they also coalesced around their shared hope that the AI field will continue to change for the better in 2020.

One person who spoke at length with VentureBeat about how the AI field can evolve in the year ahead is Celeste Kidd, director of the Kidd Lab at University of California, Berkeley. She told us she hopes neural networks lose their reputation for being black boxes, and that more people in machine learning develop realistic opinions of what babies can learn compared to neural nets, but she also talked about the lack of women in machine learning and sexual harassment.

She was Time Person of the Year in 2017 along with other women associated with the #MeToo movement, and last month she gave the opening keynote address at NeurIPS, the largest AI research conference in the world.

In her speech, Kidd took a deep dive into what machine learning practitioners should know about the human mind – how people form beliefs and how they can be quickly led to believe falsities when content recommendation AI maximizes for engagement.

READ  Apple's iOS 12.4 and watchOS 5.3 betas will add support for Apple Card

She also talked about her own experience with sexual harassment, and the need to dispel the myth among men in machine learning that being alone with a female colleague can lead to sexual harassment allegations and the end of their careers. When that fear leads to missed opportunities for women in the field, Kidd said, even well intentioned people with no desire to inflict harm can instead damage the careers of women.

The speech then ended with a standing ovation, which was uncharacteristic for a machine learning research conference.

Misperceptions held by men in machine learning is something she said nobody wants to talk about for a NeurIPS keynote address, but it’s something she felt she had to do given the opportunity to speak with so many people who are directly responsible for decisions made at their companies or training female students at universities.

In 2018, analysis by Element AI found that the number of women authors of papers published at major AI research conferences like NeurIPS remains below 20%, while a 2019 Nesta report on gender diversity in AI found that less than 30% of AI research published on arXiv in the U.S. had a female author. Some countries like the Netherlands surpass 40% female authors but no nation achieves gender parity.

Bringing more women into machine learning research requires taking sexual harassment seriously and exposing predators, Kidd said, because she believes it’s a contributor to the leaky tech pipeline. She also stressed that for the average person, it’s not a single dramatic event, but more often 1,000 seemingly small events – what she called “death by 1,000 paper cuts” – that push women out of the field.

READ  Google pledges carbon-neutral shipping for its devices by 2020, recycled plastic by 2022

The day after she gave her speech, Kidd said she tried without success to reach the poster session at the conference but ended up being stopped by men and women – men with thanks for calling this fear misguided, and women who said they were invited to social events with peers.

“You learn just as much from your peers, if not more than you do from your mentors,” she said. “So when you have a lab treating a woman as otherly, if you’re not treating her the same, she doesn’t get the same access to all of the informal training opportunities that exist, all the opportunities that everybody else in the lab has for learning from their peers.”

Inviting women to be part of social outings and dissuading the misperception that mentoring women will lead to sexual harassment allegations and the end of men’s careers is important, but getting rid of serial predators is critical, Kidd believes, to achieving parity and close the AI research gender gap. Resisting the Pence Rule that men should avoid being alone with a female colleague unless their wife or others are present could also help.

“If you set up a rule like ‘I’m only going to meet with women with the door open’ [or] ‘I’m only going to meet with women when there’s somebody else present,’ you’re introducing a systemic inequity that means that she doesn’t get as much access to your mentorship to somebody that doesn’t have to have those particular circumstances in place,” Kidd said.

One thing that stood out from the interview: It’s not just an individual who loses out when a woman is pushed out of the industry or a persistent gender gap emerges in machine learning research. It’s a loss for the machine learning industry, as well. And as AI spreads to all corners of business and society, that means everyone loses.

READ  AI Weekly: Introducing our 'Power in AI' special issue

For AI coverage, send news tips to Khari Johnson and Kyle Wiggers and AI editor Seth Colaner — and be sure to subscribe to the AI Weekly newsletter and bookmark our AI Channel.

Thanks for reading,

Khari Johnson

Senior AI Staff Writer



READ SOURCE

LEAVE A REPLY

Please enter your comment!
Please enter your name here