Universities Are Using Innovative Learning Programs to Tackle AI’s Diversity Problem

By  - 
Students in the 2019 Carnegie Mellon University AI4ALL summer program take part in a group presentation to showcase their artificial intelligence research project.
Students in the 2019 Carnegie Mellon University AI4ALL summer program take part in a group presentation to showcase their artificial intelligence research project.

From smart assistants like Alexa and Siri to Google search algorithms and social media feeds, artificial intelligence (AI) technology plays a central role in our everyday lives, providing us personalized recommendations and streamlined access to information.

However, while many see AI as a useful — even life-changing — innovation, researchers are concerned about its track record for perpetuating stereotypes and discriminating against women, people of color, LGBTQ individuals, and other marginalized groups.

Statistics of underrepresented individuals in AI conferences and the workplace.To understand where the biases in AI originate, it’s important to know how AI technology works. As the AI Now Institute, a research center housed within New York University, explains on its website, “[AI] systems ‘learn’ based on the data they are given. This, along with many other factors, can lead to biased, inaccurate, and unfair outcomes.”

When the majority of the individuals creating AI systems are White and Asian American cisgender men, the data used in those systems will accordingly match their perspectives, which often include implicit biases about gender, race, and sexual orientation.

This feature has serious consequences for those who do not resemble the creators of the technology. According to research by The MIT Media Lab, IBM and Microsoft’s facial recognition systems were found to be more accurate when used on lighter-skinned individuals. Amazon discovered its own AI-based machine learning tool for hiring contained an algorithm that selected men over women. And researchers at the University of California, Berkeley found that a healthcare AI algorithm designed to identify patients who would most benefit from additional care favored White patients over people of color.

Considering the far-reaching effects of AI, the need to decrease the prevailing homogeneity within its ranks is imperative, according to Jonathan Reynolds, outreach project manager at Carnegie Mellon University School of Computer Science (CMU SCS).

“Artificial intelligence isn’t designed to be disruptive. It is designed to offer convenience and make things more efficient,” he says. “That’s why it’s critical to have diverse perspectives at the front end. If you have AI making autonomous decisions in all the domains of life, that has the potential to exacerbate the inequalities that already exist in society.”

Every summer, CMU SCS hosts a three-week AI4ALL program to introduce high school students to this growing field. The experience includes lectures, demonstrations, field trips, and hands-on activities for students from underrepresented ethnic and racial groups, underserved backgrounds, and geographically diverse areas.

A critical component of the AI4ALL summer program is its lower requirements for entry, which is essential for ensuring diverse students can participate. Unlike many computer science degree programs, for instance, AI4ALL does not require attendees to already know coding.

“Not every [high] school is able to offer coding experience, and not every student has the same opportunity, access, or even knowledge that it exists,” says Ashley Williams Patton, director of CMU SCS’s Computer Science Pathways program, which works to bridge the knowledge gaps between students from different socioeconomic backgrounds. “What we really care about is curiosity and the willingness to work hard and grapple with difficult topics to solve problems.”

The AI4ALL program at CMU SCS is also dedicated to eliminating barriers for students who may not be able to afford travel and living expenses. “We’ve done everything from loaning laptops for the duration of the program to even buying sheets for their dorm bed,” Patton says.

Program participants get to engage in project-based learning in which they are charged with applying AI through a humanistic perspective to deal with issues such as community policing, autonomous farming vehicles, and disaster relief allocation.

Such topics are extremely relevant to their home communities, Reynolds explains.
“I think when students are passionate about something, they are more likely to be retained in that field for the lifespan of their academic journey to their careers,” he says.

Based on AI4ALL’s success rate at CMU SCS so far, this appears to be true. A high proportion of alumni have gone on to enter highly selective computer science programs at institutions such as MIT, Stanford, and Yale.

While the program is currently on hiatus at CMU due to the COVID-19 pandemic, the university and some other host institutions have plans to resume in 2021; some schools shifted their 2020 summer program to a virtual experience.

Despite the interruptions of the pandemic, Reynolds is optimistic that the push for diversity in this growing field will continue to gather momentum.

“I do think that we are better positioned than other STEM disciplines, the reason being that AI and machine learning are relatively new phenomena,” he says. “Dealing with the issue of diversity now, while the discipline is in its infancy, will forcibly nudge companies to have to make changes for the future.”●

Lisa O’Malley is the assistant editor of INSIGHT Into Diversity. This article was published in our September 2020 issue.