The Daily Tar Heel
Printing news. Raising hell. Since 1893.
Saturday, April 20, 2024 Newsletters Latest print issue

We keep you informed.

Help us keep going. Donate Today.
The Daily Tar Heel

Column: Twitter's and Zoom's algorithms illustrate why diversity in tech matters

20191014_McGinnis_WomenCompSci-4.JPG
DTH Photo Illustration. Female computer science majors at UNC Chapel Hill turn to each other for support as they navigate being in a male-dominated department.

In the computer science industry, the use of machine-learning and data-based algorithms is quickly growing in popularity. There are countless applications of the tool, from predicting what Netflix shows you’d enjoy to self-driving cars. However, with high-level computer science algorithms comes the influence of bias and potential ethical concerns — many of which aren’t addressed in courses required by degree programs in the field.

Recently, Twitter came under fire for the neural network used to create photo previews on user timelines. It was found that previews choose the faces of white individuals over those of people of color. Researchers explained that this was primarily due to the fact that the network wasn’t trained with face detection, leading to the bias.

While the company hadn’t found any evidence of racial or gender bias in its internal testing, the Twitter communications team pledged to open-source its work for further review. Zoom, the video call platform that quickly rose in popularity with the push in online courses and meetings, had similar issues with recognizing the faces of Black people who use a virtual background.

Although these issues may seem trivial, they represent a larger trend of racism in technological tools put out by large names in the industry. Google created an algorithm that mistakenly labeled Black people as gorillas, and another algorithm allowed Microsoft’s Tay chatbot to become racist and profane, due to how the bot used data from the internet.

These problems can partly be linked to the racial disparity in the workforce. In 2014, Apple, Facebook, Google and Microsoft released their first diversity reports and pledged to increase diversity in their company makeup. Since then, each of those employers has made immense advancements in technology, but not so much in who they’ve been hiring. 

But, in data from 2019, the number of U.S. technical employees at Google and Microsoft who are Black or Latinx rose by less than a percentage point, while Apple’s did not move from six percent. While things have somewhat improved for women since 2014, there is still no company that is close to achieving parity with white and Asian men in the technological workforce.

Without a diverse group of coders and individuals involved in technological development processes, it’s often difficult to provide a rounded assessment of algorithms from a group with differing perspectives before they go out to production, allowing mishaps such as those experienced at Twitter and Zoom.

This isn’t just an issue within the computer science industry — questionable results and science occur across other research fields as well. 

An example came earlier this week when a trio of researchers published a paper in Nature that borders on phrenology, claiming to evaluate trustworthiness of a person using machine learning based on their physical features. Researchers on Twitter were quick to point out the flawed set of assumptions, questionable methodology and analysis, superficial engagement with art history and a disregard for sociology and economics. The project has drawn connections to the way police departments have attempted to use empirical analysis to legitimize their racial profiling through surveillance technology and predictive analytics.

A lack of diversity in technology, as well as a disregard for ethics in the development of algorithms, consistently disadvantages women and people of color, whether it's in the use of the technology itself or in the lack of support for these individuals in the workforce. It is imperative that companies and research institutions attempt to hire people who are not only successful coders, but also represent diverse groups, who can help to address biases before projects are pushed to production. 

@rajeeganesan

opinion@dailytarheel.com

To get the day's news and headlines in your inbox each morning, sign up for our email newsletters.