Official newspaper of The University of Texas at Austin

The Daily Texan

Official newspaper of The University of Texas at Austin

The Daily Texan

Official newspaper of The University of Texas at Austin

The Daily Texan

Advertise in our classifieds section
Your classified listing could be here!
October 4, 2022
LISTEN IN

Software catches unintentional bias in other programs

2016-11-22_Secret_Data_Talk_Chase
Chase Karacostas

Humans are not the only ones who can unconsciously discriminate against minorities or other groups—computer programs can also contain unintentional biases. 

A computer program that predicts the likelihood that someone will commit future crime recently received criticism from the website ProPublica for possible bias against African-Americans. To help refute or confirm claims like these, Aws Albarghouthi, an assistant professor of computer sciences at the University of Wisconsin-Madison, created software he calls FairSquare. 

Albarghouthi gave a talk Monday at the Gates Dell Complex on how FairSquare decides whether a program is “correct.” He said that previously, computer scientists decided a program was “correct” based on output, but that now ethics are a concern as well.


“Throughout the past half-century, we’ve been working on techniques to prove programs are correct,” Albarghouthi said. “All the properties we used [to prove that] have been either true or false. For example, either the program crashes or it doesn’t. Now, though, the question of what it means for a program to be correct is getting fuzzier and fuzzier.”

To illustrate this point, Albarghouthi brought up Facebook’s news feed experiment, in which the company purposefully supplied users with negative news to study their reactions. Albarghouthi said that the experiment’s algorithm worked in that it accomplished its goal, but there is still a valid question of whether the algorithm was ethically correct. 

According to Albarghouthi, FairSquare uses probabilities to determine whether a program is fair or unfair in the context of discrimination

To test FairSquare, Albarghouthi used a hypothetical program which hired people based on college rank and years of experience. Albarghouthi said the program was fair if the percentage of minority employees hired was similar to or greater than the percentage of minorities in the total population.

Albarghouthi then ran another program which randomly generated a variety of different population models for the hiring program to examine. While generating these models, Albarghouthi said college ranks for minorities were slightly lowered to reflect real-world correlations. He said this type of relationship has to be known for FairSquare to work properly.

“If we don’t know the environment, then our population model won’t be correct,” Albarghouthi said. “We need to know the relationships between variables. We have correlations with certain ethnicities, and now we try to prove we’re not biased against certain groups.”

More to Discover
Activate Search
Software catches unintentional bias in other programs