The Academy in Context is a dinner-seminar series, created to foster a stronger sense of community among Brown's graduate students. Events focus on topics of interest across disciplines, typically concerning ethics.
Sandstede, a professor of Applied Math, and Blumberg, who teaches the DSI’s Data and Society classes, spoke about Bias and Transparency in Contemporary Data Science.
It has become a fairly well-documented problem that social or cultural biases find their way into software developed through machine learning using supposedly objective algorithms. Early face recognition software had trouble with non-white ethnicities and women because it was trained with mainly white or Caucasian male faces. Less well-known examples of bias include algorithms that help process loan or job applications, which can perpetuate the biases of previous loan or hiring decisions by using those past decisions to train them. Transparency, or lack thereof, becomes a factor when decisions that affect people’s lives are made by proprietary algorithms—the people affected by their results are not even able to see how the results are produced.
These are considerations that need to be part of a date science education program, and it is a part of the Brown Data Science Initiative’s mission to “explore the impact of the data revolution on culture, society, and social justice.”
Our February panel on Algorithmic Justice, co-sponsored with the Center for the Study of Race and Ethnicity in America, explored the effects of Big Data on social inequalities, including the ways that data can be used in the service of social justice. We will be holding more of these sorts of events in the future, so stay tuned!