Algorithm Bias, surprisingly not a computer problem.

Matteo Candelaria
2 min readJul 26, 2021

We as a population seem to forget that everything we make and design is at this point managed and influenced by us. We design our cars to fit people that are the same size as us and create food that we think tastes good. Why does this relate to algorithms because we still impact them with our own ideals and agendas even if we don’t outright say it. As discussed by Cathy O’Neil in her Ted Talk(O’Neil, 2017) People have too much faith in the algorithms designed by people and kept updated by people. The criteria and data put into these algorithms is entirely of human design and therefore due to the bias we all have the algorithm has no way of being truly objective in its function. This is a problem because there are algorithms active in fields that require objectivity such as law enforcement.
As discussed in the Ted Talk by Joy B.(Buolamwini, 2017) 1 in 2 U.S. citizens are logged in a facial recognition program and that the system is used by law enforcement. Not only is the program a gross violation of civil liberties. But is also very finnicky at the interpretive level. if the AI has issues like not having a diverse enough set of data the it could falsely identify someone as a suspect or culprit due to a partial match and human efforts filling in the gaps. Until there is a way for algorithms to objectively function with truly diverse data sets and function then it will be an issue. We likely will not see a time where algorithms are not in some way tainted by human bias but I do believe its possible it just may take a while.

References:
O’Neil, C. (2017, August 22). The era of blind faith in big data must end [Video]. TED Talks. https://www.ted.com/talks/cathy_o_neil_the_era_of_blind_faith_in_big_data_must_end
Buolamwini, J. (2017, March 9). How I’m fighting bias in algorithms [Video]. TED Talks. https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms

--

--