From hiring practices, to teacher evaluations, to self-driving cars, to grocery store advertisements, algorithms have an immediate affect on our lives each and every day. The idea of using math to narrow information based on fact, without the insertion of bias was the idea that has perpetuated the use of these formulas. Unfortunately, these systems are inherently ridden with bias, because they are developed based on someone’s idea of success.
According to Cathy O’Neill, algorithms, even when they have good intentions, can have deeply damaging effects when they are not challenged. This was made evident in New York, when teachers were being fired based on scores from an algorithm, that when it was plotted, it looked like a random number generator. 
“Algorithms can go wrong, even have deeply destructive effects with good intentions. They repeat our past practices, our patterns. They automate the status quo. That would be great if we had a perfect world, but we don’t. An algorithm designed badly can go on for a long time, silently wreaking havoc.” ~O’Neill 
O’Neill suggested five steps that should be performed on algorithms being practiced:
- Data integrity check
- Come up with a universal definition of success for each situation
- Consider accuracy
- Consider long term effects
- Demand accountability 
Safiya Noble also demonstrated the destructive impacts that unchecked algorithms can have on perpetuating bias in our society, and had a similar goal to end of unconditional trust in the misunderstood practice of big data.
Her main facet of research was in the “sexist and discriminatory ways in which women are regarded and denied human rights,” especially through the use of search engines like Google that have become synonymous to the whole internet concept. 
In summary, she found bias present in simple searches favoring above all the interests of white men. Women were displayed in almost all instances as overly sexualized, especially women of color.
The overall result of the study proved that “Google’s monopoly status coupled with its algorithmic practices of biasing info toward interests of the neoliberal capital and social elites in the US has resulted in a provision of information that purports to be credible but is actually a reflection of advertising interests.” 
It is important to take these pieces of information and put them to use, making sure to challenge these algorithms whose occurrences are rapidly expanding, to avoid future civil rights issues. The truth of the matter is that the creation of algorithms was to remove influencer bias from decision making, and as of now we have yet to find a way to completely separate the two. I believe that recognizing the downfalls of these practices is a huge step in a positive direction, so that hopefully one day we will live in a world without inherent bias in everything that we do.
The most important lesson from this assignment is that the era of blind faith in big data must come to an end, and accountability is everything.
The terms ‘big data’ and ‘algorithms’ were completely foreign to me before encountering the studies of Cathy O’Neil and Safiya Noble. These ladies have discovered a way to accurately identify the shortcomings of previously undefinable concept and call people to action against the embedded bias in algorithms that control our lives.
 Noble, Safiya Umoja. 2018. Algorithms of Oppression: How Search Engines Reinforce Racism. New York: New York University Press. https://www.dropbox.com/s/fllx0pgarkhjjk9/Noble.pdf?dl=0
 Cathy O’Neil. “The Era of Blind Faith in Big Data Must End.” Advertisement. TED. 2017. https://www.ted.com/talks/cathy_o_neil_the_era_of_blind_faith_in_big_data_must_end/transcript?language=en#t-816782.