Week 2_Reflection about Algorithm

Zhichen Gu
3 min readNov 3, 2020

How to judge the value of the algorithm? It seems limited to the completeness of the algorithm itself or in connection with society and people’s lives.

Nowadays, these invisible and intangible professional terms are gradually interpreted and understood by different social groups from multiple perspectives. In contrast, the actual fairness of algorithms is also progressively extended to the level of social morality.

The deviation of the algorithm might reflect the chaotic past of society. Any deleted data may still lurk in the algorithm, waiting for a chance to reappear. In part, because they are inadvertently programmed into the software and amplified by algorithms. Only looking at appearance, algorithmic bias seems to be an engineering problem that can be solved by econometric and statistical methods.

So how does this set of logical steps ensure a just, fair and ethical outcome?

Setting up AI learning programs requires a human intervention which takes a great deal of responsibility and tenacity to set up more equitable programs.

And sometimes algorithms go from a problem-solving to be another problem.

In 2016, the Google album classified two African American photos as “gorillas.” Two years later, Google responded to the incident by simply removing the word “gorilla” from its classification database.

At the same year, data showed that Amazon’s same-day delivery service for European and American consumers was uneven.

When translating neutral words into gender-specific pronouns, Google’s word2vec neural network translation reflects gender stereotypes. For example, “doctor” (or “boss”, “financier”, etc.) is often translated as “he”, while “nurse” (or “housewife”, “nanny”, etc.) is translated as “she”.

In Florida, algorithms that recommend detention and parole based on the risk of recidivism have a higher error rate among African Americans, who would otherwise not commit another crime, and are more likely to be misjudged by the algorithm as recommending detention.

It turns out that the algorithm is fairly good at predicting recidivism and less good at predicting the violent variety. So far, so good. But guess what? The algorithm is not colour blind. Black defendants who did not reoffend over a two-year period were nearly twice as likely to be misclassified as higher risk compared with their white counterparts; white defendants who reoffended within the next two years had been mistakenly labelled low risk almost twice as often as black reoffenders.

— John Naughton

And how about arts and algorithm?

Those generative arts working on painting and sound by the algorithm can provide artists with a means to reduce subjective creative intention. Under different conditions, parameters and Settings, the operation process is totally different, and the result will show specific characteristics, which is unpredictable.

Harold Cohen and AARON, 1973

Iannis Xenakis and Formalized Music, 1958

--

--