“Algorithms have a level of intelligence just below that of a bacterium”

26 November 2018

David Sumpter in front of a whiteboard.

David Sumpter, a Professor of Applied Mathematics, thinks that articles about what algorithms can achieve are rather exaggerated.

Hi David Sumpter, Professor of Applied Mathematics, who has written the book Outnumbered: Exploring the Algorithms That Control Our Lives. Algorithms are constantly being discussed. How much control do they really have over our lives and how worried should we be?

“Reports about what algorithms can achieve are rather exaggerated. If we take ‘fake news’, for example, it is certainly a problem that sort of thing continues to be shared in social media like Twitter and Facebook. But these kinds of conspiracy groups that exist in their own bubbles amount to only 1–5 per cent of the population, according to American studies. As soon as they broaden out too much, they are challenged by people outside the bubble and are faced with counter-arguments, like you can see on YouTube.

“What is more dangerous is if you are interested in buying a new product and rely on the first best review on the internet. An enormous number of web pages simply exist to provide fake reviews and have links to other sites, such as Amazon. The webpage owners then receive a commission on products sold.”

But do researchers like you bear any responsibility for constructing algorithms that can be used for questionable purposes?
“That is an interesting question. In May the Centre for Interdisciplinary Mathematics at Uppsala University held a workshop on that subject. The theme “Mathematical Social Activism” brought up how mathematicians can assume greater responsibility for seeing that our knowledge is used to improve society.

“People often overestimate the ability of algorithms to be more accurate than humans. However, studies show that algorithms and people are about equally reliable in making assessments and assumptions about the future. Algorithms are no better at this than some random person.”

But would it not be possible to develop an algorithm that is a better version of us without our human weaknesses and prejudices?
“Usually the problem is that the algorithms pick up our biases from the beginning. We develop algorithms based on a particular decision in which our prejudices are included. Then the algorithm takes them further. For example, I have studied how you develop algorithms in order to understand language. One of my students is currently doing a project about that, and it turns out that algorithms are incredibly sexist. Algorithms are based on huge amounts of text from news, encyclopaedias and so on. So you can ask algorithms to make word comparisons and, for example, to correlate the word man with computer programmer. Then you call on the computer to fill in the correlation with the word woman. The correct answer here is also computer programmer, but the algorithm proposes housewife as the most suitable word. And that is because the algorithm has been trained using all of the texts that we humans have written and in which women traditionally have worked in the home. And unfortunately in our current texts it still looks as if this continues to be built into the algorithms.”

What does the future with algorithms look like?
“As far as algorithms and artificial intelligence are concerned, I am quite sceptical of the idea that AI will take over the world, at least within the next century.

“Previously I have conducted research in mathematical biology and looked at different biological levels, among other things. The human brain can then be assumed to be at the top level in terms of intellectual development. But researchers in Great Britain have taught a bumblebee to play football by rewarding it with food after it had pushed a ball into the goal. They have also allowed another bumblebee to watch, which then also managed to score a goal. So insects can learn from each other in a way that algorithms and robots cannot.

“Instead algorithms are at a level of intelligence just below that of a bacterium. An E. coli bacterium that lives in your stomach knows, for example, what food it should look for. Algorithms are not even at that level of intelligence. So they don’t scare me very much, ha ha.”

On 3 December David Sumpter will speak at an alumni event at the Swedish Residence in London.

---

Additional reading:

Will AI take over the world?

Mathematics for social activism

Subscribe to the Uppsala University newsletter




Last modified: 2022-12-22