code laptop
Credit: Image from pexels.com

For every problem there might be an algorithm that can solve it, but for every algorithm there is also a problem. Jonathan Albright, research director at the Tow Center for Digital Journalism, shared some advice for journalists reporting on algorithms, speaking at the GEN Summit in Portugal yesterday (30 May).

Albright explained investigating algorithms should become a part of the journalistic process and seen by newsrooms as a practice that is “cooked” into what they do.

“How do we shed light on the ways that we receive information and the ways that we are exposed to certain types of things like search results?”, he asked.

News organisations such as ProPublica have been reporting on algorithms and the ways they make decisions that affect our lives for years.

Journalists have uncovered bias and discrimination in algorithms that calculated car insurance premiums or determined whether a person was more or less likely to reoffend.

They found that it’s possible to post adverts on Facebook for jobs but exclude users over a certain age from seeing them, among other filters you can apply on the platform.

“I believe that the way to look at these algorithms is to look at what decisions they are making and audit these decisions,” Julia Angwin, a former investigative journalist at ProPublica, explained at the International Journalism Festival in April.

For Albright, looking at the input is equally important. While the spotlight has been placed on transparency, he explains this can alienate the public.

Referencing the work of Frank Pasquale, author of the The Black Box Society, he said putting a lot of emphasis on revealing how an algorithm works can “decrease understanding because it’s so complicated”.

“The job is to communicate why this matters and the power dynamics. Don’t confuse transparency with algorithmic accountability. It’s just one step along the way.“

Algorithmic accountability reporting should aim to explain these power dynamics, audit the decision of algorithms, and shine a light on the context of how algorithms prioritise, classify, associate and filter information.

Fact-checking, for example, can be undermined by the way YouTube presents related videos. A person watching a fact-checking video from a trusted organisation can then fall down a spiral of videos featuring conspiracy theories and propaganda if the auto-play feature is enabled for YouTube to start playing related videos.

Google’s search bar where people start typing their questions auto-fills once you have entered enough keywords, which means questionable suggested searches can come up.

When journalists think about algorithms for news, they need to explore and report on not just the results but tie them back to behaviour, socioeconomics, and the wider context in which they operate, he concluded.

Free daily newsletter

If you like our news and feature articles, you can sign up to receive our free daily (Mon-Fri) email newsletter (mobile friendly).