The robot clicks on an ad on a climate article showing news is corrupted

Illustration of hands and New York Times front page.

illustration: Tega Brain and Sam Lavigne

Watching Synthetic Messenger is a somewhat separate experience. It runs in a Zoom call with 100 participants, all participants are robots. Observers can observe these robots-they are strangely anthropomorphized, images with intangible hands and sounds that repeat “scroll” and “click”-methodically scroll through news articles about climate change, and click Every ad on every page.

The project was created by two New York artists and engineers and started earlier this month.In the first half week of going live, its robot visited 2 million climate articles-you can see them listed Here——And clicked on 6 million ads.

If all this looks like a bizarre, psychedelic art project, it definitely is. But this is also a criticism of how the media has shaped the narrative of the climate crisis.

Most online stores are funded by advertisers. Stories that get more ad clicks can also become more obvious in Google’s search algorithm, thereby attracting more people’s attention to the page. When certain stories get more views and engagement, news organizations are more likely to publish similar articles. Paradoxically, this means that advertising mechanisms and algorithms can play a huge role in determining what news people see, rather than other factors, such as, well, the importance of stories.

“Through this project, we want to see how media ecology affects our actual ecology, and how narrative affects our physical realm,” said Sam Lavigne, an artist and assistant professor in the Department of Design at the University of Texas.

Of course, as Lavigne quickly pointed out, conflicting narratives always play a role in a climate crisis.Polluters know how important it is to control how people talk and think about the climate crisis, so They have spend wealth In various Misinformation campaign, Including shaping the media narrative.

“The narrative about climate change has always been controlled by the fossil fuel industry and lobbying groups,” Lavigne said.

Algorithms further distort the way news – or more and more misinformation – is communicated to people.For example, the algorithm for YouTube recommended videos has Encourage the audience Watch videos full of climate denial. YouTube also opposes these videos, profiting from misinformation, and at the same time incentivizing viewers to consume more videos.

One and a half years ago, the devastating wildfires in history spread in Australia. A saying has sprung up like mushrooms, they are Caused by arsonists, Not the climate crisis. That misinformation, A group of researchers found, Is spread through the use of online trolling robots.The conservative media then turned around Amplify these claims, To create a feedback loop where everyone is exposing lies, not talking about how to solve the climate crisis. (Same scene Staged in the U.S. last year. ) However, Tega Brain, who co-founded the project, pointed out that these are not the only methods that The algorithm has colored The media landscape.

“All news and therefore all public opinion are being shaped [by] Algorithms,” said Brian, an assistant professor of digital media at New York University and a background in environmental engineering. “The algorithmic systems that affect news are these black box algorithms,” she added, referring to technology companies hiding their codes and priorities from the public. way of doing.

Another perspective of the carbon cycle.
GIF: Tega Brain and Sam Lavigne (other)

Then, Synthetic Messenger played with the system by showing the robot’s interest in climate stories. Although it can play a small role in expanding climate coverage, there are also some complications. On the one hand, because its algorithm is imprecise and based on climate-related keywords, it also clicks on ads in media that deny climate.Its creator tried to solve this problem by blacklisting these denied sites Rupert Murdoch owns, But it is not a perfect system.

If this project is designed primarily as a tool of political organization, then these may be a big sticking point. But Brain and Lavigne know very well that they know that their project will not change the media landscape or fight the climate crisis itself.

“We are not going to interpret it as,’This is a truly effective neo-radicalist strategy to combat climate change,'” Brian said. “In essence, in this project, we are doing so-called’click fraud.’ If we do this for long enough and large enough, it won’t work, because obviously the ad network is doing what it can to prevent Automated behavior. They will prevent it.”

Rather, its purpose is to draw attention to the messed up incentive structure that determines which climate stories advertisers and search algorithms tell and amplify.

“It’s not that we provide it to solve the problem we are facing. The solution is a meaningful climate policy, an effective policy,” Brian said. “But we are trying to start a dialogue and reveal how our media landscape currently works.”

Leave a Reply

Your email address will not be published. Required fields are marked *