Honest Signals is an artwork that explores information corruption. It focuses on the spread of climate disinformation and how the costs of a polluted information ecosystem are leading to a failure to take urgent action to save the planet.
The project includes a physical sound installation and a playful online game, both which use data to ‘corrupt’ sound as a metaphor for how disinformation corrupts, masks or drowns out the ‘Honest Signals’ that evidence-based sources are communicating.
The term Honest Signal refers to a theory in animal behavioural studies used to describe a biological signal that has a cost to the creature, thereby proving its reliability. For example, the spectacular, but heavy, tail of the peacock is a costly investment in advertising the male’s quality as a mate. The costs borne by climate scientists include combatting questions designed to derail and deflect, online abuse, attending to fake solutionism, false narratives and uncovering alternative terms that hide the facts.
In a Special Report, Newsguard highlights not only how low-cost producing and spreading fake news is, but how it can create revenue from advertisers. Mis- and disinformation puts pressure on credible news sources who not only have to produce news, but have to try to combat and push back against the disinformation, and the community of people and bots backing it up. The economic impact of disinformation is significant – hoaxes generate interest therefore they generate money.

Source: Newsguard
TikTok data
Our original plan was to use TikTok’s new Platform Research API which was slated for release early in 2023. However, the platform enabled the API in Beta mode to only a very few selected researchers. Manually searching TikTok for climate deniers produces varying results, none of which are 100% focussed on climate disinformation. We discovered that some search phrases are blocked, i.e.: “climate change fake”, “climate change hoax”, “climate change is not real”, “global warming hoax”. But these are easy to get around. This led to some interesting research about how users get around moderation systems – the use of alternative phrases and terms (or Internet Algospeak) easily side-steps AI moderation systems. There’s a long history of information being encoded to avoid authorities or threats. The new impact of this on social media platforms is that it can create datasets with double meanings. A cosmetics company analysing post and comment data about ‘mascara’ on TikTok may find unexpected results.
As TikTok data is proving difficult to access we have turned our attention to YouTube, from which we have captured audio samples from climate deniers. There is a lot of highly produced, highly viewed and commented on content. The platform is flooded with climate disinformation and the surrounding debates, and oil company channels all have a focus on renewables and decarbonisation that is disproportionate to their activities. Fossil fuel greenwashing (a form of marketing that gives the impression the companies are acting for the good of the climate) is a part of the climate disinformation problem as the techniques of deflect, distract and delay are employed. The meta data that we acquire will be used to generate the corruption patterns in the audio files.
Young people’s workshop
We ran a qualitative research session with seven young people (aged between 18 and 25) to understand their views on climate disinformation. The key findings from this small sample included:
- Before posting on Instagram they often check the ‘fact’ by going to a trusted source, i.e. finding government or academic papers on the project.
- All were aware that social media loses context, nuance and complexity in its presentation. The ability to post longer texts on Instagram gives space for people to show their evidence, and are therefore seen to be more reliable.
- The lack of ability on some platforms to click away from the app to find out more, makes verifying facts harder and not ‘in the moment’.
- Some preferred to get information from videos, and lose focus with text only info
- Different platforms are treated differently in terms of seriousness – i.e. trust an instagram post more, but less trust on Twitter or TikTok.
- Some had posts flagged and blocked on Facebook, which they were surprised about, but it gave them a new awareness of how easy things spread by accident.
- The twitter flagging system disputing the truth of the post was found to be very useful (Trump’s account was cited).
The cohort also said that they felt they hadn’t received critical thinking and analysis skills to prepare them for assessing disinformation, and that they thought a culture change in how we consume information through the media.
Audio corruption
Our research into disinformation techniques has inspired the type of spoken word audio corruption we are developing. In our tests we have explored cut-up techniques (chopping out words or phrases and deleting or reinserting them elsewhere), distortion, discordance, glitching, and emphasis switching. Given what we understand about the importance of language in this project, we are leaning toward the effects of re-mixing and changing emphasis for the final work. The application of the corruption filters will be determined by data patterns drawn from the spread of disinformation.

The online game
The online component of the work is subtitled DIS or DAT: Is it DISinformation or DATa? The interface allows you to select an unknown track and play it before hitting the DIS or DAT button. If players get it right they are rewarded by the source, and finding out how many others got it right the first time. If players get it wrong, the corruption gets worse (more obvious) on the second playback. It’s a playful way to get people talking about the information corruption we see, hear and watch online. honestsignals.eu is being developed as a low carbon website.
Sonic installation
The physical artwork is also a game. It will comprise a room full of cardboard speakers each emitting sound samples – both corrupted and the original. In this room of competing noise the audience has to move around and listen carefully to the output of each speaker to identify pairs of sounds – the original and the corrupted.
To build the work in the first instance we will run a community-build session where everyone gets to create their own speaker or set of speakers from very basic materials – cardboard, a bottle top, some wire and a magnet. The work is then installed at a venue for a fixed duration, after which the speaker builders are free to return and collect their speaker(s) to take home.


We are actively looking for partners to host the speaker workshop and host the installation – please get in touch.
With thanks to our partners Global Witness and Collective Discovery.