DAVID YOUNG

© Copyright 2024 David Young.
All rights reserved.

Hallucinations

Manipulated AI / Machine Learning generated images. 2023 - ongoing.


When an AI large language model (LLM) generates false information it is said to hallucinate. The same can be said for the hype generated by the tech industry about the promise and future of these systems.

For this series I start by training an AI system, a Generative Adversarial Network (GAN), on just a handful of images - photographs taken in nature, for example, or just one or two solid colors. This is in contrast to the millions or billions of images typically used to train AI models. My system struggles to “understand” what it is being shown and provides me with images that reveal both the digital and organic qualities to its processing. I then take these images and manipulate them with my own custom code, thereby looking for and amplifying hidden patterns that may be obvious to the machine but invisible to the human eye. The final images contain both the original grid of the Al-generated image and the possible understanding of the Al system. The titles are quotes from tech leaders that reflect the hype around the promise and future of the systems they represent, often as hallucinatory as the output of their systems themselves.

A Very Strange Decision (m651)

Mitigating the Risk of Extinction (m657)

People Are Begging to Be Disappointed (m702)

Loss of Control of Our Civilization (m695)

Superficial and Sloppy (m698)

The Tech the World Has Always Wanted (m689)


Press