2023
When AI large language models generate false information, it’s called hallucination. The same term applies to the inflated promises the tech industry makes about these systems’ transformative potential.
Hallucinations begins with Generative Adversarial Networks trained on minimal datasets — a handful of images rather than millions. With so little to learn from, the systems produce artifacts that expose their own limitations: failed attempts at coherence, digital residue, the gap between what they claim and what they can actually do.
But the work starts where the machine stops. Each image is then individually manipulated through custom code, channeling the artist’s own emotional responses to the tech industry’s mythology. These interventions become gestures of anger and skepticism — searching for patterns the machine perceives but cannot articulate, amplifying what remains hidden in the computational process.
The titles are quotes from tech leaders promoting revolutionary futures. The images emerge from the tension between those promises and the reality of systems struggling at the edge of their capability. Together they invite a felt encounter with the distance between what we’re told and what actually is.
Tabula Rasa - AI / machine learning generated images
Manipulations - Manipulated AI / machine learning generated images
Flag - Manipulated AI / machine learning generated images