I collaborated with Lucas Wozniak on an immersive AR installation in which:
  • Participants, guided by audio- and feel-first AR, are tasked with virtually foraging for mushrooms above a projection of underground mycelial networks. Their actions gradually string together voiceovers of a food-for-thought folktale with the help of AI, fungi, and a corpus of anthropology texts.

Exhibitions: ITP Spring Show 2022, FAYD Digital Issue 002

Lucas have long been inspired by indigenous philosophy and fields such as ecospsychology that offer fresh ways of viewing the world and humans' place in it, such as by seeing nature and its organisms and processes as vibrantly intelligent and expressing of their own meaningful languages.

In brainstorming a way to express curiosity about wild intelligences beyond human ways of knowing in a more direct and literally lively way, my collaborator and I settled upon the organism of fungi as a main character for the audience to engage with through a participatory story. We were inspired by research we came across into fungal language and fungi's underground mycelial networks as computing devices and electrical networks.

Questions driving the project development included:
  • How might interactions with fungal systems offer new ways of thinking?
  • What unique ways of processing information do mycelial networks express?
  • How might a speculative human-fungal interface communicate knowledge and perspectives on the world?


In order to bring communication with fungi and their fascinating ways of processing bioinformatics to life in an embodied and intelligent way, we combined a wide variety of mediums, including:
  • AI text generation
  • natural language processing text extraction
  • voice-cloning techniques (spatial audio storytelling)
  • haptic feedback
  • audio feedback (immersive soundscape)
  • projection mapping of an animated mycelial network underlying a 3D layer of a local landscape
  • a physical encounter with mycelial soil
  • a virtually gamified version of foraging in AR

Tools we incorporated:
  • iOS App: RealityKit, ARKit, Core Haptics, Core Location, Spatial Audio, SwiftUI, Firebase
  • AI voiceovers: Mushroom-related texts, GPT-2, Python (NLP), Voice Cloning ML, Headphones
  • Unity: GLSL shader, Firebase, Photogrammetry (DJI Mavic Mini Drone, Meshmixer, Blender), Projector, Ableton patch
  • Ableton Live: Environmental sensor, Synths, Unity patch, Mixer, Speakers

The end experience of our multimodal prototyping consists of an immersive installation game in which:
  • the participant begins by reading a written reflection planted in mycelial dirt by a previous participant
  • the participant then forages for mysterious, invisible mushrooms by walking above a mycelial projection
  • with a "foraging beacon" (encased iPhone) that vibrates when it intersects with a mushroom's invisible boundary
  • while enmeshed in an immersive soundscape generated from environmental sensor data recorded at a local fungi habitat

As each of five generatively placed mushrooms is foraged:
  • the participant hears a curated voiceover selection from a philosophical story generated by GPT-2's synthesis and an NLP algorithm's extractions of mushroom-related texts
  • (these represent nuggets of unexpected wisdom that emerge from interfacing with the mycelial network's information processing)
  • they gradually piece together pieces of a 3D landscape (photogrammetry scan) of a local fungi habitat

Until finally:
  • the landscape has been restored (totally pieced together)
  • the soundscape quiets down
  • the participant puts the foraging beacon back at its origin point and leaves their own written reflection in the mycelial dirt

User Flow

Tech Stack 

Small video demo

At ITP Spring show

© Henry - Haoyu Wang, 2018 - 2023. All rights reserved