
Future Fauna is an interactive art installation in Augmented Reality at The Swedish Museum of Natural History in Stockholm. The visitors interact with the taxidermy specimens in the showcases, and see the long since deceased animals return to life.
The visitors can play with, feed and breed the animals. The virtual beings are decoupled from the laws of biology and can interbreed across species. Strange beings populate the exhibition rooms; owls with antlers, foxes with eagle wings, wolves with moose bodies.
What role does human perceptions of beauty play in our interaction with non-human beings? What happens with nature in virtual worlds outside human control? What cryptozoological potentials comes with a reality freed from the implacable laws of nature?
Find the app for iOS here and for Android here.
You can use the app without visiting the museum. In the first info prompt you get an option to start with 9 animals. You breed the animals by cuddling with them.

A stuffed Roe Deer comes alive in its showcase. Click for a walkthrough of the project.
The project enhances the traditional museum diorama with augmented reality, creating an augmented diorama. The static displays of taxidermy become interactive, and the visitors can play with and feed the animals. This explores the possibilities of using novel visualisation technology in a museum context.
What happens when the visitors can interact with otherwise “dead" installations? AR opens up for a deeper engagement with the exhibitions, and a deeper understanding of the object on display.
Future Fauna is made by Jakob Skote in collaboration with interaction designer Sandra Dang and The Swedish Museum for Natural History. The project is funded by Kulturbryggan and Stockholms stad.
The project has been used by schools visiting the museum to teach the evolutionary processes. A fictionalised text version of the project was featured in NORK magazine #4.

I did a number of user tests at the museum before release, especially with kids since they are the museums primary audience. Their main request was to be able to really pet the animals, to feel their fur.
The 3D models are created by scanning specimens in the archive of the museum.
The basement of the museum is filled with taxidermy animals, a large dusty archive of life. Beautiful African antelopes and towering bears, as well as ragged old cats and a reindeer broken in half. Nothing is to be thrown away in case it contains valuable genetic material, so we in the future can recreate the beings we now make extinct.




Sceenshots from AgiSoft photoscans.

The resulting 3D mesh + clean-up and UV mapping in 3D-Coat.
The mixed animals are made by a genetic algorithm that randomly selects the genes of the parents and creates a new being with the respective limbs. The users themselves chooses what animals to breed, but cannot choose in what way they are combined.
The initial aim was to use a neural network for this process, in collaboration with computer engineer Hiroharu Kato, creator of the world's first Neural Renderer. For several reasons this method unfortunately was not viable and we had to resort to a simpler method. This is an initial test of a Tiger reimagined by Hiroharu’s neural network.



AR Technicals - The animals follow a NavMesh based on the floor plan of the museum. This forces the animals to avoid the same obstacles as you, and creates the illusion of a shared space. The NavMesh is re-instantiated at each showcase, which solves drifting and misalignment, creating a seamless experience.
The exhibition has 9 active showcases, hence the virtual world is made up by 9 different virtual spaces, each with its own origo as point of instantiation to be aligned with the real space. When swapping one virtual space for another the animals are translated to the accurate point in the new space.

The virtual spaces with their instantiation points overlaid atop the real space. One important learning from this process is the need for a shared language to talk about multi-layered spatialities. We named the virtual spaces after their respective animals.





Screenshots from custom alignment app
A printed marker mounted on the side of the showcase constitute the key between the virtual and the real space. The computer recognises the marker as the instantiation point of the virtual room.
The markers are designed to be as easily readable by the computer as possible, with lot of detail, a broad histogram and lack of repetition, while also being easily identifiable by humans, as well as aesthetically fitting within the exhibition space and the general graphic profile of the museum.




