see


A collection of key project prototype images - including diagrams, 3D animation stills and computationally generated artworks - expressing core project concepts, followed by a series of videos of talks about iscri.

Images

Etic Lab's Alex Hogan and Dr Kevin Hogan designed the initial experimental set-up. Alex's system diagrams shown here are informed by the Lab's whole-systems approach to developing Machine Learning models, which draws on the thought of biologists Maturana and Varela.

This diagram shows the inseparability of octopus and its environment, how the Reinforcement Learning AI model becomes part of that environment, and the questions it aims to explore.

questions this setup explores: is there a communication relationship between us and the octopus, mediated by the environment? If so, does the octopus set communication rules and can we learn them? How do changes in our expectation for equilibrium affect the above?

The second system diagram shows that the AI model's detectors and emitters respectively monitor changes in the underwater environment and broadcast video into it. The streamed video element is modulated through changes detected in the environment, meaning that if it chose to respond the octopus could modulate the streamed output.

This setup also acknowledges the entanglement of experimental setup, tools, questions and aims with results. It aims to remove, as far as possible, human expectations and biases, by training the AI model only on environmentally-gathered data that includes the octopus's behaviour; and by creating an interface that could allow the octopus to engage, to play with and edit the video stream.


The underwater experimental setup is visualised in this image, showing the underwater screens - emitters - in an ocean mesocosm. The setup was developed through conversations across the whole group, and our ideas about it changed over the course of the project. Other possible emitters included underwater holographic projections and haptic objects, as touch is so important for octopus vision. Thinking about objects in the environment, we also recognised how important play as a response might be for an octopus. Our hypothesis was that an octopus's participation might constitute a form of communication, mediated by the AI.


Maggie was regularly diving off the coast of Cape Town, in the same waters where Craig Foster’s Netflix documentary 'My Octopus Teacher' was filmed. She worked with local BBC underwater cameraman James Loudon to capture high quality underwater images of an octopus's environment, as part of the visual research


Imagery included close-ups of the ocean floor and objects from the octopus's environment that make up an octopus visual-tactile vocabulary; whose textures, forms and patterns are mimicked in octopus camouflage and shapeshifting. The collection of essays, 'Cephalopod Cognition' (eds. Darmaillacq, Dickel and Mather) was a key recent text for our understanding and interpretation of octopus visual cognition.


0rphan Drift produced a body of visual prototype experiments that they developed through Lidar animations and computational art, and using 3D animation software Blender. These were based on Maggie's experiences of diving off the coast in Cape Town’s Great African Sea Forest and working with South African interspecies communicator Anna Breytenbach to gather more sensory-based information, together with our collaborative marine biology research and project discussions.

These still and animated image prototypes played with forms and textures from octopus environments and speculatively explored ways an octopus might experience this environment, visually and as a distributed consciousness.

The iscri team developed what we called the 'Encounter Diagram' for our meeting with Serpentine Creative Director Hans Ulrich Obrist.

The diagram was an attempt to map out iscri's key aesthetic, ethical and philosophical approaches and concepts that we had developed over four years through octopus cognition research and in-depth, cross-disciplinary group discussion.




Videos

A selection of talks and a 3D video showreel of experiments using computational art for visually prototyping hypotheses, or 'hyoptypes' to use the term coined by Alasdair Milne in his iscri archive text, about how octopuses might see. Click on the images to watch the videos.

'Kraken?' was the first collaborative talk given by 0rphan Drift and Etic Lab, in the early stages of thinking about what it might mean to try to communicate with an octopus, before we named the project iscri.

We gave this talk at Goldsmiths, hosted by Dr Ramon Amaro in the Visual Cultures department.

Maggie Roberts spoke on behalf of 0rphan Drift; Dr Kevin Hogan and Stephanie Moran spoke on Etic Lab's behalf.

Maggie spoke about iscri as part of her talk on 0D for Dr Betti Marenko’s ‘Hybrid Futures Lab’, part of the University of London's ‘Digital Innovations Season’, 2020.

iscri showreel with 3D animation prototypes developed in Blender, Lidar animations and computational art experiments by 0rphan Drift, 2020-21, with computational artists Megan Bagshaw, Duncan Paterson, George Simms and Jason Stapleton.

Maggie and Stephanie discuss iscri as part of the panel, 'Strange Relations: Exploring Interspecies Communication through AI and the Arts', for FIBER Festival 2021, Amsterdam, alongside Špela Petrič speaking about her work with AI and plants. Moderated by Ruben Baart.

Stephanie and Kevin are interviewed by Guy J. Baker, Editor of the Marine Biological Association's publication, The Marine Biologist. This followed on from an article about iscri that was published in the magazine.

‘A Cephalopod - Machine Encounter’: Maggie in conversation with the Serpentine Gallery’s Creative AI Lab’s team on Twitch TV for London Frieze Week, 2021.

Maggie and Stephanie present a paper about iscri at Art Machines 2: International Symposium on Machine Learning and Art 2021, as part of the panel 'Interspecies Research and Becoming Animal'.

A promo video for an AI tech industry pitch by Beth and Alex, packaging up some of the ideas from the research into a product idea. This was part of a short-lived attempt to raise tech investment for the AI development part of the project (which would involve a lengthy and costly period of training an AI model in the Mediterranean Sea).