Haptics and Robotics Projects by Ad Spiers: Lunchtime Talk Write-Up
Operating Theatre to Immersive Theatre
Last Friday we were joined in the Studio by the lovely Ad (Adam) Spiers. Ad is a research associate at Bristol Robotics Laboratory (BRL) where he primarily researches haptic technology (which interfaces with a user via their sense of touch) and its application to robotic surgery.
Ad started the talk by offering us a dictionary definition of what haptics is:
Adjective: Relating to the sense of touch, in particular relating to the perception and manipulation of objects using the senses of touch and proprioception. - Oxford English Dictionary
He then explained that haptics sensation could be simulated when a human holds on to a robot and then exhorts a force, i.e. pushing on the robot, the robot can use it’s motors and some very clever maths to push back. This then gives the illusion of pushing on a textured, solid object as a result. We are able to get the sensation of something physical without actually needing to touch it. This can be incredibly useful; it allows Ad to prototype and test surgical equipment without having to physically build it. It has been recently used to train medical students and vets, for example by recreating the sensation and feel of a pregnant cow at multiple stages throughout her pregnancy.
Ad then said that by developing haptic technology surgeons would be able to have all the benefits of Keyhole Surgery; less scaring and less chance of infection, while still being able to have the sensation of tactile touch that would normally require invasive surgery.
Ad has also been (and continues to be) involved in a number of collaborative creative technology projects related to haptics and robotics. These have included shape shifting navigation aids for pitch black installations and moving origami foxes to intrigue (and confuse) ravers. He introduced us to four of his recent creative projects:
Milk Pixel is collaboration between Ad, Paul O’Dowd, Tom Burton and Jason Welsby. It is a giant interactive RGB matrix of Milk Bottles, which can respond to webcam and sound input or any other data. Milk Pixel was first shown at the Arnolfini as part of unCraftivism and Ad hopes to have version two ready for the festivals in 2013. You can watch a video of it in action here.
Ad then introduced us to Roborigami, a collaboration with Coco Sato a Giant Origami Artist and Peter Bennett an Interaction Design Researcher. The project was born out of a random meeting at the Secret Garden Party 2011, where they decided they wanted to merge their three skills and create a paper sculpture that would incorporate movement and sound. They decided to use foxes and the animals were created out of paper. Each fox was then fitted with a motor and speaker that allowed them to randomly cock their heads and make intermittent sounds. They showcased the foxes at the secret garden party 2012 and then playgroup festival. They are currently developing the foxes further and are looking into making them battery powered, waterproof and interactive. You can watch a video of them at the secret garden party here.
Ad then spoke about DADA, a retired, recycled and re-purposed robot arm that Ad rescued from a skip in 2005. It had been stripped of all its assets, leaving just the motors, transmissions and metal. It sat idle in Ad’s flat until he got talking to Justin Windle (Soulwire) at a party and they decided to collaborate on a project to make tangible generative art. They decided to experiment with DADA and see what could be created from Justin’s artwork. Due to Ad only programming the end points of each drawn line, and it’s unusual mechanical structure, the work that DADA produces can’t be predicted. This leads to some astonishing pictures being created. During his time in the Studio Ad has been working on creating a simple processing class and he hopes to eventually create a gallery show and live drawing exhibit with DADA. Ad is currently looking for more programmers who want to be involved, you can get in touch with him on firstname.lastname@example.org
The Haptic Lotus
The Haptic Lotus was developed after Ad was approached by Maria Oshodi from EXTANT, the UK’s only professional performing arts company of visually impaired people, to develop something that could ‘Provide a rich and equivalent cultural experience to both visually impaired and sighted people.’ They wanted to create a cultural experience that was not dependant on sight.
They looked at traditional experience of theatre and decided straight away to move away from the traditional stage format. This led them to think about immersive theatre and the work of companies like Punchdrunk, where you are expected to move around the space during a performance and search for your own experiences. They felt that this would be the most interesting way to create a equivalent performance and decided that in order to level the sensory playing it should be entirely in the dark. This was where Ad came in; Maria and Ad developed a gadget that would entirely guide the audience through the space and help them make sense of their environment.
This led Ad to investigate how he could use haptic sensation to imply proximity, as this is not an experience that we naturally have, it creates a ‘new sense’ and helps again to “level the sensory playing field” between sighted and non-sighted people. After working through various prototypes and user testing Ad decided on The Haptic Lotus. The lotus ‘petals’ slowly open up in your hand as your become closer to a key feature/experience in the room, and if you stay still (as your impulse might be in a dark room) it doesn’t move, therefore you are encouraged by the lotus to explore the space. The research and development phase, culminated in a showing at BAC (Battersea Arts Centre) in June 2010.
You can find out more about The Haptic Lotus, watch a video about the project, and find a full list of collaborators by visiting: http://www.thequestion.org.uk/
You can find out more about Ad’s work on his blog here: http://theadlab.blogspot.co.uk/