Mscapes overlay digital sight, sounds and interactions onto the physical world to create immersive and interactive experiences.
Users equipped with a mobile device running the mscape player can move through the physical world, triggering digital media - including images, text, sounds, audio and video - in response to physical events such as location, proximity, time and movement, making for experiences that are unpredictable, memorable and entertaining. The most commonly used trigger is GPS, but a range of sensors can be used and combined, such as Bluetooth, wifi, Active RF Beacons and 2D Barcodes. Mscapes can be anchored to a specific location or work in a generic space, like a large playing field.
mscape was originally developed by HP Labs through the Mobile Bristol research programme (http://www.mobilebristol.com/). It received further updates by HP Labs under the Pervasive Media Project as part of the Pervasive Computing Lab at HP Labs in Bristol and is now being carried forward within the Pervasive Media Studio by start-up company ‘Calvium'.