A prototype Internet of Things system aimed at unlocking urban wayfinding for the visually impaired, and more besides.
We (Future Cities Catapult) launched a major project today: Cities Unlocked (Ed. this article was posted on 7th November 2014, when the project launched.) Phase one of a collaboration with Microsoft UK and Guide Dogs, it’s a demonstrator of how wearables and 3D soundscapes could improve mobility and wayfinding in a beacon-enabled city. In other words, how might we get around cities in the near future?
The collaboration centres on augmented services for the visually-impaired, which are a kind of ‘lead user’ in this case, as well as a group of people with a genuine need for better ways of getting around cities. The statistics are fairly sobering: 180,000 people in the UK rarely leave their house alone, as the city is rendered largely inaccessible to them. The unemployment rate amongst those with sight loss is around 70%, as compared to 7% for the general population. And by 2050, the number of people with visual impairment in the UK doubles to around 4 million people, as the population ages.
So there is a genuine need underlying this project; and the technology has real promise, in terms of beginning to address those issues. Yet designing a better urban environment, from curbs to beacons to maps to transport services, is likely to enable a better city for all. That’s the thinking behind the project.
The technology is designed to augment, rather than replace, other assistive elements like guide dogs themselves, and white cane. It uses a variety of technologies, from beacons embedded in the urban environment (in this case, based around a journey from London Paddington to a suburban street in Reading) to a modified Aftershokz headset, packed with accelerometers, compass, gyro etc, to sense which direction the head is pointing. The 3D soundscape keeps the user on-track, as well as noting ‘points of interest’, including real-time elements such as “The number 9 bus is pulling in here in 30 seconds; reserved seats at the front …” etc.
Preparing for the launch, I was blindfolded (actually a ‘mindfold’, which prevents any light whatsoever entering the eyes) and went for a walk from Microsoft’s HQ in Paddington to Paddington Station, with both a sighted guide and a white cane.
It was thoroughly disorientating, even though it gave only the most basic, and passing, understanding of what it might be like to be blind. (And I was holding onto someone’s arm all the way.) But feeling the other senses begin to ‘kick in’ after a minute or so was fascinating. The warm sun on my face as I turned a corner, smelling a coffee shop to my left, hearing a taxi to my right alongside the trundle of suitcases on gravel, the bouncing of the cane in my hand over some cobblestones (I’ve gone off cobbles) …
To use the wearable, one’s head is scanned (via a Kinect) to ensure correct fit. The headset itself is bone-conducting — thus leaving the ear free — which is an odd sensation initially, but the brain soon figures out how to process it. It’s quite something. The headset also uses a smartphone app, as well as beacons embedded in the environment (via MiBeacons.)
It was so instructive to talk to Microsoft’s Amos Miller and Bill Buxton about these elements. While many architects and designers do have a good instinctive understanding of sound in the city, it’s fair to say that it is generally little understood in urban projects (as noted here many times, from ‘urban soundlabs’ to energy to the potential impact of electric cars on the street’s soundscape.)
At the Catapult, we particularly focused on the human-centred design research elements, augmenting our in-house team with a series of partners.
Superflux did a great workshop for us, exploring the wider urban issues and opportunities around the core questions.
Helen Hamlyn Centre at the Royal College of Art shadowed eight people with sight loss as they planned and undertook journeys, developing a deeper understanding of how they experienced the city. You can watch a lot of the videos they produced here. For example:
Arup Foresight produced a wide-ranging database of potential innovations around accessible urban design for the project, wich we’re posting on the site here. We hope those are of use to designers generally, from kerbs to braille interfaces for smartphones to installations.
We asked the Centre for Advanced Spatial Analysis (CASA) at UCL Bartlett to use their ‘appropriated’ electroencephalography (EEG) brain monitoring to measure the cognitive and emotional responses of people with sight loss as they moved around the city. These enabled new ‘stress maps’ of the city to be produced. Some really interesting insights here (over and above the obvious ones, such as Tottenham Court Road is not well liked by anyone) — such as green spaces are calming for both people with sight loss and those without.
Finally, Nottingham University did an initial verification of the results. They found significant improvements.
Guide Dogs’ research found that 10 out of 17 measures of wellbeing were significantly increased when using the technology, with 62% of participants showing an increase in safety, confidence and resilience, allowing them to relax into the journey.
That comes from a report we produced about the project, and the research, which you can download here.
This is just phase one; planning for phase two starts in a couple of weeks. The goal is in understanding and demonstrating the potential benefits of a series of interoperable, accessible, location-based services for complex urban environments.
In terms of our methods, it’s another example of urban prototypes, described here (see also Connected Displays with BERG, and a couple more to follow.) We bring people together to make projects happen, projects which try to flush out the true possibilities of Internet of Things and ‘smart cities’ by taking a human-centred design research approach. We’re also interested in bringing these techniques from user experience design into architecture, urban design and urban planning — which are often conducted without such insights and approaches.
To this end, our Cities Unlocked site unpacks all the research we did with our partners, partly to share our findings and methods, but partly so that others can build on it. #legiblepractice
Thanks to Amos Miller, Angus Foreman, Jarnail Chudge and Bill Buxton at Microsoft, and to Jenny Cook and John Shelton at Guide Dogs. Thanks also to Lord Holmes of Richmond. (Was great to meet Bill Buxton, actually; something of an HCI legend. I think I studied his research on my Comp Sci degree a long time ago!)
Particular thanks to Claire Mookerjee, Project Lead: Urbanism in my team at the Catapult, who led the project for us. Her knowledge of the urban context was an invaluable addition to the various specialisms of Microsoft and Guide Dogs. She also orchestrated that stellar array of UK design research talent for the project, in the form of Superflux, Royal College of Art Helen Hamlyn Centre, Arup Foresight and the Bartlett’s Centre for Advanced Spatial Analysis. Thanks to those collaborators too. Thanks also to the rest of the Catapult team for their support too — especially Lucy Warin and Scott Cain. Find out more at Cities Unlocked.
(We’ve had some good press coverage already, including BBC News, Radio 4 Today, Daily Telegraph, Dezeen, The Verge, CityMetric etc. More to follow.)
Ed. A version of this article was originally published at cityofsound.com on 7th November 2014/ I was Executive Director at the Future Cities Catapult at the time.
Leave a Reply