A few months ago, when Apple announced ARKit, a new augmented reality framework, it was difficult not to be impressed by the accuracy of the tracking and the cohesiveness of the whole experience. Since then, some very interesting demo projects started cropping up.
In August, we had just finished a few months of back-to-back sprinting, so it was the perfect time to indulge in a creative break. We were really excited about the possibilities of AR and so spent a couple of weeks exploring potential uses for AR in Health-related apps.
As anyone familiar with Pokémon Go will understand, AR has the potential health benefit of encouraging people increase their physical activity. We’d love to see this taken further in AR applications for physiotherapy and rehabilitation. AR also has the very interesting potential to measure and record data about a person’s interactions with the world. We developed two prototypes to explore how this potential could be useful in a health-related context.
After lots of sketching, we eventually decided to take two ideas forward into prototypes: LifeTagger and ShapeSolver (full disclosure: we also considered LifeTggr and ShapeSlvr as names).
LifeTagger arose from thinking about how AR could be used to augment memory and add stories to your personal possessions. A bit like a distributed life story. To use the prototype, you first find an object that is important to you. You then tag it and add some information to it. You can then return to that tag later and view its associated information.
This approach could be useful in a few different situations such as reminiscence therapy where family photos and memorabilia might be tagged with an associated story. These tags could also help people with memory impairments live more independently by reminding them how to perform household tasks. Lastly we could imagine these tags being used in a more formal dementia care setting. Supporting people with dementia to communicate important parts of their identities or preferences to carers.
ShapeSolver evolved from similar themes that we are currently exploring as part of CognitionKit. We wanted to develop some simple AR games that might help us understand how people solve problems. ShapeSolver asks you to find an open space and then gives you a puzzle. None of the puzzles can be solved by standing still so you have to move around!
We speculated that different people would have different strategies for solving each problem. Information about how people move around the objects could be used for understanding a few different cognitive processes such as spatial reasoning, memory and executive function. Or it could even be used for cognitive training or rehabilitation.
Whatever the use case, the most important thing is that it’s fun to play!
Toying with the existing AR demos created a fair amount of buzz around the office. But witnessing people’s reactions to our own explorations was definitely the most fun part of the process.
Two observations in particular stood out:
The level of immersion was higher than we would have expected from an AR game you experience through just your phone screen – no Oculus headset required. People were careful not to walk “through” AR objects, preferring to squeeze themselves into awkward spaces instead. Several “Oh—I was expecting it to still be there!” reactions were also prompted as people returned their gaze into the Real World. While some (perhaps those among us with the more Orwellian inclinations?) described the immersion as creepy and disturbing, most took it as light-hearted entertainment.
People instinctively used their whole body rather than just interacting with the phone screen. As one of the premises of ShapeSolver was to combine a cognitive test with physical interactions, seeing people walk around, squat and tiptoe unprompted was encouraging. The designs of the 3D objects had some effect as well. Where a pyramid was aligned directly towards the user like a neat geometry book illustration, the person was less likely to doubt in its symmetry, and so less likely to explore opposite sides. Some visual hints such as tilting the shapes or hiding them behind each other were enough to prompt people to move around.
Having seen the variety of sophisticated ARKit demos out there we initially expected to spend a lot of time getting to grips with the new APIs before we could try out even the most basic features. In reality, the bulk of our preparation was taken up by installing and setting up the beta versions of High Sierra, XCode 9 and iOS 11. Some brushing up on matrixes and 3D geometry is, of course, necessary. But starting with the XCode AR boilerplate project makes placing custom objects on the screen surprisingly easy. You quickly get a feel for the basic possibilities of AR!
Creating 3D models to occupy our AR scenes had a similarly low barrier to entry. In this case, though, having somebody on the team already familiar with 3D modelling applications will likely save a lot of time during prototyping. We found it faster to create our ShapeSolver objects in Blender and import DAE models to XCode rather than attempt a programmatic approach or use the (somewhat buggy) built-in SceneKit editor in XCode.
Note: When importing 3D models it’s helpful to think about size. We spent about 30 minutes wondering why our object wouldn’t show up in the scene, only to realise we had been sitting inside it all along (it was huge).
To be sure, providing real value beyond these initial prototypes is more technically challenging. The concept behind LifeTagger, for example, relies heavily on the premise that the tags you create will still appear in the same physical space the next time you launch the app. Persisting an ARKit scene, however, is currently not trivial, involving machine learning, image recognition, and storing GPS and device coordinates in order to then backwards-engineer where earlier tags should appear relatively to the device’s new origin point. At least until something like the AR Cloud exists.
The release of ARKit can be likened to the early days of the App Store, where the waters were tested by building wacky and gimmicky apps whose value was more entertainment than utility. Our own explorations fell largely into the former category but we look forward to experimenting further with AR and feel confident that we will see some genuinely useful applications in healthcare very soon.
(Thank you to Danielle for letting us film her while she solved puzzles on ShapeSolver.)
For our Spring Mobile Health Meetup we decided to focus on the use of Virtual Reality (VR) technologies as therapies in healthcare. We were fortunate to be joined by Isabel Van De Keere, founder of Immersive Rehab, and VR pioneer Skip Rizzo, who – quite appropriately – was present through Google Hangouts.
Isabel kicked off the conversation with a personal account of her experiences going through physical rehabilitation following a serious accident 6 years ago. She spoke of her frustrations at the therapeutic tools available, and the difficulty in maintaining the motivation to follow repetitive exercise regimes.
She was inspired by this experience to attempt to improve the tools available for physiotherapists, and as a result founded Immersive Rehab. Here she works to design engaging and motivating VR therapeutic programmes for physio rehabilitation. These take advantage of the brain’s neuroplasticity to overcome mobility difficulties such as a patient’s perceived inability to perform certain movements. Isabel describes it as: “it’s like the wifi connection between the mind and muscle has been lost and VR experiences are a tool to help reconnect.”
Skip Rizzo echoed Isabel’s findings in his own account of extensive work with VR in medical practice, from using VR to help soldiers with PTSD, to the development of AI agents within VR worlds. Much like in physical rehabilitation, Skip’s work with VR and mental health allows people to suspend their disbelief, allowing them to revisit traumatic experiences with the knowledge that they are in a safe and controllable environment. Crucially, the brain responds to stimuli in a similar way to how it would in real situations, despite the fact that the patient is aware that they are in a simulation.
As the conversation turned to the ethics of VR in health, the optimism of both speakers was infectious, along with their explanation of the issues still to be addressed. Contrary to criticisms of VR being an isolating activity, VR in therapy can be very social, even collaborative. Isabel spoke of her work with physiotherapists, in which the engagement between patient and therapist was aided by the motivating elements of the VR tool. Similarly, in treatment for PTSD, VR can allow a therapist to be present in the same environment as the patient, and in so doing gain a much clearer picture of the patient’s trauma.
Both Isabel and Skip feel that we are on the brink of widespread uptake and use of VR for a variety of medical applications. With the decline in equipment costs and mounting evidence of the impact of VR in projects such as Isabel’s and Skip’s, the gap between medical research and practice will start to close. Standalone headsets, haptic technology that bring a sense of touch to VR worlds, and responsive interactions through AI, will become increasingly common, opening up the possibilities for even more immersive experiences, and thus therapies.
But as VR becomes normalised, there will be a host of ethical and regulatory questions to solve. How should we handle patient data from VR therapies? Would harassment in a VR environment be treated the same as harassment in the real world? Will actions in a VR world have legal consequences in the real world? As with any potentially transformative technology the possibilities for positive and negative impacts will make ethics a vital component in conversations about the future of VR.
If you’d like to watch the conversation from start to finish, make sure to check out the video of the event below.
Finally a big thank you to Isabel and Skip for sharing their experiences and insights on a fascinating topic, and to the audience for coming over to our studios and bringing such thoughtful questions to the discussion.
We look forward to seeing you at the next Mobile Health Meetup!
Back in August we launched two Dementia Citizens beta apps. Today we’re happy to announce that the research code we wrote for the Dementia Citizens apps is now publicy available on Github. This project taught us just how much digital tools can empower people living with dementia, so it is a real pleasure to share the work more widely. The code is shared under the MIT license which means that anyone is free to use it.
The Dementia Citizens research code repository contains the code that implemented the research features of the two beta iOS Apps: Playlist for Life and Book of You. It includes the common elements across both apps which focused on the user’s participation in a research study - from onboarding, to data collection, and underlying dementia friendly design guidelines.
Below is an overview of these features:
It’s important to note that this code is not a finished framework or library. Instead we hope that other people working in dementia care, design and research can benefit from this work and expand on what we have done. We would love to hear from you if there are any questions about the code, the project, or if you plan to use it. This is an area that we’ll continue to be committed to, so we’d love to talk.
Dementia Citizens is a partnership with the Nesta Health Lab, Glasgow Caledonian University, Playlist for Life, Bangor University and Book of You, and with the dementia community, supported by Alzheimer’s Research UK, Alzheimer’s Society and the Department of Health.
Welcome to our first post of the year! To kick-off our first Mobile Health Meetup of 2017 (and our 7th meetup so far) we invited Ivor Williams - Senior Design Associate at the Helix Centre and co-founder of Humane Engineering - and Cassie Robinson - director of strategy and research at Doteveryone, service designer and researcher at the Co-op, and co-founder of The Point People - to talk about digital and service design for end of life care.
Ivor introduced the conversation with an overview of how our traditional approaches to death - across both design and contemporary culture - aren’t appropriate for the reality of people’s experience. Designers, for instance, have a tendency to simplify complex topics. While this is often heralded as a virtue, in the context of end of life care it can miss important nuances and the deep sensitivity required at every stage.
Similarly, the potency of myths around heroic, peaceful or poetic deaths actually impedes our ability to plan for death in a realistic way. Just as patients, friends and family may find it difficult to make decisions around end of life care, the conversations can be equally difficult from the perspective of healthcare practitioners. Trained to cure disease and save lives, junior doctors often feel unsupported and lack confidence around the relevant vocabulary to engage in meaningful end of life care discussions.
Changing these behaviours is difficult. Both Cassie and Ivor described how the culture of healthcare is so embedded, and how challenging it is to replace or introduce new processes or artefacts. As an example, Cassie described how even just addressing end of life issues across a person’s lifetime could improve how we deal with death at both an individual and system level.
Compassionate design can also play a role. In a project to redesign the Do Not Attempt Resuscitation (DNAR) form, Ivor illustrated how a new type of tool - in both physical paper and digital formats - could help to prompt and guide humane conversations around end of life care. Cassie suggested that technology can be used to develop a better understanding of who is having end of life care conversations, how authorship of information is distributed across the healthcare system, and what kind of information is valuable, to whom, and when.
Through the conversation, we also explored the relevance of aesthetics in designing end of life care, and how digital tools can make advanced care planning simple and frictionless. We talked about how technology, now seamlessly embedded into our lives, is starting to shape how we understand and deal with death. To listen to the whole conversation, check-out the audio/video recordings of the event below.
Thank you very much to Cassie and Ivor for sharing such valuable insights and thoughtful conversations.
And as always, a massive thank you to everyone who attended the meetup. It was lovely to see both new and familiar faces. We look forward to seeing you all at the next one!
For our 6th Mobile Health Meetup, we were joined in the studio by Holly Brenan, user experience designer at ustwo and Alice Osborne, head of design at Active Minds, to talk about designing with the dementia community across digital and non-digital products and services. Our very own Emilie Glazer facilitated the conversation and shared some of our experiences of working on Dementia Citizens.
Holly kicked off the discussion with an overview of Keepsake, a pilot for a digital product that uses natural language processing to support care and management staff in dementia care homes by reducing the amount of time they spend documenting with pen and paper. She spoke about the added value of designing a product from the bottom-up with the direct input of carers and staff.
Alice introduced us to Active Minds and some of their activity products and games for people living with dementia. We had the pleasure to experience first hand the thought, consideration and creativity that goes into creating these products as Alice brought some in for us to explore. She talked about the challenges of designing meaningful activities for people living with dementia, but also about the immense rewards.
Across the initial conversation and later Q&A we spoke about the importance of creating products and services that harness people’s skills, knowledge and stories, and about their potential to bring people closer together. We talked about how digital technology enables personalisation, while physical objects can engage senses in very visceral ways. We shared experiences on how the processes of design, prototyping and research have to adapt in order to be sensitive to the context of dementia, and how a small thing like a sticker can help overcome some of the challenges of introducing new products and technologies into care environments.
For all this and more, listen/watch the whole conversation below and have a look at our Storify roundup of the event.
A big thank you to Holly and Alice for their time and insights and to all those who attended the meetup and joined the conversation!
Also… that wraps up our Mobile Health Meetups for 2016! This year we exchanged ideas around rethinking research ethics for digital health and overcoming barriers in developing medical apps. Thank you to everyone who took part in our events, as a speaker or a curious attendee. We’re already preparing our first meet up for 2017, so make sure you join the community here and we’ll be in touch. See you in 2017!