It’s easy to love or hate technology, to blame it for social ills or to imagine that it will fix what people cannot. But technology is made by people. In a society. And it has a tendency to mirror and magnify the issues that affect everyday life. The good, bad, and ugly.
Danah Boyd
Digital innovation is routinely heralded as the panacea for modern health and social care – creating more efficient and effective services, enabling patients to take more control of their health, and citizens to manage their transactions with government online. Personalised Health and Care 2020 (November 2015) sets out a framework for digital technologies with a bold ambition:
One of the greatest opportunities of the 21st century is the potential to safely harness the power of the technology revolution, which has transformed our society, to meet the challenges of improving health and providing better, safer, sustainable care for all. To date the health and care system has only begun to exploit the potential of using data and technology at a national or local level. Our ambition is for a health and care system that enables people to make healthier choices, to be more resilient, to deal more effectively with illness and disability when it arises, and to have happier, longer lives in old age; a health and care system where technology can help tackle inequalities and improve access to services for the vulnerable.
But in our rush to embrace digital technologies, are we paying proper attention to the implications for all of us as patients and citizens? What does digitally transformed health and care mean for privacy and surveillance? Who benefits and who might get left behind from the so called ‘digital revolution’? How do we elide patient/citizen choice with professional expertise?
Last week saw a conversation on the above themes which aimed to bridge a dialogue across disciplines and sectors. It was aimed at academics, people accessing health and care services, practitioners, digital innovators and anyone else with an interest in the topic of dignity and humanity in a digital world. The session was hosted by yours truly along with Dr Helen Thornham, Associate Professor Digital Cultures at the School of Media and Communications; Dr Ian Kellar, Associate Professor and lead for behaviour change at the School of Psychology; and Imran Ali, Founder of Carbon Imagineering and Living Lab, with an interest in emerging technologies. In fact the idea for the conversation itself arose from a conversation between myself and Imran some months back.
On the morning of our session, I happened to clock a Telegraph headline screaming Strike all you like, doctors – technology will soon take away your power. Published the day after the junior doctors strike, it was a salutary reminder that technology is deeply rooted is the social, cultural and political context from which it emerges. Should we be pitting health practitioners and technology against each other? Should we frame digital innovation purely in terms of efficiency and reduced public resources? Or should we be considering the community enhancing and social impacts that digital technologies afford?
These are all questions that we should be challenging ourselves to think about deeply and critically. But I wonder if we’re all in too much of a rush and trying to juggle too much to do with too little to spend the time. I might just be in the wrong places, but my experience is that this conversation is nowhere near central enough in public sector debates about digital.
Back to the conversation… it comprised people from all sorts of varied backgrounds and the themes we ranged were almost overwhelming. We covered everything from digital technologies being deployed as means of state sanction to benefit claimants, through to practical issues of data packages for people living in poverty. We ruminated on the plethora of code clubs but the absence of digital citizenship in educational contexts. We shirked at the amount of data we routinely give away to commercial companies and shared examples of how much of the day to day details of our lives we inadvertently share. We discussed the extent to which ethics are an individual or a collective endeavour and we contemplated the value of big data without context. We pondered whether resistance to the surveillance capability in the products we own will become its own market place.
We didn’t come up with any answers but we did surface a lot of fascinating and rich themes that each lend themselves to further inquiry. I began this post with a quote from Danah Boyd and I’ll leave you with her recent great post (thank you Imran for sharing) which resonates with the theme of dignity and humanity in a digital world.
nb. we are looking at funding to run a series of conversations on various themes of dignity and humanity in a digital world so do get in touch if you’d like to be involved. If you’re interested in the themes in this post, you might also be interested in this upcoming conference – Digital Health / Digital Capitalism in Leeds on 4 July.