Report Open Access

Trustworthy Assurance of Digital Mental Healthcare

Burr, Christopher; Powell, Rosamund

There is a culture of distrust surrounding the development and use of digital mental health technologies.

As many organisations continue to grapple with the long-term impacts on mental health and well-being from the COVID-19 pandemic, a growing number are turning to digital technologies to increase their capacity and try to meet the growing need for mental health services.

In this report, we argue that clearer assurance for how ethical principles have been considered and implemented in the design, development, and deployment of digital mental health technologies is necessary to help build a more trustworthy and responsible ecosystem. To help address this need, we set out a positive proposal for a framework and methodology we call 'Trustworthy Assurance'.

To support the development and evaluation of Trustworthy Assurance, we conducted a series of participatory stakeholder engagement events with students, University administrators, regulators and policy-makers, developers, researchers, and users of digital mental health technologies. Our objectives were a) to identify and explore how stakeholders understood and interpreted relevant ethical objectives for digital mental health technologies, b) to evaluate and co-design the trustworthy assurance framework and methodology, and c) solicit feedback on the possible reasons for distrust in digital mental health.

 

Research and production for this report was undertaken at the Alan Turing Institute and supported by funding from the UKRI's Trustworthy Autonomous Hub, which was awarded to Dr Christopher Burr (Grant number: TAS_PP_00040).
Files (4.4 MB)
Name Size
final-report.pdf
md5:01eda5deaa212e63ee035ed5c96f5f01
4.4 MB Download
171
83
views
downloads
All versions This version
Views 171171
Downloads 8383
Data volume 361.6 MB361.6 MB
Unique views 118118
Unique downloads 7171

Share

Cite as