Designing for Sharing and Trust: Opening the Access to Personal Data

Pervasive sensing technologies can be used for the assessment and monitoring of mental health issues, behaviours and affective states. Whilst continuous tracking has often been used for self-reflection, other beneficial usages include: sharing data with circles of support, clinicians and researchers. However, opening access to personal data can put users in a vulnerable position, at the same time it contrasts with current discourses about data privacy protection. In particular, data related to mental health can bring stigma and discrimination. Therefore, there is a need for an in-depth analysis of user requirements, concerns and expectations, in order to create technologies that will benefit society and individuals. This paper brings the context and plans of the PhD project focused on developing a conceptual framework to inform designers of future behavioural data sharing platforms.


Context and motivation
The healthcare domain has been greatly benefited from the advances on pervasive sensing technologies, such as sensors embedded in wearables, smartphones and the ambient [2]. Data captured from these devices can be used for mental health care, by helping to identify behaviours and affective states [12]. For instance, previous research has found that it is possible to take smartphone collected data as an indicator of depression symptoms [13].
Self-tracking practices can come with the goal of supporting personal reflection and management of chronic health conditions [1]. Besides this private use, behavioural data can also be shared with circles of support, clinicians and researches. For instance, doctors can use the data collected from the daily activities of patients, so that they can monitor patients' progress more closely [3]. Another possibility involves the donation of personal data for open-access research platforms, so that researchers can use the data for their scientific investigations [7].
Despite the benefits of sharing data with others, there are important aspects to be taken into account regarding user privacy and the different needs of the user groups involved [16]. Firstly, when agreeing to have data collected, users have to make a quick decision about giving their consent, in which a number of factors are involved, but it is often difficult for them to foresee all the risks [4]. The data collected may contain very intimate details of the individuals' lives [11] or might uncover health disorders that can bring social stigma and discrimination [6]. These risks related to privacy might turn out costly to the user.
Secondly, there is a gap between the needs and preferences of those sharing and those accessing the data. Previous research has found that there are some data types people are more reluctant in sharing, despite it being bene-ficial for their own health if they allow clinicians to know [8].
Regarding the data reported, it is challenging to decide how to provide an adequate level of detail to each user group, as some data visualisations can have a negative impact on the individuals, whilst doctors might benefit from more detailed information [9]. In addition, when it comes to agreeing to share data, a relationship of trust with those who will access the data is a key requirement, but it might not be enough to guarantee that both parts will benefit from the process. Designing for trust calls for extra efforts in transparency regarding reliability and responsibility [14].

Research goals
The contribution of this PhD will come in the form of a conceptual framework with design guidelines to inform designers of future shared-access data platforms. The goal is to provide validated recommendations for the creation of systems in which those contributing with data feel like they can trust, and that those accessing the data can benefit from it. Open challenges regarding acceptability, privacy and the potential for negative consequences are some of the aspects that will be considered throughout the PhD [10].
In summary, the research goals of this PhD project are: 1. Identifying the challenges and requirements of the design of shared-access data platforms 2. Investigating how to address these needs and build systems that users can trust when they decide to share their data 3. Organising design guidelines derived from this knowledge in a conceptual framework for future researchers and designers Doctoral Consortium DIS '19 Companion, June 23-28, 2019, San Diego, CA, USA The empirical basis for this project is twofold. First, the application domain will be digital stress, which refers to the negative psychological effects of being always connected to digital technologies [15]. In particular, for young people, social media has been found to potentially trigger stress, anxiety, and addiction in some users [5]. As this project is part of the TEAM research network, which supports the assessment, prevention and treatment of mental health difficulties in young people, the focus of this PhD will be a shared-access platform with data about social media usage among young users. Second, this research will be part of the ongoing design of the CACHET Research Platform (CARP), which is an open-access data sharing platform. This research will take as input the requirements for CARP and in return inform the design of CARP in terms of support for increasing user trust.

Current status and next steps
The first stage of the project has been completed in 12 weeks. A focus group was conducted to begin gathering inputs from designers and developers who work with real health technology projects, involving patients and systems. The study showed that even though there are number of reasons why people are willing to share their data, there are open challenges that still need to be overcome. These findings are linked to the first research goal. The participants highlighted the lack of transparency on data ownership, the fact that people can intrinsically be reluctant to share, the lack of clarity on how the data will be used, the lack of data protection mechanisms once it has been collected and the lack of trust on companies and privacy protocols.
The insights from the focus groups pinpointed what aspects designers and developers consider more challenging when it comes to sharing data. Regarding next steps, a research protocol will be created to adequately propose empirical studies and the literature review will be continued in parallel. Further focus groups and interviews will be conducted to collect qualitative data about the perspectives of other healthcare experts, patients, developers and researchers. Also, surveys will collect quantitative data to complement these findings. The studies will later involve technology probes which will be selected after a technology review on existing applications for or behavioural monitoring. By using the probes, it will be possible to insert users into a real scenario where they will be asked if they want to share their data. Once the needs and requirements for data sharing are identified, then the investigation of how to address them through design will begin. The ultimate goal is to inform and validate the proposal of a conceptual framework for the design of more responsible and trustworthy ways to share data, which can bring great advances for the field of pervasive healthcare.