Preprint Open Access

# Pervasive Data Profiling, Moral equality and Civic Responsibility

Delacroix, Sylvie

### DataCite XML Export

<?xml version='1.0' encoding='utf-8'?>
<identifier identifierType="DOI">10.5281/zenodo.903488</identifier>
<creators>
<creator>
<creatorName>Delacroix, Sylvie</creatorName>
<givenName>Sylvie</givenName>
<familyName>Delacroix</familyName>
<affiliation>University College London</affiliation>
</creator>
</creators>
<titles>
<title>Pervasive Data Profiling, Moral equality and Civic Responsibility</title>
</titles>
<publisher>Zenodo</publisher>
<publicationYear>2017</publicationYear>
<subjects>
<subject>pervasive data profiling</subject>
<subject>ubiquitous computing</subject>
<subject>knowledge asymmetry</subject>
<subject>moral equality</subject>
<subject>civic responsibility</subject>
</subjects>
<dates>
<date dateType="Issued">2017-09-21</date>
</dates>
<resourceType resourceTypeGeneral="Text">Preprint</resourceType>
<alternateIdentifiers>
<alternateIdentifier alternateIdentifierType="url">https://zenodo.org/record/903488</alternateIdentifier>
</alternateIdentifiers>
<relatedIdentifiers>
<relatedIdentifier relatedIdentifierType="DOI" relationType="IsVersionOf">10.5281/zenodo.903487</relatedIdentifier>
<relatedIdentifier relatedIdentifierType="URL" relationType="IsPartOf">https://zenodo.org/communities/dfp17</relatedIdentifier>
</relatedIdentifiers>
<rightsList>
<rights rightsURI="info:eu-repo/semantics/openAccess">Open Access</rights>
</rightsList>
<descriptions>
<description descriptionType="Abstract">&lt;p&gt;So far neither regulatory frameworks nor endeavours to develop bottom-up approaches to data protection have had much success in mitigating the moral hazard inherent in pervasive use of seemingly innocuous, “leaked data”. This paper focuses on two aspects of those risks that are too rarely discussed. They relate to the possibility of civic responsibility (1) and moral equality (2). (1) Ethical agency (and the responsibility it entails) presupposes the possibility of change: when our practices are wanting, someone needs to be able to stand up and question them. Change, in turn, presupposes the ability to break from habitual frames of thought. To what extent is the profilebased tailoring of the environmental architecture that shapes our choices and attitudes hampering the possibility of civic responsibility? The question at stake here is different from the betterknown “right to be forgotten” issue: the latter focuses on the impact of recorded data on others’ perception of oneself. This paper, by contrast, focuses on what may be called the “anchoring effect”: the impact of recorded data on our ability to see the world -and, most importantly, our role in it- differently. The ability to acknowledge a discrepancy between one’s present ethical stand and the person one seeks to be is essential to making sense of any notion of civic responsibility. The danger is that such discrepancy will simply cease to arise in an environment that has been systematically “optimized” in accordance to one’s profile. (2) The second section of this paper explores the extent to which ubiquitous, proactive computing gives rise to a very particular type of vulnerability, one that potentially threatens the moral equality of data-subjects. This vulnerability cannot be addressed by correcting epistemic imbalances. The current regulatory focus on information disclosure has obscured what mere disclosure cannot achieve: empowering data subjects to maintain and develop their sense of self. This may sound overly ambitious. I argue that it is no more so than a commitment to retain a meaningful, practice-relevant concept of civic responsibility.  &lt;/p&gt;</description>
</descriptions>
</resource>

88
55
views