Preprint Open Access

Pervasive Data Profiling, Moral equality and Civic Responsibility

Delacroix, Sylvie

DCAT Export

<?xml version='1.0' encoding='utf-8'?>
<rdf:RDF xmlns:rdf="" xmlns:adms="" xmlns:cnt="" xmlns:dc="" xmlns:dct="" xmlns:dctype="" xmlns:dcat="" xmlns:duv="" xmlns:foaf="" xmlns:frapo="" xmlns:geo="" xmlns:gsp="" xmlns:locn="" xmlns:org="" xmlns:owl="" xmlns:prov="" xmlns:rdfs="" xmlns:schema="" xmlns:skos="" xmlns:vcard="" xmlns:wdrs="">
  <rdf:Description rdf:about="">
    <rdf:type rdf:resource=""/>
    <dct:type rdf:resource=""/>
    <dct:identifier rdf:datatype=""></dct:identifier>
    <foaf:page rdf:resource=""/>
        <rdf:type rdf:resource=""/>
        <foaf:name>Delacroix, Sylvie</foaf:name>
            <foaf:name>University College London</foaf:name>
    <dct:title>Pervasive Data Profiling, Moral equality and Civic Responsibility</dct:title>
    <dct:issued rdf:datatype="">2017</dct:issued>
    <dcat:keyword>pervasive data profiling</dcat:keyword>
    <dcat:keyword>ubiquitous computing</dcat:keyword>
    <dcat:keyword>knowledge asymmetry</dcat:keyword>
    <dcat:keyword>moral equality</dcat:keyword>
    <dcat:keyword>civic responsibility</dcat:keyword>
    <dct:issued rdf:datatype="">2017-09-21</dct:issued>
    <owl:sameAs rdf:resource=""/>
        <skos:notation rdf:datatype=""></skos:notation>
    <dct:isVersionOf rdf:resource=""/>
    <dct:isPartOf rdf:resource=""/>
    <dct:description>&lt;p&gt;So far neither regulatory frameworks nor endeavours to develop bottom-up approaches to data protection have had much success in mitigating the moral hazard inherent in pervasive use of seemingly innocuous, “leaked data”. This paper focuses on two aspects of those risks that are too rarely discussed. They relate to the possibility of civic responsibility (1) and moral equality (2). (1) Ethical agency (and the responsibility it entails) presupposes the possibility of change: when our practices are wanting, someone needs to be able to stand up and question them. Change, in turn, presupposes the ability to break from habitual frames of thought. To what extent is the profilebased tailoring of the environmental architecture that shapes our choices and attitudes hampering the possibility of civic responsibility? The question at stake here is different from the betterknown “right to be forgotten” issue: the latter focuses on the impact of recorded data on others’ perception of oneself. This paper, by contrast, focuses on what may be called the “anchoring effect”: the impact of recorded data on our ability to see the world -and, most importantly, our role in it- differently. The ability to acknowledge a discrepancy between one’s present ethical stand and the person one seeks to be is essential to making sense of any notion of civic responsibility. The danger is that such discrepancy will simply cease to arise in an environment that has been systematically “optimized” in accordance to one’s profile. (2) The second section of this paper explores the extent to which ubiquitous, proactive computing gives rise to a very particular type of vulnerability, one that potentially threatens the moral equality of data-subjects. This vulnerability cannot be addressed by correcting epistemic imbalances. The current regulatory focus on information disclosure has obscured what mere disclosure cannot achieve: empowering data subjects to maintain and develop their sense of self. This may sound overly ambitious. I argue that it is no more so than a commitment to retain a meaningful, practice-relevant concept of civic responsibility.  &lt;/p&gt;</dct:description>
    <dct:accessRights rdf:resource=""/>
      <dct:RightsStatement rdf:about="info:eu-repo/semantics/openAccess">
        <rdfs:label>Open Access</rdfs:label>
          <dct:RightsStatement rdf:about="">
            <rdfs:label>Creative Commons Attribution 4.0 International</rdfs:label>
        <dcat:accessURL rdf:resource=""/>
All versions This version
Views 112112
Downloads 7171
Data volume 11.5 MB11.5 MB
Unique views 103103
Unique downloads 6666


Cite as