Published November 14, 2022 | Version v1
Thesis Open

Privacy of Vehicular Time Series Data

Authors/Creators

  • 1. BME-HIT

Contributors

Supervisor:

  • 1. BME-HIT

Description

Data is the new oil for the car industry. Cars generate data about how and where they are used and who is behind the wheel which gives rise to a novel way of profiling individuals in order to provide personalized, value-added services. Information about how people are using their vehicles enables a wide number of applications, e.g., personalized insurances or advertisements, traffic maps or urban planning, where the benign side is aiming to improve quality of life. Alas, the availability of drivers’ vehicle data can potentially reveal sensitive information about them such as home and work places, lifestyles, political or religious inclinations or even sexual orientation. In this dissertation we address both anonymization and de-anonymization of vehicular data. In our first scenario, we presume an attacker that is capable of acquiring in-vehicle network logs from the targeted driver and also can make test drives with an arbitrarily chosen vehicle. The goal of the attacker is to identify any individual among the people present in the dataset, i.e. singling out. Since the format of in-vehicular network logs are proprietary the attacker must find a way to utilize the data even without the knowledge of the logging protocol. In the second scenario we demonstrate a mitigation against a weaker adversary, that is only possesses location data of vehicles (e.g., GPS coordinates). We present a novel technique for privately releasing a composite generative model and whole high-dimensional location datasets with detailed time information and the guarantee of Differential Privacy. My work does not only raise the flag to car drivers, but also to companies collecting vehicular logs; the re-identification (and/or profiling) of drivers so effortlessly means that vehicular logs indeed constitute personal data and, as such, are subject to the European General Data Protection Regulation (GDPR) and mitigation often falls short of utility expectations. Overall, the techniques presented in this thesis can be used by data controllers and processors to both mitigate against attacks or to assess their level of privacy protection of their already existing anonymization methods.

Files

ertekezes.pdf

Files (3.0 MB)

Name Size Download all
md5:4d4ea1cde2ba4b5a3c3a344d2a70676c
3.0 MB Preview Download

Additional details

Related works

Is published in
10890/18974 (Handle)