• Deutsch
    • English
  • Deutsch 
    • Deutsch
    • English
  • Einloggen
Dokumentanzeige 
  •   Startseite
  • Naturwissenschaften, Mathematik und Informatik
  • Fakultät für Mathematik und Informatik (inkl. GAUSS)
  • Dokumentanzeige
  •   Startseite
  • Naturwissenschaften, Mathematik und Informatik
  • Fakultät für Mathematik und Informatik (inkl. GAUSS)
  • Dokumentanzeige
JavaScript is disabled for your browser. Some features of this site may not work without it.

A Cross-Cultural Investigation of Privacy Perceptions Regarding Concerns, Responsibility, and Understandability

von Patrick Kühtreiber
Dissertation
Datum der mündl. Prüfung:2025-06-19
Erschienen:2025-07-01
Betreuer:Prof. Dr. Delphine Reinhardt
Gutachter:Prof. Dr. Dr. Simone Fischer-Hübner
Gutachter:Prof. Dr. Joachim Meyer
crossref-logoZum Verlinken/Zitieren: http://dx.doi.org/10.53846/goediss-11363

 

 

Dateien

Name:Thesis__Revision_ (3).pdf
Size:18.8Mb
Format:PDF
ViewOpen

Lizenzbestimmungen:


Zusammenfassung

Englisch

In today's data driven world, individuals' data is collected almost constantly. Whether it is apps on mobile phones or social media sites, companies collect, store, and process their users' data in order to enhance their services, to tailor marketing campaigns to each user, or to gather insights about users' behavior. Critically, individuals' Personally Identifiable Information (PII) is also collected which makes it important to control these data flows–a difficult task given the ubiquity of the data collection. It is nearly impossible for laypeople and experts alike to guess the amount of data which each individual company holds about them. Hence, technical tools and laws are in place that aim to protect individuals' privacy. A landmark regulation in this regard is the European General Data Protection Regulation (GDPR). It has been created in order to prevent malicious data practices and to protect EU citizens' data, but has since its inception influenced many data protection laws around the globe. While most criticism about the GDPR revolves around the introduced nuisances, such as cookie banners, it is also unclear whether the regulation protects against threats that individuals are most concerned about. This gap becomes especially apparent when considering the pervasive role of modern technology in everyday life, where data collection often happens unnoticed. One of the devices collecting more data than most users assume are smart speakers. These devices are widely used and, while being convenient, collect a lot of speech data. This includes everyone who interacts with the smart speaker, i.e., all users in the household and visitors, who might not even be aware that a smart speaker is active. Hence, methods that protect people's privacy must be in place. To evaluate the efficacy of these protections, it is important to understand who users and bystanders perceive as most responsible to protect their privacy. Laws and technical solutions should adequately address these different data protection needs. Moreover, since smart speakers are used worldwide, it is important to consider cultural differences in these perceptions and whether diverse privacy control mechanisms are needed, based on the user's culture. Next to laws and regulations, technical solutions to protect privacy are also in place. However, they are often hard to locate and it is unclear who smart speaker users and bystanders perceive as responsible to conceptualize, implement, and activate them. One widely used solution for privacy-preserving data collection is Differential Privacy (DP). Data processed under DP are perturbed in a way that balances the privacy needs of individuals in the data set and the need of data analysts to still extract useful information from the data. Thus, any individual whose data is in a DP data set has an extra layer of privacy. One key challenge regarding DP is the difficulty of explaining it to laypeople, which is however needed as consent to data processing under the GDPR must be given in an informed way. In this thesis, we first address the question of what smart speaker users and bystanders are most concerned about and who they believe bears the greatest responsibility for protecting their privacy. We do this via two large scale quantitative vignette studies aimed at smart speaker users and bystanders from Germany and the UK, to compare two countries in which citizens from both are protected by the GDPR, but that are still culturally diverse in dimensions that are relevant for privacy perceptions. Next, we evaluate approaches to making DP more understandable for laypeople and measure what other factors are correlated to their data sharing attitudes. Here, we first replicate a mixed design study examining how different descriptions of DP are correlated with data sharing attitudes before conducting another mixed design study to investigate how visualizations compare to textual descriptions and which factors are ultimately correlated to an increase in data sharing attitudes if data is protected with DP. Both of these two studies primarily involve German participants to allow comparability. The results indicate that most smart speaker users and bystanders are concerned about the device manufacturer, third parties, and the state. We show that the varying user groups have different perceptions in this regard and that cultural differences between Germany and the UK are indeed significant. Moreover, we discovered a trend towards a rising privacy concern in younger generations; something that points in the opposite direction of most previous research in this area. Lastly, we show that understanding DP is not correlated to data sharing attitudes. While visualizations of DP perform poorly regarding participants' understandability, they are correlated with an increase in data sharing attitudes. Ultimately, though, the underlying privacy persona is the most significant factor w.r.t. a positive change in data sharing attitude when data is protected with DP. Software on smart speakers needs to have Privacy-by-Default, i.e., that standard settings are set to the maximum data protection. However, they must also be flexible enough to allow users to change their privacy settings, if they want to take control of data processing. Moreover, solutions for bystanders are needed, as they are a large group affected by IoT data collection with almost no agency regarding their privacy. One possible solution are indicators of data collection. Regarding DP, simple visualizations work well when trying to increase data sharing attitudes, but they do not increase understanding of the method. That means that for most people who are not experts, showing the implication of DP is a simple and effective way to increase trust in the method.
Keywords: Privacy Concerns; Differential Privacy; Privacy Responsibility; Cross-Cultural
 

Statistik

Hier veröffentlichen

Blättern

Im gesamten BestandFakultäten & ProgrammeErscheinungsdatumAutorBetreuer & GutachterBetreuerGutachterTitelTypIn dieser FakultätErscheinungsdatumAutorBetreuer & GutachterBetreuerGutachterTitelTyp

Hilfe & Info

Publizieren auf eDissPDF erstellenVertragsbedingungenHäufige Fragen

Kontakt | Impressum | Cookie-Einwilligung | Datenschutzerklärung | Barrierefreiheit
eDiss - SUB Göttingen (Zentralbibliothek)
Platz der Göttinger Sieben 1
Mo - Fr 10:00 – 12:00 h


Tel.: +49 (0)551 39-27809 (allg. Fragen)
Tel.: +49 (0)551 39-28655 (Fragen zu open access/Parallelpublikationen)
ediss_AT_sub.uni-goettingen.de
[Bitte ersetzen Sie das "_AT_" durch ein "@", wenn Sie unsere E-Mail-Adressen verwenden.]
Niedersächsische Staats- und Universitätsbibliothek | Georg-August Universität
Bereichsbibliothek Medizin (Nur für Promovierende der Medizinischen Fakultät)
Robert-Koch-Str. 40
Mon – Fri 8:00 – 24:00 h
Sat - Sun 8:00 – 22:00 h
Holidays 10:00 – 20:00 h
Tel.: +49 551 39-8395 (allg. Fragen)
Tel.: +49 (0)551 39-28655 (Fragen zu open access/Parallelpublikationen)
bbmed_AT_sub.uni-goettingen.de
[Bitte ersetzen Sie das "_AT_" durch ein "@", wenn Sie unsere E-Mail-Adressen verwenden.]