Show simple item record

A Cross-Cultural Investigation of Privacy Perceptions Regarding Concerns, Responsibility, and Understandability

dc.contributor.advisorReinhardt, Delphine Prof. Dr.
dc.contributor.authorKühtreiber, Patrick
dc.date.accessioned2025-07-01T07:53:15Z
dc.date.available2025-07-01T07:53:15Z
dc.date.issued2025-07-01
dc.identifier.urihttp://resolver.sub.uni-goettingen.de/purl?ediss-11858/16090
dc.identifier.urihttp://dx.doi.org/10.53846/goediss-11363
dc.format.extent175de
dc.language.isoengde
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/
dc.subject.ddc510de
dc.titleA Cross-Cultural Investigation of Privacy Perceptions Regarding Concerns, Responsibility, and Understandabilityde
dc.typedoctoralThesisde
dc.contributor.refereeFischer-Hübner, Simone Prof. Dr. Dr.
dc.date.examination2025-06-19de
dc.description.abstractengIn today's data driven world, individuals' data is collected almost constantly. Whether it is apps on mobile phones or social media sites, companies collect, store, and process their users' data in order to enhance their services, to tailor marketing campaigns to each user, or to gather insights about users' behavior. Critically, individuals' Personally Identifiable Information (PII) is also collected which makes it important to control these data flows–a difficult task given the ubiquity of the data collection. It is nearly impossible for laypeople and experts alike to guess the amount of data which each individual company holds about them. Hence, technical tools and laws are in place that aim to protect individuals' privacy. A landmark regulation in this regard is the European General Data Protection Regulation (GDPR). It has been created in order to prevent malicious data practices and to protect EU citizens' data, but has since its inception influenced many data protection laws around the globe. While most criticism about the GDPR revolves around the introduced nuisances, such as cookie banners, it is also unclear whether the regulation protects against threats that individuals are most concerned about. This gap becomes especially apparent when considering the pervasive role of modern technology in everyday life, where data collection often happens unnoticed. One of the devices collecting more data than most users assume are smart speakers. These devices are widely used and, while being convenient, collect a lot of speech data. This includes everyone who interacts with the smart speaker, i.e., all users in the household and visitors, who might not even be aware that a smart speaker is active. Hence, methods that protect people's privacy must be in place. To evaluate the efficacy of these protections, it is important to understand who users and bystanders perceive as most responsible to protect their privacy. Laws and technical solutions should adequately address these different data protection needs. Moreover, since smart speakers are used worldwide, it is important to consider cultural differences in these perceptions and whether diverse privacy control mechanisms are needed, based on the user's culture. Next to laws and regulations, technical solutions to protect privacy are also in place. However, they are often hard to locate and it is unclear who smart speaker users and bystanders perceive as responsible to conceptualize, implement, and activate them. One widely used solution for privacy-preserving data collection is Differential Privacy (DP). Data processed under DP are perturbed in a way that balances the privacy needs of individuals in the data set and the need of data analysts to still extract useful information from the data. Thus, any individual whose data is in a DP data set has an extra layer of privacy. One key challenge regarding DP is the difficulty of explaining it to laypeople, which is however needed as consent to data processing under the GDPR must be given in an informed way. In this thesis, we first address the question of what smart speaker users and bystanders are most concerned about and who they believe bears the greatest responsibility for protecting their privacy. We do this via two large scale quantitative vignette studies aimed at smart speaker users and bystanders from Germany and the UK, to compare two countries in which citizens from both are protected by the GDPR, but that are still culturally diverse in dimensions that are relevant for privacy perceptions. Next, we evaluate approaches to making DP more understandable for laypeople and measure what other factors are correlated to their data sharing attitudes. Here, we first replicate a mixed design study examining how different descriptions of DP are correlated with data sharing attitudes before conducting another mixed design study to investigate how visualizations compare to textual descriptions and which factors are ultimately correlated to an increase in data sharing attitudes if data is protected with DP. Both of these two studies primarily involve German participants to allow comparability. The results indicate that most smart speaker users and bystanders are concerned about the device manufacturer, third parties, and the state. We show that the varying user groups have different perceptions in this regard and that cultural differences between Germany and the UK are indeed significant. Moreover, we discovered a trend towards a rising privacy concern in younger generations; something that points in the opposite direction of most previous research in this area. Lastly, we show that understanding DP is not correlated to data sharing attitudes. While visualizations of DP perform poorly regarding participants' understandability, they are correlated with an increase in data sharing attitudes. Ultimately, though, the underlying privacy persona is the most significant factor w.r.t. a positive change in data sharing attitude when data is protected with DP. Software on smart speakers needs to have Privacy-by-Default, i.e., that standard settings are set to the maximum data protection. However, they must also be flexible enough to allow users to change their privacy settings, if they want to take control of data processing. Moreover, solutions for bystanders are needed, as they are a large group affected by IoT data collection with almost no agency regarding their privacy. One possible solution are indicators of data collection. Regarding DP, simple visualizations work well when trying to increase data sharing attitudes, but they do not increase understanding of the method. That means that for most people who are not experts, showing the implication of DP is a simple and effective way to increase trust in the method.de
dc.contributor.coRefereeMeyer, Joachim Prof. Dr.
dc.subject.engPrivacy Concernsde
dc.subject.engDifferential Privacyde
dc.subject.engPrivacy Responsibilityde
dc.subject.engCross-Culturalde
dc.identifier.urnurn:nbn:de:gbv:7-ediss-16090-1
dc.affiliation.instituteFakultät für Mathematik und Informatikde
dc.subject.gokfullInformatik (PPN619939052)de
dc.identifier.ppn1929651767
dc.identifier.orcid0000-0002-0642-3907de
dc.notes.confirmationsentConfirmation sent 2025-07-01T08:15:01de


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record