Investigating User Perspectives on Differentially Private Text Privatization
Recent literature has seen a considerable uptick in Differentially Private Natural Language Processing (DP NLP). This includes DP text privatization, where potentially sensitive input texts are transformed under DP to achieve privatized output texts that ideally mask sensitive information and maintain original semantics. Despite continued work to address the open challenges in DP text privatization, there remains a scarcity of work addressing user perceptions of this technology, a crucial aspect which serves as the final barrier to practical adoption. In this work, we conduct a survey study with 721 laypersons around the globe, investigating how the factors of scenario, data sensitivity, mechanism type, and reason for data collection impact user preferences for text privatization. We learn that while all these factors play a role in influencing privacy decisions, users are highly sensitive to the utility and coherence of the private output texts. Our findings highlight the socio-technical factors that must be considered in the study of DP NLP, opening the door to further user-based investigations going forward.
| Attribute | Value |
|---|---|
| Address | Albuquerque, New Mexico, USA |
| Authors | Stephen Meisenbacher , Alexandra Klymenko |
| Citation | Meisenbacher, S.; Klymenko, A.; Karpp, A.; Matthes, F.: 2025. Investigating User Perspectives on Differentially Private Text Privatization. In Proceedings of the Sixth Workshop on Privacy in Natural Language Processing, pages 86–105, Albuquerque, New Mexico. Association for Computational Linguistics. |
| Key | Me25b |
| Research project | |
| Title | Investigating User Perspectives on Differentially Private Text Privatization |
| Type of publication | Workshop |
| Year | 2025 |
| Team members | Stephen Meisenbacher , Alexandra Klymenko |
| Publication URL | https://aclanthology.org/2025.privatenlp-main.8/ |
| Acronym | PrivateNLP |
| Project |