Teaching at the Professorship of Cyber Trust

Summer Term 2018

Proseminar: Privacy

The seminar explores key facets of the concept of privacy. Questions that will be considered include the following: What is the history and origin of the concept of privacy? What are approaches to define and conceptualize privacy? What is the value of privacy seen from different perspectives such as economics and human rights? How is privacy currently regulated in different geographical regions (U.S., Europe, Germany), and across different business sectors? How do consumer express their desire for privacy and how do they act to protect or divulge personal information? How is privacy discussed in public, and by various stakeholders (e.g., companies)? What is the relationship of privacy to other important topics including identity, anonymity, and security? What technologies exist to protect and manage privacy, how do they work, and what do we know about their effectiveness? To address these questions a mix of theoretical, practice-oriented and policy literature and case examples will be used and evaluated by seminar participants.

TUM Online: Link

Seminar: Data Analytics for Cybercrime and Undesirable Online Behaviors

Cybercriminal activities as well as other undesirable or malicious activities have increased in prevalence over the last decade. At the same time, the efforts and capabilities of industrial and academic researchers to understand these phenomena have made significant improvements. In this seminar, we will discuss a range of recent data-driven studies focusing, for example, on Spear-Phishing, Ransomware, Cybercriminal Marketplaces, Online Fraud etc., but also other challenges of societal interest such as Cyber-Bullying and Fake News. Each participant of the seminar will deeply engage with a key study to understand its focus, methodology, (data) limitations, and achievements. It is further expected to understand each work in the context of related studies, e.g., from security industry research labs. Participants of the seminar are expected to build on the literature to develop research objectives for further study.

TUM Online: Link

Seminar: Trust in Automated Decision-Making

Why do humans have such trouble trusting algorithmic-decision making? The predictive strength of decision-making algorithms has led to their growing application in society, for example, in autonomous driving, online behavioral advertising, digital health, court decisions on recidivism, and credit scoring. There are even plans to deploy predictive algorithms as a replacement for human juries at the Olympics in 2022. The reason is simple: even rudimentary algorithmic models consistently outperform humans on various prediction tasks. However, research indicates that humans are reluctant to trust automated decision-making models. For example, almost 80% of Americans say they would not want to travel in an autonomous car because they don’t trust it. The same phenomenon holds for simpler applications of algorithmic models. The aim of this seminar is to explore the key factors that underlie human trust and distrust in algorithmic decision-making. Students will engage with a range of literature on human-machine interaction, deceptive and trust-enhancing interfaces, policy measures to create algorithmic trust, and human psychological dispositions of trust in automated decision-making. Each student will comprehensively review a paper to understand how it potentially informs academia, industry, or policy making. Overall, this seminar addresses an emergent scientific field and students are encouraged to focus on the implications of learning algorithms and novel data analytics methods on human trust from a variety of different perspectives. In order to complete the seminar successfully, students are required to prepare a presentation and hand in an 8-10-page report.

TUM Online: Link

Seminar: Transparency of Algorithmic Systems

When consequential decisions are made about individuals on the basis of the outputs of machine learning-based models (ML), concerns about biases inevitably arise. ML leads to significant privacy implications that are legally mandated by the novel European Union’s General Data Protection Regulation (GDPR). The new document, which will come into effect in May 2018, includes the so-called “right to explanation”. As algorithmic accountability and transparency involves providing reasons, explanations and justifications for their decisions, one would expect that these assumptions should form a substantial part of the content of a decision-maker’s account. But how can decision-makers provide an explanation of their decision-systems? What are the benefits and limits of methods such as white and black box analyses? Generally, what are the potential methodologies that help make AI-based decisions explainable as mandated by the GDPR? Furthermore, many centralized parties such as certificate authorities, app markets, health care systems, and financial institutions are largely opaque in their dedicated operations, such as issuing and revoking certificates, publishing apps, or processing private health data. As such, centralized parties act as trust anchors. Interior and exterior compromise allows exploitation of their trusted position, which, in the worst case, remains undetected. Recently published and applied transparency methods provide means for proving valid behaviour and detect misuse. With these concepts users do not have to blindly trust central parties anymore but can verify their authenticity. Examples are cryptographic data structures such as transparency logs and blockchains, which are tamper-evident and verifiable. Overall, the growing opaqueness and vulnerability of different algorithmic systems necessitates novel measures to ensure transparency. Consequently, in this seminar, students will explore and work towards novel methodologies that help make such systems more transparent. In order to complete the seminar successfully, students are required to prepare a presentation and hand in an 8-10-page report.

TUM Online: Link

Seminar: The Value of Privacy

What does privacy mean? What values do we address when we speak of privacy? How do these different values relate to each other? Is there a commercial value of privacy? Are privacy and security trade-offs? Overall, how can we protect the right to privacy in a digitalized society? Recently, in light of several global data breach scandals, such questions have become the subject of intense debate in the public, in academia, industry, and law. The aim of this seminar is to first explore the different conceptualizations of privacy from literature in law, sociology, philosophy, policy, and privacy enhancing technology. Second, students will review how current digital technologies, in particular, machine learning and big data methods in social media, online behavioural advertising, or intelligent personal assistants (and others) influence and shape our understanding of privacy. In order to complete the seminar successfully, students are required to prepare a presentation and, if desired, hand in an 8-10-page report.

TUM Online: Link

Seminar: Deep Learning and Security

Machine learning methods have proven their applicability to several computer security problems and were able to outperform recent state-of-the-art solutions for spam, intrusion, vulnerability and malware detection. Further, machine learning methods continue to improve dramatically. Especially, Deep Learning has shown major improvements in tasks such as image classification and natural language processing. The seminar will investigate recent applications of Deep Learning with emphasis on security and program analysis. Students will summarize, present and discuss recent scientific papers, in order to identify new problems and interesting research questions. Participants with exceptional and promising ideas will be considered for theses or internships.

TUM Online: Link

Seminar: Usable Security and Privacy

During recent years, the requirement for secure and privacy preserving computer systems is reaching higher and higher priority. Luckily, a variety of technologies already exist, specifically designed to meet these requirements. However, most technologies were not designed keeping usability in mind. Consequently, important questions arise when integrating and applying these technologies: What are the implications on usability of computer systems? And vice versa, does usability have an impact on security and privacy? Are security and privacy requirements conflicting with (mostly more important) functional requirements? Do these conflicts lead to users rejecting secure systems? Is security and privacy versus usability an unavoidable trade-off? Currently, this trade-off tends to be either over-biased towards functionality and usability, or security and privacy. This seminar explores this problem and investigates state-of-the-art research on how to rebalance the trade-off. Moreover, based on related work, students will identify new problems, formulate research questions and justify their relevance. Students with exceptional and interesting ideas will be considered for theses or internships.

TUM Online: Link

Research Seminar at the Chair of Cyber Trust

Weekly group meeting of the Chair of Cyber Trust for members and guests of the chair. The seminar includes research discussions and talks about topics related to the activities of the chair.