The working groups make up the largest part of the academy. Each participant chooses a working group with which he or she will spend 3-5 hours per day over the course of the week with up to 15 other people. Each working group is led by an expert from the respective field.
One of the most important tasks of the state is to ensure the security of its population. To fight crime, the state has increasingly advanced methods at its disposal in terms of data technology. Intelligent video surveillance with facial recognition, the combing of social media and the analysis and monitoring of cell phone data have made the fight against crime much more effective. Containing pandemics can also be aided by the collection of personal data. In response to the attacks on September 11, 2001, the United States enacted comprehensive telecommunications surveillance laws. But the Snowden revelations show how far-reaching these actually are. Not only suspected criminals are monitored, but everyone – even if there is no suspicion of criminal activity. This raises numerous questions. How much data are we allowed to collect, and under what circumstances? Is the state allowed to store data on all citizens across the board, even without a basis for suspicion? What are the potential dangers?
Sebastian Krüsmann is a cyber and strategic risk manager at Deloitte and a lecturer at Mannheim University of Applied Sciences. As a BSI IT security consultant, he is one of about 100 experts certified by the German Federal Office for Information Security (BSI).
Data privacy is a profoundly social phenomenon. What people understand by it and how desirable it is is determined by society. In this working group, we want to take a look behind the scenes in society and try to better understand the socio-cultural issue of data protection. What distinguishes privacy protection, data protection and data sovereignty? How did the desire for data protection arise and how has it changed? What predictions can be made in light of individualization, singularization, and the increasingly entrepreneurial self? To what extent do capitalist logics of exploitation and stricter protection of private data contradict each other? And what is that actually: data? These are the questions we will explore in the WG, taking a step back to identify both structural causes and effects of surveillance capitalism on our everyday lives and society. Applications from outside the social sciences are strongly encouraged. After all, a public sociology benefits from developing interdisciplinary knowledge with diverse perspectives. This is exactly what we want to do together and we are very much looking forward to it!
PD Dr. Nils Zurawski is a social anthropologist and criminologist and heads the Research Unit for Strategic Police Research at the Hamburg Police Academy. He teaches at the University of Hamburg on criminology, conflict and mediation.
Katika Kühnreich is a political scientist and sinologist. She works on the social impact of digitalization.
In this working group, the collection of data for automated assessment of human behavior is addressed and resulting potentials and risks are investigated. In all areas of life, we generate data that can potentially be used to describe our behavior in detail and – if desired – to evaluate it and automatically derive implications from these evaluations. Smartphones record our movement, from which implications for our health behavior can be derived. On social media, we show our personality in speech and writing, from which employers can infer our conscientiousness. In email and video conferencing interactions, and based on the use of programs on our work laptops, our work performance can be inferred, as well as whether we are stressed or even on the verge of burnout. The everyday data collection and algorithmic evaluation of our behavior in these areas will be investigated in more detail in this research group, taking into account different stakeholders. Exemplary topics are privacy issues, gamification to influence human behavior, criteria for the quality of algorithmic evaluation, biases and discrimination in algorithmic evaluation, and intransparency of algorithmic evaluation and its implications.
Dr. Markus Langer conducts research at Saarland University in the field of industrial and organizational psychology, focusing, among other things, on the relationship between artificial intelligence, algorithms, and humans.
In this working group, we want to investigate the potentials and risks of data collection for science. The potentials for research and development are enormous and competitive progress in the field of medicine and in technology companies increasingly depends on the use of Big Data and AI. In turn, the development of these technologies typically requires large amounts of data, some of it highly sensitive. But what forms of misuse can there be and what are the implications? To what degree is the collection of data for research-related activities justifiable, even considering a global research network? What rights should consumers have, and what obligations should research institutions have?
[Still unclear lecturer] As soon as staffing is finalized, we will update the AG on our website and notify accepted via email.
Cambridge Analytica has repeatedly shown us how easily entire populations can be influenced by targeted personalized advertising in democratic elections. Authoritarian regimes use censorship and surveillance to systematically undermine potential opponents. Is targeted advertising a normal political campaign or a threat to free elections? What tools can be used to promote democracy and protect against the consequences of the filter bubble without necessarily having to completely abandon data collection? What is the risk of abuse of power by the state with regard to freedom of the press? In addition to socio-historical, theoretical and empirical program points, media-ethical and normative questions will also be focused on.
Dr. Carsten Ochs is a research associate in the BMBF project “Democracy Development, Artificial Intelligence and Privacy”.
Dr. Ingrid Stapf is a media ethicist at the International Center for Ethics in the Sciences and Humanities at the University of Tübingen and heads a BMBF project on the safety of children in online communication (SIKID). Previously, she conducted research on privacy issues at the Forum Privatheit and published a volume on children’s privacy in the context of digitalization.
More and more companies are outsourcing their IT processes and databases to the cloud. The largest providers in this area are the American groups Amazon, Google and Microsoft. But banks and the healthcare sector in particular do a lot of work with highly confidential personal information. For some years now, there has been a specialist area in cryptography that deals with precisely this problem. Using “Secure Multi-Party Computation” (MPC), it is possible for multiple parties, each with secret input, to jointly compute a public function without sharing their input with the others. “Yao’s Millionaire Problem,” for example, addresses the question of how two millionaires can jointly determine which of them has more money without sharing how much money the person actually has.
Participants will first learn the theoretical basics of MPC during the first half of the academy, but then work in small groups on practical projects starting in the second half with support from the instructor. Due to the thematic depth, we assume a mathematics or computer science degree. Previous cryptographic knowledge is helpful, but not a prerequisite.
Dr. Marcel Keller is a senior researcher at CSIRO’s Data61 and developer of the MP-SPDZ Library for Multi-Party Computation.
Daily accompanying expert presentations on different areas and perspectives will complement the working groups. The afternoon program will be rounded off by workshops run by the participants. Whether theater, photography course, reading circle, running group or film team – we look forward to your creativity here! We will plan the workshops together on site.