Jürgen Jost, Max Planck Institute for Mathematics in the Sciences, Leipzig, Germany
Information is a fundamental concept since the pioneering works of Shannon and Kolmogorov, but is application is often unclear or controversial. In particle, in interactive situations, the question emerges who knows or needs to know what. Also, it is a question to what extent the complexity of a system can be evaluated in information theoretical terms. In this talk, I shall present some of our contributions towards these questions, the theory of information decomposition and the information theoretical approach to complexity, and sketch some applications in various domains, like evolutionary and molecular biology or strategy science.
Lars Hornuf, Betriebswirtschaftslehre, Finanzdienstleistungen und Finanztechnologie, Universität Bremen
When using digital devices and services, individuals provide their personal data to organizations in exchange for gains in various domains of life. Organizations use these data to run technologies such as smart assistants, augmented reality, and robotics. Most often, these organizations seek to make a profit. Individuals can, however, also provide personal data to public databases that enable nonprofit organizations to promote social welfare if sufficient data are contributed. Regulators have therefore called for efficient ways to help the public collectively benefit from its own data. By implementing an online experiment among 1,696 US citizens, we find that individuals would donate their data even when at risk of getting leaked. The willingness to provide personal data depends on the risk level of a data leak but not on a realistic impact of the data on social welfare. Individuals are less willing to donate their data to the private industry than to academia or the government. Finally, individuals are not sensitive to whether the data are processed by a human-supervised or a self-learning smart assistant.
Jan Nagler, Centre for Human and Machine Intelligence (HMI), Frankfurt School of Finance & Management
We show when ergodicity breaking in coupled and uncoupled ecosystems dominates other effects from fluctuating environmental factors. This involves a field study in nematodes. Ergodicity breaking leads to a shift in the evolutionarily adapted mean temperature and thereby to a correction of tipping point thresholds, for example, from climate change. We are able to analytically predict this temperature shift from first principles. More generally, possible implications regarding sudden breakdowns of ecosystems are discussed.
George FR Ellis, Mathematics Department, University of Cape Town, South Africa
Most physicists are focused only on efficient causation, and even then they tend to regard it as only happening at the basic physics level. However in the real world, efficient causation applies at each emergent level, for example psychological causation is just as real as physical causation at the microlevel. The emergence of genuine causal powers at higher levels is enabled by a mixture of upwards
and downwards causation, where the latter takes place via both material and formal causation. In the social context that is the framework of our lives, all of this is guided by final causation. Additionally there is a further key kind of causation not characterised by Aristotle, namely abstract causation. Taken together with the stochasticity that occurs at molecular and cellular levels, this forms a solid foundation for claims that agency can emerge from physics.
Peter Wittenburg, Max-Planck Computing & Data Facility, Garching
Digitization and thus generating and processing data has a long tradition in research but is now becoming an overwhelming trend in the whole society. We start understanding that this trend will revolutionise societies, research and industry. Not only the data volumes are increasing, but especially the complexity due to the interdependences between the various digital entities. Consequently, we start talking about infrastructures that will be needed to efficiently manage and process this complex data space, but also implement mechanisms to ensure privacy and establish trust.
Yet, companies offering platforms see the data they are collecting as their goods and mostly researchers still create data for their own purposes. Big companies and some governments invest large sums of funds (EOSC, NFDI, etc.) to get a deeper understanding of the kind of infrastructures needed and to remain at the competitive edge. However, completely different interests will hamper to establish the degree of harmonisation which will be required to enable the easy, nevertheless secure exchange which is needed. There is a growing interest to not leave the choice of basic pillars of the emerging data infrastructure to a few large profit-making companies.
Experience shows that defining new types of standards in such transformational times needs to anticipate trends to be expected in the coming decades. This is the reason that an international group of experts is working on the concept of FAIR Digital Objects (Findable, Accessible, Interoperable, Reusable) which are self-standing and secure data entities in the Internet including all relevant information. The Internet protocols, invented a few decades ago, created a unified international computer network where everyone easily can plug-in his/her computer. The FDOs needs to enable people to easily plug-in their data and trace their usage. It will be important to ensure that the FDO concept will not owned by a company, nor by any particular state.
In this talk all mentioned issues will be addressed ending up with a few recommendations to be discussed.
- FDO Forum
- FDO: De Smedt, K., Koureas, D., & Wittenburg, P. (2020). FAIR digital objects for science: from data pieces to actionable knowledge units. Publications, 8(2), 21.
- FDO Framework
- EOSC Evolution
1 There is much talk about the EOSC Portal but it should be noted that this is only a development of some institutions not representing EOSC.
Dr. Tim Ruhe, Technische Universität Dortmund
The worldwide COVID-19 pandemic as well as previous crises, like the global financial- and the European refugee crisis, has unearthed the fact that some people are able to cope with these kinds of situations in a more productive way than others. We attribute this to different coping strategies, which people apply to situations of inherent uncertainty. As a wide range of possible coping strategies -- ranging from gathering information to believing in conspiracy theories -- exists, the remaining question is, which of the possible coping strategies are actually expressed and applied. To this end we investigated a total of 1.6 million public German tweets, with respect to possibly expressed coping strategies. More than 3000 tweets were annotated, using predefined categories, which themselves correspond to certain coping strategies. These 3000 tweets were then used to analyze possible correlations with direct mandates in the 299 German constituencies. This talk summarizes the overall analysis approach, as well as the results of exploratory analysis of the entire 1.6 million tweets. Furthermore, preliminary results on possible correlations with the direct mandates are presented.
John Bateman, Universität Bremen
In this talk I offer an overview of some of the current theoretical accounts of meaning, and of different kinds of meaning, offered by the perspective of multimodal semiotics. This draws on the philosophical foundation developed by Charles Sanders Peirce as modified further by incorporating some of the generalised linguistic mechanisms developed in recent multimodality studies. Peirce placed particular importance on a particular kind of 'iconicity', that of a structural correlation of descriptions across different domains. Current work from a very different tradition, blending theory, is now bringing this form of iconicity back into focus in several areas. In particular, the application of blending in the area of formal ontology suggests a further round of potential formalisations for structural iconicity as a mechanism for constructing interpretations, that is, for assigning 'meanings' to phenomena. Crucial for such endeavors is the notion of 'heterogeneous ontologies', which I also introduce and motivate on the basis of our previous and ongoing work on ontology engineering. These strands are then brought together as a set of open research directions by which computational approaches to capturing meaning may be developed further.