246 Projects, page 1 of 50
Fundamental properties of 2D materials are dramatically modified when they are brought next to each other to form a vertical heterostructure. Then, the combination of the 2D layers as well as the relative orientation between them ultimately determines the performance of the new hybrid material. This presents tremendous new opportunities for manipulating the behaviour of 2D layered materials and ultimately achieving unprecedented control over their performance when integrated into highly specific functional devices. However, research in this field is in its nascent stage and many exciting phenomena remain to be discovered. The main objective of TWISTM is to unravel the most fundamental properties of unexplored graphene- and transition metal dichalcogenide-based bilayers arising from many-body interactions. Progress here requires of a comprehensive microscopic picture of the fundamental properties of the heterostructures in clear connection to their macroscopic behaviour. TWISTM will tackle this challenge by using local probe techniques (STM and s-SNOM) with sub-nm resolution combined with mesoscopic electron transport measurements on in-house engineered twisted bilayers with accurate misalignment angles. TWISTM comprises three goals: i) to develop and optimize a fabrication method towards the achievement of high-quality twisted bilayers, ii) a multiscale search for collective electronic and optical phenomena arising from the coupling between the layers; iii) A full characterization of the superconducting and magnetic properties as a function of the chosen materials combination and twist angle. Overall the project aims at acquiring fundamental scientific knowledge with potential for technological applications that will be useful in both academia and industry. Furthermore, TWISTM will offer high-quality interdisciplinary training to a young researcher helping her to develop a promising independent scientific research career as well as her own scientific network.
Traditional complexity theory focuses on the dichotomy between P and NP-hard problems. Lately, it has become increasingly clear that this misses a major part of the picture. Results by the PI and others offer glimpses on a fascinating structure hiding inside NP: new computational problems that seem to lie between polynomial and NP-hard have been identified; new conditional lower bounds for problems with large polynomial running times have been found; long-held beliefs on the difficulty of problems in P have been overturned. Computational geometry plays a major role in these developments, providing some of the main questions and concepts. We propose to explore this fascinating landscape inside NP from the perspective of computational geometry, guided by three complementary questions: (A) What can we say about the complexity of search problems derived from existence theorems in discrete geometry? These problems offer a new perspective on complexity classes previously studied in algorithmic game theory (PPAD, PLS, CLS). Preliminary work indicates that they have the potential to answer long-standing open questions on these classes. (B) Can we provide meaningful conditional lower bounds on geometric problems for which we have only algorithms with large polynomial running time? Prompted by a question raised by the PI and collaborators, such lower bounds were developed for the Frechet distance. Are similar results possible for problems not related to distance measures? If so, this could dramatically extend the traditional theory based on 3SUM-hardness to a much more diverse and nuanced picture. (C) Can we find subquadratic decision trees and faster algorithms for 3SUM-hard problems? After recent results by Pettie and Gronlund on 3SUM and by the PI and collaborators on the Frechet distance, we have the potential to gain new insights on this large class of well-studied problems and to improve long-standing complexity bounds for them.
Precise synaptic connectivity is a prerequisite for the function of neural circuits, yet individual neurons, taken out of their developmental context, readily form unspecific synapses. The goal of this proposal is to understand the roles and requirements of such promiscuous synapse formation during brain development. The observation of promiscuous synapse formation is not at odds with precise outcomes. The developmental program can ensure correct partnerships between neurons that form synapses promiscuously, but to what degree remains largely unresolved. My group has developed live imaging and optogenetic manipulations of dynamic synaptic choice processes in the intact developing fly brain. We found that time, location and the kinetics of filopodial interactions restrict to a remarkable degree the specificity of synaptic contacts between neurons that can form synapses with many partners if not actively prevented from doing so. A second surprise finding that motivates this proposal suggests markedly different connectomes of flies that developed at 18°C and 25°C, including synapse numbers and partnerships. These differences can be traced back to interaction kinetics during development, thus revealing a level of synaptic promiscuity that can be exposed through temperature alone. The time is ripe for a quantitative assessment of the extent to which synaptic connections are the result of developmentally regulated promiscuous synapse formation. The quest to understand the molecular mechanisms of brain wiring has largely focused on guidance cues and synaptic recognition, fields in which great progress continues to be made. This proposal offers to approach the question of synaptic specificity from the much rarer, but complementary perspective of the alternative limiting case, the hypothesis of synaptic promiscuity. SynPromiscuity is devised to balance the risks of a contrarian approach with its relevance for neurodevelopmental precision and plasticity in health and disease.
A major goal in ecology is to predict how environmental changes, including drivers of global change, affect communities and ecosystem functioning, with society demanding answers to these pressing questions. A key limitation of virtually all experimental approaches addressing such questions is that treatments are delivered abruptly, while many changes occurring in nature are gradual. Here I propose to comprehensively study consequences of environmental change when delivered abruptly vs. gradually. In order to understand and model effects of gradual vs. abrupt changes, we need to simultaneously consider physiological effects (e.g. acclimation), evolutionary changes (e.g. adaptation) and changes in community composition and functioning. Even though changes at these levels likely interact, there is no study in which physiology, evolutionary changes and community shifts have been studied in response to a changing environmental factor. This research program thus enters unchartered territory of empirical environmental research in proposing work at this nexus of physiology, environmental change and community composition/ function. I focus on soil fungi, key players in terrestrial ecosystems, testing a range of gradually vs. abruptly changing environmental factors, in a range of soils, in the field and in microcosms. We connect differential responses to species traits, apply modeling and employ data syntheses across all biomes and organisms to achieve high external validity. We carry out a set of core experiments that will afford unprecedented insight into the nature of change in a community context in response to warming, focusing on soil fungi. In these we follow evolutionary change (phenotype and genotype), test physiological shifts by re-isolation of fungi and monitor community changes. This work will have transformative character in providing not only new mechanistic insights into effects of environmental change, but will also represent a step change in fungal ecology.
Time series characterize diverse systems, examples in this proposal are: i) Proton motion in an inhomogeneous aqueous environment, ii) folding and unfolding of a peptide described by a suitably chosen reaction coordinate, iii) migration of a living cell on a substrate, iv) US Dollar / Yen exchange rate. Examples i) and ii) are close-to-equilibrium, iii) is a far from equilibrium since energy is constantly dissipated, while example iv) at first sight defies the classification into equilibrium or non-equilibrium. For the understanding, comparison, classification and forecasting of time series data, stochastic differential equations, diverse random walk models, and more recently, machine-learning algorithms are commonly used. But fundamental questions remain unanswered: Is a unified description of such diverse systems possible? What is the relation between different proposed models? Can the non-equilibrium degree of a time series be estimated? NoMaMemo provides a unified description of generic time series data in terms of non-linear integro-differential stochastic equations based on memory functions that are extracted from data. NoMaMemo accounts for non-linear and non-equilibrium effects as well as for non-Gaussian noise and connects with fundamental concepts such as equilibrium statistical mechanics, response theory and entropy production. The general formulation contains previously proposed models and thus allows their comparison, forecasting quality will be compared with modern machine-learning algorithms. NoMaMemo creates a generic platform to analyse, understand, compare, classify and predict time series data and to optimize stochastic systems with respect to search efficiency, barrier-crossing speed or other figures of merit. NoMaMemo will significantly advance the understanding of chemical reaction and protein folding kinetics, the interpretation of THz and IR spectroscopy of liquids and the analysis of living matter and socio-economic data.