Part 2: Infrastructure Protection; International audience; State estimation is vital to the stability of control systems, especially in power systems, which rely heavily on measurement devices installed throughout wide-area power networks. Several researchers have analyzed the problems arising from bad data injection and topology errors, and have proposed protection and mitigation schemes. This chapter employs hierarchical state estimation based on the common weighted-least-squares formulation to study the propagation of faults in intermediate and top-level state estimates as a result of measurement reordering attacks on a single region in the bottom level. Although power grids are equipped with modern defense mechanisms such as those recommended by the ISO/IEC 62351 standard, reordering attacks are still possible. This chapter concentrates on how an inexpensive data swapping attack in one region in the lower level can influence the accuracy of other regions in the same level and upper levels, and force the system towards undesirable states. The results are validated using the IEEE 118-bus test case.
International audience; The use of land cover mappings built using remotely sensed imagery data has become increasingly popular in recent years. However, these mappings are ultimately only models. Consequently, it is vital for one to be able to assess and verify the quality of a mapping and quantify uncertainty for any estimates that are derived from them in a reliable manner.For this, the use of validation sets and error matrices is a long standard practice in land cover mapping applications. In this paper, we review current state of the art methods for quantifying uncertainty for estimates obtained from error matrices in a land cover mapping context. Specifically, we review methods based on their transparency, generalisability, suitability when stratified sampling and suitability in low count situations. This is done with the use of a third-party case study to act as a motivating and demonstrative example throughout the paper.The main finding of this paper is there is a major issue of transparency for methods that quantify uncertainty in terms of confidence intervals (frequentist methods). This is primarily because of the difficulty of analysing nominal coverages in common situations. Effectively, this leaves one without the necessary tools to know when a frequentist method is reliable in all but a few niche situations. The paper then discusses how a Bayesian approach may be better suited as a default method for uncertainty quantification when judged by our criteria.
Genome-wide screens are a powerful technique to dissect the complex network of genes regulating diverse cellular phenotypes. The recent adaptation of the CRISPR-Cas9 system for genome engineering has revolutionized functional genomic screening. Here, we present protocols used to introduce Cas9 into human lymphoma cell lines, produce high-titer lentivirus of a genome-wide sgRNA library, transduce and culture cells during the screen, isolate genomic DNA, and prepare a custom library for next-generation sequencing. These protocols were tailored for loss-of-function CRISPR screens in human lymphoma cell lines but are highly amenable for other experimental purposes.
We discuss the use of parametric phase-diverse phase retrieval to characterize and optimize the transmitted wavefront of a high-contrast apodized pupil coronagraph with and without an apodizer. We apply our method to correct the transmitted wavefront of the HiCAT (High contrast imager for Complex Aperture Telescopes) coronagraphic testbed. This correction requires a series of calibration steps, which we describe. The correction improves the system wavefront from 16 nm RMS to 3.0 nm RMS for the case where a uniform circular aperture is in place. We further measure the wavefront with the apodizer in place to be 11.7 nm RMS. Improvement to the apodized pupil phase retrieval process is necessary before a correction based on this measurement can be applied.
International audience; In this paper, we study the effect of smartphone camera exposure on the performance of optical camera communications (OCC) link. The exposure parameters of image sensor sensitivity (ISO), aperture and shutter speed are included. A static OCC link with a 8 × 8 red, green and blue (RGB) LED array employed as the transmitter and a smartphone camera as the receiver is demonstrated to verify the study. Signal-to-noise ratio (SNR) analysis at different ISO values, the effect of aperture and shutter speed on communication link quality is performed. While SNRs of 20.6 dB and 16.9 dB are measured at 1 m and 2 m transmission distance, respectively for a ISO value of 100, they are decreased to 17.4 dB and 13.32 dB for a ISO of 800. The bit error rate (BER) of a 1 m long OCC link with a camera's shutter speed of 1/6000 s is 1.3 × 10 −3 (i.e., below the forward error correction BER limit of 3.8 × 10 −3) and is dropped to 0.0125 at a shutter speed of 1/20 s. This study provides insight of the basic smartphone settings and the exposure adjustment for further complex OCC links.
Wildfire prediction from Earth Observation (EO) data has gained much attention in the past years, through the development of connected sensors and weather satellites. Nowadays, it is possible to extract knowledge from collected EO data and to learn from this knowledge without human intervention to trigger wildfire alerts. However, exploiting knowledge extracted from multiple EO data sources at run-time and predicting wildfire raise multiple challenges. One major challenge is to provide dynamic construction of service composition plans, according to the data obtained from sensors. In this paper, we present a knowledge-driven Machine Learning approach that relies on historical data related to wildfire observations to guide the collection of EO data and to automatically and dynamically compose services for triggering wildfire alerts.
International audience; We study the Horn theories of Kleene algebras and star continuous Kleene algebras, from the complexity point of view. While their equational theories coincide and are PSpace-complete, their Horn theories differ and are undecidable. We characterise the Horn theory of star continuous Kleene algebras in terms of downward closed languages and we show that when restricting the shape of allowed hypotheses, the problems lie in various levels of the arithmetical or analytical hierarchy. We also answer a question posed by Cohen about hypotheses of the form 1 = S where S is a sum of letters: we show that it is decidable.
Multiple theories of working memory are described in the chapters of this book and often these theories are viewed as being mutually incompatible, yet each is associated with a supporting body of empirical evidence. This chapter argues that many of these differences reflect different research questions, different levels of explanation, and differences in how participants perform their assigned tasks in different laboratories, rather than fundamental theoretical adversity. It describes a version of a multiple component working memory in which a range of specialized cognitive functions (or mental tools) act in concert, giving the impression, at a different level of explanation, of a unified cognitive system. The chapter argues that more rapid and more substantial scientific progress on the understanding of the concept of working memory would be achieved through identifying the levels of explanation explored within each theoretical framework, and attempting to integrate theoretical frameworks rather than perpetuating debate with no clear resolution in sight.
The definition of antipower introduced by Fici et al. (ICALP 2016) captures the notion of being the opposite of a power: a sequence of k pairwise distinct blocks of the same length. Recently, Alamro et al. (CPM 2019) defined a string to have an antiperiod if it is a prefix of an antipower, and gave complexity bounds for the offline computation of the minimum antiperiod and all the antiperiods of a word. In this paper, we address the same problems in the online setting. Our solutions rely on new arrays that compactly and incrementally store antiperiods and antipowers as the word grows, obtaining in the process this information for all the word’s prefixes. We show how to compute those arrays online in \(O(n\log n)\) space, \(O(n\log n)\) time, and \(o(n^\epsilon )\) delay per character, for any constant \(\epsilon >0\). Running times are worst-case and hold with high probability. We also discuss more space-efficient solutions returning the correct result with high probability, and small data structures to support random access to those arrays.
Automata networks are mappings of the form \(f: Q^Z \rightarrow Q^Z\), where Q is a finite alphabet and Z is a set of entities; they generalise Cellular Automata and Boolean networks. An update schedule dictates when each entity updates its state according to its local function \(f_i: Q^Z \rightarrow Q\). One major question is to study the behaviour of a given automata networks under different update schedules. In this paper, we study automata networks that are invariant under many different update schedules. This gives rise to two definitions, locally commutative and globally commutative networks. We investigate the relation between commutativity and different forms of locality of update functions; one main conclusion is that globally commutative networks have strong dynamical properties, while locally commutative networks are much less constrained. We also give a complete classification of all globally commutative Boolean networks.