search
Include:
2,026,642 Research products, page 1 of 202,665

  • 0202 electrical engineering

10
arrow_drop_down
Relevance
arrow_drop_down
  • Open Access
    Authors: 
    Jian Jia-Zheng; Tzong-Rong Ger; Han-Hua Lai; Chi-Ming Ku; Chiung-An Chen; Patricia Angela R. Abu; Shih-Lun Chen;
    Publisher: MDPI AG

    Diverse computer-aided diagnosis systems based on convolutional neural networks were applied to automate the detection of myocardial infarction (MI) found in electrocardiogram (ECG) for early diagnosis and prevention. However, issues, particularly overfitting and underfitting, were not being taken into account. In other words, it is unclear whether the network structure is too simple or complex. Toward this end, the proposed models were developed by starting with the simplest structure: a multi-lead features-concatenate narrow network (N-Net) in which only two convolutional layers were included in each lead branch. Additionally, multi-scale features-concatenate networks (MSN-Net) were also implemented where larger features were being extracted through pooling the signals. The best structure was obtained via tuning both the number of filters in the convolutional layers and the number of inputting signal scales. As a result, the N-Net reached a 95.76% accuracy in the MI detection task, whereas the MSN-Net reached an accuracy of 61.82% in the MI locating task. Both networks give a higher average accuracy and a significant difference of p < 0.001 evaluated by the U test compared with the state-of-the-art. The models are also smaller in size thus are suitable to fit in wearable devices for offline monitoring. In conclusion, testing throughout the simple and complex network structure is indispensable. However, the way of dealing with the class imbalance problem and the quality of the extracted features are yet to be discussed.

  • Authors: 
    Luca Pilato; Gabriele Meoni; Luca Fanucci;
    Publisher: IEEE

    Recursive Systematic Convolutional (RSC) codes are the building blocks of the modern communication systems. In this paper we propose a new analytical model to manipulate the modulo-2 algebraic operations and a finite state machine model describing the single-cycle RSC architecture to design high throughput RSC code with special emphasis for parallel implementation and a puncturing scheme embedded in the design. The new design approach is suitable for any RSC code and for almost any degree of parallelism implementations. We also present some case studies about the RSC code architecture and some simulation results for the Bit Error Rate, to compare commonly used RSC codes with different constraints on the length, and redesigned with the proposed methodology.

  • Closed Access
    Authors: 
    Craig E. Kuziemsky; Liam Peyton;
    Publisher: Elsevier BV

    Abstract Objective While health information technology (HIT) offers great potential for supporting healthcare delivery, interoperability issues can be a barrier to effective use of HIT. While technical and semantic interoperability issues have been well studied there is a shortage of research that addresses process interoperability. Methods This paper uses a two year case study of a Palliative Care Information System (PAL-IS) to study process interoperability and health information technology (HIT). We describe the design of PAL-IS and develop and describe three types of process interoperability issues that arose from its implementation. Results The implementation of PAL-IS caused care delivery, clinical practice and administrative process interoperability issues. Further, many of these issues emerged over time and a solution to address one type of process interoperability issue often led to a different type of issue. We used our evaluation of PAL-IS to develop a general framework for understanding process interoperability and HIT. Conclusion Designing HIT to support care delivery is a complex sociotechnical endeavor that can result in different types of process interoperability issues. Evaluating process interoperability takes time and longitudinal studies are necessary to understand the overall ecosystem where technology, processes, and people interact. The framework developed in this paper provides a starting point for the evaluation of process interoperability and HIT.

  • Open Access English
    Authors: 
    Frédéric Jurie;
    Publisher: HAL CCSD
    Country: France

    International audience; We propose an efficient method for tracking 3D modelled objects in cluttered scenes. Rather than tracking objects in the image, our approach relies on the object recognition aspect of tracking. Candidate matches between image and model features define volumes in the space of transformations. The volumes of the pose space satisfying the maximum number of correspondences are those that best align the model with the image. Object motion defines a trajectory in the pose space. We give some results showing that the presented method allows tracking of objects even when they are totally occluded for a short while, without supposing any motion model and with a low computational cost (below 200 ms per frame on a basic workstation). Furthermore, this algorithm can also be used to initialize the tracking.

  • Publication . Contribution for newspaper or weekly magazine . Conference object . 2020
    Closed Access
    Authors: 
    Anooshmita Das; Emil Stubbe Kolvig-Raun; Mikkel Baun Kjærgaard;
    Publisher: ACM
    Country: Denmark

    Occupant behavioral patterns, once extracted, could reveal cues about activities and space usage that could effectively get used for building systems to achieve energy savings. The ability to accurately predict the trajectories of occupants inside a room branched into different zones has many notable and compelling applications. For example - efficient space utilization and floor plans, intelligent building operations, crowd management, comfortable indoor environment, security, and evacuation or managing personnel. This paper proposes future occupant trajectory prediction using state-of-the-art time series prediction methods, i.e., Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) models. These models are being implemented and compared to forecast occupant trajectories at a given time and location in a non-intrusive and reliable manner. The considered test-space for the collection of the dataset is a multi-utility area in an instrumented public building. The deployed 3D Stereo Vision Cameras capture the spatial location coordinates (x- and y- coordinates) from a bird's view angle without eliciting any other information that could reveal confidential data or uniquely identify a person. Our results showed that the GRU model forecasts were considerably more accurate than the LSTM model for the trajectory prediction. GRU prediction model achieved a Mean Squared Error (MSE) of 30.72 cm between actual and predicted location coordinates, and LSTM achieved an MSE of 47.13 cm, respectively, for multiple occupant trajectories within the monitored area. Another evaluation metric Mean Absolute Error (MAE) is used, and the GRU prediction model achieved an MAE of 3.14 cm, and the LSTM model achieved an MAE of 4.07 cm. The GRU model guarantees a high-fidelity occupant trajectory prediction for any given case with higher accuracy when compared to the baseline LSTM model.

  • Publication . Part of book or chapter of book . Conference object . 2012
    English
    Authors: 
    Patrick Bosc; Olivier Pivert;
    Publisher: HAL CCSD
    Country: France

    This paper introduces a fuzzy inclusion indicator derived from a connective aimed at modulating a fuzzy criterion according to the satisfaction of another one. The idea is to express that one is all the more demanding as to the degree attached to an element x in a set B as this element has a high degree of membership degree to a set A. The use of this reinforced inclusion indicator is illustrated in the context of database querying.

  • Closed Access English
    Authors: 
    Nesrine Ben Yahmed; Hélène Carrère; M. Nejib Marzouki; Issam Smaali;
    Publisher: HAL CCSD
    Country: France

    Abstract The green macroalgal biomass corresponds to an emerging and promising biofuel feedstock. Their biological pretreatment and energetic conversion to biomethane were investigated and the enhancement of biogas production using the solid-state fermentation (SSF) as an eco-friendly innovative pretreatment of Ulva sp. was precisely assessed. Compared to conventional acid and alkali pretreatments, the highest methane potential of 153 ± 3 mL CH 4 g − 1 VS with an anaerobic biodegradability of 57% was obtained using SSF pretreatment with a locally isolated Aspergillus fumigatus SL1 strain. It was 132 ± 2 mL CH 4 g − 1 VS for raw Ulva sp. with biodegradability of 49%. Acid pretreatment with 4% HCl at 150 °C had a negative effect on Ulva sp.'s methane potential while alkali pretreatment with 4% NaOH at 20 °C showed a significant effect. The proposed SSF-based pretreatment enhanced therefore biogas production of 21% and permitted an eco-friendly valorization of large amounts of abundant macroalgae.

  • Open Access English
    Authors: 
    Vladik Kreinovich; Christelle Jacob; Didier Dubois; Janette Cardoso; Martine Ceberio; Ildar Z. Batyrshin;
    Publisher: HAL CCSD
    Country: France
    Project: NIH | Enhancement of Quantitati... (1T36GM078000-01), NIH | Enhancement of Quantitati... (1T36GM078000-01)

    In many real-life applications (e.g., in aircraft maintenance), we need to estimate the probability of failure of a complex system (such as an aircraft as a whole or one of its subsystems). Complex systems are usually built with redundancy allowing them to withstand the failure of a small number of components. In this paper, we assume that we know the structure of the system, and, as a result, for each possible set of failed components, we can tell whether this set will lead to a system failure. For each component A, we know the probability P(A) of its failure with some uncertainty: e.g., we know the lower and upper bounds P(A) and P(A) for this probability. Usually, it is assumed that failures of different components are independent events. Our objective is to use all this information to estimate the probability of failure of the entire the complex system. In this paper, we describe several methods for solving this problem, including a new efficient method for such estimation based on Cauchy deviates.

  • Open Access
    Authors: 
    Jocelyn Tillería González; Fernando Vela Cossío;
    Publisher: Universidad Politecnica de Madrid - University Library

    La ocupación efectiva del territorio austral chileno se produjo en la segunda mitad del siglo XIX. Para su ejecución fueron claves las medidas dictadas por el Estado a través del denominado “proyecto de colonización”, que favoreció la llegada de inmigrantes procedentes de Europa durante un periodo de grandes flujos migratorios hacia el continente americano. La ciudad de Valdivia fue el punto de partida de este proyecto. En 1845 desembarcan los pobladores procedentes de los territorios de la Confederación Alemana. En esta ciudad se producen los primeros intercambios culturales entre la población local y la extranjera, en un “laboratorio constructivo” que estableció las bases de la arquitectura de la colonización y determinó la formación de la vivienda tradicional del sur del país. Mediante un estudio histórico y constructivo hemos identificado los invariantes arquitectónicos generados en este periodo de la colonización. Hemos comprobado su permanencia en distintos ejemplos de viviendas tradicionales localizadas, entre las ciudades de Valdivia y Puerto Montt, a lo largo del territorio que fue ocupado entre 1845 y 1875 con esta población alemana.

  • Closed Access
    Authors: 
    Ayman Younis; Tuyen X. Tran; Dario Pompili;
    Publisher: IEEE

    Task offloading with Mobile-Edge Computing (MEC) is envisioned as a promising technique for prolonging battery lifetime and enhancing the computation capacity of mobile devices. In this paper, we consider a multi-user MEC system with a Base Station (BS) equipped with a computation server assisting mobile users in executing computation-intensive real-time tasks via offloading technique. We formulate the Energy-Latency-aware Task Offloading and Approximate Computing (ETORS) problem, which aims at optimizing the trade-off between energy consumption and application completion time. Due to the centralized and mixed-integer natures of this problem, it is very challenging to derive the optimal solution in practical time. This motivates us to employ the Dual-Decomposition Method (DDM) to decompose the original problem into three subproblems—namely the Task-Offloading Decision (TOD), the CPU Frequency Scaling (CFS), and the Quality of Computation Control (QoCC). Our approach consists of two iterative layers: in the outer layer, we adopt the duality technique to find the optimal value of Lagrangian multiplier associated prime problem; and in the inner layer, we formulate the subproblems that can be solved efficiently using convex optimization techniques. We show that the computation offloading selection depends not only on the computing workload of a task, but also on the maximum completion time of its immediate predecessors and on the clock frequency as well as on the transmission power of the mobile device. Simulation results coupled with real-time experiments on a small-scale MEC testbed show the effectiveness of our proposed resource allocation scheme and its advantages over existing approaches.

Send a message
How can we help?
We usually respond in a few hours.