
This is a full version of the report on the methodology of human–machine teaming with human and Multimodal HMI and GUI with interactive data visualization. Here the report describes the work carried out within tasks 5.3 and 5.4 of the TRUSTY project on the development of Human-AI Teaming (HAIT), multimodal Human-Machine Interfaces (HMI), and advanced data visualization techniques. The main points investigated include AI acceptability, perceived barriers to the use of AI based systems as well as the mechanisms to establish trust between humans and machines driven by AI. This includes designing a user-centric interface with adaptive visualization tools, ergonomic enhancements, and AI-driven anomaly detection. The HMI integrates real-time environmental data, video flux monitoring, and AI-assisted voice transcription to provide operators with an intuitive and responsive decision-support system to enhance the trustworthiness and usability of AI-powered decision support in Remote Digital Towers (RDTs).
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
