
The nuclear industry has some of the most extreme environments in the world, with radiation levels and other hazards frequently restricting human access to facilities. Even when human entry is possible, the risks can be significant and very low levels of productivity. To date, robotic systems have had limited impact on the nuclear industry, but it is clear that they offer considerable opportunities for improved productivity and significantly reduced human risk. The nuclear industry has a vast array of highly complex and diverse challenges that span the entire industry: decommissioning and waste management, Plant Life Extension (PLEX), Nuclear New Build (NNB), small modular reactors (SMRs) and fusion. Whilst the challenges across the nuclear industry are varied, they share many similarities that relate to the extreme conditions that are present. Vitally these similarities also translate across into other environments, such as space, oil and gas and mining, all of which, for example, have challenges associated with radiation (high energy cosmic rays in space and the presence of naturally occurring radioactive materials (NORM) in mining and oil and gas). Major hazards associated with the nuclear industry include radiation; storage media (for example water, air, vacuum); lack of utilities (such as lighting, power or communications); restricted access; unstructured environments. These hazards mean that some challenges are currently intractable in the absence of solutions that will rely on future capabilities in Robotics and Artificial Intelligence (RAI). Reliable robotic systems are not just essential for future operations in the nuclear industry, but they also offer the potential to transform the industry globally. In decommissioning, robots will be required to characterise facilities (e.g. map dose rates, generate topographical maps and identify materials), inspect vessels and infrastructure, move, manipulate, cut, sort and segregate waste and assist operations staff. To support the life extension of existing nuclear power plants, robotic systems will be required to inspect and assess the integrity and condition of equipment and facilities and might even be used to implement urgent repairs in hard to reach areas of the plant. Similar systems will be required in NNB, fusion reactors and SMRs. Furthermore, it is essential that past mistakes in the design of nuclear facilities, which makes the deployment of robotic systems highly challenging, do not perpetuate into future builds. Even newly constructed facilities such as CERN, which now has many areas that are inaccessible to humans because of high radioactive dose rates, has been designed for human, rather than robotic intervention. Another major challenge that RAIN will grapple with is the use of digital technologies within the nuclear sector. Virtual and Augmented Reality, AI and machine learning have arrived but the nuclear sector is poorly positioned to understand and use these rapidly emerging technologies. RAIN will deliver the necessary step changes in fundamental robotics science and establish the pathways to impact that will enable the creation of a research and innovation ecosystem with the capability to lead the world in nuclear robotics. While our centre of gravity is around nuclear we have a keen focus on applications and exploitation in a much wider range of challenging environments.
The focus of the project will be on how to use an arbitrary number of commercial off-the-shelf sensors to stream a real-time 3D view of a scene in VR to be useful in remote operations, for example, the control of robotics in hazardous places such as found in nuclear decommissioning. The innovation will be focused on novel techniques for efficiently processing the large amounts of data produced from multiple 3D sensors and how to send the data to multiple users in VR, giving them a reliably high quality and low latency virtual presence in a remote scene. The project will contribute to producing a module that is usable out-of-the-box allowing users to easily combine any number of 3D sensors for the purpose of sensed VR applications.
Decommissioning costs will be impacted by the quality of data that can be captured in the planning stages of a typical project. 'What are we dealing with?' is often the first question asked by the decommissioing team. The primary challenge is to identify the location of any radiation sources. An ideal solution would be to have a complete and accurate, high resolution, 3D map of radiation overlaid on the visible contents. SeeSnake will be a major advance in this area as it will combine and demonstrate the N-Visage radiation mapping capability with snake-arm manipulators. This addresses experienced weaknesses of N-Visage being delivered 'on a stick' and being unable to see behind obstructions. Combining the technologies of two successful innovative SMEs exemplifies the industry need for mature solutions. This project will de-risk purchasing decisions and demonstrate multi-functional capability of snake-arm systems. Support from Sellafield and Culham is indicative of end user need and pull.
no public description
Createc and the Oxford Robotics Institute are collaborating to develop an autonomous industrial inspection solution. This will combine a sensor payload that allows accurate visual, auditory and thermal measurements and a navigation framework that allows repeatable accuracy for data with high precision. This comes in a package that can be easily used with any mobile robot system, while the current configuration is tailored for the needs of the Energy/Oil&Gas industrial facilities.