Powered by OpenAIRE graph
Found an issue? Give us feedback

SeeByte Ltd

13 Projects, page 1 of 3
  • Funder: UK Research and Innovation Project Code: EP/K014277/1
    Funder Contribution: 3,837,580 GBP

    Sensors have for a long time played a vital role in battle awareness for all our armed forces, ranging from advanced imaging technologies, such as radar and sonar to acoustic and the electronic surveillance. Sensors are the "eyes and ears" of the military providing tactical information and assisting in the identification and assessment of threats. Integral in achieving these goals is signal processing. Indeed, through modern signal processing we have seen the basic radar transformed into a highly sophisticated sensing system with waveform agility and adaptive beam patterns, capable of high resolution imaging, and the detection and discrimination of multiple moving targets. Today, the modern defence world aspires to a network of interconnected sensors providing persistent and wide area surveillance of scenes of interest. This requires the collection, dissemination and fusion of data from a range of sensors of widely varying complexity and scale - from satellite imaging to mobile phones. In order to achieve such interconnected sensing, and to avoid the dangers of data overload, it is necessary to re-examine the full signal processing chain from sensor to final decision. The need to reconcile the use of more computationally demanding algorithms and the potential massive increase in data with fundamental resource limitations, both in terms of computation and bandwidth, provides new mathematical and computational challenges. This has led in recent years to the exploration of a number of new techniques, such as, compressed sensing, adaptive sensor management and distributed processing techniques to minimize the amount of data that is acquired or transmitted through the sensor network while maximizing its relevance. While there have been a number of targeted research programs to explore these new ideas, such as the USs "Integrated Sensing and Processing" program and their "Analog to Information" program, this field is still generally in its infancy. This project will study the processing of multi-sensor systems in a coherent programme of work, from efficient sampling, through distributed data processing and fusion, to efficient implementations. Underpinning all this work, we will investigate the significant issues with implementing complex algorithms on small, lighter and lower power computing platforms. Exemplar challenges will be used throughout the project covering all major sensing domains - Radar/radio frequency, Sonar/acoustics, and electro-optics/infrared - to demonstrate the performance of the innovations we develop.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/W001136/1
    Funder Contribution: 1,915,360 GBP

    The international offshore energy industry is undergoing as revolution, adopting aggressive net-zero objectives and shifting rapidly towards large scale offshore wind energy production. This revolution cannot be done using 'business as usual' approaches in a competitive market with low margins. Further, the offshore workforce is ageing as new generations of suitable graduates prefer not to work in hazardous places offshore. Operators therefore seek more cost effective, safe methods and business models for inspection, repair and maintenance of their topside and marine offshore infrastructure. Robotics and artificial intelligence are seen as key enablers in this regard as fewer staff offshore reduces cost, increases safety and workplace appeal. The long-term industry vision is thus for a digitised offshore energy field, operated, inspected and maintained from the shore using robots, digital architectures and cloud based processes to realise this vision. In the last 3 years, we has made significant advances to bring robots closer to widespread adoption in the offshore domain, developing close ties with industrial actors across the sector. The recent pandemic has highlighted a widespread need for remote operations in many other industrial sectors. The ORCA Hub extension is a one year project from 5 UK leading universities with over 20 industry partners (>£2.6M investment) which aims at translating the research done into the first phase of the Hub into industry led use cases. Led by the Edinburgh Centre of Robotics (HWU/UoE), in collaboration with Imperial College, Oxford and Liverpool Universities, this multi-disciplinary consortium brings its unique expertise in: Subsea (HWU), Ground (UoE, Oxf) and Aerial robotics (ICL); as well as human-machine interaction (HWU, UoE), innovative sensors for Non Destructive Evaluation and low-cost sensor networks (ICL, UoE); and asset management and certification (HWU, UoE, LIV). The Hub will provide remote solutions using robotics and AI that are applicable across a wide range of industrial sectors and that can operate and interact safely in autonomous or semi-autonomous modes in complex and cluttered environments. We will develop robotics solutions enabling accurate mapping , navigation around and interaction with assets in the marine, aerial and ground environments that support the deployment of sensors for asset monitoring. This will be demonstrated using 4 industry led use cases developed in close collaboration with our industry partners and feeding directly into their technology roadmaps: Offshore Renewable Energy Subsea Inspection in collaboration with EDF, Wood, Fugro, OREC, Seebyte Ltd and Rovco; Aerial Inspection of Large Infrastructures in Challenging Conditions in collaboration with Barrnon, BP, Flyability, SLAMCore, Voliro and Helvetis; Robust Inspection and Manipulation in Hazardous Environments in collaboration with ARUP, Babcock, Chevron, EMR, Lafarge, Createc, Ross Robotics; Symbiotic Systems for Resilient Autonomous Missions in collaboration with TLB, Total Wood and the Lloyds Register. This will see the Hub breach into new sectors and demonstrate the potential of our technology on a wider scale.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/V05676X/1
    Funder Contribution: 1,129,920 GBP

    The offshore energy and defence sectors share a vision of the future where people are taken out of harsh, extreme environments and replaced by teams of smart robots able to do the 'dirty and dangerous jobs', collaborating seamlessly as a team with each other and with the human operators and experts on-shore. In this new world, remote data collection, fusion and interpretation become central, together with the ability to generate transparent, safe actionable decisions from this data. We propose the HUME project (HUman-machine teaming for Maritime Environments), whose vision is to develop a coherent framework that enables humans and machines to work seamlessly as a team by establishing and maintaining a single shared view of the world and each other's intents through transparent interaction, robust to a highly dynamic and unpredictable maritime environments. The HUME project's ambitious and fundamental research programme will address fundamental research questions in the field of machine-machine and human-machine collaboration, robot perception and explainable autonomy and AI. The Prosperity Partnership would build on a 20 year strategic relationship between SeeByte and HWU, with SeeByte originally a spin-out of Heriot-Watt University in 2001 and now a world-leader in maritime autonomy worldwide in the Oil & Gas and Defence sectors. This grant would facilitate a shift to lower TRL research and development, providing seeding for early-stage research that can have a broad, longer-term and more disruptive impact. This proposed work aims at establishing a durable model, through which SeeByte and HWU can remain connected to foster long-term research relationships on projects of interest, as they emerge in this rapidly changing field.

    more_vert
  • Funder: UK Research and Innovation Project Code: NE/P016561/1
    Funder Contribution: 132,243 GBP

    Many oil and gas fields are reaching the end of their lives. It is estimated that over 50 of the 475 structures that eventually will have to be decommissioned will be decommissioned by 2018. While most structures will be removed entirely, some larger platforms and pipelines currently cannot be removed without causing serious environmental harm. The structures that remain in place are cleaned and made safe and then they are regularly monitored by the oil and gas company that owns them to ensure that they are not causing any adverse impacts on the environment. Currently this monitoring is done using a survey ship, which at tens of thousands of pounds a day, becomes very expensive. This is not conducive to regular monitoring, which in turn increases the risks of any impacts going undetected. This project aims to introduce a new approach for monitoring of decommissioned structures - autonomous submarines. These vehicles are becoming more widely adopted by industry. It has been demonstrated that these vehicles can be launched from shore and carry out complex missions underwater, collecting information on the seabed type and biological environment, while also monitoring pollution. However, so far they have not yet been applied to decommissioning and existing sensors and protocols need to be reconfigured to collect the data that is required for environmental monitoring around decommissioned structures. Working with our project partners, Shell, BP, BEIS, Gardline and SeeByte, this project will address these challenges and develop approaches for collection of appropriate monitoring data to regularly assess the impact of decommissioned structures with autonomous vehicle. The approaches will be integrated within the standard practices of oil companies to ensure that they are realistic and widely adopted for monitoring. The integration of this new information with existing baseline information on the areas will also be considered. This will ultimately reduce the costs associated with monitoring and improve the quality and quantity of data that can be obtained, which will reduce the risk of environmental damage from decommissioned structures.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/L016834/1
    Funder Contribution: 5,784,700 GBP

    Robots will revolutionise the world's economy and society over the next twenty years, working for us, beside us and interacting with us. The UK urgently needs graduates with the technical skills and industry awareness to create an innovation pipeline from academic research to global markets. Key application areas include manufacturing, assistive and medical robots, offshore energy, environmental monitoring, search and rescue, defence, and support for the aging population. The robotics and autonomous systems area has been highlighted by the UK Government in 2013 as one the 8 Great Technologies that underpin the UK's Industrial Strategy for jobs and growth. The essential challenge can be characterised as how to obtain successful INTERACTIONS. Robots must interact physically with environments, requiring compliant manipulation, active sensing, world modelling and planning. Robots must interact with each other, making collaborative decisions between multiple, decentralised, heterogeneous robotic systems to achieve complex tasks. Robots must interact with people in smart spaces, taking into account human perception mechanisms, shared control, affective computing and natural multi-modal interfaces.Robots must introspect for condition monitoring, prognostics and health management, and long term persistent autonomy including validation and verification. Finally, success in all these interactions depend on engineering enablers, including architectural system design, novel embodiment, micro and nano-sensors, and embedded multi-core computing. The Edinburgh alliance in Robotics and Autonomous Systems (EDU-RAS) provides an ideal environment for a Centre for Doctoral Training (CDT) to meet these needs. Heriot Watt University and the University of Edinburgh combine internationally leading science with an outstanding track record of exploitation, and world class infrastructure enhanced by a recent £7.2M EPSRC plus industry capital equipment award (ROBOTARIUM). A critical mass of experienced supervisors cover the underpinning disciplines crucial to autonomous interaction, including robot learning, field robotics, anthropomorphic & bio-inspired designs, human robot interaction, embedded control and sensing systems, multi-agent decision making and planning, and multimodal interaction. The CDT will enable student-centred collaboration across topic boundaries, seeking new research synergies as well as developing and fielding complete robotic or autonomous systems. A CDT will create cohort of students able to support each other in making novel connections between problems and methods; with sufficient shared understanding to communicate easily, but able to draw on each other's different, developing, areas of cutting-edge expertise. The CDT will draw on a well-established program in postgraduate training to create an innovative four year PhD, with taught courses on the underpinning theory and state of the art and research training closely linked to career relevant skills in creativity, ethics and innovation. The proposed centre will have a strong participative industrial presence; thirty two user partners have committed to £9M (£2.4M direct, £6.6M in kind) support; and to involvement including Membership of External Advisory Board to direct and govern the program, scoping particular projects around specific interests, co-funding of PhD studentships, access to equipment and software, co-supervision of students, student placements, contribution to MSc taught programs, support for student robot competition entries including prize money, and industry lead training on business skills. Our vision for the Centre is as a major international force that can make a generational leap in the training of innovation-ready postgraduates who are experienced in deployment of robotic and autonomous systems in the real world.

    more_vert
  • chevron_left
  • 1
  • 2
  • 3
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.