Powered by OpenAIRE graph
Found an issue? Give us feedback

TrustMe: Secure and Trustworthy AI platform

Funder: UK Research and InnovationProject code: 10017288
Funded under: Innovate UK Funder Contribution: 59,899 GBP

TrustMe: Secure and Trustworthy AI platform

Description

According to recent survey by global analytics firm FICO and Corinium, 65% of companies cannot explain how Artificial Intelligence (AI) model decisions/predictions are made and poor data has caused 11.8 million/year, financial waste on average. Also, over the last years many companies have invested in AI applications, though have not raised the value of responsible and governance AI to the appropriate level. Most of these companies (specifically the small and medium businesses) will go with AI-as-a-Service choice to save cost, which provides incorporative AI functionality for companies without the need for related expertise. The big challenge here is, how much trust would be given to an output produced by AI-as-a-service platform? Several studies have shown that standard AI algorithms currently used are not suitable for regulated services due to their lack of security, transparency, reliability and explainability. The vision of this project is to develop a secure and trustworthy AI platform suitable for AI developers and data scientists, which provides scoring mechanism to measure the quality and trust levels of datasets and AI/ML algorithm during development and deployment phases. The TrustMe platform will be running based on a local-host web application with enabled features for designing, developing, and implementing explainable and trustworthy AI applications. TrustMe platform also offers a data quality score using Quality of Data (QoD) estimator. Using QoD feature, end users can obtain the quality level of their data along with auto generated reports highlight all bad records. QoD also offers automated solution to enrich quality of data or perform data processing according to pre-defined and editable rules by data admins. To deal with possible bias within training/testing data samples, TrustMe offers Quality of Training/Testing (QoT) toolkits. This tool will help developers to automatically generating training/testing samples with considering over/below sampling for imbalanced data samples due to data acquisition process. We envisage that in the next 3 years TrustMe Score and generated reports would become like a universal score that all organisations would want to use to prove how secure and trustworthy their AI platforms are.

Data Management Plans
Powered by OpenAIRE graph
Found an issue? Give us feedback

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

All Research products
arrow_drop_down
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::cf2e0d8bfb6bb77d959113e47b734585&type=result"></script>');
-->
</script>
For further information contact us at helpdesk@openaire.eu

No option selected
arrow_drop_down