AI Ethics
Self-Assessment
Tool

This self-assessment tool is in beta version

WHAT IS THE SELF-ASSESSMENT TOOL FOR?

This self-assessment tool is built to enable AI developer organisations or AI operator organisations to evaluate the ethics level of an AI system, using the UAE’s AI Ethics Principles and Guidelines.

We are making it available so that for each AI system ethical performance can be assessed from the very outset. This is to help the team think about the potential ethical issues that may arise throughout the development process, including initial idea stages through to the maintenance of the system when it is fully operating. It should also help in identifying particular guidelines to pay attention to for particular AI systems, and to give some ideas on what kind of mitigation measures could be introduced.

It is important to state that the guidelines in this self-assessment tool are recommended instead of compulsory. There are, however, different recommendation strengths. Guidelines with recommendation level high (phrased as “should” in AI ethics guidelines document) are highly recommended and have a higher weight in the final score. Guidelines with recommendation level moderate are moderately recommended and phrased as “should consider” and have less weight in the final performance score. It is not suggested to proceed with AI system implementation unless a certain level of ethics performance is reached. The tool is used for self-assessment purposes only and will not be audited, checked or regulated during this time.

To improve the effectiveness of the guidelines, we actively welcome feedback and improvement suggestions, as well as examples of use cases to which the guidelines have been applied. These can be given through the feedback form on our website. Data will be used to monitor the broader adoption of the toolkit and in time to improve it (eg. provide benchmarking or relative scoring).

WHAT DOES IT CONTAIN?

The self-assessment tool contains the following components:

  1. The Impact
  2. Data use Risk
  3. Assess Accountability Risk
  4. Third-party Methodology Risk
  5. Risk of Historic Bias
  6. Risk of Technical Bias

LICENSING AND RESPONSIBILITY NOTE

Minister of State for Artificial Intelligence, Digital Economy & Remote Work Application office (AI Office) is not responsible for any misuse of AI System Ethics Self-Assessment Tool, and the user bears all the consequences of this use.

This self-assessment tool is published under the terms of a Creative Commons Attribution 4.0 International Licence in order to facilitate its re-use by other governments and private sector organisations. In summary this means you are free to share and adapt the material, including for commercial purposes, provided that you give appropriate credit to AI Office as its owner and do not suggest AI Office endorses your use.