Technology and algorithms offer many opportunities for businesses and governments to organize their operations more efficiently and effectively. However, they also come with pitfalls—especially when algorithmic decisions have an impact on people. In such cases, fundamental rights may be at stake.

Which fundamental rights are being infringed? How likely is it that this will happen? What is the impact on an individual? Is that impact proportionate to the purpose of the algorithm? And is the assessment—acceptable or not—transparent and easy to explain? This, in a nutshell, is the mirror that the FRAIA holds up to algorithm users. Both governments and businesses can make use of the FRAIA.

The authors of the FRAIA are prof. mr. , dr. Mirko Tobias Schäfer, Arthur Vankan and Iris Muis, all (former) employees at Utrecht University. The client for the development is the Ministry of the Interior.

FRAIA course

The FRAIA can generally be used independently by organizations. However, it may be the case that organizations expect to use the FRAIA so frequently that they wish to train a number of internal staff to facilitate FRAIA sessions. For this purpose, we have developed a special FRAIA course day. This course day can be attended individually at Utrecht University, or it can be delivered in-house at your organization.

Guidance during a FRAIA assessment

The FRAIA can generally be used independently by organizations to assess algorithms that may impact fundamental rights. However, it may be helpful to have external guidance when using the FRAIA, especially if it’s the first time and your organization wants to become familiar with the tool. In that case, you can enlist the help of the experienced moderators from Data School.

For more information, contact us at dataschool@uu.nl or via the contact form.