We need to regulate artificial intelligence as soon as possible. Governments and human rights organisations should devote themselves to this as soon as possible. That’s what researchers from Google, Microsoft and others within AI Now say. According to them, the tech industry has not been able to regulate itself.
The companies state this in a forty-page report (PDF) that was published this week. It says that AI tools are too easy to roll out. In doing so, companies take little account of potentially harmful effects. The fact that companies often use the systems in environments where hundreds of thousands and sometimes as many as millions of people may be affected is potentially very dangerous.
Self-regulation does not work
The frameworks that AI currently has to regulate have not been able to ensure responsibilities, the researchers write. Problematic is the fact that the systems are growing in scale and complexity. In the absence of guarantees in the form of responsibilities, liability and procedural guarantees, there is now a proliferation of cases where it might be better to curb it.
At the moment there are AI solutions for just about every possible application in the making. There are systems that can appreciate students’ papers and systems that assess whether an immigrant has a criminal history. The companies that develop these systems often have only a few ethical foundations on which they rely, but even if they do not comply with them, they can often hardly be blamed for anything.
Recommendations for the industry
AI Now’s report contains a number of recommendations for the industry:
- We need regulation as soon as possible. For each domain, companies should work with governments and connoisseurs to determine what is needed.
- Face recognition must meet strict requirements if it is to be applied. Especially when it comes to recognizing emotions and criminals.
- A framework should be established whereby companies publicly account for their AI activities. Everything has to be documented, from the internal processes of a system, to the datasets and the decision making process of the systems.
- More budget should also be available for this purpose.
- The AI industry must accept that it is wider than just a technical industry, and accept that the tools have a much wider impact. This immediately implies certain new rules that must be observed.