STATEMENT: The requirements and fines of the new European AI law, with PredictLand

(Information sent by the signatory company).

STATEMENT: The requirements and fines of the new European AI law, with PredictLand

(Information sent by the signatory company)

Madrid, March 22

Penalties for non-compliance can reach 7% of turnover, warns Bruno Gerlic, growth director of this company

There are six months left until its entry into force and application development companies and data scientists are already focusing their activity based on the new regulations. Published on March 13, it bases the limits of professional use of machine learning and generative AI on the levels of risk it poses to the safety, health and fundamental rights of citizens. That is to say, it was born with the vocation that these new tools do not cause harm to people's real lives. EU AI Act, a law of global scope. In addition, it does, they observe from PredictLand AI, with an extra-EU scope: all A company that buys, develops, customizes or uses AI systems in services that could affect a citizen of the European Union will have to respond to the EU AI Act, as the new law is known. We must not forget, in this In this sense, Bruno Gerlic points out, that users of generative AI platforms such as OpenAI, Google or Microsoft are found in any corner of the planet. But, in addition, this law is likely to condition organizations based outside the EU and that could have relationships or transactions with European citizens, for example, when carrying out selection processes with AI or evaluating the acceptance of a microcredit. using scoring algorithms. Legislate based on three levels of AI risk "A machine-based system designed to operate with variable levels of autonomy, which can show adaptability after its implementation and which, for explicit or implicit objectives, infers from the input it receives, how to generate outputs such as predictions, content, recommendations or decisions that can influence physical or virtual environments. This is how the European authorities define Artificial Intelligence, for which they establish a risk level system: unacceptable, high, low, and general purpose. Those that are included under the heading "unacceptable" have to do with the violation of identity , the protection of privacy, the manipulation of disabled people in order to change their consumption patterns and, of course, those that include biases that result in a different social treatment. The list of high-risk ones covers systems that are not prohibited, but highly scrutinized. The EU AI Act specifically regulates the AI ​​sectors in medical devices, vehicles, as well as emotion recognition and surveillance and control systems. Thus, in addition to maintaining and ensuring the quality of the data of each model, it will be necessary provide documentation and traceability thereof. With transparency as a banner, human supervision will also be required, especially regarding the use of the system's results and responses. Cybersecurity and compliance with impact assessments must also be guaranteed. General-purpose models, for their part, are programs such as GPT4, or Claude, or Gemini, systems that have very broad use cases and can be embedded in other programs. Thus, GPT4 per se is not in a high-risk or prohibited category, but it cannot be used in predictive police systems. The professionals at PredictLand AI, focused on the creation of predictive and generative AI applications, admit that in the current context, community law induces companies to be part of the response. Once a breach is detected, data scientists will be driven to fix the error in record time. With AI, prevention is better than cure. This law clearly puts companies on notice, insists Bruno Gerlic. "It will be difficult for them to evade their responsibility, and I recommend that AI professionals, wherever they are, familiarize themselves with the requirements of the law as soon as possible, especially before designing their solutions. Better safe than sorry, as they say. And in AI systems, which only have data without instructions, curing can entail a very high cost – even having to withdraw the product due to the lack of viable alternatives." Penalties for breaking the law, remembers Bruno Gerlic, can reach 35 million of euros, or represent 7% of the global income of the last year. Finally, an important fact for individual users: the law does not cover personal activities, as long as they do not incur in aspects included in the previous ones in the categories.ContactContact name: carmen de BlasContact description: PredictLand AIContact phone: 639 00 72 10

NEXT NEWS