The European Association for Medical devices of Notified Bodies (TEAM-NB) are asking the European Commission to think again about a proposed regulation that could lead to repetition in regulatory requirements for artificial intelligence (AI) products. Under the Artificial Intelligence Act (AIA), AI products, including medical devices and specific medical software, would encounter risk-based regulatory requirements so that they can be placed on the European market. Medical devices and Software as a Medical Device (SaMD) which use AI would be deemed as high-risk.
However, TEAM-NB says sector-specific safety requirements are already dealt with under the European Commission’s New Legislative Framework (NLF), and more specifically is covered under Regulation (EU) 2017/745 on medical devices (MDR). When it comes to AI regulation, it is vital to focus on the enforcement under the existing NLF framework to prevent defragmentation, work duplication, save on costs, and avoid upsets to operation efficiency.
The association advises the EU to create industry-specific guidance for putting AI regulations into practise with the existing NLF framework to deal with risk, state of the art, testing and assessment obligations.
TEAM-NB says that the AIA proposal uses a broad definition of AI which could take in non-AI systems and confound regulators, manufacturers and notified bodies. To fulfil the object of standardisation, harmonised definitions of technologies, so that the relevant stakeholders can correctly communicate, the definition found in the international standard ISO/IEC 2382:2015, IT vocabulary ought to be used.
The association are also troubled about the AIA proposal’s requirements for vigilance reporting and conformity assessments. There are already systems in operation to deal with these matters and bringing in new requirements would double the workload and add unnecessary inconvenience.
Source: Medtech Insight (an Informa product)