- Financial services
- Data Protection
- Financial services
Shorter Reads
Artificial intelligence (AI) is growing rapidly in complexity and sophistication and is disrupting business processes around the world. Governments and regulatory bodies are struggling to develop frameworks which meet the ever-changing needs of the technology.
2 minute read
Published 21 January 2022
There is no AI-specific legislation in the UK. Rather, all UK businesses must take into account various existing legal obligations when developing and using AI, just as they would in adopting any other new technology. There are three main pieces of legislation which UK organisations need to consider:
Due to the digital transformation of financial service providers in recent years, AI has become increasingly crucial in the financial sector.
The regulators — the Prudential Regulation Authority (PRA) and the Financial Conduct Authority (FCA) — have not issued any AI specific guidance, but existing guidance such as the FCA’s Principles for Businesses (PRIN) and the PRA’s Fundamental Rules will be relevant to firms using AI. Under these rules:
In April 2021, the European Commission published its draft Artificial Intelligence Regulation (draft AI regulation) introducing a first-of-its-kind, comprehensive, regulatory framework for AI. The rules are still in draft form and will be the subject of complex legislative negotiations, so the regulation is not expected to be finalised and implemented before 2023. Some important aspects of the draft AI regulation are, briefly:
It will apply to all organisations providing or using AI systems in the EU. It does not apply to personal, non-professional activity.
There is a “risk-based” model to categorising AI systems:
Fines of 10 million-30 million euros or 2-6% of the global annual turnover of the organisation, whichever is higher, may be levied. The level of fine imposed depends on the nature of the infringement.
The draft AI regulation will apply to organisations outside of the EU if they are: (1) “providers” (i.e., a developer) of AI systems for organisations in the EU; or (2) “users” (i.e., organisations who buy the AI system from the provider) of AI systems where the output produced by the system is used within the EU.
This means that both UK-based AI developers who sell their products into the EU, and UK organisations using AI technology which affects individuals in the EU, will need to comply with the new rules. The extra-territorial impact is significant and multinationals will need to consider whether to use a common set of AI systems compliant with the draft AI regulation (once finalised) or adopt different technology in different jurisdictions.
The UK government has not indicated any plans to follow in the Commission’s footsteps, nor has it expressed a view on what the future of AI regulation will look like.
In the financial sector, the Bank of England and the FCA have launched the AI Public-Private Forum (AIPPF) to facilitate discussion regarding the use and impact of AI in financial services. A particular aim is to gather views on the possible future AI regulatory
framework, and in particular to look at how the UK can “strike the right balance between providing a framework that allows for certainty, regulatory effectiveness and transparency, as well as beneficial innovation”. The AIPPF have discussed and evaluated the Commission’s draft AI regulation, so it is clear that the Commission rules will be closely considered by UK legislators and regulators in developing any new framework. The AIPPF did not, however, give any indication of when a new regulatory framework will be drafted
and implemented.
The use of AI brings with it a number of unique regulatory challenges. The new AI framework in the EU is a major step forward, but it remains to be seen how the draft framework will develop during the Commission negotiation period and also how the UK will respond.
The complexity of AI and how it affects consumers, particularly in financial services, certainly supports the position that AI-specific legislation is warranted in the UK.
Originally published by ThomsonReuters © ThomsonReuters.
Related content
Shorter Reads
Artificial intelligence (AI) is growing rapidly in complexity and sophistication and is disrupting business processes around the world. Governments and regulatory bodies are struggling to develop frameworks which meet the ever-changing needs of the technology.
Published 21 January 2022
There is no AI-specific legislation in the UK. Rather, all UK businesses must take into account various existing legal obligations when developing and using AI, just as they would in adopting any other new technology. There are three main pieces of legislation which UK organisations need to consider:
Due to the digital transformation of financial service providers in recent years, AI has become increasingly crucial in the financial sector.
The regulators — the Prudential Regulation Authority (PRA) and the Financial Conduct Authority (FCA) — have not issued any AI specific guidance, but existing guidance such as the FCA’s Principles for Businesses (PRIN) and the PRA’s Fundamental Rules will be relevant to firms using AI. Under these rules:
In April 2021, the European Commission published its draft Artificial Intelligence Regulation (draft AI regulation) introducing a first-of-its-kind, comprehensive, regulatory framework for AI. The rules are still in draft form and will be the subject of complex legislative negotiations, so the regulation is not expected to be finalised and implemented before 2023. Some important aspects of the draft AI regulation are, briefly:
It will apply to all organisations providing or using AI systems in the EU. It does not apply to personal, non-professional activity.
There is a “risk-based” model to categorising AI systems:
Fines of 10 million-30 million euros or 2-6% of the global annual turnover of the organisation, whichever is higher, may be levied. The level of fine imposed depends on the nature of the infringement.
The draft AI regulation will apply to organisations outside of the EU if they are: (1) “providers” (i.e., a developer) of AI systems for organisations in the EU; or (2) “users” (i.e., organisations who buy the AI system from the provider) of AI systems where the output produced by the system is used within the EU.
This means that both UK-based AI developers who sell their products into the EU, and UK organisations using AI technology which affects individuals in the EU, will need to comply with the new rules. The extra-territorial impact is significant and multinationals will need to consider whether to use a common set of AI systems compliant with the draft AI regulation (once finalised) or adopt different technology in different jurisdictions.
The UK government has not indicated any plans to follow in the Commission’s footsteps, nor has it expressed a view on what the future of AI regulation will look like.
In the financial sector, the Bank of England and the FCA have launched the AI Public-Private Forum (AIPPF) to facilitate discussion regarding the use and impact of AI in financial services. A particular aim is to gather views on the possible future AI regulatory
framework, and in particular to look at how the UK can “strike the right balance between providing a framework that allows for certainty, regulatory effectiveness and transparency, as well as beneficial innovation”. The AIPPF have discussed and evaluated the Commission’s draft AI regulation, so it is clear that the Commission rules will be closely considered by UK legislators and regulators in developing any new framework. The AIPPF did not, however, give any indication of when a new regulatory framework will be drafted
and implemented.
The use of AI brings with it a number of unique regulatory challenges. The new AI framework in the EU is a major step forward, but it remains to be seen how the draft framework will develop during the Commission negotiation period and also how the UK will respond.
The complexity of AI and how it affects consumers, particularly in financial services, certainly supports the position that AI-specific legislation is warranted in the UK.
Originally published by ThomsonReuters © ThomsonReuters.
Need some more information? Make an enquiry below.
Enjoy reading our articles? why not subscribe to notifications so you’ll never miss one?
Subscribe to our articlesPlease note that Collyer Bristow provides this service during office hours for general information and enquiries only and that no legal or other professional advice will be provided over the WhatsApp platform. Please also note that if you choose to use this platform your personal data is likely to be processed outside the UK and EEA, including in the US. Appropriate legal or other professional opinion should be taken before taking or omitting to take any action in respect of any specific problem. Collyer Bristow LLP accepts no liability for any loss or damage which may arise from reliance on information provided. All information will be deleted immediately upon completion of a conversation.
Close