top of page

Can AI Reliance Meet Regulatory Compliance? Opportunities to Utilise AI Tools for Compliance


In recent years, generative Artificial Intelligence (AI) has emerged as a revolutionary force, igniting conversations and innovation across all facets of our society. Professionals across every sector are exploring how to harness AI responsibly, seeing the enticing opportunity to enhance efficiency whilst being cognisant of the need to understand the risks and establish effective AI risk governance arrangements.


In the past two years, AI platform ChatGPT emerged as the fastest growing consumer application in history, Microsoft has begun rolling out its Co-Pilot program into businesses globally, and countless Australian businesses have begun developing and implementing their own internal AI systems, King & Wood Mallesons included with its very own ‘KWM Chat’. Implementing internally managed AI systems can allow these Australian businesses to mitigate some of the concern around data confidentiality. The question we’re asking today is: how can compliance teams approach the potential efficiencies offered by AI to scale up their professional impact whilst mitigating some of the very real concerns associated with utilising generative AI in a corporate environment?


So how can compliance professionals responsibly harness AI?


Leveraging Large Language Models (LLMs) and generative AI in the creation of compliance documentation can create capacity in lean, busy compliance teams. Some potential usages of AI include:


  • Regulatory Summaries and Obligation Libraries: AI can summarise publicly available regulations into concise summaries or format regulations into a tabular obligation library that may be useful to distil the obligations throughout the organisation and/or map obligations to controls. Risks to consider here include the currency of the source data, completeness and accuracy of the AI output and transparency to users that AI generated content is being utilised. It also won’t negate the need for an understanding of the regulatory requirements to review and assess whether the summaries and libraries are complete, accurate and tailored in a relevant way to your business. However, the greatly reduced time spent on the manual tabulation and drafting is a clear benefit.


  • Creation of Compliance Policies: By providing relevant templates, parameters, and instructions to AI, AI can take the first step in generating regulatory policies. For example, a compliance professional could ask generative AI to produce a first draft of an Anti-bribery and Corruption Policy. Whilst the output may only satisfy a first draft at most, the compliance professional could then adapt and tailor the policy to meet their organisation’s specific needs, while saving time. Similar to the previous point, the end result will only be as good as the information provided and so an intimate human knowledge of the business will continue to be necessary and key to the success of the output.


  • Generating Risk Assessments: A higher risk opportunity to consider where an internal AI environment is available; by feeding AI relevant compliance data (within set parameters), AI tools can summarise the data in a structured risk document that sets out the background, and a summary of critical issues and potential risks. The compliance professional could use the initial base to adapt and refine it to the final version. However, compliance professionals need to be mindful of confidentiality, security and accuracy risks and to harness this opportunity responsibly, it would need to be underpinned by an effective AI risk governance model.


These examples represent just a fraction of the myriad ways AI and LLMs may be able to support risk management and compliance, and highlight the significant time-saving benefits of AI, associated risks and need for effective governance. The benefit of using AI and LLM tools are their ability to free up highly skilled specialist resources within an organisation and enables the compliance team to further focus their impact on the complexities of analysing, evaluating, and customising compliance solutions to better suit their organisation’s requirements.


Where do we go from here?


There is a significant opportunity with this emerging technology if safely and responsibly harnessed. Successful AI adoption depends on both identifying the opportunities and spending the time to consider what good governance over those opportunities looks like. Testing the governance approach in a few lower risk use cases may be a beneficial next step to consider.


Ultimately, professionals may be able to leverage these benefits to fulfil the requirements of the organisations they support. However, this will come with a substantial period of testing and learning to ensure that the tools are appropriately adapted, and that challenges and risks have been addressed and considered.


This publication is a joint publication from King & Wood Mallesons, and KWM Compliance Pty Ltd (ACN 672 547 027) trading as Owl Advisory by KWM.   KWM Compliance Pty Ltd is a company wholly owned by the King & Wood Mallesons Australian partnership.  KWM Compliance Pty Ltd provides non-legal compliance and governance risk advisory services for businesses.  KWM Compliance Pty Ltd is not an incorporated legal practice and does not provide legal services. Laws concerning the provision of legal services do not apply to KWM Compliance Pty Ltd. 

Related Posts

댓글


댓글 작성이 차단되었습니다.
bottom of page