We’re getting so used to the hype round generative AI (GenAI) that it might look like we’re on the verge of it getting used for all functions, in all places, on a regular basis. There’s vital strain on public sector organisations, particularly, to not miss the chance to reap its (anticipated, presumed) advantages.
Nevertheless, GenAI comes with many challenges and dangers, particularly once we discuss free to make use of, typically accessible GenAI fashions. This isn’t sufficiently understood or recognised and a lot of the conversations I’ve on GenAI use with public sector leaders and procurement officers are likely to rapidly attain an ungainly second the place I pop the bubble by stressing these dangers and ranting about why I feel GenAI shouldn’t simply be used as is obtainable off the shelf (or in any respect, for public sector actions that must adjust to strict necessities of excellent administration and factuality).
Within the context of public sector AI adoption, the widespread availability of those instruments poses a major governance problem and I feel we’re only a unhealthy choice away from a doubtlessly very vital scandal / drawback. The problem comes from many instructions, however particularly by the embedding (or slipstreaming) of AI instruments into current programs and software program packages (AI creep) and entry by civil servants and public sector staff by free to make use of platforms (shadow AI).
Given this, I’ve been glad to see that two latest items of steering on public sector AI use have clearly formulated the default place that non-contracted / typically accessible GenAI shouldn’t be used within the public sector and that distinctive use ought to comply with a cautious evaluation and lots of interventions to make sure compliance with rightly demanding requirements and benchmarks.
The Irish Pointers for the Accountable Use of AI within the Public Service (up to date 12 Could 2025), constructing on an earlier 2023 advice of the Irish Nationwide Cyber Safety Centre advocate “that entry is restricted by default to GenAI instruments and platforms and allowed solely as an exception primarily based on an applicable accredited enterprise case and wishes. Additionally it is really useful that its use by any employees shouldn’t be permitted till such time as Departments have carried out the related danger assessments, have applicable utilization insurance policies in place and employees consciousness on secure utilization has been applied” (p 39).
In very related phrases, however maybe primarily based on a distinct set of issues, the Dutch Ministry of Infrastructure and Water Administration’s AI Impression Evaluation Steering (up to date 31 Dec 2024) has additionally said that the provisional place for central authorities organisations is that GenAI use is in precept not permitted: “The provisional place on the usage of generative AI in central authorities organisations presently units strict necessities for the usage of LLMS in central authorities: “Non-contracted generative AI purposes, similar to ChatGPT, Bard and Midjourney, don’t typically comply demonstrably with the related privateness and copyright laws. Due to this, their use by (or on behalf of) central authorities organisations is in precept not permitted in these instances the place there’s a danger of the legislation being damaged until the supplier and the consumer demonstrably adjust to related legal guidelines and laws.”” (p 41).
I feel that these are good examples of accountable default positions. In fact, monitoring and enforcement a normal prohibition like this can be tough and extra must be executed to make sure that organisations put in place governance and technical measures to hunt to minimise the dangers arising from unauthorised use. That is additionally a useful default as a result of it can pressure organisations that purposefully wish to discover GenAI adoption to undergo the required processes of impression evaluation and cautious and structured consideration, in addition to place a concentrate on the adoption (whether or not through procurement or not) of GenAI options which have applicable safeguards and are adequately tailor-made and fine-tuned to the particular use case (if that’s potential, which stays to be seen).