[EPA] Interim Decision to Artificial Intelligence Tools
foiaonline.govFor those not wanting to download a PDF:
>At this time, agency personnel are not authorized to use Artificial Intelligence tools (e.g., ChatGPT, OpenAI, etc.) for official use. In the interim, EPA has blocked access to them.
>Background
>Artificial Intelligence is a technology that simulates human intelligence processes to make decisions and perform tasks, often using business information. AI tools collect and process large amounts of data that may contain highly sensitive information, such as financial, confidential business and Personally Identifiable Information.
>EPA is currently assessing potential legal, privacy and cybersecurity concerns as releasing information to AI tools could lead to potential data breaches, identify theft, financial fraud or inadvertent release of privileged information. We must implement robust security measures and rules of use to ensure EPA personnel and information is protected.
>Interim Decision
>While the agency works to implement security measures and rules of use, as an interim decision, EPA has blocked access to AI tools. The Agency is continuing to analyze AI tools and will follow up with a final decision.
>The interim decision to block these tools was made after careful consideration of various factors, including potential legal, information security and privacy concerns. If an employee has a compelling requirement to access AI tools, the employee can reach out to their Information Security Officer and request an exception to this interim decision. More detailed communication is forthcoming
Personally it makes pretty obvious sense to me. Having possibly sensitive data be put into the AI by someone who doesn't know any better is an extreme risk worth restricting access to AI for now.
I wonder if they are allowed to search on Google.
I think this is an interesting question.
I.e., to what extent does search activity reveal confidential information to an external entity (Google) that has no right to that information, nor any restriction on further sharing it?
E.g., could Google legally sell EPA search queries to companies bidding on competitive EPA contracts?
Either way, is there a clear line between using Google and using ChatGPT w.r.t. information leak?
I think Google and OpenAI are equally likely to collect/use/sell any information you give them, so in that sense the risk is the same, but you are incentivized to give way more information to OpenAI, at least if we're comparing search and ChatGPT. Nobody is copy/pasting an entire document into a Google search (it's useless and also it doesn't let you), but you are encouraged to do that in ChatGPT.
Obviously you could also use Google for email and document storage, potentially giving them all that data, but I'm sure the EPA has separately assessed the risks of using cloud providers for those things.
This. I think the crux of the issue is that people have never found that copy-pasting blobs of confidential data was rewarding because, comparatively to GPT, pasting in google is, how can I say, a bit dumb.
Now we're seeing improvements to search where pasting large bodies of text provides actual value and it just needs to be reiterated that copy-pasting data into a website gives said website access to the data, so don't do that. Although the framing of this discussion has been around 'pasting into AI chats is bad' it really should be "don't leak confidential information to websites, chat GPT included".
Not all that surprising, nor is it really special at all. It's probably blocked in most corporate environments as well (it is at Qualcomm) along with any other tool/site where people might be tempted to paste large chunks of company info, whether it's copy, source code, etc. I discovered the other day that we have regex101 blocked! Very annoying, but understandable. Fortunately someone set up something similar internally that we are allowed to access. Of course there are always alternatives, or ways around the blocks, but it makes sense to target whatever is most popular/viral.