If you are using free AI tools to speed up work, cut costs, or solve problems fast, it is worth asking about the risks of AI tools and what they may be taking in return. I was reminded of this at our recent PowerUp Americas conference in San Antonio by our keynote speaker Lance Haun. The old rule still applies: if you are not paying for the product, you may be the product. And don’t forget the caveats: (1) You’re about to become the product and (2) you’re about to start paying for it. That idea becomes far more serious when employees paste sensitive company information into free versions of LLMs and other AI platforms. A quick prompt can expose client data, internal strategy, financial details, source code, or confidential documents in ways your business never intended. What feels like a harmless shortcut can create real legal, operational, and reputational risk. You may not realize you are uploading information to external servers that can be accessed by a wide range of users. Or even that your content winds up being used to further train the tools – for everyone, not just you.
Here are some ways you could accidentally create exposure:
- Uploading contracts or proposals (yours or your clients’) and asking ChatGPT to tweak/summarize/explain the contents.
- Uploading resumes or job descriptions and asking for summaries, matching, etc. This could be especially risky in the EU, where both GDPR and the EU AI Act have strict compliance requirements.
- Uploading email or text threads for the purposes of drafting responses.
- Uploading your retained search process to create a marketing campaign.
These risks are *on top* of the more obvious risks surrounding bias, hallucinations, links to sources that don’t exist, and other headaches. Short of being careful, what are some steps you can take to reduce the risk of exposure? Lance shared a list of questions to ask your vendors, including:
- Who owns the candidate data I input?
- Does the vendor use my inputs to train their models?
- Where is data stored and for how long?
- What happens to my data if I cancel?
- If the tool is free or low-cost, what is the actual revenue model?
- Who are the sub-processors handling candidate data?
Other ways to protect yourself:
- Use paid/enterprise versions that offer better safeguards
- Educate all of your employees and contractors about the risks
- Have a clear policy that prohibits sharing company and confidential documents in free versions
Use of AI tools is only likely to increase. Small business owners must be aware of the risks and consider if a free tool is worth the risk of exposure. Regular training, monitoring, and usage audits can help mitigate the risks of AI tools, especially when managing a remote team.