Skip to main content

Cookies on BBB.org

We use cookies to give users the best content and online experience. By clicking “Accept All Cookies”, you agree to allow us to use all cookies. Visit our Privacy Policy to learn more.

Cookie Preferences

Many websites use cookies or similar tools to store information on your browser or device. We use cookies on BBB websites to remember your preferences, improve website performance and enhance user experience, and to recommend content we believe will be most relevant to you. Most cookies collect anonymous information such as how users arrive at and use the website. Some cookies are necessary to allow the website to function properly, but you may choose to not allow other types of cookies below.

Necessary Cookies

What are necessary cookies?
These cookies are necessary for the site to function and cannot be switched off in our systems. They are usually only set in response to actions made by you that amount to a request for services, such as setting your privacy preferences, logging in or filling in forms. You can set your browser to block or alert you about these cookies, but some parts of the site will not work. These cookies do not store any personally identifiable information.

Necessary cookies must always be enabled.

Functional Cookies

What are functional cookies?
These cookies enable the site to provide enhanced functionality and personalization. They may be set by us or by third party providers whose services we have added to our pages. If you do not allow these cookies, some or all of these services may not function properly.

Performance Cookies

What are performance cookies?
These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us to know which pages are the most and least popular and see how visitors move around the site. All information these cookies collect is aggregated and therefore anonymous. If you do not allow these cookies we will not know when you have visited our site, and will not be able to monitor its performance.

Marketing Cookies

What are marketing cookies?
These cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant content on other sites. They do not store personal information directly, but are based on uniquely identifying your browser or device. If you do not allow these cookies, you will experience less targeted advertising.
Latest News

BBB Tip: Here’s how to protect yourself using ChatGPT

By Better Business Bureau. March 8, 2023.
Person holding and interacting with device with AI messaging graphic

AI Messaging (Getty)

Are robots coming to steal your time, money, and maybe even your heart? Recent technology advances have created new AI-related security threats, and while they don’t look like what you’ve seen in the movies, you might be at risk. 

Artificial intelligence has taken mind-boggling leaps forward in terms of what it can do in the past few months, with tools like ChatGPT making headlines around the globe. AI can create useful content but generate danger in a whole new way. We’re not talking about robot armies marching in lockstep to annihilate mankind. Technology isn’t out to get us, but some humans are, and ChatGPT is a shiny new tool in some criminals’ arsenals.

In this article, we’ll explore how scammers and cyber criminals use ChatGPT to harm, then provide tips on protecting yourself.

What is ChatGPT?

ChatGPT is a conversational AI model developed by OpenAI that can mimic human language and generate coherent and natural responses to text-based input. It’s designed to learn from massive amounts of online data, soaking up what’s available on the web and then recycling information and phrasing to respond to questions. 

So far, it’s been used for various legitimate purposes, like chatbots and language translation. However, scammers and cybercriminals have also taken advantage of AI’s capabilities for malicious purposes.


Potential risks to look out for

Scammers use ChatGPT to make their malware threats, phishing attempts, and fake profiles more convincing and interactive. By generating well-written copy and fast, believable responses to victims' messages, scammers using ChatGPT can create the illusion of a real person on the other end of the conversation. This can make it harder for victims to identify that they’re being scammed. Beware of these potential scams when using artificial intelligence.

  • Phishing scams. Scammers use ChatGPT to create believable phishing scams that mimic legitimate organizations like banks, social media companies, or government agencies. The scammer sends an unsolicited message to the victim via email or messaging app, then prompts them to click a link or provide personal information.  The message can appear legitimate since ChatGPT can generate compelling messages modeled after content from the actual bank, government agency, or other organization. However, clicking the link or providing personal information can lead to identity theft or financial loss.
  • Impersonation scams. Scammers can use AI-generated content to impersonate people like a boss, co-worker, or family member and convince the victim to provide sensitive information or transfer money. The scammer generates an articulate and free-flowing conversation that appears to be from the person they are impersonating, making it tough for the victim to detect the fraud.
  • Malware and viruses. Cybercriminals can use ChatGPT to spread malware and viruses. AI allows them to produce a conversation that appears to be from a legitimate source like a friend or colleague, then prompts the victim to click on a link or download a file. Once the victim clicks the link or downloads the file, their device can become infected with malware or viruses.
  • ChatGPT romance scams. Are you sure it’s a human causing your heart to flutter during online interactions? Romance scams are a type of fraud where scammers create fake profiles on dating websites or social media platforms to establish romantic relationships with their victims. They then use this relationship to gain the trust of their victims and eventually convince them to send money or provide personal information.
  • Purchasing scams. Bad actors are using ChatGPT to trick people into buying fake goods. They create a conversation that appears to be from a legitimate seller, then convince the victim to purchase products or services. Scammers trick victims into a digital funds transfer. However, the goods are either fake or do not exist, so the money people pay can be a permanent loss.


How to protect yourself from cyber criminals using ChatGPT

Here are several tips on how to protect yourself from scammers and cyber criminals using ChatGPT:

  • Be cautious of unsolicited messages. Use extreme caution if you receive a message from someone you don't know. Don’t click on included links or provide requested personal information. Contact them directly to confirm its authenticity if the message appears from a legitimate organization, such as a bank or government agency.
  • Verify the identity of the person you're chatting with. Verify their identity if they're chatting with someone online, especially if they're asking for personal information or money. Ask for their contact information, like their business email address or personal phone number, and confirm it’s legit. If possible, communicate with them through a secure and verified messaging platform rather than an unsecured platform.
  • Scrutinize text. AI written text often uses the same words repeatedly. It also uses short sentences with unimaginative language and no idioms or contractions. Content may include implausible statements. If something seems off, trust your gut.
  • Use two-factor authentication for your online accounts. Two-factor authentication adds an extra layer of security to your online accounts by requiring an additional code to log in, usually sent to your phone. This makes it more difficult for scammers to access your accounts even if they manage to steal your password.
  • Use a password manager to generate and store strong passwords. Don’t use your dog’s name and your birth date. Avoid using the same password for everything. We know that gets complicated, but password managers help. Using a password manager generates and stores strong passwords for your online accounts, making it more difficult for scammers to gain access. 
  • Be cautious when downloading files or clicking on links. Use care, especially if files or links come from an unknown source. Scammers can use ChatGPT to generate convincing messages that appear to be from a legitimate source, such as a friend or colleague. Verify the source before clicking on any links or downloading any files.
  • Use caution when talking to strangers. To avoid falling victim to a scam, be cautious when engaging with people online, especially those who seem too good to be true. Be wary of anyone asking for money or personal information, especially if you’ve never met them. Watch out for anyone who refuses to video chat or meet in person because this could be a sign that they are not who they claim to be. If you suspect, you may be a victim, stop communicating with the scammer immediately and report the incident to the authorities.
  • Educate yourself on the latest scams and fraud tactics. Educate yourself so that you can recognize their schemes and protect yourself. Look up the latest scams and read reports on how others have been targeted by using BBB Scam Tracker. 

Scammers and cybercriminals are using ChatGPT to trick people into giving away their personal information and money, and they’d love to take you for everything you’ve got. However, being cautious can protect yourself and your data even from AI-generated threats.

 

For more information

Learn more about protecting yourself against cybersecurity threats on BBB's Cybersecurity HQ.

Check out BBB's Scam Tips to stay updated on the latest scams. Learn more about impersonation scams, phishing scams, and social media scams.


BBB of Central East Texas contributing this article.