Artificial Intelligence, commonly known as AI, has permeated our way of life. It feels like every tech corporation is pushing their latest AI tool or feature to help make your life better. Unfortunately, with AI going mostly unchecked, it can easily become a detrimental tool for the wrong people. Deepfakes in particular can be manipulative and dangerous. Here’s what you need to know about deepfakes and how to combat them.
Links:
- Report deepfakes with the Internet Crime Complaint Center
- Check out TCU University for financial education tips and resources!
- Follow us on Facebook, Instagram and Twitter!
- Learn more about Triangle Credit Union
Transcript:
Welcome to Money Tip Tuesday from the Making Money Personal podcast.
If you don’t already know, deepfakes are videos, images, or audio of someone or something that has been altered by AI. These deepfakes can be used to portray a person doing something that they haven’t done and or wouldn’t have done. This can easily be a conduit to spread disinformation.
For example, deepfakes can be made of politicians to make it seem like they said something or did something, when in reality it was created by AI. The deepfake can make the politician say something that might be controversial or visually put the politician in a compromising position. This can be devasting to their campaign, especially if people believe it is real.
Similarly, celebrities have been mimicked by deep fakes. Some of it is innocent, such as de-aging an actor for a movie. However, celebrity deepfakes have been used to endorse products or politicians without the celebrity’s consent.
You don’t have to be famous to be a victim of deepfakes. If scammers can get a recording of your voice, they can make you say whatever they want with AI. They can then call people that you know and talk to them with your voice. Scammers use this technique to then scam your loved ones into thinking you’re in some kind of trouble and need money. Similarly to politicians and celebrities, your likeness can be recreated with deepfakes. If someone has images or videos of your face, they can make a deepfake of you doing whatever they want.
Fortunately, there are ways to decrease the likelihood of having a deepfake made of you, or at the very least make it more difficult for scammers to create one of you. Be careful with what you share online and who you share it with. Scammers need images, videos, or audio of you to create a deepfake. The more media they have of you, the easier it is to make a realistic deepfake. Only share your photos and videos with people you trust. If you use social media, limit who can see your posts. You can also watermark your media which makes it harder to make a deepfake and also makes it easier to trace who created it.
If you find deepfake content of yourself or someone you know, report it on the platform it’s hosted on. You should also report it to the Internet Crime Complaint Center. If you are the victim of a deepfake, you may want to consult legal counsel and find out what your next steps are.
If there are any other tips or topics you'd like us to cover, let us know at tcupodcast@trianglecu.org. Also, remember to like and follow our Making Money Personal Facebook and Instagram to share your thoughts. Finally, remember to look for our sponsor, Triangle Credit Union, on Facebook and LinkedIn.
Thanks for listening to today's Money Tip Tuesday. Check out our other tips and episodes on the Making Money Personal podcast.
Comments (0)
To leave or reply to comments, please download free Podbean or
No Comments
To leave or reply to comments,
please download free Podbean App.