If you think what happened to Taylor Swift’s deepfakes are bad, buckle up when the US election is in full swing.
More lies and deceit will come out and artificial intelligence will fuel the fire.
Here we go again, just like the Internet and social media before AI, as Mark Zuckerberg’s famous motto made Facebook one of the biggest successes in the startup world, it also has become the biggest threat to democracy, as we have seen in 2016.
If you think 2016 was bad, it can worsen in 2024.
. . .
Taylor Swift AI
It wasn’t the first time, deepfakes and AI porn have been happening in the last few years. With more powerful generative AI tools available today, as with any other technology breakthrough, porn will always be one of the first to use these tools as we have seen with the Internet and social media.
There has been more porn available today than at any time in recent history.
. . .
With generative AI tools, there has been an increase in deepfake content, and it is no longer the traditional deepfakes wherein a photo is embedded in another person’s face, these generative AI tools can already create deepfakes with only a few words as a prompt.
As we have seen with the Taylor Swift AI pictures while nobody in their right mind would believe it is her but these generative tools have gone a long way in producing AI porn and deepfakes that wasn’t possible a few years ago.
And it will only get worse.
In some way, if not for Taylor Swift the White House wouldn’t be this “alarmed.”
Congress has yet to act and so are the companies behind the biggest AI companies like Open AI it is reminiscent of the wild wild days of social media.
“Move fast and break things.” — Mark Zuckerberg
X formerly known as Twitter swiftly removed the images, even though X or Twitter remains a social media platform where nudity and porn are allowed.
Even if they say that non-consensual nudity isn’t allowed there is a proliferation of porn that very much appears to be non-consensual, especially in third world countries like the Philippines.
But then, it takes a Taylor Swift AI scandal for big tech to take action, I guess they don’t want the White House to be alarmed.
. . .
US Elections
Recently the courts awarded Trump accuser E. Jean Carroll $83.3 million for a defamation case she filed against Trump, it was the second case and she won again.
In an unsealed court document, the former president misrepresented her statement from a 2019 interview with Anderson Cooper, the former president said that E. Jean Carroll said that “rape was sexy” and “indicated that she loved it”
Not only was the context of her statement missing, but his interpretation was false.
A lie that is perpetuated on social media, it was trending on Twitter.
It is very common to see spliced videos, and voice recordings of popular politicians, during the New Hampshire primary, this fake Biden robocalls were circulating that the president was urging people not to vote in the primary.
Again, all falsehoods.
. . .
Final words
Will Congress act on generative AI or regulate artificial intelligence? But even if it does, will it stop other countries from launching lies and deception using AI?
As the US elections loom, who knows what kind of misinformation would come out from these generative tools?
The case of Taylor Swift’s AI pictures is just the tip of the iceberg, the future of AI is in the hands of a few, and who knows where will it take us in the future.
Thank you for reading.
—
This post was previously published on DataDrivenInvestor.
***
You Might Also Like These From The Good Men Project
Join The Good Men Project as a Premium Member today.
All Premium Members get to view The Good Men Project with NO ADS.
A $50 annual membership gives you an all access pass. You can be a part of every call, group, class and community.
A $25 annual membership gives you access to one class, one Social Interest group and our online communities.
A $12 annual membership gives you access to our Friday calls with the publisher, our online community.
Register New Account
Need more info? A complete list of benefits is here.
—
Photo credit: iStock