Artificial Intelligence is sweeping industries. Is this a good thing?
Let’s talk about A.I.
We tried ChatGPT so you don’t have to.
As SZA said, let’s talk about A.I.
In November 2022, Open AI unveiled Chat GPT, an AI generator that can produce articles. To test the validity of ChatGPT, we put in a simple prompt: write an article about Megan Thee Stallion. While it was mostly accurate, it contained a few errors, which I will outline in bold below:
Megan Thee Stallion, also known as the “Hot Girl Coach”, is a rapper, singer, and songwriter from Houston, Texas. She burst onto the scene in 2019 with her hit single “Hot Girl Summer” and has since become one of the most popular and influential female rappers in the music industry.
Megan was born Megan Jovon Ruth Pete on February 15, 1995, in Houston, Texas. She grew up in a musical family and began writing her own raps at a young age. She attended Prairie View A&M University, where she studied health administration, but ultimately dropped out to pursue her music career.
Megan first gained attention in 2016 with her mixtape “Rich Ratchet,” which she released under the name Tina Snow. She followed up with a series of successful mixtapes and singles, including “Tina Montana” and “Big Ole Freak,” which helped establish her as a rising star in the hip-hop world.
In 2019, Megan’s career took off with the release of “Hot Girl Summer,” a collaboration with Nicki Minaj and Ty Dolla $ign. The song became a viral hit, spawning countless memes and becoming a cultural phenomenon. Megan’s confident and empowering lyrics, along with her infectious flow and charismatic personality, quickly made her a fan favorite.
Megan’s success continued with the release of her debut album “Fever” in 2019, which received critical acclaim and debuted at number 10 on the Billboard 200. She followed up with her second album, “Good News,” in 2020, which included the hit singles “Savage” and “Body.”
The first inaccuracy is the sentence that states she dropped out of Prairie View A&M University. This is factually incorrect since she graduated from Prairie View in December 2021. This was not the only error we noticed. Later, the article incorrectly names “Fever,” a mixtape, as Megan’s debut album. Megan’s first album was Good News and her second was Traumazine, which the article fails to include. However, outside of these two fixes, the article is pretty good. So what does this mean?
It means that A.I. is as the name implies – intelligent. As technology evolves, humans are being fazed out of creative processes. Moreover, some companies are embracing the use of A.I. for so-called efficiency.
News outlet CNET has been transparent about using A.I. to generate articles, although they explicitly state that they do not use ChatGPT and instead utilize an internally designed generator. In an explainer piece about this, CNET’s Editor-in-Chief Connie Guglielmo said they “published 77 short stories using the tool, about 1% of the total content published on our site during the same period.”
After another news outlet pointed out flaws in the articles, CNET heeded this message and conducted an audit, she continued. Their audit found that the articles were riddled with errors including “minor issues such as incomplete company names, transposed numbers, or language that our senior editors viewed as vague.”
Gugliemo did not disclose how many of the 77 A.I.-generated stories were corrected but committed to learning from these mistakes and continuing to use A.I. in the future.
Alas, A.I. writing is one of many derivatives of the technology. In Hollywood, filmmakers have been deep-faking actors into movies or photoshopping them into films that they did not actually act in. In 2019, Finding Jack’s filmmakers caught backlash for digitally resurrecting James Dean and using his likeness in a movie without his permission. Afterward, people such as Chris Evans and Zelda Williams (Robin Williams’ daughter) called out Hollywood for the exploitative nature of deep-faking.
Deepfaking did not stop with casting dead actors in new movies. Now, deep fake pornography has become a thing.
This means that people’s faces are placed onto others’ bodies to give the illusion that they are participating in pornography. This has garnered much-needed backlash due to the lack of consent. QTCinderella, a popular Twitch streamer, talked about how traumatizing it was for her to find herself photoshopped into a pornography video.
She was one of many women who were violated in this way. Yet, “deep fake content is still in a legal gray area in most states, with only Virginia and California banning it outright,” according to Buzzfeed.
These legal concerns are not limited to A.I. pornography. A.I. Art has raised some ethical worries as well.
Earlier this month, Getty Images filed a lawsuit alleging that Stability A.I., an A.I. photo generator, plagiarized over 12 Million photos and other digital assets, according to Yahoo News. This news comes just months after A.I.-generated artwork took Twitter by storm. Afterward, artists expressed opposition to A.I. Art for a number of reasons. One of the main ethical concerns with A.I. Art are copyright laws.
In a countersuit, Stability A.I. claimed “fair use” when it came to the stolen images. “Fair use” states that copyrighted material can be used for commentary, news, and other informational purposes. However, A.I. engineers are “using copyrighted works to train AI tools” and subsequently allowing users to create new works based on others’ copyrighted works so this is not “fair use,” artists argue.
This is not the only danger behind A.I. Art. To cartoonize themselves, people must upload photos into these A.I. generators, which could create a data privacy issue. Prisma, one of the most popular A.I. art generators, notes in its fine print that users:
“[Users] grant us a perpetual, revocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable, sub-licensable license to use, reproduce, modify, adapt, translate, create derivative works from and transfer your User Content, without any additional compensation to you and always subject to your additional explicit consent.”
Given these concerns, artists have opposed A.I. art on Twitter. However, at the heart of their disdain is the issue of plagiarism. In a long-form video exploring the dangers of A.I. art, Youtuber Illuminaughti noted how popular publisher Tor used A.I. artwork for one of their novel covers. After this was revealed, Tor released this statement via Twitter:
Tor claims they were unaware that this was a computer-generated image but still moved forward with publishing the novel due to “production constraints.” For Gizmodo, Linda Codega wrote, “To base covers almost entirely on AI-generated images devalues the hard work of everyone involved—from the authors and artists to the designers, editors, and everyone else at the publishing house.”
Nonetheless, the issue with A.I. is much larger than Tor. It is a phenomenon that spans industries. As evidenced by our own demo with A.I., the programming has some kinks to work out, but there is no telling how long it will be before everything is automated.