AI harms human integrity

In a world where AI advancements are making remarkable progress, many are left wondering if their job security is at risk. With machine learning algorithms capable of performing tasks once thought to require human skills, the threat of AI taking over the job market is no longer a distant possibility, but a looming reality.

The lead written above was actually written by ChatGPT, an AI that writes based on a given prompt. One could argue that AI writing is extremely close to human scripture, but it isn’t close. It’s already there.

Measures need to be taken to ensure the integrity of human cultures before they are muffled by an onslaught of artificiality.

In fact, the adoption of AI has already increased more than double since 2017, according to “The state of AI in 2022-and a half decade in review” on www.mckinsey.com.

The human race is becoming a lot more dependent on AI and the only thing that stops the artificial intelligence’s ability to be human is its digital existence. Still, there are some artificial intelligence robots like the Hong—Kong—designed Sophia that could easily pass as human if not for its, rather, inhuman appearance.

Amazon Fresh, Amazon Go and Walmart are now promoting a more technology reliant grocery shopping experience, which takes away the role of human cashiers. While those who spearhead these companies get more money to themselves, they are robbing workers of jobs because of the implementation of technology and advancements in AI.

Currently, the only thing stopping artificial intelligence from independent thought is the need for human input. When artificial intelligence begins to feed itself ideas, it will begin to fully realize independent thought.

Organizations like Open AI promote a business model for free access AI writing and art with their premier artificial intelligences ChatGPT and Dall-e 2 respectively.

Students are using ChatGPT in order to fake writing, which violates academic integrity. These students are not cultivating their own minds, but rather cultivating another prompt in ChatGPT’s scripture generation.

When studying history, civilizations are purely defined by what they write, build, and draw. When humans of the year 2023 are dug up by archaeologists, they will most likely mistake AI creations for human.

AI is ruining humanity’s integrity as a whole. While AI does the work, humans are not encouraged to think for themselves and the human race becomes duller for it.

Thus, in advancing AI, people are not advancing themselves mentally and creatively because they are claiming AI’s work to be their own.

On top of that, artificial intelligence steals art and writing to learn its behaviors in, often, unethical forms. Art models like Stable Diffusion are pulling from a wide variety of copyrighted artworks in order to teach its AI to better create art.

AI is learning to copy art and writing in order to work. Stable Diffusion for instance, is trained by 2.3 billion images in order to make its art.

Sites like creator.nightcafe.studio are monetizing Stable Diffusion’s implementation, only permitting so many art works a user can complete before needing more credits to do so. Many of these images previously had watermarks because they were blocked by payments.

In fact, Getty Images is suing Stability AI because they have illegally scraped millions of photos off their website without adhering to copyright. This could be the start of a wave of photographers and artists claiming their art to be stolen, and shutting down the sites creating it.

One might say that human artists also use copyrighted works in order to learn and become better artists, but human artists are not blatantly mimicking the Getty watermark onto their pieces. Human artists are not infringing on copyright when they are parodying artwork in their own sketchbooks, while Stable Diffusion just takes the image and repurposes it into another.

AI is becoming too integrated with human culture and needs to be stopped. People must stop using AI for legitimate work because they are only really hurting themselves.