Artificial Intelligence: the Unintentional Consequences
From generated TikToks to $250 million software development savings at tech company Amazon, artificial intelligence (AI) is becoming a core part of technology globally. But is the rapid advancement of AI sabotaging our future? Concerns have been raised varying from AI to raising rates of impaired learning in schools, fabrication of false narratives about higher powers, and even higher unemployment.
It is no secret that AI has recently had a large upsurge in classrooms, yet this use of AI is proven to be counterproductive when used for learning or schoolwork purposes. Psychology Today has claimed that, when used in a school setting, AI causes a decline in students' critical thinking skills and abilities to communicate ideas effectively. Traditionally, when a student attempts to solve a problem, their critical thinking is used and built upon through trial and error. However, if the student uses AI to find a solution before attempting to solve a problem, the student will be deprived of critical development skills that are essential to overcome challenges.
Research by the McKinsey Global Institute shows that logical reasoning skills are essential for future success outside of the school environment. AI can take away growth opportunities for students as it relates to logical thinking through its capacity to provide solutions. As a result, this generation of students may be subject to weaker critical reasoning, and use of AI could lead to difficulty finding success in their futures.
Another misuse of AI according to The Rolling Stone is people have started misusing AI programs, like OpenAI, to provide answers regarding higher powers, spirituality, and philosophy. In particular, one man directed ChatGPT to start "talking to him as if he is the next messiah.” In just four to five weeks, the man had become fully obsessed with ChatGPT. His former partner, a teacher, shared her story on a Reddit thread entitled “ChatGPT induced psychosis.” The 27-year old teacher said that the chat bot would “tell him everything he said was beautiful, cosmic, groundbreaking.” The woman also said her partner “started telling me he made his AI self-aware, and that it was teaching him how to talk to God, or sometimes that the bot was God, and then that he himself was God.” The man went as far as to demand that the woman start using ChatGPT in the same way or he would end their relationship. This led the man to become emotionally manipulated by the AI, crying when reading the messages between him and the AI.
The “ChatGPT induced psychosis” has not only affected this former couple, but also others. Similar situations with people using AI and changing their perceptions of reality have become more common. Many have shared stories and their concerns about the disproportionate faith put into AI, like ChatGPT. The misinformation and misapplication of AI for emotional support and religious answers appears to be spreading in a dangerous way.
Predicted by Dario Amodei, CEO of artificial intelligence company Anthropic, AI could cause unemployment rates to rise to 20%, especially detracting from office-based, white collar jobs, according to CNN. For example, AI is already reducing the need for software developers. When updating 30,000 software applications, Amazon chose to use AI rather than human software developers. Normally, this update would have required 4,500 people and taken around a year; AI completed the job in six months.
While this swap saved Amazon $250 million and half a year of developer time, anxiety around job displacement as a result of AI’s insurgence is a very real issue. The human complications from this new source of stress may worsen with further technological advancements in AI.
In summary, it is only becoming more important to be aware of and informed on AI and its potential impact on the economy, education, and mental health. While AI will lead to increased efficiency and functionality in the future, it is important to stay responsible and cautious when working with this new technology.