
Art by Mariah Mapa
Transparency Item: The Perspectives section of the Graphic is comprised of articles based on opinion. This is the opinion and perspective of the writer.
Some may expect to catch students frantically jotting down answers from ChatGPT mere minutes before an assignment is due.
These Large Language Models (LLMs), along with generative AI, are commonly used by students. In 2025, 92% of students used AI when it comes to academics, according to Programs.
This is far from the most jarring statistic when it comes to AI and education: 53% of education leaders have used AI daily to assist with their jobs in 2025, according to Microsoft.
With the majority of educators using AI, it raises the question: Should professors be involving AI in college classes?
Generative AI can pump out every image needed for a cohesive slide deck, and LLMs can give all the best compilation of words to go along with it. It’s estimated teachers can save up to six hours per week by using AI for their jobs, leaving them with much more time to focus on leisure, according to Gallup News.
Such convenience doesn’t come without a cost, considering “a typical AI data center uses as much electricity as 100,000 households,” according to the International Energy Agency.
AI usage also comes with significant water usage. Growing data centers are used to train various AI models requiring copious amounts of water to cool down the large computer systems, according to the Environmental and Energy Study Institute (EESI).
Larger data centers that train AI consume up to 5 million gallons of water a day, according to the EESI. This process is suggested to have contributed to the current state of water bankruptcy, according to The Week.
Regardless of the arguments for and against AI, it is growing ever-present in academic settings and becoming harder to pass off as a minor adjustment.
Although there are many opposing factors to consider, some professors have opted to utilize AI for lesson plans, lectures, homework rubrics and more, while others maintain a more anti-AI space when it comes to their own work and students.
Only 5% of university students responded they feel fully aware about AI regulations and policies at their university, according to the Digital Education Council.
In this supposed “age of AI,” we should be holding educators accountable to defend their policies on AI usage going foward. With such a looming factor in our technological world and environment, a lack of curiosity is often handed off to their students as well.
Choosing whether or not to use AI in the classroom without any open reasoning regarding the topic is also a common fault. Students require these conversations about topics as pervasive as AI, which can lead to an increase in awareness.
A shocking 64% of Americans are unaware when they are actively using AI, which can then lead to under-utilization or even cybersecurity threats, according to Gallup News.
ChatGPT, in regard to cybersecurity and AI cognizance, is not just a helpful tool to get students and teachers through the most grueling parts of their obligations. Recently, its parent company, OpenAI, partnered with the US Department of War in a contract that allows the Pentagon any lawful usage of their AI systems, according to The New York Times.
This leads many to wonder what future intent OpenAI has with its consumer base, and how secure users’ data really is or will become.
Educators must create a stance and express it openly when given the opportunity to foster curiosity and growth, not just in principle, but regarding the constantly changing structure of technology.
Educators should encourage their students and colleagues alike to do their own research and encourage conversation about this new era of AI. As the world undoubtedly moves forward, we must not let major advancements simply pass over without a second thought.
___________________
Follow the Graphic on X: @PeppGraphic
Contact Karina Guzman via email: karina.guzman@pepperdine.edu or by Instagram @wdymusernamealreadyexists
