Credit: Liz Ridley / Mustang News

Landon Block is a political science junior and the Opinion Editor at Mustang News. The opinions expressed in this article do not necessarily reflect those of Mustang Media Group.

When I check my class syllabus at the start of every new quarter, I look for three things: the final exam date, the percentage I need to earn an A and, more recently, the professor’s AI policy. Each professor and course is different; I’ve had classes where AI was enthusiastically encouraged, and others where it was outright banned.

That’s why I was so surprised to see an announcement in my inbox from the CSU Chancellor’s Office on Feb. 4 detailing our new status as the “nation’s first and largest AI-powered public university system.” 

To me, this reads as an unprecedented systemwide endorsement of AI tools like ChatGPT. 

Don’t get me wrong, I’ll be taking advantage of a free ChatGPT-4o subscription as soon as everybody else. At the same time, I’m worried about the dangerous precedent of hooking students on AI tools, seemingly without input from faculty

The Cal State system is unjustly inserting itself in what should be a classroom-to-classroom dynamic. If professors want to encourage AI use for their assignments, great. If students decide on their own that the benefit is worth $20 a month, great. I believe it is not the Cal State’s place to provide these tools to students without meaningfully consulting faculty. 

Administration owes more deference to the faculty that make our universities as great as it is. The textbook puts the answer set at the back for a reason: relying on it too much defeats the purpose of learning foundational skills. 

This is a disrespectful move that undermines tens of thousands of faculty members’ abilities to implement their classroom policies. 

Before I dive in, I want to be clear that I understand the profound power AI tools can bring students. As an ethics, public policy, science and technology minor, I’ve spent the past three years studying AI and its impact on society. In my research with the Cal Poly Ethics + Emerging Sciences Group, I’ve contributed to reports on generative AI in entertainment and AI risk assessment frameworks. And, of course, I’m a student who’s used AI tools to help me with my assignments. 

Professors deserve to make certain decisions on what is best for their class environment. Yes, some may be sticklers about new technology in general. But we can’t ignore the underlying message nearly every hesitant professor will tell you: if we’re all relying on AI tools for our assignments, we’re not developing key skills and our $60,000 degrees begin to lose their value in the eyes of employers. 

One reason I prefer using the free version of AI tools is that it will stop me from accessing premium models after a certain amount of usage. It reminds me, ‘Hey, remember you have a brain and critical thinking skills; you don’t need me for everything.’

It’s almost like having the answers at the back of the textbook. At the start, you use it to check your answers or help you when you’re stuck. After some time, it’s hard to resist immediately looking back any time you face an inconvenience. 

The free models act as a roadblock to stop you from flipping back too many times, but a free premium subscription provides no external motivation to struggle in search of a solution. 

I’m not against students using ChatGPT, Claude or NotebookLM to help them fill in the gaps between lectures. I use these tools, too. I am, however, hesitant to encourage students to rely on AI tools to finish every assignment. Moreover, I am afraid this announcement will undermine professors’ individual AI policies. 

Patrick Lin, the director of the Ethics + Emerging Sciences Group and a philosophy professor at Cal Poly, told Mustang News that neither his team nor seemingly faculty at large were brought into this decision. 

“At Cal Poly, we’re internationally known for our work in AI and other technology ethics, so it’s a bit mind-boggling why CSU wouldn’t even consult with their own in-house experts for free,” Lin said.

An email sent out by the California Faculty Association also indicated resistance to the announcement.    

I’m sure the Cal State administrators put thought into this decision. Maybe they truly believe that giving unfettered access to premium AI tools will only benefit students. But this comes off as an overriding approval of AI, regardless of what individual professors want for their classrooms. 

There are also huge privacy and environmental concerns about introducing AI on such a large scale. These are real fears I share and could go into at length, but that isn’t my main concern today. AI developers need to focus on these issues, and the Cal State management should focus on enhancing our education. 

Simply put: the Cal State system is unjustly inserting itself in what should be a classroom-to-classroom dynamic. If professors want to encourage AI use for their assignments, great. If students decide on their own the benefit is worth $20 a month, great. I believe it is not the Cal State’s place to provide these tools to students without meaningfully consulting faculty. 

Administration owes more deference to the faculty that make our universities as great as it is. The textbook puts the answer set at the back for a reason: relying on it too much defeats the purpose of learning foundational skills. This is a disrespectful move that undermines tens of thousands of faculty members’ abilities to implement their classroom policies.

Landon Block is the Opinion Editor for Mustang News. He started in journalism as a guest contributor to his high school paper, the SDA Mustang, and has since joined the San Diego Union-Tribune as a Community...