Skip to main content
MYFAU Guest homeNews home
Story
4 of 10

AI revolution in higher education: College professors navigate new territory

As artificial intelligence tools like ChatGPT and Claude evolve, so do universities’ responses to the digital shift. Institutions are quickly developing new strategies, from reshaping curriculums to appointing AI-focused roles, to prepare students for an AI-driven future. Florida Atlantic University is among several state universities already grappling with how to best find uses for AI...

As artificial intelligence tools like ChatGPT and Claude evolve, so do universities’ responses to the digital shift. Institutions are quickly developing new strategies, from reshaping curriculums to appointing AI-focused roles, to prepare students for an AI-driven future.

Florida Atlantic University is among several state universities already grappling with how to best find uses for AI in the classroom. Some faculty have fully embraced the burgeoning technology, while others are reluctant to do so because of ethical concerns. In any case, they realize that AI will be an integral part of campus life in the years to come.

James Capp, associate vice president of Strategic Planning & Student Success at FAU, estimates there are between 2,000 to 3,000 students enrolled in AI-related courses at FAU. 

AI in the classroom 

Professors are at a crossroads with AI tools like ChatGPT: some see them as allies in teaching, others as threats to learning.

In one FAU classroom, AI philosophy expert Garrett Mindt turned ChatGPT into a teaching tool, having students analyze AI-generated essays to uncover both their brilliance and their flaws. 

Down the hall, computer science professor Sven Thijssen takes the opposite approach in his AI course — ironically banning his students from using the very technology they’re studying. 

“I think once students were able to see the cracks after they learned more about the topic, they kind of see some of the deficiencies, but then also things that it did well and what the limitations are of it,” Mindt said regarding his assignment to critique AI-generated essays. 

But Mindt’s openness toward AI in the classroom has its limits. He remains hesitant to use AI in his own writing and research and prohibits AI usage for most of his courses, as he wants students to provide organic perspectives. 

“The thing I want people to do is sincerely engage with the content of the course and develop their own critical points,” he said, referring to his writing-intensive Intro to Philosophy course. “I think it can be very easy to slip into this state where if you outsource too much of the process of writing about philosophy to something like a large language model… you might lose out on that opportunity to just think about what your own opinion is on a topic.”

Meanwhile, Thijssen stands firm in his complete rejection of AI tools in his Introductory AI course.

“I’m not keen on students using ChatGPT,” he said, adding that whatever the technology returns should be “taken with a grain of salt.” “The students have to learn and not try to use a tool. [AI tools] are not as strong as we think they are.”

Capp explained that faculty members set their own rules for using AI on assignments, which are outlined in their syllabi. FAU provides three standard options for instructors to include: “AI encouraged,” “AI flexible,” or “AI prohibited.” More details can be found on FAU’s AI Syllabus Statement page. Not following these rules could violate FAU’s Code of Academic Integrity.

The evolution of AI, university responses

Beyond the individual classroom debates, universities are racing to implement AI at a broader institutional scale. 

FAU’s Center for Online and Continuing Education and the Office of Information Technology has launched a new resource called the “AI PLAIGROUND.” This platform provides live sessions on using Microsoft Copilot, a generative AI tool, and includes a course called “AI Chronicles: Building Fluency In Generative AI.”

The University of Florida has taken a similar approach, launching an initiative that aims to recruit 100 faculty members dedicated to AI research.

According to Joel Davis, a business professor who joined UF as part of this initiative, over 12,000 students at the university now receive classes on AI.

Davis has 25 years of experience in the analytics and AI industry and is currently teaching AI courses to business students. He said he uses the free version of ChatGPT in the classroom because it’s the generative AI tool most students are familiar with.

He occasionally provides a chatbot with an outline or syllabus and asks it to identify any missing gaps in the course material or suggest ways to enhance it. Davis also allows students to use generative AI in almost all of his courses and assignments except for exams, under one condition: they must disclose if and how they’ve used it.

“The goal here is to really democratize the use of these particular tools and technology and also develop… AI literacy across our student population,” he said. 

California State University, Sacramento appointed Alexander “Sasha” Sidorkin as their chief artificial intelligence officer while creating The National Institute on Artificial Intelligence in Society — a research center focused on understanding AI’s impact on society. 

Sidorkin’s role has three main functions: helping faculty adapt to AI in their teachings and assignments, speaking to various groups about AI and making his institution more efficient by using this technology.

He also runs weekly workshops for faculty and will implement an “AI Champions” program in the coming academic year. Each college will have a designated faculty member working with colleagues on AI implementation strategies. 

“I use [AI] every day, many times a day, for almost everything I do,” Sidorkin said. 

Paul Marty, an information and technology professor at Florida State University, drew parallels between the current AI revolution and the emergence of the World Wide Web in 1989.

“When ChatGPT was released in November of 2022, suddenly a bunch of people were like ‘Wow look what we now have access to’ and ‘look what the general public has access to,’” said Marty. “As a result, a lot of people were like ‘Well now we really have to think about how this is going to transform education because now we have a tool that’s been made available to everybody that we’re really not prepared to handle.’” 

Marty said the speed with which citizens adopt technology can present a problem for institutions, which aren’t as nimble. The process, in his opinion, can keep colleges and universities from having the time to consider how to best make use of technological advances.

“Every new technology that’s introduced gets adopted faster and faster,” Marty said. “So as a result, we have less time to think about how our institutions, and particularly the institution of higher education, has to evolve to keep up with this changing technology.”

Cheating and student responses

This fast rise of AI has forced professors to rethink how they prevent cheating in their classes. 

Garret Merriam, a philosophy professor at California State University, Sacramento has moved away from traditional essays to oral exams that resemble conversations, fearing that students might use AI to short-circuit the writing process.

“Worst case scenario [the students] use AI to put their notes together,” Merriam said. “But even then, if they can’t think on their feet and respond in real time, then I know that they don’t really understand the material and they’re not really thinking for themselves.” 

Merriam conducted an experiment to catch students cheating on one of his final course exams by uploading wrong answers to Quizlet, an online tool designed to help students study, or in some cases, cheat. Using probability, he caught 40 of his 96 students cheating on an 80-question test. 

“I didn’t get into this job to be a cop,” Merriam said. “I don’t want to be spending all my time and my efforts policing my students and forcing rules on them. There’s a difficult balance between trusting students, letting them run with the new tools at their disposal and finding ways to use it that are ethical and constructive.”

Merriam devotes some class time to talking about the ethics of AI. He occasionally uses this technology to help him come up with quiz questions and advocates for using it as a tool rather than a solution. 

“I tell all of my students before they have to write a paper that I’m interested in something that only they can give me, that no machine can give me,” Merriam said. “I want to know what you think. And if I wanted to know what the machine thought, I could just ask the machine myself.”

However, some educators are shifting their focus from policing to education.

“I think faculty members historically lean too far on the side of ‘I’m trying to catch people who are not doing the right thing.’ If I’m spending my effort policing something, then I’m not spending my effort teaching something,” said Davis.

Sarah Segall, a psychology major at the University of California, San Diego, is conducting a research project examining how both students and teachers use and perceive AI.

Her research has found that students primarily use ChatGPT for brainstorming, or for clarification when they run into a concept they can’t grasp. Segall, who uses AI daily, said her use of ChatGPT is especially effective when subjects are difficult to understand.

“I’ll turn to it to elaborate on concepts. If I have a lecture slide that I didn’t know what the professor was talking about, I’ll take a screenshot, I’ll put it in and ask it to explain it for me,” Segall said. “It’s been really helpful for studying and generating examples for me.”

Segall said she found herself surprised that students admitted to submitting AI-generated work as their own.

Using AI as a tool 

Merriam believes universities should prioritize faculty development to help those who are unsure about how AI should impact their classes. He also thinks faculty should allow their students to use the available AI tools, with some parameters.

“We need to prepare students for the world in which they’re going to live, and a world where they are going to live is a world in which artificial intelligent writing assistants will be ubiquitous,” he said. “So, it’s foolish to stick your head in the sand and pretend that these things don’t exist, and tell your students to never use them.”

At the beginning of the calendar year, Marty ran a series of faculty workshops at FSU specifically designed to change perspectives about AI. Professors learned to use generative AI to help draft reference letters tailored to a student’s achievements and the specifics of a given job. They also used the technology as a brainstorming tool. 

“For the vast majority of faculty who came into that workshop, the only thing they knew about ChatGPT was that this is some new tool that students were going to use to cheat,” Marty said. “My entire goal with that workshop was to not talk about cheating at all. And instead, spend 90 minutes showing all the things that you can do with generative AI to improve your teaching, to improve your research that has nothing to do with cheating.”

Sidorkin uses AI to analyze student discussions and identify patterns in areas where students may not fully understand the material. Its nimble nature makes for an effective classroom tool, he said.

“[AI] allows to adjust instruction as we go, which is very difficult to do with just human memory and human mind,” Sidorkin said. 

Sidorkin uses both ChatGPT and Claude – another generative AI tool with similar capabilities – for different purposes. He explained that Claude has a larger context window, allowing it to process more information at once than ChatGPT. However, ChatGPT has direct internet access, allowing it to browse the web, a feature Claude lacks.

“If I want knowledge, I’ll probably go to ChatGPT. If I want to write and analyze, then I’ll go to Claude,” Sidorkin said. 

Davis said generative AI can be used as a “non-threatening personalized tutor.”

“It might be that [students are] shy, or they don’t want to come and talk to me in office hours because they’re worried how that will look if they ask a stupid question. They can ask a stupid question to the machine and the machine doesn’t care, and they’ll still show them how to get to that answer,” he said.

Sidorkin also sees personal tutoring as a use for AI, which would eliminate the potential wait students might find trying to get academic services at their institution.

“Of course, unlike tutors, AI is available 24/7,” Sidorkin said.

Marty also mentioned that generative AI can help students create practice quiz questions. He used generative AI to brainstorm ideas for his book cover.

“That’s what I find to be most valuable for me and my work, that if I’m stuck trying to come up with an idea, I can ask any number of AI tools…” Marty said. “Give me 100 ideas for something, and it’ll just do it. I’m not saying that you’re going to take whatever the AI came up with and just use it without changing it, but when you read through 100 randomly generated ideas, it’s going to get your own creative juices going.” 

When ChatGPT first debuted to the public in 2022, Sidorkin began writing a now-published book about how universities can use AI tools to boost student success. His view is that faculty should embrace these tools, and he doesn’t have many ethical concerns about their choice to do so.

“All this exaggerated talk about ethics I think is really misguided,” Sidorkin said. “There is nothing unethical about using AI. It’s not something shameful. In fact, you should be proud that you have those skills.” 

Looking forward 

Merriam said the tricky part will be ensuring that students know what the acceptable uses of AI are.

“A number of students who I’ve caught cheating have made the argument that they didn’t realize what they were doing was cheating,” he said. 

Marty pointed out that academic integrity was a concern well before the rise of generative AI. 

“Even 10 years ago, how do you know the student wrote the paper? They could have paid someone to write the paper for them. There were always essay bills out there,” Marty said. “All we’ve done now is, in essence, put those essay bills out of business.”

Robert Ives, a special education and statistics professor at the University of Nevada, Reno who does research on academic integrity, emphasized the need for clear guidelines that may vary “from class to class and assignment to assignment.”

“These applications are going to be changing the way that we do a lot of things in higher education,” Ives said. “I think it would be naive to suggest that we could somehow cloister higher education as an area in which we cut that off. I think that’s both naive and counterproductive.”

Sidorkin believes the integration of AI into education will be gradual.

“It’s not going to be overnight,” he said. “It will probably take 10 years for people to fully embrace AI, but it will make us more efficient if we’re smart about it.”

Marty encouraged students to learn how to use this technology before they enter the professional world.

“If you want a job, if you want a career, you have to be better than the AI, which means you need to know how to use the AI as a tool and then go beyond it. And that’s what we have to help our students learn how to do in higher education.”

The challenge, for some professors, lies in finding the balance between leveraging AI’s benefits while ensuring students develop the critical thinking skills necessary for their future careers. 

“All of these tools make critical thinking all the more important than it ever was,” Marty said. “You can’t trust your own eyes, you can’t trust your own ears, you can’t trust what you read. We have to all develop critical thinking skills to survive a world where all of this content, all of this information can be generated on the fly in a way that looks extremely accurate, but may not be.”

Laurie Mermet is the Student Life Editor for the University Press. For information regarding this or other stories, email lmermet2022@fau.edu or DM laurie.mmt on Instagram.

Latest University Press