Photo by: Sanket Mishra via Pexels

ChatGPT– it’s a term that has become ubiquitous with modern education, a program emblematic of the rapidly changing field of artificial intelligence. Controversially, the program, just one year old, has found a home in classrooms around America.

As students are using the artificial intelligence tool to find answers in their schoolwork, educators are looking for answers in managing the platform’s use in the classroom. Professors, universities and governments are faced with the challenging reality of ChatGPT and its use in the world of education. 

What is ChatGPT?

Spend a few minutes with ChatGPT and you’ll realize two things– it offers an incredible breadth of information and its reliability is questionable. The current program is a large language model, its intelligence is based not on thought, but on prediction. 

Photo by: David Go

“What it actually does is, given a sequence of words it predicts what the next word could be,” said Dr. Kaushal Chari, dean of UWM’s Lubar College of Business. “ChatGPT can generate new content from existing content. It is based on the large language model, and the large language model basically is trained on a sequence of words.” 

Chari was recently appointed to Governor Tony Evers’ task force on workforce and artificial intelligence. The dean is part of a subcommittee to advise the governor with informed predictions on AI and its future impact.

How ChatGPT is Being Used

And though AI technology is sure to advance, ChatGPT’s impact is already being found in daily work, with some professors already finding useful ways to integrate it into the classroom.

“I have seen applications of generative AI in creating new text for poems or essays. From textual description, it can create images,” said Chari. “I think it is a great tool for doing a lot of creative work.” 

Photo by: David Go

Many professors see ChatGPT as useful for idea generation and information gathering – helping students with the early stages of an assignment or project. 

“It gives us a starting place to move forward and further nuance our ideas,” said Dr. Sarah Riforgiate, director of UWM’s Center for Excellence in Teaching and Learning (CETL), designed to advise professors in areas such as generative AI use. “I encourage my students to use it for idea generation.” 

“I help my students see AI and how it can be used,” said Dr. Lane Sunwall, learning and technology consultant in the CETL. “ChatGPT, and AI more broadly, can help them, so I want them to use it. I’ve encouraged them to use it.” 

“I think ChatGPT and the AI-generated responses are great ways for students to learn general information about a topic,” said Rachael Jurek, senior teaching faculty member in the JAMS (Journalism, Advertising and Media Studies) program. 

“I don’t have a problem with people starting with it and using it as the beginning of a prompt,” said Dr. Michael Mirer, assistant professor in the communications and JAMS departments. “I do think getting familiar with those tools, there’s some value to it.”

Photo by: Matheus Bertelli via Pexels

Pitfalls and Drawbacks

Despite ChatGPT’s usefulness in education, it must be approached with caution. ChatGPT has a tendency to create quotes and present false information. Because it is a large language model, it has no discernment between fact and fabrication. Beyond that, it sometimes formulates quotations or information to support an argument it presents. 

“I’ve seen [instructors] using ChatGPT to pull up information on a topic, and then talking directly with their students about where this information come from and trying to locate quotes’ sources,” said Riforgiate. “[They] realize ChatGPT has completely fabricated the quotations.” 

“Around 70-80% of the text it produces tends to be problematic,” said Dr. Andrew Larsen, senior teaching faculty member in UWM’s history department. “The current ChatGPT has less than minimal value.” 

Larsen added that many professors he has spoken with advocate forbidding its use in the classroom. Though his conservative approach on forbidding ChatGPT may be a minority view among professors, there is nearly mutual agreement that the generative AI program shouldn’t be used too liberally. Even if ChatGPT were to present reliable information, its writing ability is questionable, even in the most optimistic viewpoint, according to Larsen.

“[Experts I know] say no, it’s never going to be an effective tool because by definition it operates by averaging,” said Larsen. “In theory, even if you could get it to produce a reliable text, it would probably be a C-level text.” 

The Plagiarism Question

The world of AI presents a new angle on one of the most loathed words in academia – plagiarism. Presenting ChatGPT’s output as a student’s own work poses a new form of academic dishonesty, dressing up the work of advanced technology as one’s own. 

“If you are using ChatGPT to compose your synthesis, that’s plagiarism because ultimately that’s not your synthesis,” said Mirer. 

“If they just use the output and use that as deliverable, there is a problem– that should not be allowed,” said Chari. “The fine line is that they can use ChatGPT to generate ideas and get some ideas to work on their assignments. If students just turn in the output from ChatGPT and submit that as an assignment, then that would be an honor code violation, in my view.” 

Because ChatGPT builds its database on the creative work of previous intellectuals, some see it even as plagiarizing the work of other individuals.

“If you go to AI and you say, ‘I want you to write an essay for me,’ that’s clearly not your work – that is the work of lots of different people that have been amalgamated together by AI,” said Sunwall.

Should UWM Create a ChatGPT Policy?

Because of the technology’s recency and complicated nature, the UWM does not currently employ a school-wide policy on the use of generative AI in the classroom. Instead, the university offers suggestions for professors managing its use within the classroom, such as making expectations clear or teaching students how to properly cite work. With its rising prevalence come calls for a universal policy to manage its use consistently within the institution. 

“I think that we have to draw a fine line,” said Chari. “As long as we use ChatGPT to generate new ideas, I’m fine with that; that is a legit use of the tool. Then, of course the students would have to do some additional work to use those ideas and build upon those ideas using their own creative thought process. The fine line is that they can use ChatGPT to generate ideas to work on their assignments.” 

“I think the university should have a policy against it,” said Dr. Eric Lohman, teaching faculty in the JAMS department. He later clarified that the policy should prohibit unrestricted use – there are uses for the technology but should also be limits. “[I hope we] make it something where ChatGPT wouldn’t be that attractive to students.” 

“I think we need to provide guidance on how we expect our students to use ChatGPT,” said Riforgiate. “Because you’re working with very different disciplines across various schools and colleges, different students have different needs– one blanket policy is not going to get at ‘what are students’ needs?’” 

Many believe that students’ needs must be front and center in the conversation between AI and education. Generally, higher education aims to serve two purposes– transform students into well-rounded and educated individuals, and prepare students for professional careers, directly or indirectly. At the heart of the AI in education debate, it must be remembered that education is used to prepare students for life after college.

“I see that there will be greater adoption of AI in all spheres of life,” said Chari. “From banks and companies to educational institutions, all adopting AI for the delivery of their services. They can also be an aide, a personal aide to people as an AI assistant helping persons. I have seen some demos of products like a personal assistant based on AI that could really help somebody, empower somebody to do things. So, I see more and more advanced AI technologies appearing in the horizon in a few years. We have to be prepared.” 

Dr. Michael Mirer raised questions about its applicability in the world of writing, perhaps the field most impacted by generative AI in its current form. 

“There definitely are ad agencies and PR agencies and even journalistic agencies that are experimenting with ChatGPT tools,” said Mirer. “Ultimately, if ChatGPT can eventually write a better simple news story, we need to understand what those stories can do and don’t do.” 

Riforgiate added that the skills developed at a university hold more value than merely passing classes with good grades. 

“It’s really important that they [students] struggle through that whole writing process and improve their writing.” 

The university’s regulatory policies, whether one blanket policy or many, should underscore not just the current state of generative AI, but its future. Technology never sits idle in its current state but undergoes continuous refinement and expansion. What could the future of ChatGPT and generative AI look like in the classroom? 

“ChatGPT is certainly going to be disruptive, especially with regards to how we are going to do assessments,” said Chari, who is tasked with governing generative AI’s future in Wisconsin. “We have to learn how to live in this world of ChatGPT, which would mean that the type of assessments we give out should be such that students are forced to put in their own work and effort, and not fully depend on ChatGPT. 

“I think ChatGPT could be a great tool for learning, because sometimes there are concepts that students are not able to grasp by reading something,” Chari continued. “It can be a good aid for learning, but we have to be cautious with regards to assessments – that’s where ChatGPT can be very disruptive.” 

“I think that something like AI technology could be useful as a pedagogical tool, it just so happens that right now it’s mostly being used for cheating purposes,” said Lohman. 

“We need to teach students the ethics and the fundamental ways of thinking and doing in order to make the tools work most properly,” said Riforgiate. “In fact, I think it actually makes education much more valuable.” 

Harnessing the Power of ChatGPT

Artificial intelligence has brought about some of the most advanced technology humankind has ever seen. But with great power comes great responsibility– how it is used may dictate the future of education and careers across industries.

“One of Wisconsin’s mottos is sift and winnow,” said Sunwall. “We need to sift and winnow the information that we get, because not all of it is going to be good. We’ve got to separate that chaff from the wheat. But I think there’s a danger, not about cheating but more about scrubbing ourselves out of the work. If you read things like the founding documents of the United States, AI would certainly change some of the run on sentences or phrasing. But that’s the poetry – that’s why we read it or come back to it.” 

Properly managed, AI is a force for good that can produce a world of possibilities, many previously unimaginable. Approached without discernment, it could cause more harm than good in the world of education. At the heart of the debate its impact on our humanity. 

“I think that we need to keep that in mind, that our poetry, what we write isn’t necessarily going to be something that is framed and put on a wall of a building, but it’s something that we can be proud of and say it’s our own,” said Sunwall. 

“If we keep using AI, it’s really not us anymore.” 

One reply on “Educators Still Looking for Answers, One Year After the Release of ChatGPT”

  1. Very interesting article. Comprehensive. Lots of data backed up by verified sources. Very well-written. As an educator, this makes me reevaluate how I can use, not use or limit the use of ChatGPT in my classroom. I have taken to checking student submissions if it were purely written by ChatGPT.

Comments are closed.