“There’s a lot of cheap knowledge out there. I think this could be a danger in education, and it’s not good for kids,” said one educator of OpenAI’s viral chatbot.
OpenAI’s latest chatbot, ChatGPT, can write elaborate essays and movie scripts, debug code and solve complex math problems. Its ability to generate legible answers to any question you can imagine could be a promising supplemental resource in classrooms, especially given the national shortage of teachers. But teachers are concerned about students using the free and accessible tool as a Wikipedia replacement to complete homework and to write assignments for them, endangering students’ willingness to develop skills like writing and researching.
“Students are going to think and use this chatbot as if it is a know-all,” says Austin Ambrose, a middle school teacher in Idaho. “That’s because it’s a technology that is creating these things that sound really legitimate, they are going to assume that it is and take it at face value.”
Over the past few weeks, ChatGPT has exploded in usage, with more than a million users signing up to use it within a week after it was launched. The algorithm is a language model trained through human feedback and a vast amount of public data from a variety of sources like books and articles from the internet. But just because it appears to know what it is talking about doesn’t mean that the information it provides is fully accurate. For one, because it was trained on data available until 2021, ChatGPT isn’t able to provide factual up-to-date answers. It sometimes presents minor inaccuracies that it should know, given its training data — for instance, it said the the color of Royal Marines’ uniform during the Napoleon war was blue when it was actually red. In addition, ChatGPT struggles with confusingly worded questions, which could also lead to incorrect answers.
The algorithm also has bias problems, given that it was trained on vast amounts of data pulled from the internet. It can render racially biased content: When asked for a way to assess the security risk of travelers, it proposed some code that calculated risk scores, which spit out higher scores for Syrians, Iraqis, and Afghans than other airline travelers.
Open AI’s CEO Sam Altman himself acknowledged these pitfalls in a tweet, saying, “ChatGPT is incredibly limited but good enough at some things to create a misleading impression of greatness. It’s a mistake to be relying on it for anything important right now.”
With these concerns in mind, teachers say that it is even more important to teach digital literacy early on and emphasize the importance of critically assessing where information is coming from. Teachers say the tool could also emphasize and reinforce the use of citations in academic papers.
“There’s a lot of cheap knowledge out there. I think this could be a danger in education, and it’s not good for kids. And that becomes an issue for the teacher to teach the students what is appropriate and what’s not appropriate in the search for knowledge,” says Beverly Pell, an advisor on technology for children and a former teacher based in Irvine, California.
“With a tool like this at their fingertips, it could muddy the waters when evaluating a student’s actual writing capabilities because you’re giving kids potentially a tool where they could misrepresent their understanding of a prompt.”
As digital natives, students become aware of new technologies and start using them much sooner than educators can begin to understand them, education technology experts say. Natalie Crandall, a high school literacy director at Kipp New Jersey, a network of public charter schools, says that it is inevitable that students will use devices both in and out of the classrooms, and it is better for educators to embrace them and teach students how to use the technology ethically and honestly. “We have a saying in education that ‘teenagers are going to teenage,’ meaning they are going to get academically creative in terms of finding things online and using them in classrooms,” Crandall says.
However, not all curriculum and school programs are designed to accommodate a cutting-edge AI chatbot in mind, Ambrose says. “A lot of times not only the curriculum but also schools and teaching programs are structured in a way where teachers don’t have the knowledge to bring in these advanced and innovative technologies,” he says.
There is a place for AI in the classrooms, Ambrose says, but the AI would have to do more specific tasks such as correcting grammar or explaining a math problem rather than perform a broad range of tasks like ChatGPT, which makes it vulnerable to errors. “It would be great if we could create a program that would provide students guided feedback in the moment because teachers can’t always get to every kid,” he says, emphasizing that teachers, who are often responsible for a class of 30 to 35 students, could use the additional support.
Because there are several tools that help students to cut corners while completing assignments, teachers often scan submitted assignments through anti-plagiarism software like Turnitin, which checks for similarities between a student’s work to text located elsewhere. But because the database used by tools like Turnitin doesn’t include answers churned out by an AI chatbot, information directly copied and pasted from ChatGPT will slip through the software and go undetected. Requiring teachers to compare a student’s answers with answers from ChatGPT would give teachers another layer of tasks and take away time from planning lessons and giving feedback to students, says Stephen Parce, a high school principal in Colorado.
With essays being a major component of teaching skills like reading, writing and comprehension, using a shortcut like ChatGPT could also mean that students aren’t developing these crucial skills. “With a tool like this at their fingertips, it could muddy the waters when evaluating a student’s actual writing capabilities because you’re giving kids potentially a tool where they could misrepresent their understanding of a prompt,” says Whitney Shashou, founder and advisor at educational consultancy Admit NY.
Despite online chatter about the newly unveiled chatbot potentially replacing teachers, some educators see the invention as an opportunity and a tool to incorporate in classrooms rather than something to be afraid of. Educators say that the chatbot could be used as a starting-off point for students when they face a writer’s block, or it can also be used to get examples of what an answer should look like. It could also make information available at students’ fingertips, encouraging them to conduct research and double-check their facts. At the school level, teachers say it could prompt faculty members to update their curriculum to accommodate such technologies openly in the classroom and for ChatGPT to act as a co-teacher.
“The ultimate goal is really that we want students to grow and expand their skills,” Parce says. “There’s a need for the curriculum to continue to be reviewed and adapt to the goal of keeping pace with the evolution of different tools and technologies.”
Source: https://www.forbes.com/sites/rashishrivastava/2022/12/12/teachers-fear-chatgpt-will-make-cheating-easier-than-ever/