As ChatGPT becomes more popular, its impact on the education sector grows. Students are using it to create full essays, reports, and assignments. Many students see this as an easy way to complete work quickly. However, this has sparked a huge debate in universities.

Education institutions are now rethinking their approach to plagiarism. The rise of artificial intelligence tools like ChatGPT raises new questions about what counts as “original work” in the academic field.

Some people argue that using large language models for assignments counts as plagiarism, while others believe students should be allowed to use them for research and inspiration. The question is complex, and universities are struggling to find the right balance.

Some education institutions have already moved to ban ChatGPT entirely. Others have taken a more open approach, viewing it as a helpful research tool. The difference in responses shows that there is no universal agreement on this issue. It will be interesting to see if universities reach a consensus on ChatGPT and plagiarism.

ChatGPT and Academic Integrity

ChatGPT’s wide reach challenges traditional views of plagiarism. Plagiarism typically means taking someone else’s work and presenting it as your own. However, students are not simply copying from a single source. Instead, they’re using a generative AI model to produce unique responses, built from extensive training data gathered from multiple sources.

This raises the question: Is it plagiarism if a student didn’t take the text from a known source? Or does the fact that the text isn’t their own mean it’s still plagiarism?

Generative AI models like ChatGPT rely on vast amounts of training data, including text from books, articles, and other digital resources. When a student uses ChatGPT to complete a paper, the text generated doesn’t belong to anyone in particular.

Yet, it’s not the student’s own writing either. This ambiguity leads many education institutions to ask whether using AI tools counts as plagiarism. Is it plagiarism if the content is technically unique each time? Or is it enough that the ideas and structure are not original work?

The Role of Plagiarism Detection Tools

Traditionally, universities have used plagiarism detection tools to combat copied work. Tools like Turnitin and online plagiarism checkers scan documents for matches in existing databases, ensuring that students submit original work.

However, these detection tools struggle with generative AI content. ChatGPT and similar models create responses in real time, producing text that doesn’t appear in existing plagiarism detection databases. This makes it difficult for free online plagiarism checkers and other detection tools to flag such content.

Plagiarism detection tools would need updates to keep up with AI-generated content. Some companies are developing new detection systems designed to identify generative AI. They aim to provide universities with accurate results that can differentiate between human-written and AI-generated work. However, this technology is still developing, and universities may take some time to fully adapt their systems.

Education Institutions’ Response to AI-Generated Content

Many education institutions have reacted to ChatGPT’s rise by revising their policies on academic integrity. Some universities now state explicitly that using generative AI tools in assignments counts as potential plagiarism. These universities may enforce this rule with failing grades for students who submit AI-generated work. Others have taken a softer approach, permitting students to use ChatGPT for specific parts of their research, provided they cite sources correctly and acknowledge the tool’s assistance.

However, avoiding plagiarism in this context requires new guidelines. Students may not fully understand when or how to cite ChatGPT as a source. The concept of citing sources is well-established for books, articles, and websites, but students may not be clear on the rules for AI-generated text.

Without clear guidelines, students may commit accidental plagiarism, believing they have done nothing wrong. To prevent this, universities need to teach students how to use ChatGPT responsibly.

ChatGPT as a Learning Tool: Benefits and Drawbacks

Some argue that ChatGPT and similar tools can be helpful for students. Generative AI models allow students to explore ideas, find inspiration, and structure their thoughts. ChatGPT can guide students in understanding complex topics, providing a learning experience that feels similar to human feedback. For students who struggle with certain subjects, ChatGPT can offer explanations and help them learn at their own pace.

However, using ChatGPT as a learning tool also has risks. Students may rely too heavily on it, bypassing the learning process entirely. If students use AI tools to complete assignments, they may miss out on developing critical writing, research, and problem-solving skills.

Education institutions worry that students who rely on AI will not learn the skills they need for their academic and professional futures. This raises a key question: How can universities ensure students use ChatGPT in a way that supports learning rather than replacing it?

Training Data, Originality, and Intellectual Property

ChatGPT and other large language models generate content based on their training data, which comes from a wide range of sources. The data used includes books, websites, articles, and other online text, which means that ChatGPT does not “think” or “create” in the same way humans do.

Instead, it assembles responses based on patterns in its training data. This raises questions about originality. If a student uses ChatGPT to write a paper, can they truly claim the ideas as their own?

The issue also touches on intellectual property. Universities must consider who “owns” the content generated by AI tools. ChatGPT’s responses are unique, but they’re not fully original work either. This can create confusion over ownership, particularly if students use AI-generated text in their submissions.

Some universities argue that using AI content without attribution is unethical because the ideas do not come from the student. Others feel that students should treat ChatGPT as a research tool, similar to Wikipedia or online encyclopaedias, but must not use it to complete entire assignments.

Avoiding Plagiarism with Generative AI: Best Practices for Students

To use ChatGPT responsibly, students need clear guidelines. Universities can help students understand what counts as original work and how to avoid plagiarism when using AI tools. Here are some best practices for students to follow:

  • Always Cite ChatGPT: If students use ChatGPT in their research, they should mention it as a source. Just as they would with a book or article, students should acknowledge the role ChatGPT played in their work. This helps avoid potential plagiarism and makes their academic process transparent.

  • Use AI for Inspiration, Not Answers: Students should avoid using ChatGPT to complete entire assignments. Instead, they can use it to generate ideas, improve their understanding of a topic, or structure their thoughts. By using ChatGPT as a learning tool, students can gain insights without relying on it for final answers.

  • Learn to Paraphrase and Summarise: If students take ideas from ChatGPT, they should paraphrase or summarise them in their own words. This creates a degree of separation from the AI-generated content, ensuring that their work reflects their own understanding.

  • Check Plagiarism Independently: Students should use a plagiarism tool to check their work before submission. Many free online plagiarism checkers can help detect similarities in text. By running their work through a checker, students can ensure they’re submitting original work.

  • Seek Human Feedback: Relying on human feedback is essential. Tutors, professors, and peers can offer guidance and corrections that an AI model can’t provide. Human feedback allows students to refine their work and ensure it meets academic standards.

ChatGPT, the Future of Education, and Academic Policies

As ChatGPT and other generative AI models continue to evolve, the education sector will need to adapt its policies on plagiarism and originality. Universities may soon create new standards for what counts as original work in an AI-dominated landscape. Some institutions are already implementing courses on digital ethics, AI ethics, and responsible use of technology. These courses help students understand the implications of using AI and how to balance it with human knowledge.

However, policy changes alone may not solve all issues. Universities may need to update their curriculum to emphasise critical thinking and creativity—skills that AI cannot replicate. By focusing on these human abilities, education institutions can better prepare students for a world where technology plays a significant role in work and research.

ChatGPT’s Impact Beyond the Classroom

The implications of ChatGPT extend beyond just education. The same questions of originality, intellectual property, and plagiarism apply in other sectors. For instance, companies using generative AI to produce content may also need to address the ethical and legal aspects of AI-generated work. As generative AI becomes more common, society will need to clarify what counts as “human” work and what role AI should play.

As ChatGPT becomes a fixture in everyday life, it will be essential for all sectors—not just education institutions—to create policies and guidelines. These will help people use AI responsibly, ensuring that it complements rather than replaces human expertise.

The Road Ahead: Balancing Technology and Academic Integrity

As we move forward, the challenge lies in finding a balance. ChatGPT is here to stay, and its benefits are undeniable. However, the rise of generative AI also brings new responsibilities for students, educators, and society as a whole. Academic integrity remains vital in education, and universities must ensure students understand the value of original work.

At the same time, AI offers opportunities to learn, grow, and innovate. By setting clear guidelines and teaching students how to use technology responsibly, education institutions can help students benefit from AI while upholding academic standards. Avoiding plagiarism in a world where AI is widespread may seem challenging, but with the right approach, students can learn to use these tools effectively and ethically.

As ChatGPT reshapes the landscape of education, the response from universities will continue to evolve. The debate on ChatGPT and plagiarism reflects larger questions about the role of technology in society. How we answer these questions will impact not only the future of education but also how we define creativity, ownership, and originality in an increasingly digital world.

Credits: ChatGPT Is Making Universities Rethink Plagiarism (Sofia Barnett, Jan 30, 2023, Wired).

Huge thanks to Ákos Rúzsa for his valuable insights!