
"According to a report by Anthropic, the company behind Claude, 39% of student interactions with the AI tool involve creating and improving educational content, such as practice questions, essay drafts and study summaries. A further 34% interactions seek technical explanations or solutions for academic assignments - actively producing student work. Most responses to this from schools and universities have been to focus on immediate concerns: plagiarism, how assessments are conducted and job displacement."
"While these are important, what's being overlooked is how evolving generative AI systems are fundamentally changing our relationship with knowledge itself: how we produce, understand and use knowledge. This isn't just about adding new technology to classrooms. It changes how we think about learning and challenges the core ideas behind education. And it risks granting power over how knowledge is created to the tech companies producing generative AI tools."
Generative AI tools are widely used by students and teachers for content creation and technical help, with 39% of interactions aimed at creating educational content and 34% producing student work. Institutional responses have focused on immediate issues such as plagiarism, assessment integrity, job displacement, and teaching AI literacy. Deeper effects include challenges to epistemology: instant authoritative outputs blur original thought and assisted thinking, alter evaluation and reasoning skills, and reshape the nature of learning. These systems risk concentrating influence over knowledge creation in the hands of the companies that build them, challenging core educational principles.
Read at The Conversation
Unable to calculate read time
Collection
[
|
...
]