As today’s students become tomorrow’s workers, they will invariably be required to use more technology, including artificial intelligence (AI) and machine learning. To prepare them, it only makes sense to provide students with opportunities to use next-generation technology—not block their access to it. And yet, that is exactly what some educators are attempting to do in response to Open AI’s ChapGPT, which has been hailed as a “tipping point” for AI.
The latest and much-improved iteration of this technology, GPT-4, is more accurate than earlier versions and shows a degree of “human-level” performance as it generates responses to user-asked questions. These capabilities have led many in the technology sector to view GPT-4 as a true breakthrough. Not everyone, however, shares this opinion.
Among the most vocal critics are people in education who see GPT-4 as a tool for cheating, with students using the technology to write essays and other assignments for them. In response, several school districts are reviewing limits and bans on the technology; most notably, New York City’s Department of Education (DOE) banned GPT-4 from schools’ devices and networks. There has been similar backlash in other parts of the world, including some universities in Australia that have returned to “pen and paper” examinations in an effort to stave off the ChatGPT cheating threat.
As a member of the board of large university—the Technical University of Denmark (DTU)—I follow the debate with great interest. I understand and respect the need for policies around how students should use tools such as GPT-4. However, my personal opinion is that instituting bans and placing strict limitations on generative AI are truly cases of throwing out the proverbial baby with the bathwater! Instead of preventing students from using generative AI, educators focus the efforts to explore how to incorporate evolving technologies into learning.
Going forward, we need to acknowledge the root of the problem: complacency within education to rely on obsolete assessments that seek to measure how much “knowledge” students can memorize and regurgitate. The effectiveness of such testing has waned, and it’s time to face that fact. This should not come as a surprise given some of the reports that have tracked investments in education over the years—with colleges and universities increasing their spending on sports, while outlays for instruction, research, and public service either remained flat or declined. In addition, as schools from K-12 to higher education quickly shifted to widespread remote learning during Covid, educators expended much time and effort to root out new ways of cheating. No surprise that online proctoring and other examination surveillance skyrocketed.
Is that really where education should be focused? As GPT-4 shows its prowess in producing content, the solution is not to pretend the technology doesn’t exist. Rather, it’s a reminder that “drill-and-kill” practices have been living on borrowed time. My argument is that, just as schools eagerly embraced technology such as laser cutters and 3D printers, they should also greet generative AI and tools such as GPT-4, which can help students build a broad base of knowledge by explaining things.
The future lies beyond memorization and attempts to make eloquent reproductions of somebody else’s work. Future learners—younger and older—will need to delve much further into real creativity, while also shaping their critical thinking skills to an unprecedented level.
The Gift of GPT-4
In a recent conversation with our chief technology officer—a leading world expert on how to use AI for educational technologies—I asked him why he thought GPT-4 would be an important catalyst for modern education. Without hesitation he replied, “It will help unlock more human creativity.”
For far too long, language has been a “secret code” among educated people who have wielded their vocabularies and knowledge of punctuation to show they are “intelligent.” But there are plenty of people who struggle with written language for a variety of reasons—among them dyslexia—and who are highly intelligent and creative. Case in point: Richard Branson, who dropped out of school at age 15 because he had difficulties with reading and writing, then went on to establish several companies and become a billionaire philanthropist.
Clearly, the world needs more Bransons. If GPT-4 can be a useful tool, then by all means let’s include it in the toolkit. A recent Harvard Business Review article observed that “humans have boundless creativity.” However, difficulty in communicating ideas and concepts—whether in written form or visually—often “restricts vast numbers of people from contributing new ideas.” Generative AI can help address this barrier, “to assist humans in their individual and collective efforts to create hitherto unimaginable solutions.”
And that’s what makes GPT-4 a gift.
But GPT-4 Is Not Enough
Generative AI is a tool, much like any other, augmenting the capabilities of a human user. Consider the carpenter who uses a nail gun, instead of pounding in each nail with multiple strikes of a hammer. The nail gun makes the carpenter more efficient and accurate, but the tool, alone, won’t put on a roof or erect a fence. Admittedly, AI will have an even bigger impact than power tools have had for carpenters. In the same way, however, GPT-4 cannot undertake an entire project on its own, but it does supercharge the capabilities of experts who use the tool.
From personal experience, I can attest that ChatGPT is wildly inefficient for building more advanced education programs, although it’s still fast to help with certain things. It is a useful tool to amplify the efforts of education experts. For example, our company has been working for two years to build technologies that can more systematically leverage the generic breakthroughs in areas such as generative AI to enhance our processes—including the creation of higher quality content. Being able to produce superior learning content at a fraction of the time and cost will move the needle. In combination with other technologies designed specifically for educational purposes, we will see very significant breakthroughs in terms of the cost of producing high-quality educational environments for the future.
This is an important takeaway and one that might help alleviate fears over AI in general and use of generative AI by students in particular. The future is not replacing human knowledge with AI. Rather, it’s promoting the interaction of people who use tools such as ChatGPT and GPT-4, and—very importantly—building technologies and products around these breakthroughs.
And that brings us back to the heart of this discussion. GPT-4 and next-gen AI are not to be feared or ignored in some futile attempt to preserve or protect human learning. Rather, these emerging technologies can be valuable in assisting learners in acquiring more knowledge and deepening their understanding.
Yes, GPT-4 can spit out relevant facts and produce an explanation. But if that’s the kind of “learning” educators want to preserve, there’s a far bigger issue here than using technology. What matters most is learners demonstrating their proficiency in new ways to show their mastery of certain knowledge and skills.
AI can help prepare learners for that future—but it’s up to the humans to perform.
Stay connected with us on social media platform for instant update click here to join our Twitter, & Facebook
We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.
For all the latest Education News Click Here