Why A.I and School are at odds

on March 1st, 2024

Upper atmosphere of the earth

ChatGPT took the world by storm as a tool of the future.

Almost two years on, our next generation faces a cheat code in the place of learning.

As A.I can now do your homework, Is the traditional system of learning we have today ready to face the future thats already here?

ChatGPT, the first “realistic” chatbot in now a long line of many so called Large Language models (LLMs). Being the first, releasing back in 2022, many people were amazed by its capabilities. From explaining calculus to a 5 year old to Writing A+ college essays, this was tech that was never seen before. And as a student myself, I get it. Why work harder, when you can work smarter?

For example, in a Harvard student’s experiment, AI-generated essays received an average GPA of 3.3. Ai models could earn passing grades in liberal arts classes at most universities. A study found that 43% of college students have used ChatGPT or similar AI tools, with 89% using it for homework. 90% of high schoolers are already using chatGPT for homework.

And clearly, I’m not alone in this kind of thinking: the top result after typing “chatgpt essays” into google is “chatgpt essays cheating”. This is because of the clear ethical dilemma when it comes to AI tools for schoolwork.

After all, it is a valid concern that Ai tools do schoolwork completely. This could leave the student unaware of what is being submitted, let alone understand it.

And even if teachers choose to take the path of cracking down on Ai generated content, It is still not easy. While AI checkers give a statistical analysis of the chance that the given text is AI generated, it is far from conclusive proof.

Unlike regular plagiarism cases, one cannot have a simple google search pin down where the text originally came from. So for now, banning all Ai outright might not be the whole story. Besides, such a move would likely just bring more attention to the idea.

What do we do then?

The truth is that challenge and learning go hand in hand. Learning is impossible if everything is easy.

An example is Google Maps. You may think that using Google Maps would help learn routes and the surroundings. but in practice, the opposite happens: turn by turn navigation leaves us with poor spatial knowledge of the area. This is because we end up relying on external tools so much that we don’t pause to actually take anything in. It all comes in through one ear and leaves through the other. Turn by turn navigation lets us disengage from our surroundings when we need active engagement.

And this is why schools have assignments and tests. School and Learning is not memorisation, but about facing challenges and overcoming them. So in this sense, Ai could be used to automate the monotonous tasks that serve no use.

“The illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn.” — Alvin Toffler

Think about it. In school, we are thought to memorise formulae and do laborious, boring math all in our head. But in everyday life, when was the last time you used the 12 times multiplication table?

Chances are, a calculator did that work. So we already live in a time where we use tools to cut out the mundane things, giving us more time to focus on what matters. And this might be the path that we should take to ai, too. Ai can save us from many mundane tasks. Like reading through articles and analysing them, Ai could increase our productivity if used right.

Of course, a calculator doesn’t make stuff up. Chatbots work by predicting the next immediate word until it forms a hopefully meaningful sentence. Even the chatbot itself does not know what the final sentence will be.

On top of this, its important to be cautious of whether the things you’re asking are impeding real learning or are encouraging it. Asking ChatGPT for different views on a text encourages learning, while asking for a summary does the exact opposite. Whether its “right” or not to use chatbots to help in an assignment varies drastically based on the task and also the individual teacher or student.

There are always two polar opposites in terms of whether a given prompt to a chatbot is misuse or not — Some think that asking chatGPT for help brainstorming aids the thinking process, while others think it kills creativity. Some think that editing with chatGPT just needs to be cited, while others think that asking a chatbot to change your essay defeats the whole point of writing it.

At the end of the day, there would be some who’d say that asking students to be so self responsible might be too much to ask. And maybe thats an issue to address itself.

while it is indeed a valid concern that relying solely on students’ sense of duty could be unrealistic, this also puts educators and the education system itself at a crossroads.

ai image of contrast between ai enables classrooms and traditional ones

If students can get around traditional assignments and tests, the focus should pivot towards fostering critical thinking, individual analysis, and exploration. This way, AI tools can aid learning, not restrict it. Although tests and grades and GPAs were designed to simply monitor learning, the education system has ended up putting these numbers at a higher priority than the actual learning behind them.

To conclude, I believe that the education system’s priorities are misplaced and have been for a decent while, and its high time for a revision as Ai ushers us into a new era of the future. While Ai shouldn’t be dismissed outright as harmful, they need to be approached in a way that ensures true benefit to the children that will end up living in a world more even more defined and saturated by computers than now.

How can Ai now keep tough things tough instead of simplifying them? How can we make students use Ai responsibly now and in the future as well? This is likely the real challenge for modern education.

Back to Home page