The Recent MIT Paper, and How to Protect Students’ Brains From AI

MIT AI Study

Estimated reading time: 6 minutes

The rise of AI over the past few years has been meteoric. What started as a fun curiosity behind catchy headlines has now already become an integral part of many peoples’ lives. AI is being used for any and everything, whether it be for professional work, or for studies.

From its launch itself college students used GPT to write their thesis papers, and help them draft presentations. There was always a negative stigma attached to how students used these tools. In the meanwhile, AI checkers were and still are a faulty model to catch people who do use AI. In a way, it is worse than the stigma teachers had with Wikipedia. At least Wikipedia has citations, edits, and corrections. It also does not hallucinate, and make up answers like AI tools do when they don’t find any information.

And while news reports and stories of students using Chat GPT to pass their exams already raise some alarms, AI was too new for people to understand how it affects people on a long-term basis.

That lack of understanding changed, on the 10th of June 2025. The Massachusetts Institute of Technology released on the 10th a paper that detailed how using AI affects our brains over a long period of time.


Take a look at some other educational blogs here:


What Does the Paper Say About AI?

The paper, over a period of 4 months, observed 54 people extensively. It studied how their brains reacted on a long-term basis to using AI for every possible task. The study used EEG, or electroencephalography to record user brain activity. This helped them gain a deeper understanding of neural activations during the different sections.

There were 3 groups that were tested.

The first group, Group A, used AI in their work.

The second group, Group B, used web search like Google.

The final group, Group C, did not use either, but used their brains only.

These groups were then asked to write an essay, either using the tools they were provided with, or without any. They conducted 3 such sessions for these groups, to fully observe how they wrote the essay, and how much they used their tools to do so.

After the first 3 sessions were completed, they asked Group A participants to use no tools, and Group C participants were allowed to use AI.

The study showed that the 3 Groups had very different neural patterns, and different cognitive strategies (Mental process used to solve problems and achieve goals) because of the tools at their disposal. They noted that brain connectivity (The connection between the neurons) scaled down seriously with the amount of external support they had. Group C, which had no external tool consistently exhibited the strongest neural performance. Group B, who used search engines had a lower performance, and Group A had the lowest performance in the study.

The Study’s Conclusions

They mainly noted for Group A, who used AI, that they completely relied on Chat GPT to write their essay. This included coming up with the topic, structuring it, finding sources that go along with it, and writing the final essay itself. Group A could not remember quotes they had written in the final essay minutes ago. They also had a hard time talking about their essay. The group that did not use any tools did not face any of these problems whatsoever.

The paper finally pointed out that AI usage for any problem resulted in lower brain function in all cases, but one: for experts. They understood that experts, and people who have knowledge over a field could use AI without any cognitive loss.

They figured that this was because the experts were using AI only to finish with menial tasks that did not require any serious thinking. And in fact, they noted that experts who used AI performed better on their essay than all other groups. This was because they were doing most of the brain work for the essay. The LLMs just found information and did simple tasks to help write their essay.

How Schools Need to Deal With AI Affecting Student Brain Activity

The paper only highlights how carefully schools have to navigate this issue. The results of the paper highlight that introducing AI, and using AI seriously harms development and learning. But just as with Wikipedia, students will find a way to use these tools to make their homework easier.

Instead of bringing blanket bans, and using inaccurate AI checkers to grade every assignment, teachers need to start implementing learning which completely removes the AI element for students. This can be best implemented through project-based learning, where students have to solve complex real-world problems in an engaging and exciting way.

Imagine, instead of asking students to simply answer a computer science question, if your students were asked to build an AI Health Assistant app? Students would be able to study their subject, use their learning to build real solutions, and use AI in a safe, interactive way. This way, not only do students learn their school concepts, but they also do not find any time or way to just have AI do their homework for them. They will now have to use AI in a manner that is controlled, and in a way lets students actually learn how to use AI. This also lets students understand AI better, with all its strengths and flaws.

Through projects like these, students will use AI in a way that lets them see it as simply a tool. This ensures that they do not use it for every aspect of their studies. This could potentially lead the students to become like the experts, who only see AI as a tool. This lets them just use it to help them with their task, not do the task for them.

A Necessary Evil?

This does not mean that AI is completely evil. There have been multiple education-based studies that understood that AI was very effective as a ‘curiosity builder’ among students. They let students use AI, not to answer questions, but to ask them. This had helped students learn more. AI served as one single interface through which students could ask many questions, and get interesting and correct answers. This shows that AI, if used in the right way, can be a huge motivator for students to learn.

Of course, there will always be issues in how students will learn. Every one of us was once that same child who had to be forced to do homework. Especially because we wanted to spend that time playing outside instead. And we would use Wikipedia to help us write an answer, instead of writing it on our own.

But incorrectly dealing with how AI needs to be used in a school setting will seriously harm how kids learn, and how equipped they become to deal with the real world.

Kruu Blogs

These blogs are written by Kruu students who have worked on live projects.

We believe in amplifying student voices on impactful and relevant topics, providing them with a platform to share their insights and experiences.

To submit your blog or stay updated on new projects and student stories, subscribe to our newsletter.

Facebook    Twitter / X    Instagrams

Categories

Recent Posts

Subscribe Newsletter

Sign up to receive notifications about the
latest news and events from us!