AI in the Middle: Are We The Users or Just The Audience?

3 minute read

Published:

In recent times, with the fast development of artificial intelligence, we have started to give most of our mental work to AI. In the past, AI was just a supporter. It was there to complete the missing parts of our thinking process or help us fix small mistakes. But now, the situation is changing. It is not just supporting us; it causes us to bypass the whole process.

In this blog, I want to talk about how our role is changing from being the “creator” to being a “spectator” of our own work. We are not doing the thinking anymore; we are just managing the AI that does the thinking.

The New Workflow

Let’s look at a concrete example of a homework or a complex task. When a new assignment arrives, our first instinct is not to read it carefully. Instead, we paste it into the AI to get a quick summary. We want to understand it fast. After we get the general idea, we don’t start working. We ask the AI to do the assignment for us. The “struggle” of creating something from zero is gone.

But the funny thing happens after the work is finished. Since we bypassed the effort, we realize we didn’t actually learn the topic. So, what do we do? We go back to the AI. We ask, “Explain this part to me,” or “Why did you use this formula?” We are trying to learn the subject by looking at the result that the machine produced. We are basically doing a “walkthrough” of a work that is supposed to be ours.

Where is the Human Intellect?

This creates a weird situation. If the AI summarizes the question, answers the question, and then explains the answer to us, where is the human intellect? We are not inside the process; the AI is in the middle of everything. It stands between the problem and the solution.

The Expectation Trap

Using AI in the tasks actually creates a new trap. Because we complete tasks so fast and with high quality using AI, the expectations are rising. The tasks are becoming harder and more annoying because the system assumes we have these tools. The goal is to force us to use our “intellect” on the subject.

But unfortunately, it doesn’t work like that. We don’t use our intellect to understand the harder topic. Instead, we use our intellect to figure out “how can I explain this to the AI better?” We spend our mental energy on prompting, not on learning. So, nothing changes in the process; we are still bypassing the work, just with more complex commands.

The Scarier Possibility

But there is an even scarier possibility. It is one thing for students to rely on AI, but what if teachers and academicians start relying on it too? Imagine a professor using AI to create the exam questions, or using AI to read and grade the homeworks.

If that happens, we are really in trouble. It means the entire process of teaching and learning is controlled by AI. The question is written by a machine, answered by a machine, and graded by a machine. If we reach that point, humans will just be the audience in an education loop run entirely by AI.

Conclusion

In conclusion, as we move forward, this habit of “bypassing” might become dangerous for our brains. We are trading the painful process of learning for the comfort of a quick result. If we only navigate through the AI’s output, maybe we will forget how to walk the path ourselves. For now, the AI is in the middle, and we are just watching the show.