Why critical thinking is key to using AI wisely

Returning guest writer Stephanie Simoes is the mind behind Critikid.com, a website that teaches critical thinking to children and teens through interactive courses, worksheets, and lesson plans. This article is meant to help educators (and parents) more effectively teach kids to use large language models and other forms of AI in positive ways.

 
In the Phaedrus, Plato expressed concerns that if men learned writing, it would “implant forgetfulness in their souls.” A 1975 issue of Science News referenced a survey that revealed that “72 percent of those polled opposed giving every seventh-grade student a calculator to use during his secondary education.”

Generative artificial intelligence is the newest target of that same opposition, and the debate has intensified since the U.S. Department of Education released its Proposed Priority on Advancing AI in Education.

“Advancing AI in education” can mean different things, but it generally falls into three main areas, all of which are addressed in the DoE’s proposals:

  1. Teaching how to use AI—media literacy and how to effectively use LLMs as thinking helpers 

  2. Teaching how AI works—expanding computer science lessons to teach the fundamentals of AI systems

  3. Using AI to support instruction—empoying AI-driven tools to provide analytics and virtual teaching assistants 

Because I teach critical thinking—and because some critics worry that using AI is destroying our ability to think critically—I will explore the first area in this article.

One of the proposed priorities is teaching students to spot AI‑generated misinformation. That one isn’t especially contentious; spotting misinformation, including AI-generated misinformation, is a core part of modern media literacy.

The more controversial question is whether students should use large language models as “thinking partners.” The virality of the recent MIT study, “Your Brain on ChatGPT,” has amplified the fear that LLM use dampens our thinking skills. In the study, 54 adults wore electroencephalogram (EEG) caps while writing short essays. One group wrote unaided, another used a search engine, and a third relied on ChatGPT. Neural activity was highest in the unaided group, lower with search, and lowest with ChatGPT.

Those results, however, come with big caveats: the paper is still in preprint, the sample was small, and none of the participants were K–12 students.

Moreover, the reduced neural activity during ChatGPT‑assisted writing may simply indicate cognitive offloading, the practice of using external tools to reduce mental effort. From maps to calculators to writing lists of things we need to remember, humans have long been engaging in this practice. Cognitive loading isn’t necessarily a bad thing, as it allows us to spend our mental energy on higher‑order tasks. However, it must be implemented carefully in a classroom.

For instance, calculators support higher‑level math education only after students learn arithmetic. Similarly, children should develop basic writing and reasoning skills before using AI as a helper.

Moreover, we need solid subject-specific knowledge before using LLMs as research assistants; otherwise, we lack the expertise to evaluate the results. If we skip those steps, we risk producing a generation of incompetent experts.

But used correctly, AI can be a powerful tool for strengthening students’ critical thinking skills.

Critical thinking is slow, careful thinking. It allows us to question assumptions, spot biases, and weigh evidence. LLM outputs can be flawed or biased like any human source, so their responses deserve the same scrutiny. That scrutiny must sit alongside intellectual humility—recognizing when we don’t (yet) know enough to judge a claim. Students already practice these habits when they evaluate social media posts or websites; LLM outputs are simply the newest arena to apply the same skills.

A drawback of LLMs is that they amplify confirmation bias when we prompt poorly. Ask, “Give me evidence for my belief,” and they may oblige. This flaw can be turned into a lesson about both responsible prompting and confirmation bias. Teach students to prompt “Show the strongest evidence for and against this claim,” and then point out the human tendency to pay more attention to the pieces of evidence that support our preconceptions.

Better yet, have students ask the LLM to challenge their beliefs: “Show me evidence that I am wrong about this.” By prompting for dissent, students learn to explore their beliefs and may even change their minds about some unsupported ones.

History shows a pattern when it comes to new technology: panic, adaptation, and, finally, integration. The task of educators isn’t to shut the door on AI, but to teach students to use it wisely.


Stephanie Simoes | Critikid.com

What is critical thinking?


Guest contributor Stephanie Simoes is the founder of
Critikid.com, a website dedicated to teaching critical thinking to kids and teens through interactive courses, worksheets, and lesson plans.


“Critical thinking” is a trendy term these days, especially in the education world. Alternative schools in Austin commonly advertise that they encourage kids to think critically. Conversations about critical thinking are often accompanied by some version of the Margaret Mead quote, “Children must be taught how to think, not what to think.” But such discussions often neglect a crucial question: “What does it mean to teach children how to think?” Critical thinking is an abstract term. Are we all on the same page when talking about it?

As the founder of a critical thinking site for kids, this question is important to my work. We all get what “thinking” is, so the real question is—what makes it “critical”? I like to use a simple definition: critical thinking is careful thinking. It requires slowing down and questioning our assumptions.


Fast and Slow Thinking

Our brains are hardwired to respond to stimuli quickly, a crucial advantage in emergencies. When faced with a potential threat, immediate reaction is essential—there’s no time for deliberation. While this quick thinking might make us mistakenly perceive a harmless situation as dangerous, it’s a safer bet to err on the side of caution in high-stakes moments. It’s a matter of survival: better to assume danger where there is none than to overlook a real threat.

While fast thinking[1] is a valuable skill, it is prone to errors.

Here’s an example. Try to answer this question in less than 5 seconds:

If 1 widget machine can produce a total of 1 widget in 1 minute, how many minutes would it take 100 widget machines to produce 100 widgets?

After you’ve given your quick, intuitive answer, take as much time as you need to think about it.

Many people’s initial, intuitive response is 100 minutes. However, with more careful thought, we see that the correct answer is 1 minute. (The production rate per machine is 1 widget per minute. The rate doesn’t change with the number of machines.)

The key takeaway of this puzzle is that careful, deliberate thinking is often more accurate than quick thinking.

Applying slow, careful thinking to every daily decision would be impractical. Imagine how long you would spend at the grocery store if you conducted a deep analysis of every single choice! In many cases, our intuitive, fast thinking serves us well. However, problems can arise when we cling to the conclusions drawn by our fast thinking—especially in situations where accuracy matters.

In the widget machine problem, it’s relatively straightforward to recognize and correct our intuitive response with a bit of careful thought. However, letting go of our intuitive conclusions is not always this easy.


Humility and Critical Thinking

We might cling to our intuitive answers, even when faced with clear evidence or reasoning that challenges them, for several reasons.

First, it can be hard to change our minds when the intuitive answer feels very obvious or the correct answer is very counterintuitive. A famous example is the Monty Hall Problem. The correct answer to this puzzle is so counterintuitive that when Marilyn Vos Savant published the solution in Parade Magazine in 1990, the magazine received around 10,000 letters (many from highly educated people) saying she was incorrect!

It can also be challenging to let go of wrong answers when we have invested in them, such as by spending time and energy defending them. Sometimes, it’s simply a matter of not wanting to admit we were wrong.

Critical thinking requires more than just slow, deliberate thought. It also demands an open mind, humility, and an awareness of our minds’ flaws and limitations.


Building Blocks of Critical Thinking

Paired with slow, deliberate thought and humility, the following tools help us to be better critical thinkers so we can communicate more clearly—even when communicating with ourselves:

  1. An understanding of cognitive biases: These are systematic errors in our thinking that can lead us astray. There are many online resources that explore these biases in detail.

  2. An understanding of logical fallacies: These are flawed arguments. Logical fallacies can be used deliberately to “win” a debate, but they’re often made accidentally. Recognizing logical fallacies helps us to keep conversations on track. You can learn about some common logical fallacies in my Logical Fallacy Handbook or teach your kids about them with my online course, Fallacy Detectors.

  3. Science literacy: We were taught many facts in science class, but many of us never really learned what science is and how it works. This is the foundation of science literacy. For an introduction to this, I recommend biology professor Melanie Trecek-King’s outstanding article “Science: what it is, how it works, and why it matters.” Another important part of science literacy is knowing How to Spot Pseudoscience.

  4. Data literacy: Data literacy is the ability to properly interpret data to draw meaningful conclusions from it (and to know when drawing certain conclusions is premature). It means understanding how data is collected, identifying potential biases in data sets, and understanding statistics. Data literacy helps us make sense of the vast amount of information we encounter daily. You can introduce your teens to some common errors in data collection and analysis in Critikid’s course A Statistical Odyssey—a course that adults have enjoyed and learned from, too!


Preparing Kids for the Misinformation Age

A quick scroll through social media reveals a minefield of bad arguments and misinformation. You have probably come across logical fallacies like these:

“You either support A or B.” (False dilemma)
“Buy our product—it’s all natural!” (Appeal to nature)

The lack of science literacy among influential voices is also concerning. I can’t count how many times I have seen or heard the phrase,

“Evolution is just a theory.”

This phrase confounds the scientific and colloquial definitions of theory. If unintentional, it demonstrates a lack of science literacy; if intentional, this is a logical fallacy known as “equivocation,” in which a word is used in an ambiguous way to confuse or mislead the listener.

The need for data literacy is also apparent. You may have heard arguments like:

“Illness X has increased since Y was introduced, so Y must be the cause.” (Mistaking correlation for causation)
“There are fewer cases of food poisoning among people who drink raw milk than those who drink pasteurized milk.” (Base rate neglect)

We have an incredible amount of data at our fingertips, but without data literacy, we don’t have the proper tools to make sense of it all.


Critical thinking shouldn’t be taught as an afterthought; it needs dedicated, explicit instruction. Children face a battlefield of misinformation and faulty logic every time they go online. Critical thinking is their armor. Let’s help them forge it.


Stephanie Simoes | Critikid.com



[1] Nobel-prize-winning psychologist Daniel Kahneman calls fast thinking “system 1 thinking” in his book Thinking, Fast and Slow. I highly recommend this book to anyone who finds the content of this blog post interesting.