AI in the classroom

3 Things to Know About AI Tools for K-12 Before Choosing One

Merlyn Mind
August 19, 2024

We can confidently assume the following exchange has taken place in a staff meeting at your school or district within the last few months …

"I've been looking into AI tools that can assist our teachers with tasks like lesson planning and managing classroom technology. It could be a game-changer."

"I'm not so sure about that. I worry about the privacy risks associated with giving it access to our content.”

"What about cheating? I’ve heard that students will become too reliant on AI and lose critical thinking skills."

“Maybe we should table the discussion for now and set up a separate time to dig into the topic of AI so we really understand what we’re looking at.”

Sound familiar?

When the topic of artificial intelligence (AI) comes up in the context of education, it often evokes a knee-jerk reaction – and understandably so. While companies like ours have been using AI to build solutions for years, widespread understanding of AI is still new and murky at best. And – to add to the confusion – AI is a term that represents a broad category of technology, yet the public’s familiarity is related to specific applications, like generative AI tools Chat GPT and Google Gemini.

Schools and districts manage a complex ecosystem of needs and challenges on a daily basis, and when the wellbeing of kids is in the mix, there is zero room for risk. It’s understandable that new technology brings hesitations.

So where does the conversation go from here?

As leaders in the development of AI tools specifically for K-12 settings, these are questions we’ve been familiar with for years. And from our experience, there are three key insights that change the discussion about AI-based tools like Merlyn, our AI assistant for educators, and how they can empower teachers and improve learning. Let’s separate fact from fiction.

3 Things to Know About AI for K-12

Fact #1: AI should never replace teachers.

At Merlyn Mind, this belief is central to our mission and product development. AI can’t replace teachers, but it can certainly support them – and no one understands this better than us, because we’re speaking from personal experience. 

For almost a decade, our founder led the team at IBM research to apply the Watson AI platform to education. It was a grand challenge of AI: building a computer that could teach. An intelligent tutoring system. And while the technology was advanced, the experiment was ultimately flawed.

Why? Because the most critical parts of teaching can’t be replicated by a machine.

Years of research has shown the importance of student engagement and a strong student-teacher connection for academic success – especially when learners feel challenged or discouraged. While AI can add a lot of value to the overall context of a classroom, it’s not very good at motivating humans on a personal level.

Teachers understand their students at an emotional level. They often sense when a student is grasping a concept or when a student is struggling – simply by looking at them. Teachers can imagine what it will take to get an individual student to an "aha" moment and how to navigate the social dynamics in the classroom. 

Teachers also understand the cultural influences that impact teaching and learning, whether that be the cultural dynamics of their school, district, community, or nation. These nuances are what drive human connection and motivation, and it’s what helps teachers bring learning to life for their students. AI simply can’t play that role.

So why even talk about bringing AI into classrooms?

We all know that educators have more needs and challenges to navigate than ever before and limited resources to use in the process. Cognitive overload is one of many difficult challenges teachers face. One area where AI excels is automating repetitive and administrative tasks, a.k.a. the things that take away from actual teaching to students.

Across the team at Merlyn Mind, we’ve talked with thousands of teachers over the years. We know teaching is a passion for most teachers. Teachers choose to be teachers because they want to help students succeed. And teachers are great at this! Computers and AI are not.

When we design AI solutions to empower people to focus more of their time and energy on the human work – not replace them – we enable them to multiply the impact they can have. Thoughtfully designed AI like Merlyn can give teachers more time to focus their attention on their students while taking more administrative tasks off their plate. Win-win.

Recommended Reads:

Merlyn's voice control of technology allows teachers to stay engaged with their students. (Image credit: Merlyn Mind)

Fact #2: There’s more to AI than generative AI.

While the world of AI spans more than just large language models (LLMs), the public boom of Chat GPT has led many people to equate “AI” with generative AI tools. In the context of classrooms, much attention has been given to the opportunity for students to use generative AI to cut corners or present work that doesn’t represent their true understanding of a concept. 

But not all AI is created equal.

At Merlyn, we saw an opportunity to create a purposefully built AI solution specifically for education, long before other companies and other LLMs. While most of the successful AI solutions in the past have focused on solving big data problems – problems where humans were not part of the process – our deep understanding of classroom needs meant that we designed AI from the ground up to be centered around humans.

For example, the goal of Merlyn Mind’s voice assistant is to make it easier for teachers to engage with students while also navigating apps and other digital teaching materials. Teachers access a lot of tech in their classrooms – from learning management systems and interactive resources to assessment platforms and gamification. When we enable teachers to utilize that ecosystem more easily, we can remove interruptions to the instructional flow (not to mention tech frustration!) and give them freedom to move around the class to interact with students. 

It’s a model that adds to the dynamics of an engaged classroom, not takes away.

Simple voice commands to Merlyn enable teachers to use their existing devices and applications to:

  • Open and progress through lesson plans and materials.
  • Find a short educational movie about the day’s topic.
  • Set a timer for any length of time.
  • Display an illustration to reinforce a concept.
Do those little moments really make that big of a difference? 

They do. Independent research on Merlyn found that after just 7 weeks, teachers using our AI assistant experienced more time for teaching and learning, less time spent on administrative tasks, reduced technostress, and greater comfort with AI in the classroom. Meaning that extra time and energy goes where it really matters – to students.

Merlyn was built from the ground up to support teaching and learning. (Image credit: Adobe)

Fact #3: Domain specificity matters. (Okay, but what does that mean?)

Many of the AI models that are capturing today’s news headlines are considered general-purpose models, which means they’re trained on massive datasets encompassing text, code, and other information. This breadth of knowledge allows them to perform a wide range of tasks, from drafting emails to answering questions, making them incredibly versatile.

But trying to adapt an expansive business tool for a classroom is trying to fit a square peg into a round hole. 

AI succeeds when solutions are purpose-built for specific industries to solve specific problems. For example, credit card companies using specific AI algorithms built to monitor fraud, or healthcare companies using specific AI algorithms to assist radiologists reading MRI scans.

With AI, bigger is not always better. 

That’s why we built an AI solution that is domain specific to education – AI to understand the language of teachers, as well as their goals and objectives. AI that is context aware to understand the environment, applications, and teacher workflows.

To dig into the concerns with large LLMs a bit more, and explain how domain specificity addresses those risks, let’s talk about hallucination (don’t worry, we’ll explain) and data privacy:

Large LLM Domain Issue 1: Hallucination

While it’s incredible that generative AI can generate answers to prompts, one drawback is the potential for the algorithm to make up facts entirely – something technologists call “hallucination.”  

Domain-specific LLMs (like Merlyn) are less susceptible to hallucination than the jack-of-all-trades models because they’re trained on specific material and smaller in size. With a smaller training dataset, a smaller LLM is also less likely to encounter conflicting or contradictory information, meaning you get more focused responses.

Merlyn is built on the first LLM designed specifically for the classroom – pulling from vetted educational content and resources, not from the entirety of the internet. By focusing on educational content, the model can generate more relevant responses that are aligned with specific learning objectives and curriculum standards.

If AI’s in the classroom, it should make a teacher’s job easier, not harder, right?

Large LLM Domain Issue 2: Data Privacy

One other big reason large language models feel like a risky proposition for schools is data privacy. AI has access to a wealth of knowledge and many are built on a model where the information being input is shared with the larger LLM. 

In schools, policies like the Children’s Online Privacy Protection Act (COPPA) and the Family Educational Rights and Privacy Act (FERPA) put important guardrails in place to regulate what information is shared with – and collected from – students, as well as how that happens. 

Here’s where small, education-specific LLMs offer another huge win for educators. At Merlyn, our Responsible AI team works to ensure that Merlyn solutions built on our domain-specific LLM adhere to the highest standards for safety, in addition to COPPA and FERPA requirements.

And because we prioritize safety and security from the ground up, we’ve taken it a few steps further, adding domain-specific guardrails appropriate for use in education, including:

  • Beta testing with schools to validate our solutions and co-design new ones.
  • Applying a safety model on user prompts and model-generated responses to identify and filter harmful, biased, or inappropriate content.
  • Internal and external 3rd party Red Teaming, a.k.a. asking our team and partners outside the organization to simulate cyberattacks on our systems, helping us challenge our existing defenses.
  • Aligning the model’s responses with information and guidelines specific to school or district policies for any sensitive topics.

Educators and administrators have enough on their plate to worry about – our mission is to help them leverage the power of generative AI without adding more to their to-do lists.

Recommended Read: Merlyn Mind Looks to Simplify the Teaching Process Through Responsible AI Use, Tech & Learning

Embracing AI: A Balanced Approach for Schools

With the prevalence of technology in classrooms and the rise of AI innovation, it’s inevitable that these conversations will happen in your schools if they haven’t already. And we get why educators are skeptical about blindly adopting new technologies – they should be! With a classroom full of kids to keep engaged and little time to spare, there’s no room for tech that complicates a teacher’s instructional flow.

But intimidation or confusion doesn’t have to stop the conversation. 

By understanding these key factors, school leaders can confidently navigate the AI landscape and unlock its potential to give teachers a much-needed hand.

Want to learn more about responsible AI policies? Explore our Responsible AI Resource Center and look for more blog posts to come on this topic!

All Posts

3 Things to Know About AI Tools for K-12 Before Choosing One

AI in the classroom
August 19, 2024
Merlyn Mind

We can confidently assume the following exchange has taken place in a staff meeting at your school or district within the last few months …

"I've been looking into AI tools that can assist our teachers with tasks like lesson planning and managing classroom technology. It could be a game-changer."

"I'm not so sure about that. I worry about the privacy risks associated with giving it access to our content.”

"What about cheating? I’ve heard that students will become too reliant on AI and lose critical thinking skills."

“Maybe we should table the discussion for now and set up a separate time to dig into the topic of AI so we really understand what we’re looking at.”

Sound familiar?

When the topic of artificial intelligence (AI) comes up in the context of education, it often evokes a knee-jerk reaction – and understandably so. While companies like ours have been using AI to build solutions for years, widespread understanding of AI is still new and murky at best. And – to add to the confusion – AI is a term that represents a broad category of technology, yet the public’s familiarity is related to specific applications, like generative AI tools Chat GPT and Google Gemini.

Schools and districts manage a complex ecosystem of needs and challenges on a daily basis, and when the wellbeing of kids is in the mix, there is zero room for risk. It’s understandable that new technology brings hesitations.

So where does the conversation go from here?

As leaders in the development of AI tools specifically for K-12 settings, these are questions we’ve been familiar with for years. And from our experience, there are three key insights that change the discussion about AI-based tools like Merlyn, our AI assistant for educators, and how they can empower teachers and improve learning. Let’s separate fact from fiction.

3 Things to Know About AI for K-12

Fact #1: AI should never replace teachers.

At Merlyn Mind, this belief is central to our mission and product development. AI can’t replace teachers, but it can certainly support them – and no one understands this better than us, because we’re speaking from personal experience. 

For almost a decade, our founder led the team at IBM research to apply the Watson AI platform to education. It was a grand challenge of AI: building a computer that could teach. An intelligent tutoring system. And while the technology was advanced, the experiment was ultimately flawed.

Why? Because the most critical parts of teaching can’t be replicated by a machine.

Years of research has shown the importance of student engagement and a strong student-teacher connection for academic success – especially when learners feel challenged or discouraged. While AI can add a lot of value to the overall context of a classroom, it’s not very good at motivating humans on a personal level.

Teachers understand their students at an emotional level. They often sense when a student is grasping a concept or when a student is struggling – simply by looking at them. Teachers can imagine what it will take to get an individual student to an "aha" moment and how to navigate the social dynamics in the classroom. 

Teachers also understand the cultural influences that impact teaching and learning, whether that be the cultural dynamics of their school, district, community, or nation. These nuances are what drive human connection and motivation, and it’s what helps teachers bring learning to life for their students. AI simply can’t play that role.

So why even talk about bringing AI into classrooms?

We all know that educators have more needs and challenges to navigate than ever before and limited resources to use in the process. Cognitive overload is one of many difficult challenges teachers face. One area where AI excels is automating repetitive and administrative tasks, a.k.a. the things that take away from actual teaching to students.

Across the team at Merlyn Mind, we’ve talked with thousands of teachers over the years. We know teaching is a passion for most teachers. Teachers choose to be teachers because they want to help students succeed. And teachers are great at this! Computers and AI are not.

When we design AI solutions to empower people to focus more of their time and energy on the human work – not replace them – we enable them to multiply the impact they can have. Thoughtfully designed AI like Merlyn can give teachers more time to focus their attention on their students while taking more administrative tasks off their plate. Win-win.

Recommended Reads:

Merlyn's voice control of technology allows teachers to stay engaged with their students. (Image credit: Merlyn Mind)

Fact #2: There’s more to AI than generative AI.

While the world of AI spans more than just large language models (LLMs), the public boom of Chat GPT has led many people to equate “AI” with generative AI tools. In the context of classrooms, much attention has been given to the opportunity for students to use generative AI to cut corners or present work that doesn’t represent their true understanding of a concept. 

But not all AI is created equal.

At Merlyn, we saw an opportunity to create a purposefully built AI solution specifically for education, long before other companies and other LLMs. While most of the successful AI solutions in the past have focused on solving big data problems – problems where humans were not part of the process – our deep understanding of classroom needs meant that we designed AI from the ground up to be centered around humans.

For example, the goal of Merlyn Mind’s voice assistant is to make it easier for teachers to engage with students while also navigating apps and other digital teaching materials. Teachers access a lot of tech in their classrooms – from learning management systems and interactive resources to assessment platforms and gamification. When we enable teachers to utilize that ecosystem more easily, we can remove interruptions to the instructional flow (not to mention tech frustration!) and give them freedom to move around the class to interact with students. 

It’s a model that adds to the dynamics of an engaged classroom, not takes away.

Simple voice commands to Merlyn enable teachers to use their existing devices and applications to:

  • Open and progress through lesson plans and materials.
  • Find a short educational movie about the day’s topic.
  • Set a timer for any length of time.
  • Display an illustration to reinforce a concept.
Do those little moments really make that big of a difference? 

They do. Independent research on Merlyn found that after just 7 weeks, teachers using our AI assistant experienced more time for teaching and learning, less time spent on administrative tasks, reduced technostress, and greater comfort with AI in the classroom. Meaning that extra time and energy goes where it really matters – to students.

Merlyn was built from the ground up to support teaching and learning. (Image credit: Adobe)

Fact #3: Domain specificity matters. (Okay, but what does that mean?)

Many of the AI models that are capturing today’s news headlines are considered general-purpose models, which means they’re trained on massive datasets encompassing text, code, and other information. This breadth of knowledge allows them to perform a wide range of tasks, from drafting emails to answering questions, making them incredibly versatile.

But trying to adapt an expansive business tool for a classroom is trying to fit a square peg into a round hole. 

AI succeeds when solutions are purpose-built for specific industries to solve specific problems. For example, credit card companies using specific AI algorithms built to monitor fraud, or healthcare companies using specific AI algorithms to assist radiologists reading MRI scans.

With AI, bigger is not always better. 

That’s why we built an AI solution that is domain specific to education – AI to understand the language of teachers, as well as their goals and objectives. AI that is context aware to understand the environment, applications, and teacher workflows.

To dig into the concerns with large LLMs a bit more, and explain how domain specificity addresses those risks, let’s talk about hallucination (don’t worry, we’ll explain) and data privacy:

Large LLM Domain Issue 1: Hallucination

While it’s incredible that generative AI can generate answers to prompts, one drawback is the potential for the algorithm to make up facts entirely – something technologists call “hallucination.”  

Domain-specific LLMs (like Merlyn) are less susceptible to hallucination than the jack-of-all-trades models because they’re trained on specific material and smaller in size. With a smaller training dataset, a smaller LLM is also less likely to encounter conflicting or contradictory information, meaning you get more focused responses.

Merlyn is built on the first LLM designed specifically for the classroom – pulling from vetted educational content and resources, not from the entirety of the internet. By focusing on educational content, the model can generate more relevant responses that are aligned with specific learning objectives and curriculum standards.

If AI’s in the classroom, it should make a teacher’s job easier, not harder, right?

Large LLM Domain Issue 2: Data Privacy

One other big reason large language models feel like a risky proposition for schools is data privacy. AI has access to a wealth of knowledge and many are built on a model where the information being input is shared with the larger LLM. 

In schools, policies like the Children’s Online Privacy Protection Act (COPPA) and the Family Educational Rights and Privacy Act (FERPA) put important guardrails in place to regulate what information is shared with – and collected from – students, as well as how that happens. 

Here’s where small, education-specific LLMs offer another huge win for educators. At Merlyn, our Responsible AI team works to ensure that Merlyn solutions built on our domain-specific LLM adhere to the highest standards for safety, in addition to COPPA and FERPA requirements.

And because we prioritize safety and security from the ground up, we’ve taken it a few steps further, adding domain-specific guardrails appropriate for use in education, including:

  • Beta testing with schools to validate our solutions and co-design new ones.
  • Applying a safety model on user prompts and model-generated responses to identify and filter harmful, biased, or inappropriate content.
  • Internal and external 3rd party Red Teaming, a.k.a. asking our team and partners outside the organization to simulate cyberattacks on our systems, helping us challenge our existing defenses.
  • Aligning the model’s responses with information and guidelines specific to school or district policies for any sensitive topics.

Educators and administrators have enough on their plate to worry about – our mission is to help them leverage the power of generative AI without adding more to their to-do lists.

Recommended Read: Merlyn Mind Looks to Simplify the Teaching Process Through Responsible AI Use, Tech & Learning

Embracing AI: A Balanced Approach for Schools

With the prevalence of technology in classrooms and the rise of AI innovation, it’s inevitable that these conversations will happen in your schools if they haven’t already. And we get why educators are skeptical about blindly adopting new technologies – they should be! With a classroom full of kids to keep engaged and little time to spare, there’s no room for tech that complicates a teacher’s instructional flow.

But intimidation or confusion doesn’t have to stop the conversation. 

By understanding these key factors, school leaders can confidently navigate the AI landscape and unlock its potential to give teachers a much-needed hand.

Want to learn more about responsible AI policies? Explore our Responsible AI Resource Center and look for more blog posts to come on this topic!

Book a demo

Schedule a free personalized demo to see our purpose-built solutions in action, and hear how innovative schools are leveraging the power of Merlyn in their classrooms.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.