While AI holds immense promise for personalising learning and enhancing educational experiences, it’s crucial to acknowledge its potential downsides in K-12 environments (primary school to high school) in all corners of the world. Having a balanced approach to the consideration of the implementation of AI in learning environments will help school leaders, teachers, and parents navigate the exciting yet complex world of AI-powered education.
The Dark Side of the Algorithm: Potential Pitfalls of AI in K-12
While AI promises customised learning paths, several concerns remain as we continue to learn more about AI and the implications of its use in the modern world, and classroom.
Bias and Inequality
AI algorithms can perpetuate existing biases present in the data they’re trained on. This can lead to unequal learning experiences, to the detriment of students from certain backgrounds. Without a discerning approach to this matter, AI tools may continue to propagate these tendencies.
A great example is an AI called “Bookworm AI” designed to personalise reading recommendations for students. This AI analyses a student’s reading history, preferences, and grade level to suggest books they might enjoy. However, it has been observed that if the training data primarily consists of books by Western authors, the AI might overlook fantastic works by Asian authors, for example.
This could hinder students from diverse backgrounds from encountering a well-rounded selection of literature. For instance, a student interested in historical fiction might be suggested: “The Book Thief” (German setting) or “All the Light We Cannot See” (French setting), but not “Pachinko” (Korean-Japanese diaspora) or “The Kite Runner” (Afghanistan).
This bias not only limits exposure to diverse literary traditions but also reinforces the idea that “great literature” primarily comes from the West, marginalising the contributions of Asian and other non-Western authors.
Dehumanisation of Learning
Reliance on AI tutors could diminish the vital role of human teachers in fostering critical thinking, social interaction, and emotional development which are all hugely crucial aspects of education.
While AI like DreamBox Learning can provide personalised feedback on Maths problems, it can’t offer the same level of encouragement, guidance, and real-time course correction as a human teacher can. In science experiments, for example, a teacher’s presence is essential to ensure student safety, answer questions, and guide them towards deeper understanding through discussions.
The human touch is an exceptionally important aspect of learning, and ensuring that students aren’t relegated to cold, isolated learning environments is an important balance and consideration for educations the world over.
Privacy and Security Threats
Student data security is paramount, especially in digital learning environments, where data is online and relies on safety standards of multiple parties and suppliers of software from around the world. Regional data protection regulations can differ, and ensuring the right standards are adhered to is crucial when assessing options. AI systems that collect and analyse student data raise privacy concerns, and the AI industry at large are consistently battling perceptions of control of AI and whether they access information they should not.
Schools need to be transparent about the data collected by AI-powered learning platforms and ensure it’s stored securely in accordance with regional regulations. This is particularly important in countries with stricter data privacy laws, such as the European Union and South Korea.
Tech Dependence and Reduced Critical Thinking
Overdependence on AI for problem-solving can hinder students’ ability to develop critical thinking skills and develop independent learning skills.
An AI homework helper might churn out solutions to complex math problems in seconds. However, this deprives students of the opportunity to grapple with the problem themselves, develop logical reasoning skills, and experience the satisfaction of arriving at a solution independently.
For instance, Photomath allows users to take a picture of a Maths problem, and it will provide a step-by-step solution. While this can be helpful for checking answers or understanding difficult concepts, it can also hinder the learning process by preventing students from developing problem-solving skills and critical thinking abilities.
Though access to an answer through such a tool would have revolutionised my educational experience, I think we can agree that its only after grappling with the problem at hand, and then working through the solution step-by-step that we internalise and learn from the work.
Job Displacement
AI automation might impact educators’ roles, particularly in areas like grading and individualised learning plans. While AI can automate tasks like grading multiple-choice quizzes, it can’t replicate the nuanced feedback a human teacher can provide on essays or projects.
EssayGrader, for instance, is an AI-powered tool that can assess essays and provide scores, but it falls short of offering the in-depth, personalised feedback that a teacher can give to encourage student growth. Teachers will still be essential for guiding students, providing constructive criticism, and cultivating a love of learning
Finding Balance: Mitigating the Risks
Despite these challenges, AI can be a valuable tool when implemented thoughtfully.
Here are a few ways we can ensure that we mitigate the risks that come with the use of these tools:
Teacher Training and Oversight
Equipping your educators with the skills they need to assess AI tools, identify biases, and curate appropriate learning experiences is crucial.
Teachers should be trained to critically evaluate AI-powered learning platforms for potential biases and ensure they align with the curriculum and learning objectives. They should also be empowered to choose the most effective tools to complement their teaching style and cater to the specific needs of their students.
Furthermore, shortlists and reviews of AI tools may be provided of suggested tools that have been found to adhere to standards of testing and review.
Data Transparency and Security
Prioritising data privacy and security through robust data governance practices is essential.
Schools should clearly communicate to parents and students what data is being collected by AI platforms and how it’s being used. They should also implement robust data security measures to prevent unauthorised access and ensure student information remains protected.
Governmental lists and investigatory bodies into AI tools developed for education are a useful starting point when considering data transparency and security. Equally as important, is the continued review of tools as they are developed and whether they maintain adherence to regulatory requirements.
Focus on Human-AI Collaboration
AI should complement, not replace, teachers. The human touch remains irreplaceable in fostering creativity, critical thinking, and social-emotional learning.
Instead of viewing AI as a replacement, teachers can leverage it as a powerful tool to enhance their teaching experience.
For example, AI tutors like Carnegie Learning’s MATHia software can provide personalised practice for struggling students, adapting to their individual needs and providing targeted support. This frees up the teacher’s time to focus on small group instruction or individual mentoring, allowing for more personalised attention and deeper learning experiences.
Promoting Digital Literacy
Cultivating digital citizenship skills in students empowers them to navigate the online world critically and responsibly.
Students need to be equipped with the skills to critically evaluate information encountered online, identify potential biases in AI-generated content, and protect their privacy in the digital world.
Visit our blog “Empower your Students through Digital Citizenship” for more information on how you can help your students become responsible citizens.
A Measured Approach to AI in Education
By acknowledging the potential downsides of AI, we can develop a more nuanced approach to its implementation in K-12 education. One key aspect to consider is that while AI models can provide personalised learning experiences, they do not fully account for the social, emotional, and economic backgrounds of students. Teachers, on the other hand, are uniquely equipped to understand and respond to these diverse needs, fostering a supportive and inclusive learning environment.
Mobile Guardian understands the impact AI has on student learning, and that responsible intervention and access control are vital aspects of a balanced educational ecosystem. By combining the strengths of AI with the human touch of teachers, we can create a more holistic and equitable approach to education that benefits all students.
For more information on access control, see our previous blog post, “How to Block ChatGPT”.
You can also discover more about “Artificial intelligence in education” on our blog.
Onwards,
Panashe Goteka
Team Mobile Guardian