As the 2023-2024 academic year begins, change is imminent. Not just a technological change, but a societal one. A sea change. Some envision a brave new world, while others fear a new world order.
What’s ahead is difficult to grasp. Like invisible radio waves that bring words and images to handheld devices, or the atoms and particles students learn about in science class, the speed of artificial intelligence’s ongoing innovation in education cannot be seen by the human eye. Moreover, the next industrial revolution driven by artificial intelligence is ongoing. It’s too soon to reflect on lessons learned and mistakes made.
The onset of ChatGPT in late November 2022 prompted a sense of urgency for teachers, administrators and policymakers. On command, generative AI can produce essays in conversational language, solve complicated math problems or write lesson plans for any grade level — in a matter of seconds. Its potential uses haven’t been quantified yet. Education officials are still scrambling to rewrite the rules on plagiarism as technology companies churn out products to differentiate the writings of humans from those of AI.
CALLS FOR A PAUSE
Regulation and bans of ChatGPT and other AI tools vary by states, local school districts and even organizations: The College Board, which administers high school Advanced Placement courses, prohibits the use of ChatGPT under any circumstances, while the International Baccalaureate program welcomes it.
New York City Public Schools, the largest district in the nation with more than 1,800 schools and roughly 1 million students, initially placed ChatGPT on its list of restricted websites but later allowed it to be used by teachers and students alike.
“The knee-jerk fear and risk overlooked the potential of generative AI to support students and teachers, as well as the reality that our students are participating in and will work in a world where understanding generative AI is crucial,” New York City Public Schools Chancellor David Banks said in a prepared statement in May.
The use of AI in schools is still decided at the school board level. California, the largest state in the country with about 6 million students in K-12 public schools, has set a gold standard in Internet privacy laws that protect school children but has not established any laws or policies for or against AI use, said Niu Gao, a senior fellow with the nonprofit Public Policy Institute of California.
“Curriculum remains a local choice,” she said in an interview with Government Technology, “but access is always a concern.”
Some education leaders in Georgia are trying to set a proactive example for the nation to follow. Under the AI4GA program, middle schools in five counties piloted a nine-week elective called “Living and Working with Artificial Intelligence,” and a statewide AI curriculum for all grade levels is under consideration. Seckinger High in Gwinnett County, which opened in August and serves 1,500 students in grades nine through 12, proclaims itself the first AI-focused school in the nation.
THE ‘FLIP-PHONE PHASE’
During various presentations at this year’s International Society for Technology in Education (ISTE) conference in Philadelphia, where AI often dominated conversations, the show of hands of educators from across the nation who had actually used ChatGPT or other generative tools was not overwhelming, though it appeared most had heard of it. Likewise, those technologies still haven’t cracked a national top 40 list of ed-tech tools most commonly used in U.S. schools. But that could change fast.
“The student/teacher relationship will likely be changed forever as [AI chatbots] mature and greater student agency could be offered with an AI assistant,” the Consortium for School Networking (CoSN) noted in a paper for its membership published earlier this year, ChatGPT – Above the Noise. “There’s been a general recognition that this moment in time is a unique one. It’s the sort of thing you need to tell your friends about when you see it.”
Wendy Hanasky, an instructional technology consultant at the East Central Ohio Educational Service Center, said students in her region of the state, when they return to school this fall, are most likely to encounter chatbots or prompts on their first day of class. She also anticipates a strong demand for the latest Khanmigo tool, which, instead of producing information on command, converses with students and requires them to understand concepts and processes as they work together to complete assignments.
Speaking at the ISTE conference in late June, Hanasky was quick to remind her audience that, even by its inventor’s own admission, ChatGPT is in the “flip-phone phase” of generative AI technology, and she implored fellow educators to look at other major developments and predictions in AI outside the classroom: A man who had lost the use of his legs was able to walk again after he received an implant that allowed the brain and spine to resume communications with the help of AI; a fully paralyzed patient was able to send an email via his thoughts without using his voice or fingers; and a futurist predicted that by 2030 humans could achieve immortality with the use of tiny nanorobots that can “fix us” at the cellular level.
“How does this affect us in education? Greatly — especially for students interested in health-care careers,” Hanasky said.
Hanasky completed an eight-week AI professional development course for teachers last year, then familiarized herself with as many tools as possible and used them in conjunction with each other to develop lesson plans. For example, she used the image generator Midjourney to visualize the solar system while ChatGPT produced 20 true-false questions and homework assignments on the same topic.
For a social studies lesson, Hanasky combined Hello History (AI-generated historical figures) with Play.ht (text to speech), ChatGPT, Blockade Labs (360-degree images) and Thinglink (interactive images). Then she used the video generator D-ID to merge the tools together. The finished product was an activity where the user controlled a 10-year-old girl in the early 1800s navigating her way to freedom using the Underground Railroad. It looked and sounded like a newly released video game created by professional developers.
“It was all through prompts,” Hanasky said. “It took me about an hour.”
Using AI to teach various subjects is one thing, but explaining to students how AI works is another, Hanasky said. To illustrate the latter, she demonstrated a tool called Quick, Draw! that challenges users to draw six items within an allotted time period. After completing her last drawing in the exercise — a rabbit — Hanasky scrolled down a list of items organized in alphabetic order, clicked on a rabbit icon, and found that what she sketched just seconds ago was now in a live database of 133,000 other drawings. She was also able to point out how the AI tool categorized drawings by exact pen strokes and starting positions on the pages. Those who use AI are also feeding it, making it bigger, smarter and stronger.
“This is where the data comes from,” Hanasky said. “It’s a fun but important way to teach students.”
The power of AI can be frightening, especially to those worried about job security. Hanasky encourages teachers to view ChatGPT as an intern, something to help write lesson plans and emails, but not something or someone one would trust to send out a message or document unchecked with one’s name on it, let alone a tool that could be counted upon to meet one on one with students or parents.
AN ASSISTANT, NOT A LEADER
The U.S. Department of Education Office of Educational Technology, in its May report, Artificial Intelligence and the Future of Teaching and Learning: Insights and Recommendations, identified key differences between technology and human teachers, noting that AI cannot meet learners where they are like teachers can, nor does it grasp commonsense judgment.
“Experts in our listening sessions warned that AI models are narrower than visions for human learning and that designing learning environments with these limits in mind remains very important,” the report said. “The models are also brittle and can’t perform well when contexts change.”
CoSN, in its ChatGPT – Above the Noise membership paper, identified the myths of the AI tool that educators should be aware of.
“AI bots don’t ‘know’ anything. They are simply a knowledge doppelganger, pretending to know in a convincing manner. They simply spit back data but do not have ‘understanding,’ as we think of human understanding,” the paper says. “Another myth that specifically hit K-12 education is that with the ability of chatbots to write code, perhaps we don’t need to teach coding anymore. The truth is that students need to understand the coding to find potential flaws and learn how to use AI-generated code to build upon. We need to teach coding with this in mind and accommodate the addition of a coding assistant; perhaps like a thesaurus or some inspirational prose may help a writer.”
The paper also challenged teachers to reflect on their initial perceptions and experiences of other recent technologies that may have also been considered “disruptions.”
“There have already been many disruptive IT analogs to which we’ve adapted. Mass adoption of the Internet was an information disruption. Napster was a distinct music industry disruption that is now normalized as many of us pay for a streaming version today. Google Docs for Edu [now Google Workspace] disrupted collaborative work and we’ve seen many predictive text adoptions in our docs, email and chats,” the paper says.
TO INNOVATE AND MOTIVATE
Looking ahead, meanwhile, teachers and ed-tech professionals share different visions of the future of AI in the classroom.
Martyn Farrows is the CEO of Dublin, Ireland-based SoapBox Labs. Using AI, the SoapBox speech engine powers tools that teach foundational literacy in more than 100 countries where English is spoken, and it recognizes dialects and accents within nations, states and even specific city neighborhoods. He said student assessments will change from their current linear process of marking periods and final exams. With AI, students are continually assessed, so there may not be a need for tests at the end of the academic year. AI could allow teachers to spend far less time on paperwork and grading, and more time working with students individually.
“It’s more fluid and flexible,” Farrows said in an interview with Government Technology.
Betia Bentley, a former teacher in Fayette County, Ga., who is now self-employed as an AI school curriculum developer, was involved in the AI4GA pilot program. She reflects on her own life experiences to understand the importance of AI and its future in public education. She grew up in a poor, rural farming community. She did not pursue computer science in her youth but took an interest after she was exposed to it. To get her first teaching job, she took a position instructing broadcasting and communications. With a foot in the door, Bentley initiated computer science courses, which later became mandated statewide. AI education, she added, should be required nationwide.
“AI is in our life not by choice, so it should not be a choice to learn about it, just like English, math and computer science,” she said. “If I stayed in the same place, I would not have been exposed to all of this because it wasn’t relevant. We can’t think like that anymore. The approach to education and inspiring kids has to change.”
Bentley says the future of AI is about that inspiration. Tools will help children discover their gifts and talents, as opposed to faculty members guiding students toward certain academic paths or vocations based on outdated aptitude measurements. Children with different backgrounds can aspire to become cancer researchers, fashion designers, leaders, artists.
“They will be told: ‘Don’t just regurgitate — collaborate. I need for you to fail, and I need for you to get back up,’” Bentley said. “We must allow for students to be creative and collaborate with technology.”
Amber Jones, a middle school teacher in Douglas County, Ga., has been a part of the AI4GA pilot program with the state for two years now and instructs the nine-week “Living and Working with AI” course. The content is vastly different than the coding courses she taught in the past, because the focus is more on theory than application. Still, the kids spend time learning about things they like, including face filters, ChatGPT and self-driving vehicles. The main idea is that AI is already all around them.
“They can see it everywhere, so they should know that it’s impacting their lives,” Jones said. “They’ll learn that all of this is based on algorithms, and they need to understand that their data is tracked. They’ll dip their toes in a whole bunch of concepts.”
In learning about AI, students will develop a much greater interest in English, math or other subjects. Math is applied when the kids work with graphs to understand how a computer’s brain is mapped with an axis to recognize dimensions and other categories. English applies when explaining how to best communicate with smart devices powered by Siri.
Bentley said it’s hard to fathom what AI technology has in store for the classroom, but she is certain of this: Curriculums will become more standardized, much deeper concepts in AI will be taught and more applications will be carried over to other subjects. Most of all, school will become a more engaging and interesting place for young people.
Hanasky, the instructional technology consultant from Ohio, offered this advice for teachers: Augment technology into what you’re already doing, but start bringing students along. And think of whatever you presume is the latest and greatest AI educational technology as a bubble that will pop.
“We’re preparing them for careers that don’t exist yet,” she said. “Our students need to be able to bob and weave.”
This story originally appeared in the September issue of Government Technology magazine. Click here to view the full digital edition online.