Standardized tests have been a staple in education throughout the United States for about 100 years. They’ve been used to uniformly test the attitude of college-age to kindergarten students. They determine the universities a particular student can attend, whether a child advances to the next grade and how much federal funding a school gets with the No Child Left Behind Act of 2001.
Those assessment methods can be beneficial and expose areas where students, and even entire schools, can improve their educational outcomes in specific areas. However, a major critique of standardized tests is that those same “outcomes,” aka student’s qualitative scores, take precedence over soft skills. They also diminish the value of nuanced teaching methods that cover a wide variety of subjects and don’t cater to different learning styles in students.
And these critiques are not new. In 1920, the influential education reformer, philosopher and psychologist John Dewey was critical of how much the education system was beginning to rely on standardized student assessments at the time.
“Our mechanical, industrialized civilization is concerned with averages, with percents,” said Dewey. “The mental habit which reflects this social scene subordinates education and social arrangements based on averaged gross inferiorities and superiorities.”
I recently spoke to Amanda Slavin, an education-focused entrepreneur with a master’s in curriculum instruction and author of “The Seventh Level,” who reaffirmed Dewey’s century-old assessment. “We’re obsessed with outcomes and productivity in Western society. Our traditional education curriculum was built around the industrial revolution. It was meant to create factory workers that were output machines. Productivity was the point.”
What does all this have to do with generative AI, particularly in schools? Everything.
Our obsession with outcomes and productivity has left major aspects of how and what children are taught unchanged for generations. Then ChatGPT came out and it upended massive parts of our education system in a week. Students experimented with it and then the headlines poured in about cheating. Some schools, like New York City public schools, rushed to ban it in their networks.
Slavin argues that suppressing gen AI in education is shortsighted and damaging for a number of reasons. First, it keeps the antiquated, outcome-based learning system we’re accustomed to in place. Standardized learning must be replaced with something and generative AI offers an opportunity for students to engage in a more human-focused, soft skills-based curriculum centered around concepts collaboration, creativity and problem-solving and much more. Banning it also keeps students — who are already more tech savvy than adults — stuck in the past as the rest of the world heads toward a gen-AI-infused future. Lastly, barring this technology discourages teachers from using it to address their pain points around lesson-planning, grading and overall time management.
Adjusting education to generative AI is no small feat, but it’s both doable and necessary. Teachers are embracing it and professors are already adapting their curriculums to make room for more group work, oral exams and handwritten work in its wake. Slavin believes that rather than being fearful of this technology, adults should welcome it; we need it to move from our existing outcome-based education system to one where students can use AI to be more human. Below, she dives into just how we can collectively make that happen — and the importance of doing so.
Slavin’s note for the educators: “In no way, shape or form is this the fault of teachers. I feel that there's no more important job than a teacher and they should be up there with the most highly-paid professionals. I'm saying this is a problem with the system as a whole.”
How is generative AI shaking up the education space?
I think anything that's new and innovative is terrifying to an industry that is outdated and needs to be improved. So generative AI is a very glaring, terrifying reminder of that to a system that has needed to be brought into the future for decades. The pandemic was a perfect example. Most school systems were completely distraught because they had no idea how to teach children in digital environments. And that just shows how far behind we are within the way that we think about education and technology and access to that technology.
And I think that the problem we're facing is that our society, even outside of education, is obsessed with hyper-growth, productivity and outcomes. Many teachers now are cogs in a wheel to reach a standard based on the premise that productivity at all costs is success. Generative AI comes out and it creates a spiral effect for educators. Many thought, “Oh no, this is ruining all the hard work I've done. Now, I'm not going to be able to get my kids to do these essays. They're going to be cheating. Now they’re not going be able to reach the standards and I'm going to lose funding.” Which is the reality for many educators as funding is based on outdated models that do not serve the students or teachers.
It is so sad that officials in the New York City school system, which is in the state that spends the most on students, needed to ban something [ChatGPT] that is so innovative because the only way they can get funding is to be dependent on outputs that cannot change.
So all of this means there are much bigger problems than generative AI. There are so many things in education that need to be discussed and transformed. But there's not enough time because teachers are overworked, underpaid and leaving their field at accelerated rates. They can't be innovative or even think critically anymore because they need government approval to determine what they can teach. So in order for us to have conversations on anything, we need to create opportunities for these teachers to not be measured solely on outcomes.
How important is it that the education system actually allows students to learn about and use generative AI instead of banning it?
The NYC Department of Education said that ChatGPT is not teaching critical thinking skills. ChatGPT is a tech tool — it’s not meant to teach children critical thinking skills. We need to teach our children critical thinking skills so they can be ready to use tools like ChatGPT and Jasper and Lensa to enhance the work that they actually want to put into the world.
All of this is showing us the cracks in the veneer of our education system. ChatGPT was out for one week and kids learned how to hack, and break the educational system that was designed for them. That's the problem that we should be looking at, not the tool.
We're no longer in the Industrial Revolution. We're in a technological revolution with very powerful AI tools. Shouldn't we change what we're teaching our kids so that they can work collaboratively with the technology instead of against it, or thinking it will replace them? The solution to addressing it is not banning it. The tool is not dangerous but banning it is. And a lot of our kids are going to use this whether or not we like it. So let's teach them how to use it well versus just abusing it.
Innovation is terrifying to adults. But our kids are not terrified. Technology is just a part of who they are. Adults need to level up and evolve because our kids are going to know how to use the technology but parents are going to be left behind.
What does it mean to prepare children for a new world of technology that involves generative AI?
Saying that we shouldn't be teaching our children these tools is actually saying we want to leave our children behind. The tech is here whether or not we like it. It would be like banning teaching kids how to drive even though cars are here. Are your kids going to have to walk everywhere, which doesn't allow for them to get to their job or transport their families, while everyone else is driving?
And because we are measuring success in all the wrong ways, we are setting our children up to continue being in a mindset that is obsessed with outputs versus understanding the process behind learning so that they can continue to iterate and grow, like AI.
We should be learning how to utilize these tools in a way that supports our students and enhances their humanity in a world of technology. Students should be taught to lean more into foundational soft-skills like compassion, empathy, leadership, being entrepreneurial, critical thinking and creativity. These traits aren’t being taught much right now because we're stuck in the past. And the future is happening every single day.
And the thing with generative AI is, if we don't teach our children to have discernment, empathy, compassion, and to care about other people, we're going to have AI that is racist, sexist and prejudiced. AI is not an isolated problem — human beings are a problem. It’s taught by humans. So if we don't change, we are going to be in trouble. We need to become better and that's a conversation people don't want to have. From there, we can teach our technology to be better.
It seems like certain elements of the education system need to be reworked at a high level. What does that look like? What are some things we should consider?
We need to tell everyone in education, “This is not just about the standardized test scores or grades. We are giving you an opportunity to change the way that we measure your success and students’ success. We're changing engagement, which is a two way street. We're changing the way that we talk about success along the education journey, not just the outputs. We want to measure how children learn human skills and the process of critical thinking also.”
We need conversations — teacher and student, parent and student, teacher and parent — around how each individual student engages with the world around them. Because there's no such thing as a “standard” student anymore. How can we create personalized learning environments that inspire students, so when a generative AI tool or when a Web3 tool comes out, it's not as scary because we've evolved the process of learning? Then teachers, parents and administrators aren’t as afraid because they can see these are just tools to enhance learning versus destroying the outcomes that we are so dependent on.
How can generative AI help educators?
I would start by determining how we as innovators and technologists can make educators’ lives easier and support them. How can we grow with them versus having it be us versus them? We're all a part of this problem. So how do we come up with solutions in a collaborative way? Then we can go to the NYC Department of Education and ask them, “What problems are you trying to solve that you cannot?”
A lot of issues in education come down to a lack of time, energy and money. And I think what’s really going to give educators time, energy and money back is by changing our obsession with standardized scores and outcomes. If they change standardized tests, something has to replace it. To do that, I would work with teachers and say, “Here's a generative AI program designed to make your life so much easier and also address more critical thinking in learning.”
For example, if you incorporate ChatGPT or Jasper into the curriculum, the output is going to change. So rather than having students write a five-paragraph essay about a book like George Orwell’s “1984,” the final project could be creating a presentation about a business that would be accepted in that universe. Students could interview the AI and create content and imagery with it. From that experience, we're going to have multiple ways of assessing critical thinking.
I’ve also worked with a series of teachers who told me they spend dozens of hours a week on curriculum planning. AI can help teachers develop their curriculums using their creative ideas. It can streamline that process to make it more efficient for them. Then they can spend more time personalizing learning for students. Or the tool can help them grade assignments.
We [technologists] can show educators what these tools can do for them once we learn more about their needs. It can open more time and resources to allow them to understand students better and be with them more, which is what teachers enjoy. But in order to do any of that, we need to value the idea of teachers working with their children more and personalizing time with them versus, again, outcome and output. And that is a society thing — it’s not rooted in the child or the board of education.
We need to change the way that we measure success around learning. Period. And we need to use AI to help us do more of the things we love; to give us time back and make us more human.