Advancing Equity through Teaching with Artificial Intelligence

Flower Darby is an associate director of the Teaching for Learning Center at the University of Missouri. In this role she builds on her experience teaching in person and online for over twenty-eight years to empower faculty to teach effective and inclusive classes in all modalities. She is a coauthor of The Norton Guide to Equity-Minded Teaching (2023), along with Isis Artze-Vega (Valencia College), Bryan Dewsbury (Florida International University), and Mays Imad (Connecticut College). 

Flower Darby
Image Credit: Cameron Clark

In this age of Generative Artificial Intelligence (often abbreviated GenAI or, simply, AI), many educators are understandably apprehensive about students’ illicit use of these tools and the possible negative impacts on students’ critical thinking and authentic learning. I frequently hear from dedicated instructors who worry about how to ban or detect their students’ use of AI. While I empathize with and validate these concerns, I’ve concluded that ultimately this is not a productive way to spend our intellectual energy. Instead, I propose that we reframe the opportunity in front of us: we can help our students learn the ethical and responsible use of AI. In doing so, we are advancing equity in teaching, learning, and society more broadly. 

As one of four coauthors of The Norton Guide to Equity-Minded Teaching, I’ve argued that there is much we can do as instructors to promote equitable student engagement and learning in all disciplines and class sizes, whether we teach in person, online, or some combination of both. But does teaching with AI factor into this worthwhile goal? After months of study, exploration of emerging best practices, and conversations with faculty across the country and in varying institutional contexts, I’ve come to believe that it absolutely does. Here’s why. 

First, the use of AI has already become a case of the rich getting richer. Students who know how to use AI tools effectively are experiencing significant gains (as, indeed, are workers in virtually every industry). Students who don’t, possibly because they attended high schools or colleges that banned the use of AI, are already being left behind. Further, AI is transforming jobs. As many have quipped, “You won’t lose your job to an AI. You’ll lose your job to someone who knows how to use AI.” If we aim to prepare our students to be competitive in today’s job market, we should help them develop AI literacy, fluency, and awareness of the ethical considerations involved. 

However, instructors also have ethical concerns about the use of AI. We know that tools such as ChatGPT amplify the existing societal biases at work today. As Addy et al. (2023) argue in their essay “Who Benefits and Who is Excluded?”, the output that tools provide reflects the voices and experiences of those who have traditionally had the most power, while marginalized voices and experiences are further excluded in the current approach taken by Large Language Models (LLMs). We may struggle with the environmental costs of bringing more tech into our lives, given the huge amount of energy required for the computing that enables AI. Lots of us are also concerned about the exploitation of human labor by companies who pay pennies on the dollar to train their systems. Many of the people undertaking this work are in developing countries or hold marginalized identities in the U.S., and they experience genuine mental trauma while flagging offensive material as inappropriate.  

We (and our students) also may be concerned about data privacy and security. By choosing to use these tools, we give away a lot of personal information, not to mention intellectual property. If we require students to use AI, we’re asking them to give away these things as well. In addition to data privacy concerns, our students are not universally enamored with these tools. Many of them struggle to see the value of using them or feel worried about accusations of cheating if they do. Last but by no means least, a site of inequity regarding the use of AI is the fact that some tools offer advanced capabilities for a price. As with many things in our society, students who can afford to pay are at an advantage. For example, ChatGPT has a limited-use option, 3.5, which does not access the internet in real time. For $20.00 per month, people can use ChatGPT 4, significantly more powerful with access to real-time internet data. For students with housing or food insecurity issues, shelling out $20/month for “better” AI may not be an option. 

Despite these very real concerns, we should also consider the ethical benefits of teaching with AI. It can level the playing field for students with learning disabilities, neurodivergent students, and multilingual students. Additionally, teaching students how to use these tools appropriately is important for their future economic well-being. Recent research shows that current employers are already willing to pay a 47% higher salary for skilled AI users, and that they’re facing a shortage of such individuals (AWS & Access, 2023). But it’s not just about money. We can foster the well-being of our future democracy and society as a whole by teaching students to be critical consumers and users of AI. Given the increasingly blurred lines between truth and artificially generated media and other artifacts, we owe it to our future selves to help students learn about the appropriate use of AI and to begin to cultivate critical awareness of its use more broadly. 

So, what to do? If you haven’t already begun to incorporate generative AI tools into your teaching and learning activities, it’s not too late. Here are three first steps: simple ways you can get started with AI in your courses. For each of these suggestions, it’s important to talk with students (or engage in asynchronous discussions) about the appropriate use of these tools, their hesitations, and your own learning curve. This kind of transparent and vulnerable communication shows that you’re prioritizing student learning and building student trust—important efforts in our equity journey. 

Add a draft syllabus statement, if you don’t already have one, and solicit student input before finalizing it. A quick online search will reveal many different examples of syllabus statements that provide guidance on the use (or avoidance) of AI in your course. Better yet, ask an AI tool to generate one for you, then refine with student input in a real-time discussion or asynchronous activity using a collaborative document. This is an excellent way to increase transparency as well as invite student feedback, two key equity-minded strategies. 

Assign students tasks that require them to learn more about the kinds of ethical considerations I’ve outlined above. Ask students to investigate one of the topics and incorporate steps in the process in which they use an AI tool of their choice to help conduct their research, for example. Or they can take a deep dive into one particular tool, read and annotate its terms and conditions, and explore issues and concerns they may find therein. Consider offering students choices so they can select a task or topic that most interests them. Foregrounding relevance in this way fosters intrinsic motivation, another approach that can advance equitable learning outcomes in your courses. 

For an existing assignment, identify steps in the task where students might productively use AI. You might suggest students use a generative AI tool to generate ideas for an arguable thesis statement for an essay writing assignment, for example. Better yet, ask them to help develop a list of options from which they can choose. We all have different strengths, and we each may want to use AI in differing ways to help us be more effective and efficient in our work. Rather than establishing a one-size-fits-all set of assignment instructions, give students choices (another important equity-minded strategy) about where and how they might use AI in a particular assignment. 

If, like me, you care deeply about creating inclusive and equity-focused learning environments, you’ll find your own motivation to begin to integrate AI into your teaching and your students’ learning. As we help our students navigate this terrain, we can explore efficiencies such as generating a rubric or creating case studies and scenarios for analysis. When we save time on the teaching-related tasks we all must fulfill, we’ll find we have more time for what really matters: interacting with our students. 

References 

Addy, T., Kang, T., Laquintano, T., & Dietrich, V. (2023). Who benefits and who is excluded? Transformative learning, equity, and generative artificial intelligence. Journal of Transformative Learning, 10(2), 92-103. https://jotl.uco.edu/index.php/jotl/article/view/518 

Amazon Web Services (AWS) and Access Partnership. (Nov. 2023). Accelerating AI skills: Preparing the workforce for jobs of the future. https://assets.aboutamazon.com/e1/a0/17842ee148e8af9d55d10d75a213/aws-accelerating-ai-skills-us-en.pdf 

Leave a Reply