Artificial Intelligence (AI) in College

We’re including this section on “AI in College” because it’s clear that generative AI (GenAI) tools, including platforms such as ChatGPT, are beginning to transform what student workflows and class instruction looks like in higher education. This emerging technology is reshaping what it means to be a student and professional, and yet we’re at a rather messay stage right now with no real consensus on how these tools should be used—or  whether they should be used at all. Some students and faculty actively avoid GenAI in the classroom. Others are not yet familiar with the wide range of tools and capabilities available to students and instructors. And some are embracing these new tools and actively experimenting with them.

Regardless of how you are currently using or not using AI in your classroom, all of us are feeling the impact.

The purpose of this section

This guide does not offer suggestions for how students or educators should use large language models (LLMs)—such as ChatGPT, Google Gemini, Microsoft Co-Pilot or Anthropic’s. As faculty ourselves, we understand the need to slow down and consider new tools critically. A deliberate and critical approach to generative AI is particularly important, since tools such as ChatGTP struggle with accuracy and hallucination, foster bias and censorship, and can easily become a substitute for thinking.

However, we do provide a chapter that offers guidance on ethical considerations and how to adhere to your classroom’s academic integrity policy. At this moment, most colleges are developing guidance or policies around AI use in the classroom. In addition, instructors at CWI have careful wording in their course syllabus about what constitutes acceptable vs. non-acceptable uses of AI. Students should become familiar with their institution’s and instructor’s AI policies as they navigate the AI landscape.

At the College of Western Idaho, school syllabi now include the following language:

Practicing academic integrity includes, but is not limited to, non-participation in the following behaviors: cheating, plagiarism, falsifying information, unauthorized collaboration, facilitating academic dishonesty, collusion with another person or entity to cheat, submission of work created by artificial intelligence tools as one’s own work, and violation of program policies and procedures.

Students must understand how their specific “working with AI” practices relate to this institutional expectation and their classroom’s syllabus policy. For example, how should Quillbot’s paraphrasing and co-writing capabilities be classified? Should students be allowed to use Grammarly (or even Word’s build in grammar checker) to correct their grammar and syntax? This section will provide an overview of what AI-assistance looks like and guidance on how to navigate those kinds of issues.

This section includes the following chapters:

Note about updates to this textbook section

The practice of working with AI is evolving rapidly. As soon as this section is published, it will be somewhat outdated. The affordance of OER, however, allows us to update this textbook more frequently than a traditional textbook. We intend to regularly maintain this section and will update on a semester-by-semester basis.

License

Icon for the Creative Commons Attribution 4.0 International License

Pathways to College Success Copyright © by CWI 101 Leaders is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book