Fall classes are coming up quickly, especially for those of us offering or taking them. I’m very curious to see how academic institutions are working on how to deal with emerging AI technologies. That is, are policies around ChatGPT etc. appearing? How are individual instructors rethinking and redesigning their sections as the collision of student with tech draws nigh?
Surveying the roughly 20,000 academic institutions around the world to see how they are reacting - in realtime - is a project too large for a single Substack post or scholar. Today I’d like to offer a kind of schemata of ways campuses might offer AI responses, and exemplify many with cases volunteered by readers. This is a draft framework, which reality will surely modify. I would love to learn of more examples from you, dear readers.
New policies Here a college or university creates a new regulation concerning AI use. I’ve been looking for examples of this, but have yet to see one.
Now, College Unbound is doing a fascinating and possibly unusual effort on this score, going through an iterative policy and practice development process including students. We’ll see what appears from it. (thanks to Lance Eaton)
Revised policies Some institutions have amended current policies, usually around cheating. For example, the American University of Armenia added a note about AI in their cheating policy:
6.4.2. Cheating. Cheating includes but is not limited to:
6.4.2.1. using or referring to notes, books, devices or other sources of information, including advanced Artificial Intelligence (AI) tools, such as ChatGPT, in completing an Academic Evaluation or Assignment, when such use has not been expressly allowed by the faculty member who is conducting the examination;
Hosftra University on Long Island did a similar amendment to their syllabus policies:
Hofstra University places high value upon educating students about academic integrity. At the same time, the University will not tolerate dishonesty, and it will not offer the privileges of the community to the repeat offender... Use of generative artificial intelligence tools (e.g. Chat GPT) must be consistent with the instructor’s stated course policy. Unless indicated otherwise in the instructions for a specific assignment, the use of Chat GPT or similar artificial intelligence tools for work submitted in this course constitutes the receiving of “unauthorized assistance for academic work”, and is a violation of the Hofstra University Honor Code. Students bear the ultimate responsibility for implementing the principles of academic integrity. [emphases added]
Guidelines One alternative to implementing policies is to publish suggested codes of operation, or guidelines. The University of Illinois Urbana-Champaign has issued such a document, in some detail. According to Ted Underwood Jamie A. Nelson played a major role in shaping them.
Career services Academic institutions might have their departments which prepare students for the labor market make some changes to reflect potential AI impacts on the work world. On Twitter the excellent Amanda Sturgill remarks that her campus, Elon University, is considering this kind of thing.
Individual classes and faculty members To the extent that instructors have autonomy over their sections’ policies and operations, they can craft AI structures. Lance Eaton has assembled a Google Doc with dozens of examples of this:
Topical resources Some institutions are setting up professional development resources and opportunities. For example, The Ohio State University has posted one helpful webpage. (thanks to Terry Bradley)
Study An institution commissions a group to analyze AI and get a handle on its implications for the school, then to start any policy development.
Karl Aho mentions this is happening at Tarleton State University: “We've got a study group working on advising the provost about policy/guidelines. We don't have one in place yet.” Marc Watkins says that a campus-wide task force is working on spreading AI literacy across the University of Mississippi.
University AI study can also take the form of formal classes. Auburn University launched a Teaching with AI online class for its faculty and staff. The University of Mississippi has conducted a summer class on AI.
Nothing I don’t mean that to sound cruel or dismissive, but perhaps some colleges or universities aren’t changing anything on this score. This could occur for a number of reasons: satisfaction with current policies; considering the AI challenge to be minimal; confidence that instructional and support staff can handle any AI issue.
Bear in mind that what I see as “nothing” might mean a campus conducting serious research and planning, just not visible to the outside world.
Now it’s over to you, readers. Are you seeing examples of these categories? Are there other types of responses we should know?
(thanks to Perry Share, Exhaust Fumes, Mark Watkins, and Brent Anders)
The AI-generated images here are straight out of a storyboard for a sci-fi horror film
Whole lotta nothing here!!!