This guide aims to help instructors to learn about generative artificial intelligence (AI) tools like ChatGPT, their capacities and uses in teaching, as well as key considerations for course redesign. It is produced primarily based on faculty discussions in the spring of 2023, in response to the emergence of generative AI, with ChatGPT as an extreme example, and its implications for higher education. Also, we have learned from the discussion with Duke Learning Innovation and user scenarios in Duke courses. Generative AI tools are constantly evolving, so we will keep updating this guide following Duke’s Guidance for Instructors as well as DKU’s Faculty Resource Guide for Teaching and Learning with AI.
Following preliminary discussions, the Language and Culture Center (LCC) first formed a working group to discuss the guidelines specific to language courses. They produced an LCC Faculty Guide on Use of AI Tools, which is referenced several times in this guide.
To work on this general guide on AI and Teaching, the Center for Teaching and Learning (CTL) facilitated an informal faculty working group that included six faculty from three divisions and the LCC. The working group meetings led to the production of a draft guide that has been shared with and reviewed by administrators overseeing teaching and learning, colleagues at Duke Learning Innovation, and leading experts in the field. This guide reflects the combined efforts of all of these groups.
Informal Faculty Working Group on AI and Teaching
Minghao (Rainie) Zhang, Language and Culture Center
Junyi Li, Language and Culture Center
Daniel Weissglass, Division of Arts and Humanities
Ming-Chun Huang, Division of Natural and Applied Sciences
Ferdinand Kappes, Division of Natural and Applied Sciences
Paul Stanley, Associate Dean of Undergraduate Studies
Since late 2022, the AI app ChatGPT has been a big hit in the higher education community. Like the AI-based tools (such as Grammarly and Google Maps) that we have used in everyday life, generative AI technologies use machine learning algorithms that draw upon a large language model (LLM) to generate content in response to a user prompt or question. Certainly, AI tools are no longer limited to text generation; some can create images, videos, or music.
Some have expressed great concern about the threat to student learning and academic integrity while others see this as a trend to learn and work with evolving AI. After months of widespread attention and discussion, educators have learned more about the limitations of the generative AI tools. Meanwhile, many educators are using this as an opportunity to rethink approaches to higher education and to innovate teaching practices.
In higher education, a place that bridges student experiences in schools to industry or academia, college students are extremely concerned about how this trend will impact their learning, research, and future careers. Many have discussed and used generative AI tools inside and outside of class. This is an opportunity for instructors to start a conversation with students in the classroom and update their course syllabi accordingly, including careful attention to assessments that use this technology where appropriate.
Conversation about AI
Both instructors and students can freely share their experiences with AI. Using user scenarios, instructors can facilitate a discussion to analyze the benefits and risks. Instructors, as the disciplinary experts, are also recommended to bring the disciplinary standards surrounding the use of AI in the discussion. This equal and transparent conversation will not only raise students’ awareness about AI and get the expectations clear, but also empower them to become active, self-directed, and independent learners.
Course Policies and Expectations
In higher education, disruptive technologies like that posed by generative AI are sources of understandable anxiety, but like all disruptions, they herald new opportunities for teaching and learning innovation. Our task is to accommodate ourselves to this new technology while preserving important norms in the academy. To that end, the standard policy on the DKU syllabus template will be as follows:
“Please include explicit guidelines for what kind of resources, collaborative work, and/or assistance from others is and is not allowed, and we encourage you to include this information at the assignment level, rather than the course level. Precision and clarity are to be preferred. Since AI is a novel technology that combines some of the features of traditional online sources and some of the features of a more responsive actor, be certain to indicate where students have permission to use generative AI tools and when they do not, and be mindful of assessment design that supports your aim.”
At the course level, instructors are strongly encouraged to revisit the course learning objectives and evaluate the position of the course in the major or overall curriculum. In particular, instructors may wish to explain and expand the DKU policy using course-specific examples, the potential use of AI tools that students and faculty may have discussed. Read this post for more tips before adding the policy to the syllabus.
Expectations for the use of AI tools may vary across disciplines and courses. Examples of statements about the role or use of AI are listed in the “Emphasize Academic Integrity” section below. Some see ChatGPT as a calculator. Students are expected to learn the basic arithmetics before using calculators. This applies to the basic writing skills regardless of the genre or discipline. In a writing-focused course, instructors may refer to the LCC Faculty Guide on Use of AI Tools produced by the LCC working group.
Ethical and Responsible Use of AI
If instructors allow the use of AI in coursework, it is important to educate students about AI ethics.
Cite appropriately if generative AI is used.
Indicate your learning progress and how you evolve based on the AI-generated content.
Be aware of the hallucinations, biases, and discriminations scaled by the AI.
Generative AI tools can be used to improve teaching efficiency and promote teaching innovation. Faculty members have experimented and shared their examples, going beyond the primary fact-finding and information organization.
Create course materials, such as draft syllabi with essential components, lab report templates, sample essays, grading rubric templates, draft teaching slides, and class discussion content.
Design activities and games, including but not limited to asking generative AI tools to provide a template of activity/game rules, identify missing pieces in the activity design or suggest simplifications for beginners.
Engage AI bots in the classroom, to generate different examples or perspectives and enrich the discussion. Some have created bots for interactive learning opportunities.
Teach ethics with AI, by inviting students to analyze AI-generated content and the ethical use of information.
Faculty are good role models. If using AI to generate assignments, ideas, drafts, templates, the instructor should cite the source. Also, it is beneficial to invite students to use AI tools for learning to enhance their independent learning, questioning, and critical thinking skills. For example, students may use AI tools primarily as peer tutors to provide basic information and immediate feedback. However, AI-generated content may be outdated, incorrect, or completely fabricated. Both instructors and students need to be aware of the limitations, experiment with baby steps, and use the content critically.
Some faculty have expressed great concern about generative AI, especially about how to prevent students from unauthorized use of AI in their coursework. In addition to the clear course policies, an effective assessment is equally important. Instructors could take this opportunity to rethink the assessment (i.e., type, prompt, grading criteria), how well AI can perform in these assignments, and what students are expected to learn from them.
Add authentic assessment, such as asking students to relate to their personal experiences, respond to recent events, refer to specific texts discussed in class, or apply to a real-world case.
Relate to the most recent information or recently published articles, as ChatGPT’s knowledge deadline is September 2021.
Assess higher-order thinking through open-ended questions that require students to integrate the bits of information, even if they can get some of it from AI tools.
Here is one way to revise the reading response prompt:
“Look at the readings and analyze how the texts contradict or extend the discussion we’ve had.”
This assignment guides students to approach the readings as scholars.
Set benchmarks to demonstrate student learning in a major assessment or throughout the course, such as requiring multiple drafts.
To discourage students from taking shortcuts, instructors can scaffold students by requiring exploratory writing on the way to a big project. Steps may include:
Throughout the project, students have ample conversations with the instructors and with peers, which scaffold and motivate students to learn deeply.
Experiment with alternative assessments, such as oral exams and in-class writing to emphasize fundamental skills. Here is a blog from the Chronicle of Higher Education that illustrates the future of undergraduate writing.
Concerns about academic integrity are a direct result of tools based on LLM. This prompts universities and faculty to review their policies and emphasize the expectations if any, as a result of the AI tools.
Academic Integrity Policy Applied to DKU Undergraduate Courses
The current syllabus template, approved by the Undergraduate Studies Office, has a blanket policy that has been revised in light of generative AI tools.
“You are responsible for knowing and adhering to academic policy and procedures as published in the University Bulletin and Student Handbook. Please note, an incident of behavioral infraction or academic dishonesty (cheating on a test, plagiarizing, use of online tools prohibited by the instructor at the course or assignment level, etc.) will result in immediate action from me, in consultation with university administration (e.g., Dean or Associate Dean of Undergraduate Studies, Student Conduct, Academic Advising). Please visit the Undergraduate Studies website for additional guidance related to academic policy and procedures. Academic integrity is everyone’s responsibility.”
Educators have mixed feelings about the detection tools or features. It should be noted that there is no perfect AI detection tool. The results can be used to start the conversation with students. Instead, instructors are encouraged to focus on the fundamental principles, such as the mismatch in style and language, long passages without citations, and when students cannot explain their work.
Statements Allowing Appropriate Uses of AI Tools
AI tools may be allowed in some courses. It is important to make this clear in advance to avoid any misuse or abuse of AI tools. Instructors may refer to the example statements from peer institutions and add to the syllabus.
Equity is another challenge that AI can magnify. If any generative AI tools are allowed in the course, it is the instructors’ responsibility to ensure that students have access to the same or similar tools, which should not be taken for granted.
Also, many tools collect information about users. So instructors need to encourage students to read the privacy policy and terms of use for each tool, and give students the option to opt out if they do not feel comfortable.
Additional resources from peer institutions and ongoing discussions in the higher education community can be found in the resource collection (NetID log-in required) maintained by the CTL.