AI technologies raise novel questions around data security, attribution, and ethics. In many cases, Carleton’s existing policies still apply to AI, but in some cases this technology requires new policies or new interpretations of existing policies.
This page will provide links to policies that apply directly to AI use at Carleton and highlight any additions and changes as they’re made.
Data Security
Carleton employees should not enter sensitive data or proprietary content into generative AI tools, as it’s unclear how this data might be used or reproduced elsewhere. This includes platforms like ChatGPT, Claude, and Perplexity. The one exception is Google Gemini, which may be used as described below.
Carleton has an ongoing agreement with Google regarding data privacy, and they recently issued a statement that they will not use information entered into Google Gemini for training their AI model. Given these stipulations, Carleton community members are welcome to submit low and medium risk data (e.g. grades or financial information) into Gemini when they’re logged into the system through their Carleton accounts. Under no circumstances should individuals share high risk data (e.g. SSN or credit card information) with Gemini.
Before sharing medium risk data with outside vendors, including other generative AI tools (e.g. ChatGPT), please contact infosec@carleton.edu.
Carleton’s existing policies apply:
Academic Integrity
Carleton recently reworded its Academic Integrity policy to better articulate the guidelines around AI use in students’ academic work. The full policy is in the Campus Handbook, but the crucial phrase is that students are expected to complete all assignments “in a manner ensuring that the core intellectual substance of the assignment is carried out by the student themselves.”
Ultimately, the responsibility lies with each student to ensure that they identify and complete the core intellectual substance of their work. But faculty are encouraged to discuss this with students, both generally and with regards to AI specifically, to help students understand what the work of their courses is meant to be, how it’s meant to benefit their learning, and how AI tools might assist or hinder that learning.