AI and Liberal Arts in the Classroom

Courtesy of SEO & Web LTD

Artificial intelligence tools have become a problem at Dartmouth. We can, and should, change with the times and learn how to best adapt to their demands. It would be foolish to disregard the technology’s benefits, particularly because everything is going to change with or without us. Right now, however, generative AI is being used to the detriment of students’ education.

While AI technologies like ChatGPT surely make menial tasks much more efficient and help us solve challenging problems, there are certain academic contexts in which these tools are completely inappropriate. They should not be used to draft paper outlines or produce arguments, never mind writing the essays altogether. Summaries produced by AI should not be relied upon in lieu of actually reading books or articles. Particularly in the humanities, AI has become a cheat code to avoid actually doing the labors of undergraduate work. We have effectively outsourced our creativity and innovation to a machine that is incapable of original thought.

Generative AI technologies rely entirely on what has already been produced. When we ask it to write a history paper, for example, it’s going to procure us argumentative frameworks that already exist. The entire craft of history depends upon breaking ground by looking at the past with a fresh set of eyes.

AI as it’s currently conceived and utilized by students (and faculty) poses a threat to the very nature of liberal arts. The point of our education here is not to learn how to be efficient. Maybe AI would be perfect in countries with planned economies and political repression, where innovative thought is forbidden. But here in America, and at Dartmouth, thinking is our primary strength. We study here because we learn how to think. If we ask AI to do that for us, why even attend Dartmouth?

All too often, I see people I know skip the readings in favor of a one-page-long summary on their computer screens. When it comes time to write it, they again use AI to plan the paper’s argument and organization. Doing this misses the point. Professors don’t really care if you have the ability to produce some mediocre, six page essay on racial injustice in To Kill a Mockingbird. You won’t remember what you write, and neither will your professor. They do care if you have the skills to independently analyze a text and then form your own conclusions, and so should you.

The use of AI is connected to the broader problem at Dartmouth of students finding shortcuts in their classes and not taking their work seriously. It’s not that these students are necessarily anti-intellectual or bad students, but they are part of a culture that doesn’t value learning for learning’s sake. They get good grades to get a good consulting job. They do just enough to please their professors without actually pushing themselves to grow as thinkers.

In the humanities and social sciences at least, I think it is destructive to allow the use of generative AI. I cannot think of a single scenario where it would be appropriate. You must be able to read a text closely and then organize your thoughts around it, and that includes the planning and drafting processes. Labor is what enables you to think.

Thankfully, I haven’t taken many classes where AI is allowed or encouraged, because much of the history department still values the liberal arts pedagogy. I cannot think of a single department, save for maybe computer science, where generative AI should be commonplace.

Despite some conservative rules, however, students still find ways to get around them. They have gamed the system by becoming good at using ChatGPT. They use AI in all its meaningful ways but are able to cover their tracks. Detection softwares help, but they aren’t foolproof. It’s a shame that paper writing, a true art form, has become diluted.

To combat it, unfortunately, a number of new practices should be encouraged. In-person blue-book exams should make a comeback. Computers should be all but banned in lectures and discussions unless absolutely necessary. Students should superfluously cite their sources.

Ultimately, it’s up to the students to actually care about what we learn. Many of us are smart enough to figure out how to use AI discreetly. We should instead redirect those efforts to getting the most out of our classes as possible. If you cheat on the readings, so be it, but it’s your loss at the end of the day. You get out of Dartmouth what you give to it; you can choose to learn something, or you can spend your time plotting the most streamlined path to graduation. Save generative AI for your pencil pushing careers once you graduate, where taking shortcuts doesn’t matter. For now, exploit the chance to learn and grow just as you came here to do.

Be the first to comment on "AI and Liberal Arts in the Classroom"

Leave a comment

Your email address will not be published.


*