As technology like ChatGPT evolves, the implications for Northwestern are in mind

Members of the Northwestern community gathered recently for the Winter Faculty Webinar to discuss generative artificial intelligence—specifically ChatGPT—its growing popularity and its potential impact on university culture.

Hosted in early March by the Office of the Provost, the webinar featured several guest speakers and attracted approximately 200 participants.

“Generative artificial intelligence is becoming increasingly available and will impact teaching and learning in many ways,” Provost Kathleen Haggerty said in her introduction.

To help Northwestern address the questions, concerns and opportunities associated with the emergence of these technologies, Hagerty announced the creation of the Generative AI Advisory Committee, a multidisciplinary group of experts who provide advice on an institutional approach to AI and promote coordination of the most good practice in North West schools and units.

The Director’s Office has created a list of tools, upcoming events and answers to frequently asked questions.

During the winter faculty webinar, several committee members offered opinions on ChatGPT and discussed its implications for academia, faculty resources, and institutional policy.

Here are four takeaways:

“The idea that I can take a set of data and make it understandable to everyone in many different ways is incredibly exciting.”

– Christian Hammond, Professor of Computer Science

We live in a world, Hammond said, where we are surrounded by data, numbers and symbols that are impenetrable to most people. But now there’s the ability to take a set of data, turn it into raw facts, feed it to a language model, and tell it to generate a story tailored to the needs of a particular community.

“AI can help speed up the generation of certain types of communications, but it cannot replace lawyers.”

– Sarah Lavsky, Professor of Law

In recent months, AI tools have successfully passed law exams at major and prestigious US universities. What does this mean for law students? Think of generative AI as a co-pilot for lawyers. For example, standard tasks such as generating deposit questions, first draft of a contract, or summarizing a case can be considered achievable tasks for ChatGPT. Still, there has to be a person involved, Lovesky said. For one lawyer, “80 percent right is still very wrong.”

“Writing is much more than just a product.”

– Elizabeth Lenahan, Director and Associate Professor of Studies in the Cook Family Writing Program; and assistant director of The Writing Place

Lenagan talks about a human approach to writing that requires feeling to fully appreciate the quality of a work. In principle, the AI ​​cannot tap into this essence because it lacks the capabilities necessary to understand its own product. However, she said she is confident that artificial intelligence can be useful in streamlining the writing process, helping not only to improve grammar and syntax, but to explain why a particular word choice is right or wrong, thus serves as a teacher. To that end, she recommended asking the generative AI to do more than one thing, as this can encourage the user to think about how that feedback can be incorporated into the creative process.

“This represents a unique opportunity to co-discover with our students, a chance to highlight the value of lifelong learning.”

– Jennifer Keys, Senior Director of the Searle Center

Highlighting how educators engage students in conversation about this topic, Keyes suggested that students must have critical thinking skills to be workplace-ready. So the next best thing educators can do is help students understand the limits of generative AI so they can make informed decisions about exactly when these tools can add value, she said.

Leave a Comment

Your email address will not be published. Required fields are marked *