Home

Purdue makes 'AI working competency' a graduation requirement

Purdue University last week said it will require incoming undergraduate students to meet an "AI working competency" requirement in order to graduate.

The yet-to-be-defined requirement, part of a broader AI strategy, will apply to freshmen, starting in the fall of 2026, at the university's main campus in Indianapolis and West Lafayette, Indiana.

"The reach and pace of AI's impact to society, including many dimensions of higher education, means that we at Purdue must lean in and lean forward and do so across different functions at the university," said Purdue President Mung Chiang in a statement.

From the faculty perspective, the goal is to treat AI as an enhancement to education

At a university senate meeting last month, Chiang, a professor of electrical and computer engineering, framed the new requirement as meeting the school's responsibility to ensure that graduates have ample employment opportunities.

"As you may have read in daily news articles and on traditional and social media, many companies, large and small, have stopped recruiting and have announced layoffs as well, sometimes in large numbers," Chiang said, according to the meeting minutes [PDF].

"This poses a substantial issue for universities to think about what kind of jobs are going to be displaced by AI or by those experienced with AI. So, we have a task in front of all of us to do everything we can,” he added.

Purdue's AI strategy, referred to as AI@Purdue, covers five distinct areas: Learning about AI, Learning with AI, Research AI, Using AI, and Partnering in AI.

Learning about AI focuses on ensuring that Purdue students can use and think critically about AI. The university has tasked the deans of its academic colleges with developing discipline-specific criteria and proficiency standards for the AI competency requirement.

Learning with AI involves a pending revision of the university's policies about the use of generative AI in teaching and learning, the beginnings of which were published in November.

During last month's senate meeting, Mark Zimpfer, chair of the university senate and assistant professor of practice in Purdue Polytechnic's school of construction management technology, spoke about the need to reach a coherent and consistent set of rules for when using AI is appropriate.

"During my open office hours this morning in our department," he said, "I had a student come in and ask about clarification on our AI rules and compare that to a class that they had on another part of campus, and some conflicting language between the syllabi in the two classes and what was allowed and not allowed."

Research AI refers to efforts to incorporate and utilize AI in university research groups, including the Purdue Institute for Physical AI (IPAI), the Institute for Digital and Advanced Agricultural Systems (IDAAS), the Transportation and Autonomous Systems Institute (TASI), and various other R&D initiatives.

Using AI covers university efforts to equip staff and students with AI tools. It includes, for example, Purdue's 2024 deal with Microsoft to provide access to OpenAI's GPT-4 via Microsoft Copilot with Data Protection, which does not interact with M365 data and does not save chat data or use it for model training.

Finally, there's Partnering with AI, which addresses deals that the university has made with technology partners such as Google, Apple, and Arm.

A Purdue spokesperson did not immediately respond to a request for information about the terms of the deals the university has made with tech partners like Google.

The Register reached out to several members of Purdue's faculty and heard back from one scholar who agreed to share some thoughts on the AI initiative on condition of anonymity.

Our source said that AI has become a major discussion topic across campus. There's strong momentum for integrating it into classrooms and the university has supported efforts to understand the technology and use it effectively, we're told.

"From the faculty perspective, the goal is to treat AI as an enhancement to education, not a replacement," our source said.

Over the summer, we're told, Purdue hosted an AI Academy and faculty from all the colleges were invited to participate. Students, as noted by Zimpfer, want clear guidance on how AI can be used in a way that's consistent with academic integrity requirements.

Faculty members, our source said, have mixed feelings about the AI competency requirement. They appreciate Purdue's commitment to preparing students for a world where AI literacy looks likely to be essential. But they're also concerned about how the directive will be implemented.

"Many programs already integrate AI into coursework, so it's not clear whether students will need additional credits or if existing classes will suffice," our source said. "The university has stated that no extra credits will be required, which is reassuring, but the details remain vague. Different majors use AI in very different ways, and a uniform requirement risks being either too broad to be meaningful or too rigid to fit diverse disciplines."

In short, we're told, faculty hope students will benefit from the AI requirement, but they worry it could become more of a bureaucratic hurdle than an educational asset. ®

Source: The register

Previous

Next