Skip to content

AI at University: Raising Standards Is (Not Yet) a Solution

Dominik Herrmann

This is a translation. View original (Deutsch)

A BR24 article discusses our Bamberg AI guidelines for teaching. It also quotes a user: “You would have to … raise the standards so that the effort in post-processing is also evaluated.” – Good impulse – but I think this is too short-sighted.

  1. Tool ≠ Access. Today o3 is the best tool – tomorrow maybe Gemini ULTRA. Several times per semester a new “calculator model” comes to market and whoever switches potentially has to work less and gets a better grade. Most educators cannot keep up with this development (not because they don’t want to, but because this topic competes with content-related topics and human supervision for limited time).

  2. Competency gaps. Many, but not all, use AI for homework – and those who use it do so at very different levels. One could say: it’s the universities’ job to bring all students to the same level – i.e., to train them in using certain private sector offerings. Whew. Training for specific products is difficult. Doesn’t the state interfere with the market when it recommends specific products, advocates for the use of specific tools (by only training these) or even mandates certain tools in teaching? Prompt engineering is highly tool-dependent. Apart from that: Good teaching needs time to mature. Teaching concepts and content are often more or less fixed before the semester starts, that’s much too slow for the current dynamics.

  3. Fairness and inclusion. If we raise the bar across the board, we primarily reward those who can afford the expensive “professional tool calculator.” What do we do with those who can’t spend 20 euros per month? Do we need AI financial aid? Who pays for it?

I believe: Banning AI makes no sense as long as we cannot objectively detect AI use and thus enforce the ban (it’s not enough to check the text, the thought process beforehand could also have been AI-guided). We therefore cannot rely on AI detectors – state action must be comprehensible and free from arbitrariness.

At the University of Bamberg we therefore say: There are no simple answers to dealing with AI in teaching. It requires transparency and education. Our AI guidelines plus AI policy generator help educators think about how they themselves view AI use and sensibly design course rules.

» Link to the Bamberg Guidelines for AI Use in Teaching


This post first appeared on LinkedIn.