2.3 Examination law
2.3 Examination law
What to do if ChatGPT "helps" with an assignment? New challenges for examination processes
The use of AI in examination processes requires particular care and compliance with legal requirements. This is because examinations are legally significant procedures whose results determine qualifications, academic progress or access to further educational opportunities.
Legal framework and principles
In Germany, the legal framework governing the conduct, assessment, repeatability and contestability of examinations is primarily established by the state higher education laws and the examination regulations of individual universities. In addition, from 2 August 2026, universities must also comply with the provisions of the AI Act (KI-VO), particularly where high-risk AI systems are concerned.
A number of fundamental principles and requirements apply to higher ed institutions regarding the use of AI systems in examinations:
- Transparency and traceability: Students must be fully informed about the use of AI-based procedures. This includes how the system works, the assessment criteria and the influence of the AI system on the overall mark.
- Accountability: Ultimate responsibility for the assessment must always lie with a natural person. Under examination law, AI systems must never autonomously decide on pass or fail, as this would violate the principle of the right to a fair hearing (Art. 103(1) of the German Basic Law) and the principle of proportionality.
- Challengeability: AI-based assessments must be justified and verifiable. Only if students can objectively understand the assessment is an effective legal remedy (such as an appeal or legal action) possible. Black-box models without explainability are therefore problematic.
- Equal treatment: AI systems must not result in unlawful discrimination (Art. 3(1) of the German Basic Law).
- Data protection and purpose limitation: All data processed during the examination process is subject to the GDPR. Processing must be necessary, limited to the specified purpose and transparent.
Examples of AI-supported examination applications
There are various areas of application in which AI can be used in examinations, e.g.
- Automated marking of multiple-choice examinations
- Analysis of free-text answers using NLP tools
- Plagiarism detection through comparison with databases, specifically: detection of ‘traditional’ plagiarism (copying from existing sources) and detection of AI-generated content (which, in examination law terms, is considered an attempt at deception rather than plagiarism)
- Generation of individual exam questions using language models
- Behavioural analysis in online exams for fraud detection
High-risk AI systems and their consequences
Certain AI applications in the examination sector are classified by the AI Act as high-risk AI systems (Art. 6(2) AI Regulation, Annex III).
These include:
- AI systems used to determine access to or admission to general and vocational education and training institutions at all levels, or to assign natural persons
- AI systems intended for the assessment of learning outcomes
- AI systems intended for the assessment of the appropriate level of education that a person will receive or to which they will have access
- AI systems intended to be used for the monitoring and detection of unauthorised behaviour by students during examinations
Consequences of the AI Act: High-risk AI systems in the examination sector (effective from 2 August 2026)
For these high-risk systems, the regulation of which applies from 2 August 2026, there are specific and comprehensive requirements for higher education institutions:
- Conducting appropriate risk assessments
- Using high-quality datasets to minimise discriminatory outcomes
- Logging activities to ensure traceability
- Producing detailed documentation
- Provision of clear information for operators
- Implementation of appropriate human oversight measures
- Ensuring high levels of robustness, cybersecurity and accuracy
- Ensuring the fitness for purpose and correctness of input data
- Monitoring and reporting
- Conducting a data protection impact assessment (Art. 35 GDPR)
- Requirement for CE marking
- Registration requirement in the EU database for high-risk AI systems
Exceptions to high-risk AI systems
A provider who considers that an AI system does not pose a high risk must document their assessment before the system is placed on the market or put into service. Such a provider is subject to the registration requirement under Article 49(2) of the AI Act. Upon request by the competent national authorities, the provider must submit the documentation of the assessment.
Recommendations for higher education institutions
Higher education institutions should develop and implement their own regulations based on state higher education laws and the AI Act. This leads to potentially differing guidelines and requirements between individual institutions. The recommendations for higher education institutions include:
- Inclusion of regulations on the use of AI in examination regulations and degree programme documents
- Involvement of legal departments, data protection officers and student representatives in the introduction of new systems
- Development of guidelines on assessment transparency and the documentation of results
- Establishment of an interdisciplinary committee to review the use of AI in examinations
- Conducting pilot projects with academic supervision to evaluate acceptance, fairness and impact
Conclusion: Comply with legal requirements, but safeguard students’ rights
The use of AI in examinations requires careful consideration of the opportunities and risks. Higher education institutions must comply with legal requirements, in particular those of the AI Act for high-risk AI systems, and safeguard students’ rights. Transparency, traceability and ensuring accountability by natural persons are key.
Further information: KI und Prüfungsrecht (in German): Jens O. Brelle (Multimedia Kontor Hamburg) dated 12 March 2025 https://vimeo.com/1065804840/440e91c74e
Presentation (PDF) accompanying the video: https://www.mmkh.de/fileadmin/veranstaltungen/netzwerk_landesinitiativen/KI-Verordnung/2025-03-12_KI-VO_pruefungsrecht_brelle.pdf
💡 Learning Summary Chapter 2.3: Examination Law
- Examinations are subject to strict legal requirements: The use of AI in examinations must be transparent, traceable and overseen by natural persons. Students have a right to comprehensible assessments and a fair hearing.
- High-risk AI systems in the examination sector: According to the EU AI Act, certain examination applications are classified as high-risk. From August 2026, strict requirements such as documentation, human oversight and data protection impact assessments will be mandatory.
- Action required at universities: Examination regulations and study materials should clearly govern the use of AI. Universities should establish interdisciplinary committees, evaluate systems and involve students at an early stage.
References
Baresel, Kira; Horn, Janine & Schorer, Susanne (2025). Der Einsatz von KI-Detektoren zur Überprüfung von Prüfungsleistungen – Eine Stellungnahme.
Published by Digitale Lehre Hub Niedersachsen“ (DLHN).
DOI: https://doi.org/10.57961/fjg9-jr89
Licensed under CC BY-SA 4.0
KI-Verordnung – was nun? Herausforderungen des AI Acts für Hochschulen / online event on 12 March 2025 / organised by MMKH and elan e.V. in cooperation with HND-BW, VCRP, vhb and NeL; funded by StIL https://www.mmkh.de/digitale-lehre/netzwerk-landesinitiativen/ki-verordnung