Explore UCD

UCD Home >

Generative AI

GenAI & Assessment

Generative AI technologies are rapidly transforming how we create, learn, and assess knowledge across higher education. Design of assessment at both module and programme level requires thoughtful consideration to ensure both its validity and its integrity. Rather than blanket prohibition or uncritical embrace of GenAI in assessment design, UCD is committed to evidence-based approaches that harness AI's potential while preserving the core values of academic excellence and intellectual growth.

Assessment design in the age of GenAI presents both challenges and opportunities. Some common challenges are the ethical concerns, privacy, security, academic integrity and the risk of hallucinations. However, assessment opportunities have also been identified, including its ability to support personalised learning, immediate student feedback and the development of more complex assessments (Mao et al., 2024). 

Cartoon image referencing Hamlet. A man holds a skull in his hand. The image is captioned 'To use or not to use, that is the question'.

There are many ways to approach GenAI in higher education assessment and programme  design. JISC (2025) identifies three main strategies: avoiding its use, outrunning it, or embracing and adapting to it. Regardless of the approach chosen, the key is being transparent with students about what AI use is permitted. As with many changes to higher education over the years, central to any assessment change are the core elements in assessment design.

In your disciplinary context, who, why, what, when and how are you assessing?  (UCD T&L, 2025a). 

Recent thinking about GenAI advocates that the key to success in the era of GenAI is to return to  basic good assessment design. Corbin et al. (2025) advocate thinking about design in terms of what they describe as ‘structural changes’ i.e. changes made to the nature, format, or mechanics of the assessment itself. This approach emphasises more ‘live’, process-driven assessment that captures students’ attainment over time, ’constraining’ the use of GenAI. They emphasise how all assessment tasks should come together as a whole in a module to ensure validity. This approach strengthens the integrity of the assessment, but may be hard to implement in some contexts, such as large class settings with limited tutor/demonstrator support. Examples include: ‘supervising the parts of a generation for the essay; discussing random questions in an interactive oral assessment;  checkpoint in live assessment requiring a tutor sign off on lab work‘ (Corbin et al, 2025, p7)

The other common approach they describe is ‘discursive changes’, which are modifications to assessment that primarily rely on communication to students of what’s permissible in the use of assessment, where, they warn, students ‘remain essentially free to follow or ignore’ (Corbin et al., 2025). Examples include: ‘telling students to use GenAI for editing but not generating text; displaying a warning on quiz pages advising students not to use AI; raising the importance of not fabricating data with AI ‘ (Corbin et al, 2025, p7). 

It is worth considering when and how these two broad approaches might best suit your assessment, keeping an open mind on which approach to use in your context and with your students. The following are broad steps in an assessment design strategy for GenAI, drawing on existing good practices in assessment design (UCD T&L, 2025)

Why and how do I assess?

First, consider ‘why are you assessing’? 

Is your assessment primarily:

Highly weighted graded assessment demonstrating achievement (assessment OF learning) is high stakes assessment (National Forum, 2017) and requires more robust approaches regarding how GenAI is used. Therefore for these assessments you could consider the ‘structural change’ approach which usually limits the use of GenAI. 

Next consider the advice around how to assess. In particular, what are the learning outcomes you want to achieve in the module?  

Research shows GenAI is being used by students for ‘explaining concepts, summarising an article, suggesting research ideas, and structuring thoughts’ (Freeman, 2025). Assessment is currently more robust against the unauthorised use of GenAI when supporting learning outcomes that require  ‘audience-tailored, observation by learner and reflection on work practice’  (Hardie et al, 2024, p8).  This aligns with the idea of ‘authentic assessment’ (National Forum, 2017). 

Consider the higher order cognitive outcomes, which are less likely to be accurately generated by GenAI. This table provides ideas about its use at the different cognitive levels in Bloom's taxonomy. 

Other important considerations are: 

  • Programme Stage: What stage in the programme are the students, first year, final year? What should they be able to do at this stage? Have they used GenAI before? Using a programme approach to assessment, a programme team can support a more scaffolded approach to its use, thoughtfully navigating the tension between validity and integrity. When is best to design in more ‘live’ (GenAI constrained assessment) and when to use GenAI (with advice on what is permitted.)
  • Assessment for Inclusion: Are there assessment for inclusion aspects that should be considered in your decision?  For example, is the assessment diverse, manageable, and scaffolded? (For further details see this presentation about using GenAI through an assessment for inclusion lens (O'Neill, Wolf, Hyland, (2025)

What assessment methods/types should I use? 

UCD has 14 broad umbrella assessment types, including  exams, assignments, portfolios, choice of assessment, quizzes, projects, group work, etc. By their nature, some assessment types support the ‘structural approach’ to GenAI use (the In-Person exam, some in-class Participation in Learning Activities and the Viva/Oral). Corbin et al (2025) in particular advocates for these types of activities to be built into assessment design that connects with other assessments, focuses on the assessment process and gives some checkpoints on students’ progress. 

Some of these UCD assessment types, such as Reflective AssignmentPortfolios and Practical Skills, are often more authentic and minimise the use of GenAI (Hardie et al, 2024) Authentic assessments (Villarroel et al, 2018) are relevant, support students’ identity, are often collaborative, and link to the ‘real world of practice’. See some showcases of authentic assessment in UCD and nationally in Ireland (National Forum, 2017) 

However, in the rapid changing times of GenAI development, we need to develop our own and the students' literacies to make these informed choices. We often don't know how ‘vulnerable’ assessments are to GenAI’s completion of the task until we try them out. 

Cartoon image of a woman standing at a crossroads. Written on the cloud behind her head is 'literacies and knowledge to make the choice'

 

When you have your initial idea for an assessment type, it can be informative to do a ‘vulnerability check’ on it: one step in the Assessment Redesign Process developed by Dr Dhanushi Abeygunawardena as part of the Nexus Program at UNSW (Wijenayake, 2025). They suggest: 

  • Test out the assessment in a few different common GenAI tools by adding in the assessments instructions and associated grading rubric, if available, into the prompt
  • Use your usual grading scale (rubric) to assess the GenAI outputs. The higher the grades produced by the GenAI outputs the more vulnerable it is to GenAI ‘misuse’ 
  • For these highly scored assessments, you will need to either modify the assessment or Embrace and adapt (JISC 2025)  the assessment to use ‘ethically’ and ‘transparently’ with GenAI
  • Review this assessment types with colleagues from your discipline

How do I implement the approach? 

‘Being transparent’ to students around what is expected and allowed in your assessment has always been a key assessment principle (UCD T&L, 2025). However, this is particularly important when you have made changes that now allow GenA use in your assessment (discursive changes). Some examples of scales/levels that you can use to help to clarify what students are permitted to use: 

  • UCD College of Arts and Humanities has implemented a Traffic Light System 
  • Perkins et al (2024) suggests five ways to use GenAI: No AI, AI Planning, AI Collaboration, Full AI and AI Exploration. 
  • TEQSA in Australia (2024, p51) highlight a two-way lane approach: 
    • a Secure Lane (lane 1): graded, Assessment OF learning, secured, in-person, supervised, GenAI may or may not be allowed by the examiner.. 
    • An Open Lane (lane 2): Assessment for and as learning. Emphasis is on the process of learning instead of the product, not secured. Where relevant, use of AI is supported and scaffolded so that students learn how to productively and responsibly engage with AI.
  • Wolf and O’Neill (2025a) in UCD Teaching and Learning, suggest giving students a choice in the way they use GenAI including to ‘not use it’. They also suggest using GenAI ‘as author’ (to produce an output that is reflected on) or ‘thought partner’ (used in an iterative, transparent, and dynamic way to help develop your thoughts)

Cartoon image showing 3 people. Underneath the 1st person it says 'no AI'; undeneath the 2nd it says 'AI as author' and under the 3rd it says 'AI as thought partner'

Students need to be informed of these approaches in the module descriptor and through ongoing conversation led by the module coordinator/lecturer with their students. UCD Academic Integrity Policy (2024) states that; ‘It should be indicated clearly in the module descriptor whether generative AI will form any part of the learning experience’. 

Some useful points to consider in your GenAI module statement are: 

  • Rationale for the decision in the module
  • How it should be used: including any restrictions. Are there differences between particular assessments within the module (if used)? 
  • Students transparency around how they used it and what was produced (If used)

As students can receive conflicting permission around its use across modules within the same year or trimester, it is important to allow time in class for discussion around your decisions and permissions. Developing short video advice is also useful. 

Developing skills in the use and their critique of GenAI is a key skill for life-long learning. This is most efficiently done as a team across the assessment in the programme. 

Drawing of a digger in a sandbox. A flag in the sand reads 'GenAI Sandbox'. Words such as  clarity, accuracy, bias, prompt engineering appear in the sand.

Students should be scaffolded in, for example:

See, for example, how students were allowed to ‘play’ with GenAI in-class in a UCD School of English, Drama and Film module (Follett, 2025,  Case 7, ‘Setting Bad Examples: Using AI-Generated Essays to Teach English Composition’ in Costello et al, 2025

In summary, the different GenAI approaches can vary in how they weight the validity and/or the integrity of the assessments across your programme, so keep an open mind on which approach(s) to use in your context and with your students.  

References