Explore UCD

UCD Home >

Generative AI

Overview

The rapidly evolving developments in GenAI continue to have an impact on education with the potential to change the way we teach and learn. The capabilities of these technologies to create text, images, code, audio and video if used responsibly and with clear intention can support and enhance learning. However as educators we must be cognisant of key concerns across the teaching and learning higher education landscape particularly those in the context of academic integrity and plagiarism. Also the concerns raised about the impact of the over-reliance on GenAI and on students’ critical thinking, writing abilities and the acquisition of foundational knowledge remain valid. 

This set of web pages serve as continually evolving resources which provide context, support for understanding GenAI tools broadly, the GenAI landscape at UCD, and support for GenAI and assessment and will help academics navigate the use (or non-use) of GenAI.

Resources

Generative AI: Overview and Context in Higher Education

GenAI in Higher Education

The rapid adoption of GenAI tools across university campuses worldwide has created both unprecedented opportunities and complex challenges for teaching and learning. The tools raise profound questions about academic integrity, the development of critical thinking skills, and the very purpose of assessment in higher education.

Understanding GenAI requires more than technical familiarity with specific tools. It demands comprehensive awareness of how these systems function, their inherent limitations, and their broader implications for knowledge work, creativity, and human expertise. Institutions must navigate between technological integration and educational integrity, between innovation and the preservation of essential academic skills. 

Seventeen multicoloured post-it notes are roughly positioned in a strip shape on a white board. Each one of them has a hand drawn sketch in pen on them, answering the prompt on one of the post-it notes

Image credit: Rick Payne and team / https://betterimagesofai.org / https://creativecommons.org/licenses/by/4.0/

What is GenAI?  

Artificial intelligence (AI) and Generative Artificial Intelligence (GenAI) are often used interchangeably, but it is important to be precise. AI is a field of computer science focused on creating computer systems that can perform tasks that typically require human intelligence, such as understanding natural language, recognizing patterns, making decisions, and learning from data while GenAI is a type of artificial intelligence that can create new content, such as text, images, or music, by learning from existing data and mimicking patterns. UCD College of Arts and Humanities has a very helpful Glossary of Terms created as a part of the AI Futures SATLE project. An ever increasing range of GenAI tools are publicly available, some popular examples used in higher education include Google Gemini, NotebookLM, Microsoft Copilot, ChatGPT, Claude, Perplexity, DALL-E and Midjourney, other GenAI tools listed by type can be found in the UCD Library guide.

GenAI across Irish HEIs

Irish higher education institutions are collaborating extensively to share knowledge, develop common approaches, and learn from each other's experiences with generative AI implementation. This collective effort has produced outputs that reflect the specific context and priorities of the Irish higher education sector,  for example:

GenAI in International Higher Education Contexts

International organisations such as UNESCO and the European Commission have emerged as key voices in establishing GenAI guidelines and competency frameworks. These cross-border initiatives provide valuable perspectives on balancing innovation with educational integrity, offering tested approaches that can inform local policy development and implementation strategies.

 

Generative AI and UCD

UCD Policy & Procedures 

The University Management team recently (2025) approved AI Governance Principles on the recommendation of a University Working Group on the Development, Governance and Management of AI. Currently approved and licensed AI tools include Google Gemini and Notebook LM. UCD IT Services now maintains an AI Services webpage listing tools that are approved for use with University data and support safe, responsible, and innovative use of AI. 

In February 2025, UMT approved the Terms of Reference for the Working Group on Artificial Intelligence and Teaching and Learning. The purpose of the working group is to investigate and recommend strategies for incorporating AI into teaching and learning in UCD.

Teaching and Learning Resources Created at UCD

Digital Literacy

Digital literacy and GenAI

Digital literacy in the 21st century means understanding not just how to use technology, but how technology shapes us. (Gilster, 1997) The understanding of GenAI tools is situated within the broader scope of digital literacy, asking students and educators to move beyond surface-level tool proficiency to develop deeper comprehension of artificial intelligence's role in society, education, and human decision-making. 

The SATLE funded open Digital Literacy and Technological Transformations course (University College Dublin, 2025) aims to equip learners with key digital literacy skills that are essential to study and work in rapidly evolving online environments. Building on Belshaw's (2011) foundational work identifying eight essential elements of digital literacy (cultural, cognitive, constructive, confident, communicative, creative, critical, and civic), Opened Culture (2025) has developed what they term "Dimensions of AI Literacies," a framework that extends existing digital literacy scholarship into the realm of artificial intelligence. 

Fluency with GenAI requires grappling with complex interconnections between technology and pedagogy, what Tim Fawns (2022) calls entanglements. One must understand how algorithmic systems can perpetuate existing inequalities, who controls these technologies, and how design choices embed particular values and worldviews. This includes examining questions of environmental sustainability, labor displacement, and the concentration of AI capabilities within a handful of global corporations.

Developing sensitivity to the subtle ways AI systems influence human behavior and cognition is crucially important. From recommendation algorithms that shape our information consumption to emotion recognition technologies that claim to read our inner states, AI increasingly mediates our relationship with knowledge, creativity, and even our understanding of ourselves, in hidden ways.

Image saying 'Ethically engaging with an unethical tool'

Image Credit: Ethically engaging by Visual Thinkery

AI fluency demands practical skills in evaluation and verification. We must learn to navigate copyright complexities, privacy implications, and the often-invisible data collection that powers these systems. This Infographic on AI Ethics, sourced from the UCD AI Futures SATLE project, is a useful visual overview of some of the key ethical concerns related to generative AI. The creator, Leon Furze, studies the implications of AI on writing instruction and education. His blog series on AI ethics is an excellent resource for both students and staff. The blog post has been updated since publication and Furze has expanded upon the nine points covered here, and linked below:

  1. Bias and discrimination
  2. Environmental concerns
  3. Truth and academic integrity
  4. Copyright
  5. Privacy
  6. Datafication
  7. Emotion recognition
  8. Human labour
  9. Power

The academic community has created several supports for assessing when AI-generated content is appropriate, reliable, or ethically sourced:

  •  ALT’s Framework for Ethical Learning Technology provides structured guidance for educational institutions to evaluate and implement learning technologies while considering ethical implications and potential impacts on students and educators. 
  • A Framework for the Learning and Teaching of Critical AI Literacy skills (Open University, 2025) outlines a framework for teaching Critical AI Literacy skills, emphasizing the need for educators and students to critically engage with AI technologies and their societal implications. 
  • This comprehensive Generative AI toolkit (2024) from University College Cork has been specifically designed to assist staff in considering the responsible integration of GenAI into their learning and teaching practices.
  •  The Civics of Technology Project develops research, curriculum, and professional development to encourage teachers and students to critically inquire into the effects of technology on individual and collective lives, working to advance democratic, ethical, and just uses of technology in schools and society.
  • Charles Logan, a PhD candidate in learning sciences at Northwestern University, curates this open library of work related to AI and its impacts on the environment called Against AI and Its Environmental Harms.

Ultimately, fluency is about cultivating critical citizenship in a world where artificial intelligence is not neutral but deeply political, requiring informed participation from all members of the university community in shaping its development and deployment.

Assessment Design

GenAI and Assessment Design 

Generative AI technologies are rapidly transforming how we create, learn, and assess knowledge across higher education. Design of assessment at both module and programme level requires thoughtful consideration to ensure both its validity and its integrity. Rather than blanket prohibition or uncritical embrace of GenAI in assessment design, UCD is committed to evidence-based approaches that harness AI's potential while preserving the core values of academic excellence and intellectual growth.

Assessment design in the age of GenAI presents both challenges and opportunities. Some common challenges are the ethical concerns, privacy, security, academic integrity and the risk of hallucinations. However, assessment opportunities have also been identified, including its ability to support personalised learning, immediate student feedback and the development of more complex assessments (Mao et al., 2024). 

Cartoon image referencing Hamlet. A man holds a skull in his hand. The image is captioned 'To use or not to use, that is the question'.

There are many ways to approach GenAI in higher education assessment and programme  design. JISC (2025) identifies three main strategies: avoiding its use, outrunning it, or embracing and adapting to it. Regardless of the approach chosen, the key is being transparent with students about what AI use is permitted. As with many changes to higher education over the years, central to any assessment change are the core elements in assessment design.

In your disciplinary context, who, why, what, when and how are you assessing?  (UCD T&L, 2025a). 

Recent thinking about GenAI advocates that the key to success in the era of GenAI is to return to  basic good assessment design. Corbin et al. (2025) advocate thinking about design in terms of what they describe as ‘structural changes’ i.e. changes made to the nature, format, or mechanics of the assessment itself. This approach emphasises more ‘live’, process-driven assessment that captures students’ attainment over time, ’constraining’ the use of GenAI. They emphasise how all assessment tasks should come together as a whole in a module to ensure validity. This approach strengthens the integrity of the assessment, but may be hard to implement in some contexts, such as large class settings with limited tutor/demonstrator support. Examples include: ‘supervising the parts of a generation for the essay; discussing random questions in an interactive oral assessment;  checkpoint in live assessment requiring a tutor sign off on lab work‘ (Corbin et al, 2025, p7)

The other common approach they describe is ‘discursive changes’, which are modifications to assessment that primarily rely on communication to students of what’s permissible in the use of assessment, where, they warn, students ‘remain essentially free to follow or ignore’ (Corbin et al., 2025). Examples include: ‘telling students to use GenAI for editing but not generating text; displaying a warning on quiz pages advising students not to use AI; raising the importance of not fabricating data with AI ‘ (Corbin et al, 2025, p7). 

It is worth considering when and how these two broad approaches might best suit your assessment, keeping an open mind on which approach to use in your context and with your students. The following are broad steps in an assessment design strategy for GenAI, drawing on existing good practices in assessment design (UCD T&L, 2025)

Why and how do I assess?

First, consider ‘why are you assessing’? 

Is your assessment primarily:

Highly weighted graded assessment demonstrating achievement (assessment OF learning) is high stakes assessment (National Forum, 2017) and requires more robust approaches regarding how GenAI is used. Therefore for these assessments you could consider the ‘structural change’ approach which usually limits the use of GenAI. 

Next consider the advice around how to assess. In particular, what are the learning outcomes you want to achieve in the module?  

Research shows GenAI is being used by students for ‘explaining concepts, summarising an article, suggesting research ideas, and structuring thoughts’ (Freeman, 2025). Assessment is currently more robust against the unauthorised use of GenAI when supporting learning outcomes that require  ‘audience-tailored, observation by learner and reflection on work practice’  (Hardie et al, 2024, p8).  This aligns with the idea of ‘authentic assessment’ (National Forum, 2017). 

Consider the higher order cognitive outcomes, which are less likely to be accurately generated by GenAI. This table provides ideas about its use at the different cognitive levels in Bloom's taxonomy. 

Other important considerations are: 

  • Programme Stage: What stage in the programme are the students, first year, final year? What should they be able to do at this stage? Have they used GenAI before? Using a programme approach to assessment, a programme team can support a more scaffolded approach to its use, thoughtfully navigating the tension between validity and integrity. When is best to design in more ‘live’ (GenAI constrained assessment) and when to use GenAI (with advice on what is permitted.)
  • Assessment for Inclusion: Are there assessment for inclusion aspects that should be considered in your decision?  For example, is the assessment diverse, manageable, and scaffolded? (For further details see this presentation about using GenAI through an assessment for inclusion lens (O'Neill, Wolf, Hyland, (2025)

What assessment methods/types should I use? 

UCD has 14 broad umbrella assessment types, including  exams, assignments, portfolios, choice of assessment, quizzes, projects, group work, etc. By their nature, some assessment types support the ‘structural approach’ to GenAI use (the In-Person exam, some in-class Participation in Learning Activities and the Viva/Oral). Corbin et al (2025) in particular advocates for these types of activities to be built into assessment design that connects with other assessments, focuses on the assessment process and gives some checkpoints on students’ progress. 

Some of these UCD assessment types, such as Reflective AssignmentPortfolios and Practical Skills, are often more authentic and minimise the use of GenAI (Hardie et al, 2024) Authentic assessments (Villarroel et al, 2018) are relevant, support students’ identity, are often collaborative, and link to the ‘real world of practice’. See some showcases of authentic assessment in UCD and nationally in Ireland (National Forum, 2017) 

However, in the rapid changing times of GenAI development, we need to develop our own and the students' literacies to make these informed choices. We often don't know how ‘vulnerable’ assessments are to GenAI’s completion of the task until we try them out. 

Cartoon image of a woman standing at a crossroads. Written on the cloud behind her head is 'literacies and knowledge to make the choice'

 

When you have your initial idea for an assessment type, it can be informative to do a ‘vulnerability check’ on it: one step in the Assessment Redesign Process developed by Dr Dhanushi Abeygunawardena as part of the Nexus Program at UNSW (Wijenayake, 2025). They suggest: 

  • Test out the assessment in a few different common GenAI tools by adding in the assessments instructions and associated grading rubric, if available, into the prompt
  • Use your usual grading scale (rubric) to assess the GenAI outputs. The higher the grades produced by the GenAI outputs the more vulnerable it is to GenAI ‘misuse’ 
  • For these highly scored assessments, you will need to either modify the assessment or Embrace and adapt (JISC 2025)  the assessment to use ‘ethically’ and ‘transparently’ with GenAI
  • Review this assessment types with colleagues from your discipline

How do I implement the approach? 

‘Being transparent’ to students around what is expected and allowed in your assessment has always been a key assessment principle (UCD T&L, 2025). However, this is particularly important when you have made changes that now allow GenA use in your assessment (discursive changes). Some examples of scales/levels that you can use to help to clarify what students are permitted to use: 

  • UCD College of Arts and Humanities has implemented a Traffic Light System 
  • Perkins et al (2024) suggests five ways to use GenAI: No AI, AI Planning, AI Collaboration, Full AI and AI Exploration. 
  • TEQSA in Australia (2024, p51) highlight a two-way lane approach: 
    • a Secure Lane (lane 1): graded, Assessment OF learning, secured, in-person, supervised, GenAI May or may not be allowed by the examiner.. 
    • An Open Lane (lane 2): Assessment for and as learning. Emphasis is on the process of learning instead of the product, not secured, As relevant, use of AI is supported and scaffolded so that students learn how to productively and responsibly engage with AI.
  • Wolf and O’Neill (2025a) in UCD Teaching and Learning, suggest giving students a choice in the way they use GenAI including to ‘not use it’. They also suggest using GenAI ‘as author’ (to produce an output that is reflected on) or ‘thought partner’ (used in an iterative, transparent, and dynamic way to help develop your thoughts)

Cartoon image showing 3 people. Underneath the 1st person it says 'no AI'; undeneath the 2nd it says 'AI as author' and under the 3rd it says 'AI as thought partner'

Students need to be informed of these approaches through their UCD module descriptor and through ongoing conversation with the students. UCD Academic Integrity Policy (2024) states that; ‘It should be indicated clearly in the module descriptor whether generative AI will form any part of the learning experience’. 

Some useful points to consider in your GenAI module statement are: 

  • Rationale for the decision in the module
  • How it should be used: including any restrictions. Are there differences between particular assessments within the module (if used)? 
  • Students transparency around how they used it and what was produced (If used)

As students can receive conflicting permission around its use across modules within the same year or trimester, it is important to allow time in class for discussion around your decisions and permissions. Developing short video advice is also useful. 

Developing skills in the use and their critique of GenAI is a key skill for life-long learning. This is most efficiently done as a team across the assessment in the programme. 

Drawing of a digger in a sandbox. A flag in the sand reads 'GenAI Sandbox'. Words such as  clarity, accuracy, bias, prompt engineering appear in the sand.

Students should be scaffolded in, for example:

See, for example, how students were allowed to ‘play’ with GenAI in-class in a UCD School of English, Drama and Film module (Follett, 2025,  Case 7, ‘Setting Bad Examples: Using AI-Generated Essays to Teach English Composition’ in Costello et al, 2025

In summary, the different GenAI approaches can vary in how they weight the validity and/or the integrity of the assessments across your programme, so keep an open mind on which approach(s) to use in your context and with your students.  

 

References

Digital Literacy and Generative AI

Association for Learning Technology. (n.d.). ALT's framework for ethical learning technology. https://www.alt.ac.uk/about-alt/what-we-do/alts-framework-ethical-learning-technology

Belshaw, D. (2011). What is digital literacy? A pragmatic investigation [Doctoral dissertation, Durham University]. http://dougbelshaw.com/thesis/

Civics of Technology Project. (n.d.). Home. https://www.civicsoftechnology.org/

Fawns, T. (2022). An entangled pedagogy: Looking beyond the pedagogy—technology dichotomy. Postdigital Science and Education, 4(3), 711–728. https://doi.org/10.1007/s42438-022-00302-7

Furze, L. (2023, January 26). Teaching AI ethics [Blog post series]. Leon Furze. https://leonfurze.com/2023/01/26/teaching-ai-ethics/

Furze, L. (n.d.). AI ethics. https://leonfurze.com/ai-ethics/

Gilster, P. (1997). Digital literacy. John Wiley & Sons. https://openlibrary.org/works/OL2627594W/Digital_literacy

Logan, C. (n.d.). Against AI and its environmental harms [Open library]. https://pad.riseup.net/p/Against_AI_and_Its_Environmental_Harms-keep

Open University. (2025). A framework for the learning and teaching of critical AI literacy skills. https://about.open.ac.uk/sites/about.open.ac.uk/files/files/OU%20Critical-AI-Literacy-framework-2025.pdf

Opened Culture. (2025). Dimensions of AI literacies. https://openedculture.org/projects/dimensions-of-ai-literacies/

University College Cork. (2024). Generative AI toolkit. https://www.ucc.ie/en/ethical-use-of-generative-ai-toolkit/

University College Dublin. (2025). Digital literacy and technological transformations. https://www.ucd.ie/digitalliteracy/

GenAI and Assessment Design

Corbin, T., Dawson, P., & Liu, D. (2025, May 15). Talk is cheap: Why structural assessment changes are needed for a time of GenAI. Assessment & Evaluation in Higher Education. https://doi.org/10.1080/02602938.2025.2503964

Costello, J., Doljanin, Z., McAreavey, N., & Walsh, F. (Eds.). (2025). Teaching humanities and social sciences in the era of generative AI: Case studies from around Ireland. UCD College of Arts and Humanities.

Freeman, J. (2025). Student generative AI survey. Higher Education Policy Institute.

Hardie, L., Lowe, J., Pride, M., Waugh, K., Hauck, M., Ryan, F., Gooch, D., Patent, V., McCartney, K., Maguire, C., Richards, M., & Richardson, H. (2024). Developing robust assessment in the light of generative AI developments: Evaluation report. NCFE, Open University.

Liu, D., & Bridgeman, A. (2023). What to do about assessments if we can't out-design or out-run AI? Teaching@Sydney. https://educational-innovation.sydney.edu.au/teaching%40sydney/what-to-do-about-assessments-if-we-cant-out-design-or-out-run-ai/

Mao, J., Chen, B., & Liu, J. C. (2024). Generative artificial intelligence in education and its implications for assessment. TechTrends, 68, 58–66. https://doi.org/10.1007/s11528-023-00911-4

O'Neill, G., Wolf, L. G., & Hyland, S. (2025, June). Insights on using GenAI: Through the lens of an assessment for inclusion framework [Conference paper]. EDEN Conference, Bologna, Italy.

Perkins, M., Roe, J., & Furze, L. (2024). The AI assessment scale revisited: A framework for educational assessment. arXiv Preprint arXiv:2412.09029. https://aiassessmentscale.com/

Smolansky, A., Cram, A., Raduescu, C., Zeivots, S., Huber, E., & Kizilcec, R. F. (2023). Educator and student perspectives on the impact of generative AI on assessments in higher education. In Proceedings of the 10th ACM Conference on Learning @ Scale (L@S 2023).

Tertiary Education Quality and Standards Agency. (2024). Gen AI strategies for Australian higher education: Emerging practice. https://www.teqsa.gov.au/sites/default/files/2024-11/Gen-AI-strategies-emerging-practice-toolkit.pdf

Villarroel, V., Bloxham, S., Bruna, D., Bruna, C., & Herrera-Seda, C. (2018). Authentic assessment: Creating a blueprint for course design. Assessment and Evaluation in Higher Education, 43(5), 840–854. https://doi.org/10.1080/02602938.2017.1412396 

Wijenayake, N. (2025, June). Fostering AI literacy in higher education: Lessons from assessment redesign [Conference presentation]. EDEN 2025, Bologna, Italy.

Wolf, L., & O'Neill, G. (2025, June). In the sandbox with GenAI: Faculty practices and reflections [Poster presentation]. EDEN Conference, Bologna, Italy.

Xia, Q., Weng, X., Ouyang, Lin, T. J., & Chiu, T. K. F. (2024). A scoping review on how generative artificial intelligence transforms assessment in higher education. International Journal of Educational Technology in Higher Education, 21(40).

ENDS