Skip to Main Content

Information and Digital Literacies GLO: Effective assessments

Effective assessment examples: Information, digital and AI literacies

This page showcases practical assessment ideas that intentionally embed elements of the Information and Digital Literacies Graduate Learning Outcome - covering information, digital, and AI literacies. Examples are designed to help students locate, evaluate, create, and communicate information ethically and effectively, while building confidence with contemporary digital and AI-enabled tools.

Literature review tasks: Why they matter and how to design them

Literature reviews are powerful vehicles for the Information & Digital Literacies GLO: they require students to define/refine a question, search strategically, evaluate sources, synthesise evidence, and reference ethically - using contemporary digital tools along the way. You can also build AI literacy into the task through scaffolded steps - for example, asking students to use an AI tool to iterate on their research question or keywords, then reflect on what they kept, what they discarded, and why, before validating results in scholarly databases.

Designing a literature review assessment? Watch the video below for practical tips on task design and clear guidance to support your students.

Integrate information and digital literacies into your subject: Task starters

Below are task ideas you can adapt to integrate the Information & Digital Literacies GLO into your subject.

Some of the above content was adapted from Drew University’s Designing assignments to develop information literacy.

Integrating AI literacy: Assessment task ideas

Below you will find task starter examples that explicitly involve AI tools. These examples are grouped separately, as this is an emerging, fast-evolving area. They require additional guardrails (clear allowable use, transparency/disclosure, verification of outputs, privacy/consent, and equity of access) and may vary by discipline and policy. Use or adapt them where AI adds genuine learning value, and include an AI-use statement plus a verification step.

Acknowledging AI use in assessments (why & how)

Why add this? Requiring students to disclose and reflect on AI use builds workplace-ready habits: transparency, reproducibility, ethical judgement, and risk awareness. It also helps protect academic integrity (no hidden assistance), improves verification skills, and mirrors professional expectations where AI-assisted work may need to be logged and defensible.

How to build it into tasks (quick options)

  • Include an AI-use statement (short paragraph) on the cover page or methods section.
  • Require a prompt/log appendix (tool, date, purpose, key prompts/settings).
  • Add a verification step (what was checked against which sources; fixes made).
  • Set guardrails (permitted/forbidden uses, privacy & licensing reminders).
  • Assess disclosure (allocate 5–10% for quality of transparency and verification).

Integrating AI literacy into non-AI assessments

As an alternative to setting fully AI-focused assessments, build AI literacy by weaving small, clearly disclosed AI steps into existing tasks - for brainstorming, scoping, drafting, or quality checks. Require students to disclose how AI was used, verify any AI-influenced elements, and reflect on their choices. This mirrors workplace practice: AI can support parts of a workflow, but people remain accountable for the final product. 

How this helps 

Small, well-scoped AI elements develop transparent, ethical, and verifiable habits; improve search and revision strategies; and build workplace-ready judgement about when and how AI adds value. 

Easy ways to weave AI in: 

Guidance notes for students: Examples 

  • Permitted AI use (scoped): “AI may be used for idea generation, outline variants, search-term discovery, and targeted feedback on clarity/structure. Do not use AI to write analysis, generate citations, or summarise articles.” 
  • Disclosure requirement: “Include a 50–100 word AI-use statement and a one-page prompt log (tool/model/date/purpose, key prompts).” 
  • Verification step: “List 2–3 elements influenced by AI and how each was verified or corrected with credible sources.” 
  • Privacy & licensing: “Do not input personal/confidential data or licensed full text into AI tools; use synthetic/redacted examples only.” 
  • Assessment weight: “AI transparency & verification = 5–10% of the grade.” 

AI-use statement: Example template 

“I used [tool/model] on [date] for [purpose: brainstorming terms/outline/clarity feedback]. I verified AI suggestions against [databases/guidelines] and edited where inaccurate or biased. No AI wrote analysis/findings or summaries. I accept responsibility for the accuracy, originality, and ethics of this submission.” 

Charles Sturt University acknowledges the traditional custodians of the lands on which its campuses are located, paying respect to Elders, both past and present, and extend that respect to all First Nations Peoples.Acknowledgement of Country

Charles Sturt University is an Australian University, TEQSA Provider Identification: PRV12018. CRICOS Provider: 00005F.