A student guide to using generative AI in university studies

The following is a summary of some excellent guidance for students produced by Kings College London. The link to the original article is in the first comment below.

Generative AI is a type of AI where applications are trained to learn from data to improve their performance in certain tasks such as the creation of text, computer code, images, video and audio. This is based on the vast amounts of data that the AI has been trained on.

Types of generative AI include:

Text generation

This includes Microsoft Copilot, ChatGPT, Google Gemini and Claude. These are AI systems that can generate human-like text by predicting probable next words after being given a prompt. Producing prompts in order to achieve the desired output is referred to as ‘prompt engineering’.

Tools like these are NOT databases of knowledge. They simply ’predict’ combinations of plausible words; you should never use them to replace conventional approaches to finding information, but they may be used alongside conventional search tools such as search engines.

Other media and code generation

DALL-E, Midjourney and Stable Diffusion are examples of AI systems that create images from text prompts by analysing vast datasets of image-text pairs and producing novel images.

Descript and Murf.ai are examples of AI tools that can generate human-like audio narration and conversation from text.

GitHub Copilot and TabNine are examples of AI coding assistants that suggest completions for code based on analysing large codebases.

How can generative AI help me with my studies?

The key distinction of generative AI from e.g. a search engine output is its ability to create completely novel content at scale, as opposed to just accessing existing data. However, where outputs are fully AI-generated, they lack human understanding, critical engagement and precision and often exhibit a bland style of expression that, whilst technically accurate and impressive at first glance, is often flawed. The tools may generate biased, incorrect, or misleading information, and it is essential that you to check information against other, conventional, sources.

Used properly, generative AI tools can:

  • Help generate ideas and frameworks for study and assessments
  • Upscale and organise rough typed notes
  • Provide explanations of concepts you are struggling to understand through “teacher dialogue” (where the Ai assumes the role of teacher and responds to your questions)
  • Paraphrase and summarise sources as a study aid
  • Provide feedback on your ideas and work, and help you improve it
  • Re-format text (eg from full prose to tables) - this can be very helpful for neurodivergent students
  • Suggest code completions or assist with debugging for programming work
  • Translate texts between languages for multilingual research or for communicating with researchers across the globe
  • Summarise meeting, video or podcast transcripts
  • 'Clean’ and punctuate automated transcripts (e.g. those produced in YouTube)
  • Help you create effective presentations

What should I be alert to?

  • For most university courses students will not be required or even recommended to use Generative Ai tools tools if they do not want to. It is unlikely that copying and pasting from a generative AI tool will ever be appropriate practice in a summative assessment.
  • You should exercise caution when using any tools that require additional log ins or account generation as these will almost certainly collect personal data.
  • You should not input personal data, sensitive information or content created by others into any AI platform.
  • Outputs from generative AI can seem convincing and plausible but due to the way these systems work, could be factually incorrect, contain ‘sources’ that do not actually exist and/ or reflect biases and prejudices derived from the data the models were trained on.
6 Likes

The above post is a summary of the following document:

1 Like

This is a great advice!