University of Reading cookie policy

We use cookies on reading.ac.uk to improve your experience, monitor site performance and tailor content to you.

Read our cookie policy to find out how to manage your cookie settings.

Generative AI & AI Meeting Assistants

Best Practices & Privacy Guidelines for Staff

Microsoft Copilot on a mobile phone

This guidance aims to provide clarity on the University's expectations for staff when using Artificial Intelligence (AI) tools in your day-to-day general working practice. It is in addition to the the University's specific Data Protection and AI policies and complementary to the use of GenAI tools guidance in teaching, learning and study.  

Important considerations when using AI

Cyber security is an important consideration when using GenAI (Generative AI) and LLM (Large Language Model)tools. We understand that staff can, in theory, access GenAI and LLM tools freely and we know staff are using various AI tools. However these, including any paid for business versions, may not currently meet the necessary requirements of data protection laws and therefore must not be used with University personal data and sensitive personal data.

For detailed best practices, privacy guidelines and a list of the current tools approved for use, please read the Guidance Document (link). If you have any concerns or queries, please get in contact with DTS.

Due to the constantly shifting landscape around AI, this guidance is a point in time position and the University will monitor and adapt in line with the technology.

It is essential that staff are aware of and apply this guidance when considering using such tools to ensure relevant University data is kept safe. 

Page created by lm920207 on 13/03/25

Contact us
  • IT Self Service Portal
  • Telephone (Internal): 6262
  • Telephone (External): 0118 378 6262
  • Email: dts@reading.ac.uk