AI tools can enhance efficiency and productivity in research by optimising critical analysis, synthesis, design, and writing tasks, such as preparing grant proposals, ethics applications, project plans, and publications. In some cases, they can also foster critical or creative thinking by offering fresh insights and perspectives. When used responsibly, AI tools can assist researchers in analysing large volumes of non-sensitive data, highlighting key findings, and significantly reducing manual data analysis time.
Charles Sturt University is currently developing guidelines for the responsible use of generative AI (genAI) in research. Until those guidelines are finalised, the existing Australian Code for the Responsible Conduct of Research 2018 provides a useful framework, outlining eight key principles:
Researchers must consider these principles, as well as guidelines around authorship, data privacy, and data security, when choosing to use AI tools in their work.
It is critical that researchers exercise considerable care when uploading information into commercial genAI tools. Certain categories of data must never be submitted, including:
- Third party copyrighted materials
- Confidential or sensitive data
- Human research data
- Private or personal information
Responsible use of AI in research requires thoughtful adherence to ethical and legal obligations.