![]() |
At the end of this section, you should be able to explain why Large Language Models (LLMs) aren't reliable for finding trustworthy information and know how to clearly disclose your use of AI in academic work. |
Even with well-crafted prompts, AI tools can still produce responses that are inaccurate, biased, or entirely fabricated. Large language models (LLMs) do not inherently “look up” answers online; they generate text based on patterns learned from training data. Unless connected to a live search or database, they cannot access real-time information. While newer models have improved accuracy and guardrails, they may still generate content that sounds convincing but is incorrect or incomplete. In practice, this means
Search engines, though more up-to-date, have their own limitations. Personalized filters, algorithmic bias, and sponsored content can all influence what appears first, and not always in the most accurate or impartial way.
That’s why fact-checking matters. Whether using AI or a search engine, it's important to critically evaluate online information and verify it with credible, authoritative sources.
| Generative AI and other digital tools can support your learning in meaningful ways, but they should never be your sole method of finding reliable, up-to-date and verifiable information. If you choose to use them for academic purposes, it’s important to understand the risks and adopt responsible practices, such as fact-checking, proper attribution and transparent disclosure, to ensure ethical and responsible use. |
By now, you understand that AI generates content based on patterns, not verified facts, and that search engines can also present biased or sponsored results. To ensure your academic work is accurate, trustworthy, and well-supported, it’s essential to go beyond the surface and verify the information you find, whether it comes from an AI tool or a search engine
Here are some practical ways to do that:
The SIFT method, developed by Mike Caulfield, is a simple, reliable approach for evaluating online information. Originally designed for news and social media, it can easily be adapted to help you critically assess AI-generated content as well.
Whether brainstorming, organizing ideas, or summarizing content, it's important to clearly distinguish your own work from anything generated by AI. If you use AI tools to support your learning, be transparent. There should be no doubt about what you created and what was generated by a tool.
| ⚠️ Never use AI to complete graded assignments without explicit permission from your instructor or supervisor. If you do use AI and attribution is required, make sure it is clearly cited and disclosed. |
AI-generated content may sound authoritative, but it is not created by accountable human authors. In other words, AI cannot be held responsible for the accuracy or ethical implications of its output. For this reason, major citation styles, including APA, MLA, and Chicago, recommend not listing AI tools as authors. Instead, verify the information with credible human sources and cite those reliable sources directly. Think of AI as a helpful tool, similar to a calculator or spellchecker—useful for support, but not something you credit as an original source.
If you must include AI-generated content in your work
| 📝 Not sure how to cite AI in an assignment? Check with your instructor and refer to the Library’s AI Citation Guide. |
Though related, citation and disclosure serve different purposes:
| Citation |
|
Citations show your reader when words, ideas, or visuals in your work come from another source, including content generated by AI tools. Just like citing a book or an article, citing AI-generated content is essential to give proper credit and avoid plagiarism. Use a citation when:
|
|
Example (APA 7): Microsoft. (2025). Copilot (GPT-4) [Large language model]. https://copilot.microsoft.com/ |
| Disclosure |
|
A disclosure statement explains how you used AI in your academic work, even if you didn’t include any AI-generated content directly. Disclosure promotes transparency and helps your instructor understand your research and writing process. Provide a disclosure statement when:
As a rule of thumb, an AI disclosure statement should clearly communicate which AI tool you used, how you used it, and what role it played in completing the work. A list of sample prompts and a statement of your responsibility for the final product is also recommended. |
|
Example: I used Microsoft Copilot (April, 2025) to brainstorm and outline the initial structure of this essay. The types of prompts I used include: Summarize the key arguments in this article for a first-year university student" and “Suggest a clear structure for organizing this content into three sections.” I used the output from these prompts as a starting point for drafting my paper. I reviewed, fact-checked, and revised the content to ensure accuracy. This assignment reflects my own analysis and interpretation and I take full responsibility for the final submission. |
| Test Your Understanding |
The University of Saskatchewan's main campus is situated on Treaty 6 Territory and the Homeland of the Métis.
© University of Saskatchewan
Disclaimer|Privacy