Skip to Main Content

AI and ChatGPT Information

What is Generative AI?

Artificial intelligence (AI) is:

"The capacity of machines to mimic human cognitive functions such as learning, problem-solving, and pattern recognition, enabling them to perform tasks that normally require human intelligence. It includes various subfields, such as machine learning and natural language processing.

Example: Virtual assistants like Siri or Alexa use AI to understand and respond to voice commands."

Source: Artificial Intelligence Definitions - University of North Florida


Generative AI is a type of AI that can create new content based on the data that was used to "train" it. When you type in a "prompt" (meaning what you're asking for), generative AI tools like ChatGPT create content to respond to that prompt. They basically create content that is similar to the content already provided to them.

What You Should Know Before Using ChatGPT and Other AI for Research

Generative AI and Research

You might be thinking “What’s wrong with using AI tools like ChatGPT or other research-focused AI tools like Elicit to help locate research?”

The short response is: Tools like ChatGPT can make up information, including citations. Hallucinated citations, fake citations, whatever you want to call them - the research they are citing sometimes does not exist. Research tools like Elicit, which combine the Language Model (LM) of ChatGPT with tools to connect it with outside information sources, are still evolving. Right now, many of these tools (like Elicit) are still limited to open access articles (research articles freely available online) and aren’t yet able to pull article data from subscription-based databases like the ones students use at Highline College. This means that these tools are limited in the information they can provide and do not give you all the resources you may need when researching.

The longer version includes concepts each of us should be aware of before using ChatGPT and tools like it to find research. 

The Longer (More Technical) Version

Some definitions:

  • Language Model (LM): Think of these as turbo-powered autocomplete. Language models alone, like the GPT (Generative Pre-trained Transformer) algorithm, cannot collate existing information sources.
  • ChatGPT:  This is a product built on top of the GPT algorithm. It uses the GPT algorithm, but GPT is just part of it. Another part of ChatGPT is its ability to use "tools" (or "plugins") that connect it to outside information sources. In the future, ChatGPT’s connection to these “tools” might make it a better source for research, but so much is unknown at this time.

Key Concepts about ChatGPT and Research 

The purpose of ChatGPT and other similar products is to create textual (and code) passages, not to collate and evaluate existing information sources. 

  • Information sources are already collated in places like the internet and library databases.
  • AI technologies to find sources do exist, but ChatGPT isn’t one of them. (See below for more on AI research tools). 

You might get a citation like this from ChatGPT: 

"Climate Change and Agriculture: Impacts and Adaptation" by K. Boote, J. Hatfield, and P. Thorburn 

After 45 minutes of searching, we proved this source doesn’t exist (we can’t even tell if it’s supposed to be a book or article). If you get citations from ChatGPT, you will need to verify that they exist and get the proper citation to include in your paper.

As mentioned above, AI tools that help find research do exist and will likely continue being created and also get better over time. Here’s a partial list of research-focused generative AI tools:

  • Elicit - self-described “AI research assistant.” Searching for research topics will return summaries of potentially relevant research. When available, Elicit will allow you to search and compare research components (including Methodology, Intervention, Findings, etc.). Upgrade to paid version to export citations
  • Research Rabbit – markets itself as like “Spotify for Research;” allows you to create collections of research, and to visualize network connections between authors/articles
  • Semantic Scholar - AI-powered scholarly literature search tool providing TLDR (Too Long; Didn’t Read) automated summaries for open access articles; developed by the Allen Institute for AI 
  • We know that there are a wide range of other tools; we’ve listed these tools as representative resources

Note that while these companies currently offer free tools, most of these companies are actively trying to create hype, but we users don't need to pay attention or believe everything we see about AI. 

In summary:

  • If you’re using ChatGPT to do research, we recommend you stop doing that because you’re likely getting information that isn’t real or accurate. 
  • If you’re using AI research tools like Elicit, remember that you’re not finding all of the resources available on the topic; only what that AI tool has access to. We suggest you use additional resources as well, such as: library databases, internet searching, etc. 
  • Most importantly, we librarians are happy to help with any research and citation questions. 

Ultimately, the key takeaway is that AI tools like ChatGPT are not meant for locating research on topics

See the other sections of this AI and ChatGPT Information library guide for more helpful information. Of course, if you have any questions, please reach out to us in whatever communication method you prefer. 

In Person - Highline Library, Building 25

By Phone - 206.592.3232

By Email - refhelp@highline.edu

By Chat - Ask Us 24/7!

By Consultation - Schedule a Consultation!

Acknowledgements

Thanks to the friends of the Highline Library who provided feedback on multiple versions of this draft message.

References

Leffer, Lauren. “Too Much Trust in AI Poses Unexpected Threats to the Scientific Process.” Scientific American. March 18, 2024, https://www.scientificamerican.com/article/trust-ai-science-risks/ . Accessed  April 18, 2024.

            (note: sentence from article:  “One big risk is less obvious, though potentially very consequential: humans tend to automatically attribute a great deal of authority and trust to machines. This misplaced faith could cause serious problems when AI systems are used for research, according to a paper published in early March in Nature.”

Webster, Peter. “Artificial Intelligence for Scholarly Literature Searching: Magic Bullet Or Missing the Mark?” Online presentation at: AI and Libraries II: More Applications, Implications, and Possibilities. April 18, 2024.