Skip to Main Content

Artificial Intelligence: Library Database AI Research Tools

A Note for Faculty from Your Librarians

Librarians have tested the below tools and created this guide to provide guidance and transparency about their function and our decision to enable them or keep them disabled. We may highlight potential risks and drawbacks, which should be noted, are our own opinions only and should not be construed as policy. We welcome conversation about these tools, our decisions to enable or disable, and AI generally. Please contact Director of Library Services Dawn Emsellem or your liaison if you'd like to discuss this guide or any of the below tools. 

JSTOR's AI Research Assistant and Semantic Results (enabled for users logged into their JSTOR accounts)

JSTOR's AI Research Tool

Users must access JSTOR through the library homepage (you should see the "access provided by" phrase at the top of the page as seen below) and create and be logged into a JSTOR account (click the "register" link as seen below) to use the AI Research Assistant.

This tool encourages active interaction with content; users are encouraged to ask the tool questions about specific articles. It requires that users be logged into a JSTOR account, which makes it less likely for students to find and use it without guidance. Faculty who would like their students to use the tool can guide students in its appropriate use, either in collaboration with a librarian or on their own.  

Use Cases

AI-literacy assignments and discussion

Students can download their conversation history as a PDF, which can become an artifact for submission and an opener for a discussion. Students can annotate both the PDF of the conversation and the text. The tool does not generate definitive article summaries. It works through pattern recognition to say what an article is about. JSTOR notes that interaction with the tool is meant to be a starting point for engagement with a source, rather than an end point.  

Using AI tools in specific stages of existing assignments

Traditionally in the humanities, work is completed independently, out of classtime. The tool can be used to break assignments down into discrete tasks that have in and out of class deliverables. For example, in class students can submit citations and conversations with the tool.  

 

JSTOR's Guide to the research tool

Some features and facts about the tool:

  • Tool cannot answer whether the article is peer reviewed.
  • Citation tool within the AI may have errors.
  • Users should consider downloading their conversation with the tool before leaving the page--it's kept for only 3 hours
  • The tool will not give a response to a request to generate an essay. It will not generate new content, only respond to questions about a specific work. 
  • Every response includes in-line references. Intent is to understand concepts in author's own words,
  • Research tool only works with JSTOR item that user is viewing, not from other articles or the wider internet. This reduces potential for hallucination and processing power required/environmental impact.  
  • Uses Retrieval Augmented Generation (RAG) which uses an LLM for knowledge to predict next phrases, but keeps the user's query and JSTORs content separate  so user Content is not used for training purposes, only for generating a response. 

According to JSTOR's data, research tool users perform more searches per day, visit JSTOR more often, and view more articles and books. JSTOR believes this demonstrates higher engagement with the research process and may help alleviate anxiety about students disengaging from the research process because of AI use. 

JSTOR's Semantic Results

Gives 25 most relevant results based on similarities in context in the items. Uses vector embedding, a process that sorts a JSTOR item into concepts and numerical values that sorts items through machine learning to group similarly relevant articles. 

EBSCO's Natural Language Search Option (enabled) and AI Insights (not enabled)

EBSCO provides two AI-powered tools. Natural Language Search uses AI to parse queries, using synonyms and like terms to show more results, easing early researchers' frustration levels with getting 0 results for their natural language searches, but possibly showing more irrelevant results.

The other tool is AI Insights, which summarizes articles and generates related terms. The library team tested the tool over the summer and decided to disable it for this academic year (25/26) for the following reasons:

  • The tool is in a beta phase.
    • It acknowledges that results may be innacurate and that users should evaluate for accuracy.
    • Because most undergraduate students do not yet have the content area knowledge to evaluate scholarly articles for accuracy, and because we strive to have library tools meet high standards of credibility and trustworthiness, we determined it wasn't appropriate to enable it at this time. 
  • Librarians are not yet sure if there is consensus among faculty about whether AI tools that generate summaries of scholarly work are a net benefit or if their risk to student acquisition of critical reading and thinking skills outweighs their benefit.
  • We hesitate to implicitly sanction use of an AI tool before the Salve community has identified a strategy and way forward on use of generative AI tools. 

Statista's Research AI (enabled)

Statista's Research AI tool is an option on the database's homepage menu. The tool was released in May 2024 and has been heavily used by many graduate and undergraduate business faculty and students.

Students can choose to use Research AI OR the default Statista search. The main difference is that Research AI returns answers to prompts, rather than sources related to a topic which users may read to form their own answers to their question. Research AI does not access the entirety of Statista's corpus of materials (as of May 2025, it does not include Consumer or Company Insights) when generating a response, so users should also use Statista's standard search to ensure they see all of Statista's data and reports. Research AI's prompt responses include a button to copy the entire answer to the clipboard, which may encourage students to plagiarize the response. Statista notes that the tool may make errors and recommends that users confirm its assertions. This process of fact checking AI assertions back to original sources is an important skill for students to learn and operationalize in their research workflows. 

Faculty may consider whether allowing students to use Research AI to answer questions will bypass student acquisition of skills such as evaluating, interpreting, and synthesizing data and writing their own analysis, which will be essential skills in graduate school or the workplace.

 

Proquest's AI Research Assistant (not enabled)

Proquest's AI Research Assistant provides article summaries and generates related terms. The library team tested the tool over the summer and decided to disable it for this academic year (25/26) for the following reasons:

  • The tool is in a beta phase.
    • It acknowledges that results may be innacurate and that users should evaluate for accuracy.
    • Because most undergraduate students do not yet have the content area knowledge to evaluate scholarly articles for accuracy, and because we strive to have library tools meet high standards of credibility and trustworthiness, we determined it wasn't appropriate to enable it at this time. 
  • Librarians are not yet sure if there is consensus among faculty about whether AI tools that generate summaries of scholarly work are a net benefit or if their risk to student acquisition of critical reading and thinking skills outweighs their benefit.
  • We hesitate to implicitly sanction use of an AI tool before the Salve community has identified a strategy and way forward on use of generative AI tools. 

FAQ from Proquest here