Internal Knowledge Search

Motivation

With plain ChatGPT (or any other connected Large Language Model), you can leverage the knowledge from the training set, which includes a vast amount of publicly available information from the internet. However, internal company documents are not included in this training for privacy and confidentiality reasons. The Internal Knowledge Search assistant aims to bridge this gap by allowing your employees to access and query your internal documents through a chat interface. This assistant makes it possible to use ChatGPT-like functionality with your own proprietary knowledge base.

Goal

The goal of the Internal Knowledge Search assistant is to make your company's internal information accessible through a chat interface. By integrating internal documents and other sources of proprietary information, employees can obtain precise and relevant answers to their queries based on the company's internal knowledge.

Structure and Logic of Assistant

  1. Document Search Module:

    • The assistant includes a document search module, which is essential for accessing internal documents.

    • When a question about internal knowledge is asked, the system triggers a background search.

    • The LLM responds with the most relevant answer by retrieving information from internal documents.

    • Relevant documents are appended to the answer as clickable references.

  2. Search Options:

    • Semantic Search: This option searches for the semantic meaning of the query across internal documents.

    • Full Text Search: This option performs a keyword-based search within the documents.

    • Both search methods can be combined to provide more accurate and comprehensive results.

  3. Additional functionalities:

    • Email generation with content of conversation history

    • Translation of previous answer

Possible Adaption of Assistant

The Internal Knowledge Search assistant can be adapted and customised in several ways:

  • Enhanced Prompting: The system can be tailored by enhancing the prompts used for querying the internal knowledge base.

  • Scoped Searches: You can limit the scope of the search to specific departments or document types. For instance, the HR Department scope includes only HR-related documents.

Required and Optional Modules

The following modules are required/optional for this assistant:

Required

Optional

Required

Optional

Document Search

Context Memory Search

 

Email Writer

 

Translate

 

https://unique-ch.atlassian.net/wiki/spaces/SD/pages/469368864

 

Chat with GPT

Example AI Assistant Configuration:

Download the provided TXT file and upload it into a new space as an AI Assistant configuration. This will create an internal knowledge search assistant using GPT-4:

If most of your documents and the user's question are in German, please use the TXT file provided below:

Prompt Engineering Guide

By following these guidelines, you can enhance the effectiveness of your prompts and queries, leading to more accurate and useful responses from the LLM.

Modify Search Terms for Improved Results

If your initial search leads to no results, consider adjusting your search terms. Here are some strategies:

  • Replace certain words with synonyms or related terms.

  • Include specific words that you expect to be in the document.

  • Expand abbreviations to their full forms, as the system might not recognize abbreviations.

For example:

  • Original Prompt: "What has to be considered about EB for people with L-status?"

  • Modified Prompt: "What has to be considered about e-banking for people with L-status, crossing borders?"

Be Specific in Your Queries

Provide clear and detailed context to avoid assumptions. Specify all relevant details to get accurate responses.

For example:

  • Original Query: "My client with US citizenship moved to Singapore. Can she still use e-banking?"

  • Specific Query: "My client, who is a US citizen, has moved to Singapore permanently. Can she still use e-banking while living in Singapore?"

Avoid Leading the LLM

Do not suggest an answer within your question. Instead, ask open-ended questions to allow the LLM to provide a comprehensive response.

For example:

  • Leading Query: "Is it true that we have to document every client's interaction?"

  • Open-ended Query: "What are the requirements for documenting client interactions? Which systems should be used for different types of interactions?"

Utilize Follow-up Questions Effectively

In a chat, follow-up questions retain the context of the previous three messages. Use this to refine or expand on your queries without repeating all previous information.

For example:

  • Initial Query: "Can my client who moved to Singapore still use her e-banking services?"

  • Follow-up Query: "She is no longer living in the US. Does this affect her ability to maintain her e-banking account?"

Note on Priming

Be aware that follow-up questions might prime the system to give similar responses if the initial query did not yield the desired answer. If necessary, open a new chat and ask more specifically as defined above.

For example:

  • Initial Query Result: "I did not find the answer."

  • New Chat Query: "Given that my client is residing in Singapore permanently, what regulations impact her use of e-banking services?"


Author

@Pascal Hauri

© 2024 Unique AG. All rights reserved. Privacy PolicyTerms of Service