An assistant and modules are created and modified via API call. This guide leads you through the calls you have to do and the settings and configurations that can be set.
Preferences
Token of a user / service-user which has an admin role assigned → <yourToken>
The base URL of your application → <baseUrl>!
To get the token and the URL check this: How to get a Token for our APIs
Assistant
Create a new assistant
This can be done with this cURL. Just replace the following placeholders:
<baseUrl> / <yourToken>
<assistantDefinition> → Definition of your assistant with corresponding modules (see example below)
curl --location --globoff 'https://gateway.<baseUrl>/chat/graphql' \ --header 'Content-Type: application/json' \ --header 'Authorization: Bearer <yourToken>' \ --data '{"query":"mutation CreateAssistant($input: AssistantCreateInput!) {\n createAssistant(input: $input) {\n id\n name\n fallbackModule\n languageModel\n chatUpload\n settings\n modules{\n id\n name\n configuration\n weight\n isExternal\n }\n }\n}","variables":<assistantDefinition>}'
Update an assistant
To update some settings of an assistant, use the following cURL. Update these placeholders:
<baseUrl> / <yourToken>
<assistantId>
<updateAssistant> → contains variables that should be updated. Variables that are not contained in <updateAssistant> remain unchanged
curl --location --globoff 'https://gateway.<baseUrl>/chat/graphql' \ --header 'Content-Type: application/json' \ --header 'Authorization: Bearer <yourToken>' \ --data '{"query":"mutation UpdateAssistant($updateAssistantId: String!, $input: AssistantUpdateInput!) {\n updateAssistant(id: $updateAssistantId, input: $input) {\n name\n id\n languageModel\n settings\n chatUpload\n modules {\n id\n name\n configuration\n }\n }\n}","variables":{"updateAssistantId":"<assistantId>","input":{<updateAssistant>}}'
Delete an assistant
To delete an assistant, use the following cURL. Update these placeholders:
<baseUrl> / <yourToken>
<assistantId>
curl --location --globoff 'https://gateway.<baseUrl>/chat/graphql' \ --header 'Content-Type: application/json' \ --header 'Authorization: Bearer <YourToken>' \ --data '{"query":"mutation CreateAssistant($deleteAssistantId: String!) {\n deleteAssistant(id: $deleteAssistantId) {\n name\n }\n}","variables":{"deleteAssistantId":"<assistantId>"}}'
Assistant variables and settings
The list contains various parameters that need to be set when an assistant is created/updated.
Variable | Description | Options | optional / required |
---|---|---|---|
| Name of assistant | required | |
| fallbackModule if module selection could not find a suitable module | required | |
| Language module used for module selection | see below section with GPT-models default: | optional |
| Enable/disable upload of documents into chat |
| optional |
Collection of various setting |
|
| optional |
|
| optional | |
|
| optional | |
| List of modules to create. See Module dependent configurations for details about configuration of a module | required |
Example input for an Internal Knowledge assistant containing two modules (SearchInVectorDB
, Translate
).
{ "input": { "name": "Internal Knowledge", "fallbackModule": "SearchInVectorDB", "languageModel": "AZURE_GPT_35_TURBO_0613", "chatUpload": "Disabled", "settings": { "showPdfHighlighting": true, "modelChoosing": "BY_FUNCTION_CALL", "isPinned": true }, "modules": { "create": [ { "name": "SearchInVectorDB", "configuration": { }, "description": null, "isExternal": false, "weight": 10000 }, { "name": "Translate", "configuration": { }, "description": null, "isExternal": false, "weight": 6000 } ] } } }
You can find more about assistants here: Assistants
Module
Create a module
This can be done with this cURL. Just replace the following placeholders:
<baseUrl> / <yourToken>
<assistantId> → ID of the assistant where the module should be created
<moduleDefinition> → Definition of the new module (see example below)
curl --location --globoff 'https://gateway.<baseUrl>/chat/graphql' \ --header 'Content-Type: application/json' \ --header 'Authorization: Bearer <yourToken>' \ --data '{"query":"mutation CreateModule($assistantId: String!, $input: ModuleCreateInput!) {\n createModule(assistantId: $assistantId, input: $input) {\n id\n name\n assistantId\n }\n}","variables":{"assistantId":"<assistantId>","input":<moduleDefinition>}'
Update a module
This can be done with this cURL. Just replace the following placeholders:
<baseUrl> / <yourToken>
<moduleId>
<updatesModule> → New set of module parameters
curl --location --globoff 'https://gateway.<baseUrl>/chat/graphql' \ --header 'Content-Type: application/json' \ --header 'Authorization: Bearer <yourToken>' \ --data '{"query":"mutation UpdateModule($input: ModuleUpdateInput!, $moduleId: String!) {\n updateModule(input: $input, moduleId: $moduleId) {\n id\n name\n configuration\n weight\n }\n}","variables":{"input":{<updatesModule>},"moduleId":"<moduleId>"}}'
Delete a module
This can be done with this cURL. Just replace the following placeholders:
<baseUrl> / <yourToken>
<moduleId>
curl --location --globoff 'https://gateway.<baseUrl>/chat/graphql' \ --header 'Content-Type: application/json' \ --header 'Authorization: Bearer <yourToken>' \ --data '{"query":"mutation DeleteModule($moduleId: String!) {\n deleteModule(moduleId: $moduleId)\n}","variables":{"moduleId":"<moduleId>"}}'
Module variables and configuration
Below is an example of parameters for a module:
configuration
→ depends on module. See section Module dependent configurations for more detailsisExternal
→ flag if module is external (= developed with SDK)weight
→ defines the order of modules in module selection prompttoolDefinition
→ Description of module used for function calling. Here, the same structure as openAI is used (therefore, this parameter is called toolDefinition and not moduleDefinition)
{ "name": "SearchInVectorDB", "configuration": { <collectionOfParameters> }, "isExternal": false, "weight": 10000, "toolDefinition": { "type": "function", "function": { "name": "SearchInVectorDB", "description": "Search information in the employee knowledge base for a specific question or assignment, e.g. explain, elaborate or describe. If the employee mentions a specific document and you do not know it, ALWAYS use this function. The employee can ask specific formats. Some examples: 'summarise directive 76 in bullet points', 'how to export data to third-parties', 'what are employee benefits'.", "parameters": { "type": "object", "properties": { "instruction": { "type": "string", "description": "The question to search in the knowledge base, e.g. Was gibt es zu essen?" } } } } } }
Module dependent configurations
Module | Example | Description | Parameter | Options |
---|---|---|---|---|
| "configuration": { "languageModel": "AZURE_GPT_4_TURBO_1106", "scopeIds": ["scope_1", "scope_2"], "searchType": "COMBINED", "chunkedSources": true, "scopeToChatOnUpload": true, "historyIncluded": false, "maxTokens": 7000, "ftsSearchLanguage": "english" } | GPT model to be used |
| see below section with GPT-models default: |
Scopes that the module can access |
| |||
RAG approach to search for chunks |
|
| ||
Describing if chunks of same document are appended as individual sources to GPT content or merged to one source |
|
| ||
Scope restriction to documents that are uploaded. If no documents are uploaded, then scopes in scopeIds are relevant. |
|
| ||
Flag that allows to include previous chat conversation in GPT-calls only if the new user input is a follow-up question |
|
| ||
Max tokens used by sources and previous conversation |
| Default value depends on the used
| ||
Specifies the primary language used for full-text search. This should match the predominant language of the documents in the knowledge centre. |
| Default: | ||
| "configuration": { "languageModel": "AZURE_GPT_4_TURBO_1106", "chunkedSources": true } | GPT model to be used |
| |
Describing if chunks of same document are appended as individual sources to GPT content or merged to one source |
|
| ||
| "configuration": { "languageModel": "AZURE_GPT_4_TURBO_1106" } | GPT model to be used |
| |
| "configuration": { "languageModel": "AZURE_GPT_4_TURBO_1106" } | GPT model to be used |
| |
| "configuration": { "languageModel": "AZURE_GPT_4_TURBO_1106", "temperature": 0.5, "systemPromptExternalKnowledge": "Example system prompt", "maxHistoryInteraction": 2 } | GPT model to be used |
| |
Temperature (chatGPT) |
| Range: 0-1 Default: 0.5 | ||
System prompt |
| Default system prompt is (depending of You are ChatGPT, a large language model trained by OpenAI, based on the GPT-4 architecture.\nKnowledge cutoff: 2023-04.\nCurrent date: DAYDATE. | ||
Maximum number of user-assistant interactions taken into account in the history. |
| Default: 2 | ||
| ||||
| ||||
| "configuration": { "languageModel": "AZURE_GPT_4_TURBO_1106" } | GPT model to be used |
| |
| "configuration": { "languageModel": "AZURE_GPT_4_TURBO_1106", "tableConfig": TableConfig[], "searchExamples": ChatCompletionRequestMessage[], "showTableReference": boolean } | GPT model to be used |
| |
tbd |
| |||
tbd |
| |||
tbd |
| |||
| "configuration": { "languageModel": "AZURE_GPT_4_TURBO_1106", "scopeId": "scope_1", "maxTokens": 7000, "templateName": "template.xlsx" } | GPT model to be used |
| |
Scope that the module can access |
| |||
Max tokens used by sources and previous conversation |
| |||
Name of excel template file, that will be filled with extracted values. Need to uploaded to the same scopeId |
| |||
| "configuration": { "languageModel": "AZURE_GPT_4_TURBO_1106" } | GPT model to be used |
| |
| "configuration": { "languageModel": "AZURE_GPT_4_TURBO_1106" } | GPT model to be used |
|
Available GPT models
The list below contains available GPT-models with the corresponding name that has to be used in the configurations for the assistants and modules:
Model | Key |
---|---|
GPT-35-turbo (0301) | AZURE_GPT_35_TURBO |
GPT-35-turbo (0613) | AZURE_GPT_35_TURBO_0613 |
GPT-35-turbo-16K (0613) | AZURE_GPT_35_TURBO_16K |
GPT-4 (0613) | AZURE_GPT_4_0613 |
GPT-4-32K (0613) | AZURE_GPT_4_32K_0613 |
GPT-4-turbo (0409) | AZURE_GPT_4_TURBO_2024_0409 |
GPT versions available in Preview mode (not recommended to be used in productive applications):
Model | Key |
---|---|
GPT-4-turbo (1106) | AZURE_GPT_4_TURBO_1106 |
You can find more about modules here: Modules
Author |
---|