Tool Definition

Motivation

Tool definitions are essential for managing multiple modules within a single assistant. They help determine which module the assistant should utilize based on the user's request. This capability is crucial for efficiently handling various tasks and providing accurate responses.

What is a Tool Definition?

A tool definition is a structured format used to define custom modules that a language model can call to perform specific tasks. This allows for a more versatile and powerful interaction with the assistant.

Structure of a Tool Definition

A typical tool definition includes the following components:

  1. type: Specifies that the entity is a function.

  2. function: Contains the function's definition.

    • name: The name of the function.

    • description: Provides a brief overview of when and how the function should be used.

    • parameters: Defines the parameters required for the function.

      • type: Specifies the type of the parameters (“object”).

      • required: Lists mandatory parameters.

      • properties: Details each parameter's type and description.

        • type: The data type of the parameter (e.g., "string").

        • description: A brief description of what the parameter is and how it should be used.

Example JSON

{ "type": "function", "function": { "name": "EmailWriter", "description": "Useful if the task is ONLY asking to write an email e.g., 'Write the above message as an email to Christian' or 'write this as an email'. The employee is not asking a question.", "parameters": { "type": "object", "properties": { "instruction": { "type": "string", "description": "The instruction given by the employee e.g., 'Write this as an email'." } } } } }

Handling Tool Definitions by LLM Type

The handling of tool definitions varies depending on the model and the advanced settings chosen.

BY_FUNCTION_CALL: Module selection is done using the function calling approach of OpenAI. For more information, refer to the OpenAI Platform. This approach is only possible if OpenAI models are used.

BY_PROMPT: This setting can be used with all large language models, including OpenAI models. In this approach, the module descriptions are included in the system message of the LLM call to select the most appropriate module. This option is necessary for LLMs that do not support function calling in the style of OpenAI.

For more details on configuring these settings, visit Chat Interface - Advanced Settings.

Parallel Function Calling

Unlike OpenAI, where function calls can potentially be handled in parallel, our current implementation allows only one module to be chosen and executed at a time. This ensures clarity and precision in task handling.

Adjusting Tool Definitions

Changes to the tool definition should only be made if the wrong module is chosen for too many user requests. The definition can be adjusted based on the performance and interactions of other modules in the assistant. This helps in fine-tuning the accuracy and efficiency of task handling.

Conclusion

Tool definitions are vital for efficiently managing multiple modules in a single assistant, ensuring accurate task handling and response generation. Whether using OpenAI or other LLMs, understanding and configuring tool definitions appropriately enhances the assistant's functionality.


Author

@Pascal Hauri

 

© 2024 Unique AG. All rights reserved. Privacy PolicyTerms of Service