ChatKinetica()Kinetica LLM Chat Model API.
Prerequisites for using this API:
gpudb and typeguard packages installed.KINETICA_URLKINETICA_USER, and KINETICA_PASSWD.This API is intended to interact with the Kinetica SqlAssist LLM that supports generation of SQL from natural language.
In the Kinetica LLM workflow you create an LLM context in the database that provides
information needed for infefencing that includes tables, annotations, rules, and
samples. Invoking load_messages_from_context() will retrieve the contxt
information from the database so that it can be used to create a chat prompt.
The chat prompt consists of a SystemMessage and pairs of
HumanMessage/AIMessage that contain the samples which are question/SQL
pairs. You can append pairs samples to this list but it is not intended to
facilitate a typical natural language conversation.
When you create a chain from the chat prompt and execute it, the Kinetica LLM will
generate SQL from the input. Optionally you can use KineticaSqlOutputParser to
execute the SQL and return the result as a dataframe.
The following example creates an LLM using the environment variables for the Kinetica connection. This will fail if the API is unable to connect to the database.
Example:
.. code-block:: python
from langchain_community.chat_models.kinetica import KineticaChatLLM kinetica_llm = KineticaChatLLM()
If you prefer to pass connection information directly then you can create a
connection using KineticaUtil.create_kdbc().
Example:
.. code-block:: python
from langchain_community.chat_models.kinetica import ( KineticaChatLLM, KineticaUtil) kdbc = KineticaUtil._create_kdbc(url=url, user=user, passwd=passwd) kinetica_llm = KineticaChatLLM(kdbc=kdbc)