Callback Handler that records transcripts to the Context service.
Chat Example:
from langchain_openai import ChatOpenAI from langchain_community.callbacks import ContextCallbackHandler context_callback = ContextCallbackHandler( ... token="<CONTEXT_TOKEN_HERE>", ... ) chat = ChatOpenAI( ... temperature=0, ... headers={"user_id": "123"}, ... callbacks=[context_callback], ... openai_api_key="API_KEY_HERE", ... ) messages = [ ... SystemMessage(content="You translate English to French."), ... HumanMessage(content="I love programming with LangChain."), ... ] chat.invoke(messages)
Chain Example:
from langchain_classic.chains import LLMChain from langchain_openai import ChatOpenAI from langchain_community.callbacks import ContextCallbackHandler context_callback = ContextCallbackHandler( ... token="<CONTEXT_TOKEN_HERE>", ... ) human_message_prompt = HumanMessagePromptTemplate( ... prompt=PromptTemplate( ... template="What is a good name for a company that makes {product}?", ... input_variables=["product"], ... ), ... ) chat_prompt_template = ChatPromptTemplate.from_messages( ... [human_message_prompt] ... ) callback = ContextCallbackHandler(token)
Note: the same callback object must be shared between the
... LLM and the chain.
chat = ChatOpenAI(temperature=0.9, callbacks=[callback]) chain = LLMChain( ... llm=chat, ... prompt=chat_prompt_template, ... callbacks=[callback] ... ) chain.run("colorful socks")
Run when the chat model is started.
Run when LLM ends.
Run when chain starts.
Run when chain ends.