# ContextCallbackHandler

> **Class** in `langchain_community`

📖 [View in docs](https://reference.langchain.com/python/langchain-community/callbacks/context_callback/ContextCallbackHandler)

Callback Handler that records transcripts to the Context service.

(https://context.ai).

## Signature

```python
ContextCallbackHandler(
    self,
    token: str = '',
    verbose: bool = False,
    **kwargs: Any = {},
)
```

## Description

**Chat Example:**

>>> from langchain_openai import ChatOpenAI
>>> from langchain_community.callbacks import ContextCallbackHandler
>>> context_callback = ContextCallbackHandler(
...     token="<CONTEXT_TOKEN_HERE>",
... )
>>> chat = ChatOpenAI(
...     temperature=0,
...     headers={"user_id": "123"},
...     callbacks=[context_callback],
...     openai_api_key="API_KEY_HERE",
... )
>>> messages = [
...     SystemMessage(content="You translate English to French."),
...     HumanMessage(content="I love programming with LangChain."),
... ]
>>> chat.invoke(messages)

**Chain Example:**

>>> from langchain_classic.chains import LLMChain
>>> from langchain_openai import ChatOpenAI
>>> from langchain_community.callbacks import ContextCallbackHandler
>>> context_callback = ContextCallbackHandler(
...     token="<CONTEXT_TOKEN_HERE>",
... )
>>> human_message_prompt = HumanMessagePromptTemplate(
...     prompt=PromptTemplate(
...         template="What is a good name for a company that makes {product}?",
...         input_variables=["product"],
...    ),
... )
>>> chat_prompt_template = ChatPromptTemplate.from_messages(
...   [human_message_prompt]
... )
>>> callback = ContextCallbackHandler(token)
>>> # Note: the same callback object must be shared between the
...   LLM and the chain.
>>> chat = ChatOpenAI(temperature=0.9, callbacks=[callback])
>>> chain = LLMChain(
...   llm=chat,
...   prompt=chat_prompt_template,
...   callbacks=[callback]
... )
>>> chain.run("colorful socks")

## Extends

- `BaseCallbackHandler`

## Constructors

```python
__init__(
    self,
    token: str = '',
    verbose: bool = False,
    **kwargs: Any = {},
) -> None
```

| Name | Type |
|------|------|
| `token` | `str` |
| `verbose` | `bool` |


## Properties

- `client`
- `chain_run_id`
- `llm_model`
- `messages`
- `metadata`

## Methods

- [`on_chat_model_start()`](https://reference.langchain.com/python/langchain-community/callbacks/context_callback/ContextCallbackHandler/on_chat_model_start)
- [`on_llm_end()`](https://reference.langchain.com/python/langchain-community/callbacks/context_callback/ContextCallbackHandler/on_llm_end)
- [`on_chain_start()`](https://reference.langchain.com/python/langchain-community/callbacks/context_callback/ContextCallbackHandler/on_chain_start)
- [`on_chain_end()`](https://reference.langchain.com/python/langchain-community/callbacks/context_callback/ContextCallbackHandler/on_chain_end)

---

[View source on GitHub](https://github.com/langchain-ai/langchain-community/blob/4b280287bd55b99b44db2dd849f02d66c89534d5/libs/community/langchain_community/callbacks/context_callback.py#L29)