PromptLayerChatOpenAI()Cohere async client.
Get the namespace of the langchain object.
Build extra kwargs from additional params that were passed in.
Validate environment.
Use tenacity to retry the completion call.
Get the tokens present in the text with tiktoken package.
Calculate num tokens for gpt-3.5-turbo and gpt-4 with tiktoken package.
Bind functions (and other objects) to this chat model.
| Name | Type | Description |
|---|---|---|
``pl_tags``* | unknown | List of strings to tag the request with. |
``return_pl_id``* | unknown | If True, the PromptLayer request ID will be
returned in the |
PromptLayer and OpenAI Chat large language models API.
To use, you should have the openai and promptlayer python
package installed, and the environment variable OPENAI_API_KEY
and PROMPTLAYER_API_KEY set with your openAI API key and
promptlayer key respectively.
All parameters that can be passed to the OpenAI LLM can also be passed here. The PromptLayerChatOpenAI adds to optional
Example:
.. code-block:: python
from langchain_community.chat_models import PromptLayerChatOpenAI openai = PromptLayerChatOpenAI(model="gpt-3.5-turbo")