ChatSparkLLM()IFlyTek Spark chat model integration.
Setup:
To use, you should have the environment variableIFLYTEK_SPARK_API_KEY,
IFLYTEK_SPARK_API_SECRET and IFLYTEK_SPARK_APP_ID.
Key init args — completion params: model: Optional[str] Name of IFLYTEK SPARK model to use. temperature: Optional[float] Sampling temperature. top_k: Optional[float] What search sampling control to use. streaming: Optional[bool] Whether to stream the results or not.
Key init args — client params: api_key: Optional[str] IFLYTEK SPARK API KEY. If not passed in will be read from env var IFLYTEK_SPARK_API_KEY. api_secret: Optional[str] IFLYTEK SPARK API SECRET. If not passed in will be read from env var IFLYTEK_SPARK_API_SECRET. api_url: Optional[str] Base URL for API requests. timeout: Optional[int] Timeout for requests.
See full list of supported init args and their descriptions in the params section.
Instantiate:
.. code-block:: python
from langchain_community.chat_models import ChatSparkLLM
chat = ChatSparkLLM( api_key="your-api-key", api_secret="your-api-secret", model='Spark4.0 Ultra', # temperature=..., # other params... )
Invoke:
.. code-block:: python
messages = [
("system", "你是一名专业的翻译家,可以将用户的中文翻译为英文。"),
("human", "我喜欢编程。"),
]
chat.invoke(messages)
.. code-block:: python
AIMessage(
content='I like programming.',
response_metadata={
'token_usage': {
'question_tokens': 3,
'prompt_tokens': 16,
'completion_tokens': 4,
'total_tokens': 20
}
},
id='run-af8b3531-7bf7-47f0-bfe8-9262cb2a9d47-0'
)
Stream:
.. code-block:: python
for chunk in chat.stream(messages):
print(chunk)
.. code-block:: python
content='I' id='run-fdbb57c2-2d32-4516-b894-6c5a67605d83'
content=' like programming' id='run-fdbb57c2-2d32-4516-b894-6c5a67605d83'
content='.' id='run-fdbb57c2-2d32-4516-b894-6c5a67605d83'
.. code-block:: python
stream = chat.stream(messages)
full = next(stream)
for chunk in stream:
full += chunk
full
.. code-block:: python
AIMessageChunk(
content='I like programming.',
id='run-aca2fa82-c2e4-4835-b7e2-865ddd3c46cb'
)
Response metadata .. code-block:: python
ai_msg = chat.invoke(messages)
ai_msg.response_metadata
.. code-block:: python
{
'token_usage': {
'question_tokens': 3,
'prompt_tokens': 16,
'completion_tokens': 4,
'total_tokens': 20
}
}