Parser for JSON output.
Exception that output parsers should raise to signify a parsing error.
This exists to differentiate parsing errors from other code or execution errors that also may arise inside the output parser.
OutputParserException will be available to catch and handle in ways to fix the
parsing error, while other errors will be raised.
Base class for an output parser that can handle streaming input.
A single text generation output.
Generation represents the response from an "old-fashioned" LLM (string-in, string-out) that generates regular text (not chat messages).
This model is used internally by chat model and will eventually be mapped to a more
general LLMResult object, and then projected into an AIMessage object.
LangChain users working with chat models will usually access information via
AIMessage (returned from runnable interfaces) or LLMResult (available via
callbacks). Please refer to AIMessage and LLMResult for more information.
Parse the output of an LLM call to a JSON object.
Probably the most reliable output parser for getting structured data that does not use function calling.
When used in streaming mode, it will yield partial JSON objects containing all the keys that have been returned so far.
In streaming, if diff is set to True, yields JSONPatch operations describing
the difference between the previous and the current object.