langchain.js
    Preparing search index...

    Chain to generate tasks.

    Hierarchy (View Summary)

    Index

    Constructors

    Properties

    lc_serializable: boolean = true
    llm: any

    LLM Wrapper to use

    llmKwargs?: any

    Kwargs to pass to LLM

    memory?: any
    outputKey: string = "text"

    Key to use for output, defaults to text

    outputParser?: any

    OutputParser to use

    prompt: BasePromptTemplate

    Prompt object to use

    Accessors

    • get inputKeys(): any

      Returns any

    • get lc_namespace(): string[]

      Returns string[]

    • get outputKeys(): string[]

      Returns string[]

    Methods

    • Return the string type key uniquely identifying this class of chain.

      Returns "llm"

    • Parameters

      • values: any

      Returns Promise<any>

    • Parameters

      • text: string

      Returns Promise<number>

    • Parameters

      • inputs: ChainValues[]
      • Optionalconfig: any[]

      Returns Promise<ChainValues[]>

      Use .batch() instead. Will be removed in 0.2.0.

      Call the chain on all inputs in the list

    • Run the core logic of this chain and add to output if desired.

      Wraps _call and handles memory.

      Parameters

      • values: any
      • Optionalconfig: any

      Returns Promise<ChainValues>

    • Invoke the chain with the provided input and returns the output.

      Parameters

      • input: ChainValues

        Input values for the chain run.

      • Optionaloptions: any

      Returns Promise<ChainValues>

      Promise that resolves with the output of the chain run.

    • Format prompt with values and pass to LLM

      Parameters

      • values: any

        keys to pass to prompt template

      • OptionalcallbackManager: any

        CallbackManager to use

      Returns Promise<string>

      Completion from LLM.

      llm.predict({ adjective: "funny" })
      
    • Parameters

      • inputs: Record<string, unknown>
      • outputs: Record<string, unknown>
      • returnOnlyOutputs: boolean = false

      Returns Promise<Record<string, unknown>>

    • Parameters

      • input: any
      • Optionalconfig: any

      Returns Promise<string>

      Use .invoke() instead. Will be removed in 0.2.0.

    • Creates a new TaskCreationChain instance. It takes an object of type LLMChainInput as input, omitting the 'prompt' field. It uses the PromptTemplate class to create a new prompt based on the task creation template and the input variables. The new TaskCreationChain instance is then created with this prompt and the remaining fields from the input object.

      Parameters

      • fields: Omit<LLMChainInput, "prompt">

        An object of type LLMChainInput, omitting the 'prompt' field.

      Returns LLMChain

      A new instance of TaskCreationChain.

    • Returns string