LLM Wrapper to use
Prompt object to use
Optional
callbackUse callbacks
instead
This feature is deprecated and will be removed in the future.
It is not recommended for use.
Optional
callbacksOptional
llmKwargs to pass to LLM
Optional
memoryOptional
metadataOptional
outputKey to use for output, defaults to text
Optional
outputOutputParser to use
Optional
tagsOptional
verboseGenerated using TypeDoc
Interface for the input parameters of the LLMChain class.