Generation arguments.
Prompt file name.
Optionalvariables?: Record<string, string | number | boolean>Variables to interpolate.
Prompt file name.
Variables to interpolate.
Callback for each stream chunk (optional).
Callback when stream finishes; receives WrappedStreamTextOnFinishEvent (cost optional).
Callback when stream errors (optional).
OptionalonFinish?: WrappedStreamTextOnFinishCallback<Tools>Callback when stream finishes; receives WrappedStreamTextOnFinishEvent (cost optional).
AI SDK stream result with textStream, fullStream, and metadata promises.
Use an LLM model to stream text generation.
This function is a wrapper over the AI SDK's
streamText. The prompt file setsmodel,messages,temperature,maxTokens, andproviderOptions. Additional AI SDK options (tools, onChunk, onFinish, onError, etc.) can be passed through.