sunpeak APIDocumentation Index
Fetch the complete documentation index at: https://sunpeak.ai/docs/llms.txt
Use this file to discover all available pages before exploring further.
Overview
Returns a function to request LLM completions from the host via MCPsampling/createMessage. The host decides which model to use and may modify or reject the request.
Check getHostCapabilities()?.sampling before calling. For tool-augmented completions, check ?.sampling?.tools.
Import
Signature
CreateSamplingMessageParams
Standard MCPCreateMessageRequest params:
Conversation messages to send to the model.
Maximum tokens in the response.
System prompt for the model.
Sampling temperature.
Tools the model can call. When provided, the result may include
tool_use blocks.Returns
createSamplingMessage
(params) => Promise<CreateMessageResult | CreateMessageResultWithTools | undefined>
Function to request an LLM completion.
CreateMessageResult
Model used by the host.
Response role (typically
"assistant").Response content block.
Why the model stopped (
"endTurn", "toolUse", etc.).