SummarizeArticle generates an LLM summary with provider selection and fallback support.
Documentation Index
Fetch the complete documentation index at: https://worldmonitor.app/docs/llms.txt
Use this file to discover all available pages before exploring further.
SummarizeArticleRequest specifies parameters for LLM article summarization.
LLM provider: "ollama", "groq", "openrouter"
11Headlines to summarize (max 8 used).
Summarization mode: "brief", "analysis", "translate", "" (default).
Geographic signal context to include in the prompt.
Variant: "full", "tech", or target language for translate mode.
Output language code, default "en".
Optional system prompt append for analytical framework instructions.
Optional article bodies paired 1:1 with headlines. When bodies[i] is
non-empty, the prompt interleaves it as grounding context under
headlines[i]; when empty, behavior is identical to headline-only today.
Callers may supply a shorter array; missing entries are treated as empty.
Each body is subject to the same sanitisation as headlines before reaching
the LLM prompt.
Successful response
SummarizeArticleResponse contains the LLM summarization result.
The generated summary text.
Model identifier used for generation.
Provider that produced the result (or "cache").
Token count from the LLM response.
Whether the client should try the next provider in the fallback chain.
Error message if the request failed.
Error type/name (e.g. "TypeError").
SummarizeStatus indicates the outcome of a summarization request.
SUMMARIZE_STATUS_UNSPECIFIED, SUMMARIZE_STATUS_SUCCESS, SUMMARIZE_STATUS_CACHED, SUMMARIZE_STATUS_SKIPPED, SUMMARIZE_STATUS_ERROR Human-readable detail for non-success statuses (skip reason, etc.).