Completions
You can get controlled text completions from any of the available models using a /completions
endpoint or Completion
class (similar to the OpenAI API).
Generate a text completion
The configuration of the output
field/argument controls the output structure and type of the LLM. This is an optional field/argument, but is recommended. You can explore the options for configuring outputs here.
Last updated on November 10, 2023