You can get controlled text completions from any of the available models using a
/completions endpoint or
Completion class (similar to the OpenAI API).
The configuration of the
output field/argument controls the output structure and type of the LLM. This is an optional field/argument, but is recommended. You can explore the options for configuring outputs here.