Toxicity

Toxicity

You can get toxicity scores from the /toxicity endpoint or Toxicity class in the Python client. This endpoint/Class takes a single text parameter, which should include the candidate text to be scored for toxicity. The output will include a score that ranges from 0.0 to 1.0. The higher the score, the more toxic the text.

Generate a toxicity score

The output will look something like:

{'checks': [{'score': 0.0009843118023127317, 'index': 0, 'status': 'success'}],
 'created': 1691966144,
 'id': 'toxi-KjfoOYRZlXraSXOdllOWh7nSsxJOk',
 'object': 'toxicity_check'}
Last updated on November 16, 2023