Agents
(Run this example in Google Colab here)
Prompts, chaining, and prompt engineering is important. However, you might not always know what chain or prompts you need to execute prior to receiving user input or new data. This is where automation and agents can help. This is an active area of development, but some very useful tooling is available.
In the following we will explore using LangChain agents with Prediction Guard LLMs to detect and automate LLM actions.
We will use LangChain again, but we will also use a Google search API called SerpAPI. You can get a free API key for SerpAPI here.
We will use Python to show an example:
Dependencies and Imports
You will need to install the Prediction Guard, LangChain and Google Search Results dependencies in you Python environment.
Now import PredictionGuard and the other dependencies, setup your API Key, and create the client.
Agents
To setup an agent that will search the internet on-the-fly and use the LLM to generate a response:
This will verbosely log the agents activities until it reaching a final answer nd generates the response:
Using The SDKs
You can also try these examples using the other official SDKs: