Ollama Model node
The Ollama Model node allows you use local Llama 2 models.
On this page, you'll find the node parameters for the Ollama Model node, and links to more resources.
This node lacks tools support, so it won't work with the AI Agent node. Instead, connect it with the Basic LLM Chain node.
Credentials: You can find authentication information for this node here.
Node parameters
- Model: Select the model that generates the completion. Choose from:
- Llama2
- Llama2 13B
- Llama2 70B
- Llama2 Uncensored
Refer to the Ollama Models Library documentation (opens in a new tab) for more information about available models.
Node options
- Sampling Temperature: Use this option to control the randomness of the sampling process. A higher temperature creates more diverse sampling, but increases the risk of hallucinations.
- Top K: Enter the number of token choices the model uses to generate the next token.
- Top P: Use this option to set the probability the completion should use. Use a lower value to ignore less probable options.
Related resources
Refer to LangChains's Ollama documentation (opens in a new tab) for more information about the service.
Common issues
For common questions or issues and suggested solutions, refer to Common issues.