AWS Bedrock Chat Model node
The AWS Bedrock Chat Model node allows you use LLM models utilising AWS Bedrock platform.
On this page, you'll find the node parameters for the AWS Bedrock Chat Model node, and links to more resources.
Credentials: You can find authentication information for this node here.
Node parameters
- Model: Select the model that generates the completion.
Learn more about available models in the Amazon Bedrock model documentation (opens in a new tab).
Node options
- Maximum Number of Tokens: Enter the maximum number of tokens used, which sets the completion length.
- Sampling Temperature: Use this option to control the randomness of the sampling process. A higher temperature creates more diverse sampling, but increases the risk of hallucinations.
Proxy limitations
This node doesn't support the NO_PROXY environment variable.
Related resources
Refer to LangChains's AWS Bedrock Chat Model documentation (opens in a new tab) for more information about the service.