| Reading time: | 10 min |
| Last updated: | 22 Apr 2026 |
| Reading time: |
| 10 min |
| Last updated: |
| 22 Apr 2026 |
This guide shows you how to install and use the tool with the most common configuration. For advanced options and complete reference information, see the official documentation. Some install guides also include optional next steps to help you explore related workflows or integrations.
Bedrust is a command-line program that you can use to invoke models on Amazon Bedrock. Amazon Bedrock is a managed service on Amazon Web Services (AWS) that allows developers to build and scale generative AI applications using foundation models (FMs) from leading AI model providers.
Bedrust is available as Rust source code, and you can build and run it on an Arm Linux computer.
You’ll need an AWS account to access Bedrock. To learn how to create an AWS account, see Create an AWS account .
To use Bedrust, you need to:
To connect to Bedrock, you need to install the
AWS CLI
, generate an access key ID and secret access key, and use the aws configure command to enter your credentials.
For more information about configuring credentials, see AWS Credentials .
To use Bedrock models, you need to request access to specific foundation models through the Amazon Bedrock console.
In your AWS account, navigate to Model access in the Bedrock console and select the models you want to use.
For more information, see Access Amazon Bedrock foundation models .
One way to install Bedrust is by using Cargo, the Rust package manager.
Ensure you have Rust and Cargo installed on your computer. If not, install them using the following commands:
sudo apt install curl gcc -y
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y
source "$HOME/.cargo/env"
For more information, see the Rust install guide .
Get the Bedrust source code:
git clone https://github.com/darko-mesaros/bedrust.git
cd bedrust
With Rust and Cargo installed, you can install Bedrust:
cargo install bedrust
After installation, confirm that Bedrust is installed and available in your search path by checking the version:
bedrust --version
The output is similar to:
bedrust 0.8.8
You can set the default foundation model you want to use:
bedrust --init
Use the menu to select the default model:
📜 | Initializing Bedrust configuration.
? Select a default model to use press <enter> to skip ›
meta.llama2-70b-chat-v1
meta.llama3-1-405b-instruct-v1:0
meta.llama3-1-70b-instruct-v1:0
meta.llama3-1-8b-instruct-v1:0
cohere.command-text-v14
anthropic.claude-v2
anthropic.claude-v2:1
anthropic.claude-3-opus-20240229-v1:0
anthropic.claude-3-sonnet-20240229-v1:0
anthropic.claude-3-haiku-20240307-v1:0
anthropic.claude-3-5-sonnet-20240620-v1:0
anthropic.claude-3-5-sonnet-20241022-v2:0
us.anthropic.claude-3-7-sonnet-20250219-v1:0
anthropic.claude-3-5-haiku-20241022-v1:0
ai21.j2-ultra-v1
us.deepseek.r1-v1:0
amazon.titan-text-express-v1
mistral.mixtral-8x7b-instruct-v0:1
mistral.mistral-7b-instruct-v0:2
mistral.mistral-large-2402-v1:0
mistral.mistral-large-2407-v1:0
us.amazon.nova-micro-v1:0
us.amazon.nova-lite-v1:0
us.amazon.nova-pro-v1:0
Run bedrust to invoke the CLI with the default model.
bedrust
You’ll see the prompt and can start asking questions such as how do I install the AWS CLI? to see how it works.
bedrust
██████╗ ███████╗██████╗ ██████╗ ██╗ ██╗███████╗████████╗
██╔══██╗██╔════╝██╔══██╗██╔══██╗██║ ██║██╔════╝╚══██╔══╝
██████╔╝█████╗ ██║ ██║██████╔╝██║ ██║███████╗ ██║
██╔══██╗██╔══╝ ██║ ██║██╔══██╗██║ ██║╚════██║ ██║
██████╔╝███████╗██████╔╝██║ ██║╚██████╔╝███████║ ██║
╚═════╝ ╚══════╝╚═════╝ ╚═╝ ╚═╝ ╚═════╝ ╚══════╝ ╚═╝
----------------------------------------
Currently supported chat commands:
/c - Clear current chat history
/s - (BETA) Save chat history
/r - (BETA) Recall and load a chat history
/h - (BETA) Export history as HTML(saves in current dir)
/q - Quit
----------------------------------------
----------------------------------------
🤖 | What would you like to know today?
😎 | Human:
You can use -m to change the model:
bedrust -m nova-micro
Your queries are now sent to the Amazon Nova Micro model.
Use --help to see your models.
bedrust --help
The output is similar to:
A command-line tool to invoke and work with Large Language models on AWS, using Amazon Bedrock
Usage: bedrust [OPTIONS]
Options:
--init
-m, --model-id <MODEL_ID> [possible values: llama270b, llama31405b-instruct, llama3170b-instruct, llama318b-instruct, cohere-command, claude-v2, claude-v21, claude-v3-opus, claude-v3-sonnet, claude-v3-haiku, claude-v35-sonnet, claude-v352-sonnet, claude-v37-sonnet, claude-v35-haiku, jurrasic2-ultra, deep-seek-r1, titan-text-express-v1, mixtral8x7b-instruct, mistral7b-instruct, mistral-large, mistral-large2, nova-micro, nova-lite, nova-pro]
-c, --caption <CAPTION>
-s, --source <SOURCE>
-x
-h, --help Print help
-V, --version Print version
The output shows the model strings you can use. Make sure to enable the models you want to use in the Bedrock console.
You are now ready to use Bedrust as a quick way to explore many Bedrock models and compare them.
How would you rate this tool quick-install guide?
What is the primary reason for your feedback ?
Thank you! We're grateful for your feedback.