This guide walks through configuring Jupyter AI (the JupyterLab AI chat assistant) to route queries through Harvard's HUIT Bedrock proxy on Windows.
After the workshop, see Setting Up Anthropic API Access for Harvard Users for instructions on obtaining your own key.
Open Terminal from the Start menu and create a directory for this work:
mkdir "$env:USERPROFILE\Desktop\GAI\exercises"
cd "$env:USERPROFILE\Desktop\GAI\exercises"
You will run all subsequent commands from this directory.
Ensure Python 3.10 or later is installed:
python --version
# Expected: Python 3.13.x (or 3.10+)
If Python is not installed or the version is too old, download it from python.org.
Jupyter AI v3 is beta software that can pull in dependency versions incompatible with your existing Python packages. A virtual environment keeps everything isolated so nothing else on your system is affected.
python -m venv jupyter-ai-env
Activate it:
.\jupyter-ai-env\Scripts\Activate.ps1
Your terminal prompt should now show (jupyter-ai-env) at the beginning,
confirming the virtual environment is active.
Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser
and type Y to confirm, then try activating again.
cd "$env:USERPROFILE\Desktop\GAI\exercises"
.\jupyter-ai-env\Scripts\Activate.ps1
With the virtual environment active, install JupyterLab:
pip install notebook jupyterlab
Verify the installation and launch JupyterLab to confirm it works:
jupyter lab
Close it (Ctrl+C in the terminal) once you've confirmed it launches, then continue to the next step.
The HUIT proxy speaks the AWS Bedrock Converse API format. Jupyter AI v3 uses LiteLLM as its backend, which has native support for Bedrock endpoints with Bearer token authentication. This is why v3 (beta) is required instead of the stable v2 release.
pip install "jupyter-ai==3.0.0b9" boto3
Verify:
pip show jupyter-ai
# Expected: Version: 3.0.0b9
Set the following environment variables in your terminal.
Replace your-huit-api-key-here with the API key from the link at the top of this page.
$env:ANTHROPIC_API_KEY = "your-huit-api-key-here"
$env:ANTHROPIC_BEDROCK_BASE_URL = "https://apis.huit.harvard.edu/ais-bedrock-llm/v2"
$env:AWS_BEARER_TOKEN_BEDROCK = "your-huit-api-key-here"
Verify they are set:
echo $env:AWS_BEARER_TOKEN_BEDROCK
echo $env:ANTHROPIC_API_KEY
# Both should print your API key
AWS_BEARER_TOKEN_BEDROCK? LiteLLM's Bedrock handler checks
this variable to decide whether to use Bearer token authentication (your HUIT API key) or
standard AWS SigV4 signing (which requires IAM credentials you don't have). Without it,
you'll get "Unable to locate credentials."
With the environment variables set and the virtual environment active, launch JupyterLab:
jupyter lab
If you opened a new terminal, you need to set the environment variables and activate the venv again first:
cd "$env:USERPROFILE\Desktop\GAI\exercises"
.\jupyter-ai-env\Scripts\Activate.ps1
$env:ANTHROPIC_API_KEY = "your-huit-api-key-here"
$env:ANTHROPIC_BEDROCK_BASE_URL = "https://apis.huit.harvard.edu/ais-bedrock-llm/v2"
$env:AWS_BEARER_TOKEN_BEDROCK = "your-huit-api-key-here"
jupyter lab
In the JupyterLab interface, open the Jupyternaut (AI assistant) settings panel.
Look for the Jupyternaut settings. In v3, this may be accessible via:
In the Chat model ID field, enter:
bedrock/converse/us.anthropic.claude-opus-4-5-20251101-v1:0
bedrock/converse/ prefix tells LiteLLM to use
the Bedrock Converse API handler, which correctly constructs the endpoint URL and supports
Bearer token auth. Do not use bedrock/converse_like/ — it has
URL construction issues with custom endpoints.
In the Model parameters section, add:
| Parameter Name | Type | Value |
|---|---|---|
api_base |
string | https://apis.huit.harvard.edu/ais-bedrock-llm/v2 |
In the Secrets and API keys section, confirm that
AWS_BEARER_TOKEN_BEDROCK appears as a static (read-only) secret
imported from your environment. If it doesn't appear, add it using the "Add secret" button.
Leave the Embedding model blank. It is only needed for the /learn
document indexing feature and is not required for chat.
Click Update chat model, then send a test message in the Jupyternaut chat:
Hello, can you tell me what model you are?
Change the model ID in Jupyternaut settings to use different models:
| Model | Chat Model ID |
|---|---|
| Claude Opus 4.5 | bedrock/converse/us.anthropic.claude-opus-4-5-20251101-v1:0 |
| Claude Opus 4.6 | bedrock/converse/us.anthropic.claude-opus-4-6-v1:0 |
| Claude Sonnet 4 | bedrock/converse/us.anthropic.claude-sonnet-4-20250514-v1:0 |
| Claude Haiku 3.5 | bedrock/converse/us.anthropic.claude-3-5-haiku-20241022-v1:0 |
LiteLLM is trying AWS SigV4 auth instead of Bearer token auth.
Ensure AWS_BEARER_TOKEN_BEDROCK is set in the same terminal
session where you launched JupyterLab.
Verify that AWS_BEARER_TOKEN_BEDROCK is set to the correct HUIT API key
(same value as ANTHROPIC_API_KEY).
Run this once to allow scripts for your user account:
Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser
Type Y and press Enter to confirm.
Check the terminal where you launched jupyter lab for error tracebacks. Common causes:
bedrock/converse/)api_base model parameter in Jupyternaut settingsTo avoid re-entering environment variables every time you open a terminal, you can add them to your PowerShell profile:
# Create the profile file if it doesn't exist
if (!(Test-Path -Path $PROFILE)) { New-Item -Type File -Path $PROFILE -Force }
# Open it in Notepad
notepad $PROFILE
Add these lines to the profile file (with your actual API key), save, and close:
$env:ANTHROPIC_API_KEY = "your-huit-api-key-here"
$env:ANTHROPIC_BEDROCK_BASE_URL = "https://apis.huit.harvard.edu/ais-bedrock-llm/v2"
$env:AWS_BEARER_TOKEN_BEDROCK = "your-huit-api-key-here"
The variables will be set automatically every time you open a terminal.