AWS Bedrock with LangChain

Published: (December 11, 2025 at 09:44 AM EST)
2 min read
Source: Dev.to

Source: Dev.to

Prerequisites / Project Structure

  • requirements.txt
  • Dockerfile
  • AWS config local setup: ~/.aws/config and ~/.aws/credentials
  • AWS Bedrock access

Code Base

Main Python file (main.py)

import streamlit as st
import boto3
from botocore.exceptions import ClientError
from langchain_aws import ChatBedrock
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser

st.set_page_config(page_title="AWS Bedrock Docker", layout="wide")

st.title("🐳 AWS Bedrock + Docker + LangChain + Streamlit")
st.caption("Connected to AWS Region: `ap-south-1` via local AWS Config")
# ------------------------------------------------------------------
try:
    boto_session = boto3.Session()
    if not boto_session.get_credentials():
        st.error("❌ No AWS Credentials found. Did you mount ~/.aws in docker-compose?")
        st.stop()

    st.sidebar.success(f"AWS Profile Loaded: {boto_session.profile_name or 'default'}")

except Exception as e:
    st.error(f"AWS Config Error: {e}")
    st.stop()

model_id = st.sidebar.selectbox(
    "Select Model",
    [
        "anthropic.claude-3-sonnet-20240229-v1:0",
        "anthropic.claude-v2:1",
        "amazon.titan-text-express-v1",
    ],
)

llm = ChatBedrock(
    model_id=model_id,
    region_name="ap-south-1",
    model_kwargs={"temperature": 0.5, "max_tokens": 512},
)

# ----------------------------------------------
user_input = st.text_area(
    "Enter your prompt:",
    "Explain how Docker containers work in 3 sentences.",
)

if st.button("Generate Response"):
    if not user_input:
        st.warning("Please enter a prompt.")
    else:
        try:
            with st.spinner("Calling AWS Bedrock API..."):
                prompt = ChatPromptTemplate.from_messages(
                    [
                        ("system", "You are a helpful AI assistant."),
                        ("user", "{input}"),
                    ]
                )
                output_parser = StrOutputParser()

                chain = prompt | llm | output_parser

                response = chain.invoke({"input": user_input})

                st.subheader("AI Response:")
                st.write(response)

        except ClientError as e:
            st.error(f"AWS API Error: {e}")
            if "AccessDenied" in str(e):
                st.warning(
                    "👉 Hint: Did you enable this specific Model ID in the AWS Console > Bedrock > Model Access?"
                )
        except Exception as e:
            st.error(f"An unexpected error occurred: {e}")

Dockerfile

FROM python:3.11-slim
LABEL authors="naush"

WORKDIR /chatgpt-bedrock-langchain-demo
COPY requirements.txt .
RUN pip install --upgrade pip \
    && pip install torch --index-url https://download.pytorch.org/whl/cpu \
    && pip install -r requirements.txt
COPY main.py .
EXPOSE 8090
CMD ["streamlit", "run", "main.py"]

Requirements File (requirements.txt)

streamlit
boto3
langchain-aws
langchain-community
langchain-core

Setup Instructions

  1. Clone the repository (or create a new one) and navigate into it.

    git clone 
    cd repository_name
  2. Install the required packages

    pip install -r requirements.txt
    # or
    pip install streamlit boto3 langchain-aws langchain-community langchain-core
  3. Configure your AWS credentials
    Ensure ~/.aws/config and ~/.aws/credentials are correctly set up for the profile you intend to use.

  4. Run the Streamlit app

    streamlit run main.py

    Access the app at http://localhost:8501.

Docker Setup

To run the application inside Docker:

  1. Build the Docker image

    docker build -t bedrock-langchain-streamlit .
  2. Run the container (mount your AWS credentials if needed)

    docker run -p 8501:8501 -v ~/.aws:/root/.aws bedrock-langchain-streamlit
  3. Open http://localhost:8501 in your browser, enter a prompt, and interact with the Titan model powered by AWS Bedrock.

Back to Blog

Related posts

Read more »