AWS Bedrock 与 LangChain

发布: (2025年12月11日 GMT+8 22:44)
3 min read
原文: Dev.to

Source: Dev.to

前置条件 / 项目结构

  • requirements.txt
  • Dockerfile
  • 本地 AWS 配置:~/.aws/config~/.aws/credentials
  • AWS Bedrock 访问权限

代码库

主 Python 文件 (main.py)

import streamlit as st
import boto3
from botocore.exceptions import ClientError
from langchain_aws import ChatBedrock
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser

st.set_page_config(page_title="AWS Bedrock Docker", layout="wide")

st.title("🐳 AWS Bedrock + Docker + LangChain + Streamlit")
st.caption("Connected to AWS Region: `ap-south-1` via local AWS Config")
# ------------------------------------------------------------------
try:
    boto_session = boto3.Session()
    if not boto_session.get_credentials():
        st.error("❌ No AWS Credentials found. Did you mount ~/.aws in docker-compose?")
        st.stop()

    st.sidebar.success(f"AWS Profile Loaded: {boto_session.profile_name or 'default'}")

except Exception as e:
    st.error(f"AWS Config Error: {e}")
    st.stop()

model_id = st.sidebar.selectbox(
    "Select Model",
    [
        "anthropic.claude-3-sonnet-20240229-v1:0",
        "anthropic.claude-v2:1",
        "amazon.titan-text-express-v1",
    ],
)

llm = ChatBedrock(
    model_id=model_id,
    region_name="ap-south-1",
    model_kwargs={"temperature": 0.5, "max_tokens": 512},
)

# ----------------------------------------------
user_input = st.text_area(
    "Enter your prompt:",
    "Explain how Docker containers work in 3 sentences.",
)

if st.button("Generate Response"):
    if not user_input:
        st.warning("Please enter a prompt.")
    else:
        try:
            with st.spinner("Calling AWS Bedrock API..."):
                prompt = ChatPromptTemplate.from_messages(
                    [
                        ("system", "You are a helpful AI assistant."),
                        ("user", "{input}"),
                    ]
                )
                output_parser = StrOutputParser()

                chain = prompt | llm | output_parser

                response = chain.invoke({"input": user_input})

                st.subheader("AI Response:")
                st.write(response)

        except ClientError as e:
            st.error(f"AWS API Error: {e}")
            if "AccessDenied" in str(e):
                st.warning(
                    "👉 Hint: Did you enable this specific Model ID in the AWS Console > Bedrock > Model Access?"
                )
        except Exception as e:
            st.error(f"An unexpected error occurred: {e}")

Dockerfile

FROM python:3.11-slim
LABEL authors="naush"

WORKDIR /chatgpt-bedrock-langchain-demo
COPY requirements.txt .
RUN pip install --upgrade pip \
    && pip install torch --index-url https://download.pytorch.org/whl/cpu \
    && pip install -r requirements.txt
COPY main.py .
EXPOSE 8090
CMD ["streamlit", "run", "main.py"]

依赖文件 (requirements.txt)

streamlit
boto3
langchain-aws
langchain-community
langchain-core

设置步骤

  1. 克隆仓库(或新建一个)并进入目录。

    git clone 
    cd repository_name
  2. 安装所需的依赖

    pip install -r requirements.txt
    # 或者
    pip install streamlit boto3 langchain-aws langchain-community langchain-core
  3. 配置 AWS 凭证
    确保 ~/.aws/config~/.aws/credentials 已正确设置为你要使用的配置文件。

  4. 运行 Streamlit 应用

    streamlit run main.py

    在浏览器中访问 http://localhost:8501

Docker 设置

在 Docker 中运行该应用:

  1. 构建 Docker 镜像

    docker build -t bedrock-langchain-streamlit .
  2. 启动容器(如有需要挂载 AWS 凭证)

    docker run -p 8501:8501 -v ~/.aws:/root/.aws bedrock-langchain-streamlit
  3. 在浏览器打开 http://localhost:8501,输入提示词,即可与由 AWS Bedrock 提供动力的 Titan 模型交互。

Back to Blog

相关文章

阅读更多 »