AWS Bedrock와 LangChain

발행: (2025년 12월 11일 오후 11:44 GMT+9)
3 min read
원문: Dev.to

Source: Dev.to

Prerequisites / Project Structure

  • requirements.txt
  • Dockerfile
  • AWS 설정 로컬 구성: ~/.aws/config~/.aws/credentials
  • AWS Bedrock 접근 권한

Code Base

Main Python file (main.py)

import streamlit as st
import boto3
from botocore.exceptions import ClientError
from langchain_aws import ChatBedrock
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser

st.set_page_config(page_title="AWS Bedrock Docker", layout="wide")

st.title("🐳 AWS Bedrock + Docker + LangChain + Streamlit")
st.caption("Connected to AWS Region: `ap-south-1` via local AWS Config")
# ------------------------------------------------------------------
try:
    boto_session = boto3.Session()
    if not boto_session.get_credentials():
        st.error("❌ No AWS Credentials found. Did you mount ~/.aws in docker-compose?")
        st.stop()

    st.sidebar.success(f"AWS Profile Loaded: {boto_session.profile_name or 'default'}")

except Exception as e:
    st.error(f"AWS Config Error: {e}")
    st.stop()

model_id = st.sidebar.selectbox(
    "Select Model",
    [
        "anthropic.claude-3-sonnet-20240229-v1:0",
        "anthropic.claude-v2:1",
        "amazon.titan-text-express-v1",
    ],
)

llm = ChatBedrock(
    model_id=model_id,
    region_name="ap-south-1",
    model_kwargs={"temperature": 0.5, "max_tokens": 512},
)

# ----------------------------------------------
user_input = st.text_area(
    "Enter your prompt:",
    "Explain how Docker containers work in 3 sentences.",
)

if st.button("Generate Response"):
    if not user_input:
        st.warning("Please enter a prompt.")
    else:
        try:
            with st.spinner("Calling AWS Bedrock API..."):
                prompt = ChatPromptTemplate.from_messages(
                    [
                        ("system", "You are a helpful AI assistant."),
                        ("user", "{input}"),
                    ]
                )
                output_parser = StrOutputParser()

                chain = prompt | llm | output_parser

                response = chain.invoke({"input": user_input})

                st.subheader("AI Response:")
                st.write(response)

        except ClientError as e:
            st.error(f"AWS API Error: {e}")
            if "AccessDenied" in str(e):
                st.warning(
                    "👉 Hint: Did you enable this specific Model ID in the AWS Console > Bedrock > Model Access?"
                )
        except Exception as e:
            st.error(f"An unexpected error occurred: {e}")

Dockerfile

FROM python:3.11-slim
LABEL authors="naush"

WORKDIR /chatgpt-bedrock-langchain-demo
COPY requirements.txt .
RUN pip install --upgrade pip \
    && pip install torch --index-url https://download.pytorch.org/whl/cpu \
    && pip install -r requirements.txt
COPY main.py .
EXPOSE 8090
CMD ["streamlit", "run", "main.py"]

Requirements File (requirements.txt)

streamlit
boto3
langchain-aws
langchain-community
langchain-core

Setup Instructions

  1. 레포지토리를 클론(또는 새로 만들고) 해당 디렉터리로 이동합니다.

    git clone 
    cd repository_name
  2. 필요한 패키지를 설치

    pip install -r requirements.txt
    # 또는
    pip install streamlit boto3 langchain-aws langchain-community langchain-core
  3. AWS 자격 증명 설정
    사용하려는 프로파일에 맞게 ~/.aws/config~/.aws/credentials가 올바르게 구성되어 있는지 확인합니다.

  4. Streamlit 앱 실행

    streamlit run main.py

    http://localhost:8501에서 앱에 접근합니다.

Docker Setup

Docker 안에서 애플리케이션을 실행하려면:

  1. Docker 이미지 빌드

    docker build -t bedrock-langchain-streamlit .
  2. 컨테이너 실행(필요 시 AWS 자격 증명 마운트)

    docker run -p 8501:8501 -v ~/.aws:/root/.aws bedrock-langchain-streamlit
  3. 브라우저에서 http://localhost:8501을 열고 프롬프트를 입력한 뒤, AWS Bedrock이 제공하는 Titan 모델과 상호작용합니다.

Back to Blog

관련 글

더 보기 »