Skip to main content

Overview

LangGraph Platform is LangChain’s fully managed hosting service for LangGraph applications. Deploy your agent to production with a single command.
Zero Infrastructure: No servers, no containers, no configuration. Just deploy and go.

Why LangGraph Platform?

Serverless

No infrastructure management. Auto-scaling from zero to thousands of requests.

Integrated Observability

Built-in LangSmith tracing. Every request automatically traced.

One-Command Deploy

langgraph deploy - that’s it. Deployment in seconds.

Versioning

Automatic versioning and instant rollbacks.

Secrets Management

Secure secrets via LangSmith. No .env files in production.

Global CDN

Edge deployment for low latency worldwide.

Quick Start

1. Install CLI

uv tool install langgraph-cli

2. Login to LangChain

langgraph login
This will prompt for your LangSmith API key (get it from smith.langchain.com/settings).

3. Deploy

langgraph deploy
That’s it! Your agent is now live on LangGraph Platform.
Deployment complete in under 2 minutes!

Configuration

langgraph.json

The langgraph.json file configures your deployment:
langgraph.json
{
  "dependencies": ["."],
  "graphs": {
    "agent": "./langgraph_platform/agent.py:graph"
  },
  "env": {
    "ANTHROPIC_API_KEY": "",
    "OPENAI_API_KEY": "",
    "GOOGLE_API_KEY": "",
    "LANGSMITH_API_KEY": "",
    "LANGSMITH_TRACING": "true",
    "LANGSMITH_PROJECT": "mcp-server-langgraph",
    "JWT_SECRET_KEY": "",
    "OPENFGA_API_URL": "http://localhost:8080",
    "OPENFGA_STORE_ID": "",
    "OPENFGA_MODEL_ID": ""
  },
  "python_version": "3.12"
}
dependencies
array
List of dependency sources. ["."] means current directory.
graphs
object
Map of graph names to module paths. Format: "name": "path/to/file.py:variable"
env
object
Environment variables. Use LangSmith secrets for sensitive values.
python_version
string
Python version: 3.10, 3.11, or 3.12

Setting Secrets

Store API keys securely in LangSmith:
## Set LLM API keys
langsmith secret set ANTHROPIC_API_KEY "your-key"
langsmith secret set OPENAI_API_KEY "your-key"

## Set authentication secret
langsmith secret set JWT_SECRET_KEY "your-secret"

## View secrets
langsmith secret list
Never hardcode API keys in code or commit them to git. Always use LangSmith secrets.

Deployment Commands

Deploy

## Deploy with automatic name
langgraph deploy

## Deploy with specific name
langgraph deploy my-agent-prod

## Deploy with environment tag
langgraph deploy my-agent-prod --tag production

Test Locally First

## Start local dev server
langgraph dev

## Test in another terminal
langgraph deployment invoke --local \
  --input '{"messages": [{"role": "user", "content": "test"}]}'

Invoke Deployed Graph

## Basic invocation
langgraph deployment invoke my-agent-prod \
  --input '{"messages": [{"role": "user", "content": "Hello!"}]}'

## With configuration
langgraph deployment invoke my-agent-prod \
  --input '{"messages": [{"role": "user", "content": "Analyze"}]}' \
  --config '{"configurable": {"user_id": "alice"}}'

## Stream responses
langgraph deployment invoke my-agent-prod \
  --input '{"messages": [{"role": "user", "content": "Story"}]}' \
  --stream

Manage Deployments

## List all deployments
langgraph deployment list

## Get deployment details
langgraph deployment get my-agent-prod

## View logs
langgraph deployment logs my-agent-prod --follow

## Rollback to previous version
langgraph deployment rollback my-agent-prod --revision 4

## Delete deployment
langgraph deployment delete my-agent-prod
```sql
### Environment Management

#### Multiple Environments

Create separate deployments for each environment:

Staging

langgraph deploy my-agent-staging —tag staging

Production

langgraph deploy my-agent-prod —tag production

#### Environment Variables

Set environment-specific variables:

Staging uses different LangSmith project

LANGSMITH_PROJECT=my-agent-staging langgraph deploy my-agent-staging

Production

LANGSMITH_PROJECT=my-agent-prod langgraph deploy my-agent-prod
### Monitoring

#### LangSmith Integration

All deployments automatically integrate with LangSmith:

<Steps>
  <Step title="View Traces">
    Go to [smith.langchain.com](https://smith.langchain.com/) and select your project
  </Step>
  <Step title="Monitor Performance">
    See request latency, token usage, and error rates in real-time
  </Step>
  <Step title="Debug Issues">
    Click on any trace to see full LLM interactions and intermediate steps
  </Step>
</Steps>

#### Metrics Available

- **Request Count**: Total invocations
- **Latency (P50, P95, P99)**: Response time percentiles
- **Success Rate**: Percentage of successful requests
- **Token Usage**: Tokens consumed per request
- **Cost**: Estimated costs by LLM provider
- **Error Rate**: Failed requests over time

#### Set Up Alerts

Configure alerts in LangSmith:

1. Go to Project SettingsAlerts
2. Create alert rules:
   - High error rate (&gt;5%)
   - High latency (P95 &gt;5s)
   - Budget exceeded

### CI/CD Integration

#### GitHub Actions

```yaml .github/workflows/deploy.yml
name: Deploy to LangGraph Platform

on:
  push:
    branches: [main]

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Set up Python
        uses: actions/setup-python@v5
        with:
          python-version: '3.11'

      - name: Install LangGraph CLI
        run: |
          curl -LsSf https://astral.sh/uv/install.sh | sh
          uv tool install langgraph-cli

      - name: Deploy
        env:
          LANGCHAIN_API_KEY: ${{ secrets.LANGCHAIN_API_KEY }}
        run: langgraph deploy my-agent-prod --tag production
See complete workflow in .github/workflows/deploy-langgraph-platform.yml.

GitLab CI

.gitlab-ci.yml
deploy:
  stage: deploy
  image: python:3.12
  script:
    - curl -LsSf https://astral.sh/uv/install.sh | sh
    - uv tool install langgraph-cli
    - langgraph deploy my-agent-prod --tag production
  only:
    - main

Best Practices

Always test with langgraph dev before deploying:
# Start local server
langgraph dev

# Test thoroughly
# Then deploy
langgraph deploy
Deploy to staging before production:
# Deploy to staging
langgraph deploy my-agent-staging

# Test staging thoroughly
langgraph deployment invoke my-agent-staging --input '...'

# If good, deploy to production
langgraph deploy my-agent-prod
Tag deployments with versions:
langgraph deploy my-agent-prod --tag v1.2.0

# Or use git commit
langgraph deploy my-agent-prod --tag "$(git rev-parse --short HEAD)"
Watch logs after deployment:
langgraph deployment logs my-agent-prod --follow
Check LangSmith for errors and latency spikes.
Never put API keys in langgraph.json:
# Good: Use LangSmith secrets
langsmith secret set ANTHROPIC_API_KEY "..."

# Bad: In langgraph.json
# "env": {"ANTHROPIC_API_KEY": "sk-ant-..."}  ❌

Troubleshooting

Symptom: Error: Could not find graph 'agent'Solution:
  • Verify langgraph.json has correct graph path
  • Ensure file exists at ./langgraph/agent.py
  • Check variable name is graph (not agent_graph)
{
  "graphs": {
    "agent": "./langgraph/agent.py:graph"  // ✅ Correct
  }
}
Symptom: 401 UnauthorizedSolution:
# Re-login
langgraph logout
langgraph login

# Verify
langgraph whoami
Symptom: ModuleNotFoundError in logsSolution:
  • Add missing package to langgraph/requirements.txt
  • Redeploy: langgraph deploy
Symptom: Agent fails with missing API keySolution:
# Set secret
langsmith secret set ANTHROPIC_API_KEY "your-key"

# Verify
langsmith secret list

# Redeploy
langgraph deploy

Comparison with Other Platforms

FeatureLangGraph PlatformCloud RunKubernetes
Setup Time2 minutes15 minutes1+ hours
Infrastructure✅ None⚠️ Minimal❌ Complex
Scaling✅ Automatic✅ Automatic⚠️ Manual config
LangSmith Integration✅ Built-in⚠️ Manual⚠️ Manual
Versioning✅ Built-in⚠️ Manual⚠️ Manual
CostPay-per-usePay-per-useFixed + usage
Best ForQuick production deploymentsGCP-native appsEnterprise, self-hosted

Pricing

LangGraph Platform uses pay-per-use pricing:
  • Free Tier: 1M requests/month
  • Compute: Charged per execution time
  • No Minimum: Pay only for what you use
Cost-Effective: For most applications, LangGraph Platform is more cost-effective than running dedicated servers.

Next Steps


Ready to deploy? Run langgraph login and langgraph deploy to get your agent live in minutes!