Beam is a fast, open-source runtime for serverless AI workloads. It gives you a Pythonic interface to deploy and scale AI applications with zero infrastructure overhead.
✨ Features
- Fast Image Builds: Launch containers in under a second using a custom container runtime
- Parallelization and Concurrency: Fan out workloads to 100s of containers
- First-Class Developer Experience: Hot-reloading, webhooks, and scheduled jobs
- Scale-to-Zero: Workloads are serverless by default
- Volume Storage: Mount distributed storage volumes
- GPU Support: Run on our cloud (4090s, H100s, and more) or bring your own GPUs
📦 Installation
⚡️ Quickstart
- Create an account here
- Follow our Getting Started Guide
Creating a sandbox
Spin up isolated containers to run LLM-generated code:
from beam import Image, Sandbox sandbox = Sandbox(image=Image()).create() response = sandbox.process.run_code("print('I am running remotely')") print(response.result)
Deploy a serverless inference endpoint
Create an autoscaling endpoint for your custom model:
from beam import Image, endpoint from beam import QueueDepthAutoscaler @endpoint( image=Image(python_version="python3.11"), gpu="A10G", cpu=2, memory="16Gi", autoscaler=QueueDepthAutoscaler(max_containers=5, tasks_per_container=30) ) def handler(): return {"label": "cat", "confidence": 0.97}
Run background tasks
Schedule resilient background tasks (or replace your Celery queue) by adding a simple decorator:
from beam import Image, TaskPolicy, schema, task_queue class Input(schema.Schema): image_url = schema.String() @task_queue( name="image-processor", image=Image(python_version="python3.11"), cpu=1, memory=1024, inputs=Input, task_policy=TaskPolicy(max_retries=3), ) def my_background_task(input: Input, *, context): image_url = input.image_url print(f"Processing image: {image_url}") return {"image_url": image_url} if __name__ == "__main__": # Invoke a background task from your app (without deploying it) my_background_task.put(image_url="https://example.com/image.jpg") # You can also deploy this behind a versioned endpoint with: # beam deploy app.py:my_background_task --name image-processor
Self-Hosting vs Cloud
Beta9 is the open-source engine powering Beam, our fully-managed cloud platform. You can self-host Beta9 for free or choose managed cloud hosting through Beam.
👋 Contributing
We welcome contributions big or small. These are the most helpful things for us:
- Submit a feature request or bug report
- Open a PR with a new feature or improvement
