Run workloads on cloud, your laptop, or bare metal, with the same simple interface
Job Submit API
Daestro provides simple to use Job Submit API to submit a job to Daestro from within your own application. Read more
curl --request POST --url https://api.daestro.com/v1/client/job/submit --header 'authorization: Bearer your_api_key_here' --header 'content-type: application/json' --data '{ "job_definition": "my-project::hello-world:1", "job_queue": "my-project::first-job-queue", "container_cmd": ["python", "main.py"] }'
Self-hosted Compute
Bring Your Own Compute to run jobs on. Daestro provides it's agent via docker image, so that you can run it on any platform that supports docker. Read more
docker run --name daestro-agent -e DAESTRO_AUTH_TOKEN="<token>" -v /var/run/docker.sock:/var/run/docker.sock -v daestro_agent_data:/var/lib/daestro-agent --network host daestro/daestro-agent:latest
Multi Cloud
Run your batch jobs on any cloud provider and on your own account.
Self-hosted Compute
Daestro let's you run jobs on your own servers. Be it your laptop or enterprise servers.
Monitoring & Logging
Monitor your jobs in real time with logs and metrics.
Cost Optimization
Optimize your costs by running your jobs on the cloud providers that are cost effective for your needs.
Control and Limits
Set fine grained controls on concurrency, priority and resource usage for your jobs.
Job Queues
Use job queues to make sure high priority jobs get processed first.
Cron
Run recurring jobs on specified time.
Schedule Jobs
Schedule jobs to run at later time or date.
Resource Control
Set custom CPU and Memory quota per job definition
How to deploy a workload on Daestro
Deploying a workload on Daestro is a simple process that can be completed in just a few steps.
Cloud Auth
Cloud Auths are the credentials for accessing cloud providers.
Compute Environment
Compute Environments define the compute type and location for jobs.
Job Queue
Job Queues define how and when jobs are executed. They can be used to manage the concurrency and priority of jobs.
Job Definition
Job Definitions are the blueprint for creating jobs. They define the job's behavior and parameters.
Jobs
A runnable instance of the Job definition.
Steps
Cloud Auth: Add your cloud provider credentials (api keys / auth tokens) to Daestro.
Compute Environment: Create a compute environment to run your workload.
Job Queue: Create a job queue and add compute environment to run jobs on.
Job Definition: Create a job definition which describes how a workload should be executed using a docker image.
Run Jobs: You can now run your workload by submitting a job to the job queue with the job definition. You can use either Job Submit form or API.
Comparing With Other Solutions
Other Solutions
- Vendor lock-in
- Setup varies for each cloud provider
- No on-prem option
- Locked into their infrastructure
- Limited by the services they offer
Compute Workloads with Daestro
- Run your jobs on any cloud provider
- Easy and intuitive setup process
- Run it on your own infrastructure
- Run jobs on most suitable infrastructure for your job type
- Scale up and down as needed
Frequently
asked questions
Checkout all the FAQs here
Daestro is a versatile cloud orchestration platform that enables you to run your computing tasks across various cloud services like AWS, DigitalOcean, Linode and Vultr, as well as on your own self-hosted servers. It acts as a central control panel to manage all your compute workloads regardless of where they are physically located.
Currently, Daestro supports AWS, DigitalOcean, Linode and Vultr. Support for more cloud providers would be coming soon.
Daestro offers both free and paid plans. You can upgrade to a paid plan to get access to more features. See pricing
You can contact us via email at hello [at] daestro.com OR join our discord server from our website.