Burla is the simplest way to run Python on lot's of computers in the cloud.
Burla only has one function:
from burla import remote_parallel_map
my_inputs = [1, 2, 3]
def my_function(my_input):
print("I'm running on my own separate computer in the cloud!")
return my_input
return_values = remote_parallel_map(my_function, my_inputs)
Running code in the cloud feels the same as running code locally:
- Anything you print appears in your local terminal.
- Exceptions thrown in your code are thrown on your local machine.
- Responses are pretty quick, you can call a million simple functions in a couple seconds.
Zero config files, just simple arguments like func_cpu
& func_ram
.
from xgboost import XGBClassifier
def train_model(hyper_parameters):
model = XGBClassifier(n_jobs=64, **hyper_parameters)
model.fit(training_inputs, training_targets)
remote_parallel_map(train_model, parameter_grid, func_cpu=64, func_ram=256)
Queue up 10 Million function calls, and run them with thousands of containers.
Our custom distributed task queue is incredibly fast, keeping hardware utilization high.
This demo is in realtime!
Nest remote_parallel_map
calls to build simple, massively parallel pipelines.
Use background=True
to fire and forget code, then monitor progress from the dashboard.
from burla import remote_parallel_map
def process_record(record):
# Pretend this does some math per-record!
return result
def process_file(file):
results = remote_parallel_map(process_record, split_into_records(file))
upload_results(results)
def process_files(files):
remote_parallel_map(process_file, files, func_ram=16)
remote_parallel_map(process_files, [files], background=True)
Public or private, just paste a link to your image and hit start.
Scale to 10,000 CPU's, terabytes of RAM, or 1,000 H100's, everything stays in your cloud.
(Burla is currently Google Cloud only!)
pip install burla
burla install
See the Getting Started guide for more info.
Questions?
Schedule a call, or email jake@burla.dev. We're always happy to talk.