Skip to content

Setting up Google Cloud Storage

William Silversmith edited this page Sep 26, 2018 · 29 revisions

Google Cloud Storage (GCS) is a popular object storage backend that is supported by CloudVolume. Follow these step by step directions to get started with using CloudVolume and Neuroglancer with GCS.

Setting up a CloudVolume Bucket

In order to use CloudVolume with GCS, you'll need to create a bucket, a namespace within Google's system that holds your files. Then you'll make a service account with read/write permissions to the bucket in order to access it programatically with CloudVolume. You can also access the bucket using the gsutil tool using your command line and through the web interface.

1. Creating Your Account

If you don't already have a GCS account, you'll need to set one up and link a credit or debit card to it. Be advised that as of this writing (Sept. 2018), the default storage cost is 0.026 $/GiB/mo or in connectomics terms, $312 per TiB per year. There are cheaper options, Nearline and Coldline, but they have strings attached. Transferring data out of GCS (termed "egress") to your local system or another provider will cost between $0.08 to $0.12 per GB so you may wish to consider the vendor lock-in implications of that. AWS S3 is similar in their cost structure, though their actual prices may vary significantly.

Once you've decide to move forward, follow the steps here.

2. Create a GCS Bucket

3. Create Service Account Keys

4. Grant Service Account Permissions

5. Configure CloudVolume with Secrets

Configuring the Bucket for Neuroglancer

1. Download gsutil

2. Set Bucket to Public Read

3. Set CORS Headers

Clone this wiki locally