Backup NextCloud to Amazon S3 or BackBlaze B2

When you host your own NextCloud instance, you are responsible for backing up your data, ideally following the 3-2-1 rule. The 3-2-1 rule states that you should always have three copies of your data, two copies locally which are on separate devices and 1 copy off-site. Many users using a RAID array or distributed file system such as NFS or ZFS mistakenly believe they don’t need a backup. In fact that couldn’t be further from the truth, because RAID is not backup. Even though RAID can usually save you when one or two drives fail in your array, it cannot prevent data loss from some calamities, such as a failed RAID controller or a natural disaster.

One of the easiest ways you can back up NextCloud to an off-site location is by using an object storage service, such as Amazon S3 or BackBlaze B2. Both services provide a command line interface that supports a rsync-like sync command which can incrementally backup your data to cloud storage.


Amazon S3 vs BackBlaze B2 for Backup Storage

Some of the benefits of using an object storage service for offsite backup include:

  • Reliability: Amazon S3 is engineered for 99.999999999% durability, BackBlaze B2 99.999999% durability. The object storage service’s infrastructure is designed with multiple redundancies, making it highly unlikely for you to lose your backups.
  • Data Life Cycle Management: Both Amazon S3 and BackBlaze B2 allow you to specify after how many days should be purged from the system, depending on your retention requirements and how much storage costs you are willing to incur. S3 also has multiple classes of storage, from Standard, infrequent access (IA) to Glacier, allowing you to manage your costs by archiving older data to more inexpensive storage.
  • Retrieve Your Data by Physical Disk: If you have a large volume of data that would take days or months to download over your Internet connection, BackBlaze (Data by Mail) or Amazon (AWS Import/Export Disk) will ship you a hard drive with your data for a fee. Should you ever be in a disaster recovery scenario where you need to rebuild your NextCloud users’ data directories, this service could be a lifesaver.

So why would you choose Amazon S3 versus BackBlaze B2? Amazon S3 is a much more mature service available across Amazon’s regions worldwide, whereas B2 storage is solely located in the BackBlaze datacenters in Sacramento, CA. Here are some of the other differences:

Amazon S3 BackBlaze B2
Regions Worldwide Sacramento, CA
Durability 99.999999999% (11 nines) 99.999999% (9 nines)
Single File Size Limit Unlimited 10 TB
Encryption Server-Side and Client-Side Client-Side
Command Line Interface AWS CLI or s3cmd B2 Command Line Tool
Physical Disk Retrieval $80.00 + $2.49/data loading hour + return shipping (if outside US) $99 up to 128 GB (Flash Drive) $189 up to 4 TB (Hard Drive)
Upload (Ingress) Pricing Free Free
Download (Egress) Pricing $0.05 – $0.09/GB $0.02/GB
API Calls $0.005/1,000 requests $0.004/1,000 requests
Storage Pricing $0.021 – $0.023/GB/month $0.005/GB/month

For backup, most of the costs you incur will be for storage. You might not need to download the data for months, or even years, so choosing the storage service with the most favorable monthly storage pricing is vital.

As you can see, BackBlaze B2 is 76% less costly to store the data month-over-month. You can mitigate some of the Amazon S3 storage costs by switching to a lower storage tier, such as Amazon Glacier, but data retrieval may be delayed by 3 to 12 hours, and you must pay a retrieval fee, in addition to the egress and per-request rates.

The main upside of Amazon S3 is their geographically distributed locations, which will result in lower latency to your NextCloud server especially if you are located outside of the US. Backup times may be substantially quicker, especially if your users add a large volume of data regularly. Don’t forget Amazon S3 also supports server-side-encryption, which means that data can be automatically encrypted at-rest, and transparently decrypted when you download it. With BackBlaze B2, you can employ client-side-encryption to protect your data from prying eyes.

Regardless of which option you choose, it’s imperative to have a backup solution for your NextCloud instance. We have prepared two shell scripts for backing up NextCloud that you can download and use with Amazon S3 or BackBlaze B2 respectively. The script can be scheduled as a cron job for daily or weekly backups.

Amazon S3 Backup Script for NextCloud

This script creates an incremental backup of your NextCloud instance to Amazon S3. Amazon S3 is a highly redundant block storage service with versioning and lifecycle management features.


  • NextCloud 11 or 12 running on Ubuntu 16.04+
  • Amazon AWS Account and IAM User with AmazonS3FullAccess privilege
  • Python 2.x and Python PIP – sudo apt-get install python && wget && sudo python
  • s3cmd installed from PyPI – sudo pip install s3cmd

BackBlaze B2 Backup Script for NextCloud

This script creates an incremental backup of your NextCloud instance at BackBlaze’s off-site location.

BackBlaze B2 is an object storage service that is much less expensive than using Amazon S3 for the same purpose, with similar versioning and lifecycle management features.

Uploads are free, and storage costs only $0.005/GB/month compared to S3’s $0.022/GB/month.


  • NextCloud 11 or 12 running on Ubuntu 16.04+
  • BackBlaze B2 account (10 GB Free) – Create one at
  • Python 3.x and Python PIP – sudo apt-get install python3 && wget && sudo python3
  • BackBlaze B2 CLI installed from PyPI – sudo pip install b2


1. Insert the following line in your NextCloud config.php file above the ); to move the cache above each user’s data directory. If /media/external/CloudDATA is not your data directory, substitute the relevant directory prior to /cache.

'cache_path' => '/media/external/CloudDATA/cache',

2. Create a bucket and obtain your Account ID and Application Key from your B2 account.

3. Authenticate your CLI using the b2 authorize_account command.

4. Save this script to a safe directory such as /srv/ and make it executable with the following command.

sudo chmod +x

5. This script must be run as root. To run a backup now:

sudo ./

6. Set up a cron job to run this backup on a predefined schedule (optional).

sudo crontab -u root -e

Add the following line to the crontab to conduct a weekly backup every Saturday at 2:00am.

0 2 * * sat root sh /srv/ > /srv/backupToB2.log

Save, quit and check that the crontab has been installed using the following command.

sudo crontab -u root -l

Cloud Storage, , NextCloud
Previous Post
Tuning NextCloud for Optimal Performance
Next Post
Setting Up a Docker Environment


  • Hi,
    great Tutorial – thank you. I just want to let you know that regarding a backup to S3 I had issues with s3cmd – it did not sync any files and folders that include umlauts [ä,ö,ü,ß] or spaces.
    So after changing from s3cmd-sync to the AWS CLI native s3 sync the issues disappeared.

    So instead of:
    s3cmd put nextcloud.sql s3://$s3_bucket/NextCloudDB/nextcloud.sql
    I used:
    /usr/bin/aws s3api put-object –bucket $s3_bucket –key NextCloudDB/nextcloud.sql –body nextcloud.sql

    For syncing the data folder instead of:
    s3cmd sync –recursive –preserve –exclude ‘*/cache/*’ $data_dir s3://$s3_bucket/
    I used:
    /usr/bin/aws s3 sync –storage-class STANDARD_IA –exclude ‘*/cache/*’ $data_dir s3://$s3_bucket/

    All the best!

  • Drew Morrison
    May 11, 2018 7:16 pm

    I had to make a couple small changes to your script for it to run for me (backing up to B2):

    *The “b2” command wasn’t part of the PATH when the script was called from Cron, so I had to change the two instances there to instead call /usr/local/bin/b2

    *That was particularly hard to track down, because the crontab isn’t outputting stderr anywhere. I had to add “2>&1” to the end of the crontab line for that to capture.

    Thanks for the rest of the info!

  • Hi,
    very nice and useful tutorial. Thanks to the author and to Michael Grey for the changes.
    With this script and the S3 versioning enabled, the versions will grow forever.
    Do you have any suggestion (I’m not an S3 expert at all!!!) on how to limit these versions within the script?


    You can upload changes to all files and keep previous versions for set amount of days

    For example, to make the destination match the source, but retain previous versions for 30 days, call “b2 sync –keepDays 30 –replaceNewer “. You’ll still upload new file versions, however the older versions will now set to be removed in 30 days.

  • Hi, is there a way to encrypt the data befor uploading it?
    Great tutorial btw! Thank you very much!

  • Duplicity will do that for you

Comments are closed.