Rclone is a command line interface that lets you synchronize files and directories to and from anywhere. It aims to help you save and copy your data to / from any cloud storage providers or local filesystems in a simple, fast and scalable way. In its latest version,
1.36, Rclone supports
SFTP protocol and can be used with C14 to manage files and directories you want to archive.
Thanks to @njcw who worked on the Rclone's
SFTP support for the last weeks, you can now add remote files from AWS S3, BackBlaze B2, Dropbox, Google Cloud Storage and much more to your C14 archives with ease.
This blog post shows you how easy it is to use Rclone with C14 to add and manage remote files in your C14 archives.
To use Rclone with C14, you need to have Rclone installed. You can get it from the official Rclone website.
Create a new C14 archive
Create a new C14 archive from the Online control panel. Select a storage type and take care to select
SSH as file transfer protocol. Click the submit button to create the archive.
Back on the Archive list page, your freshly created archive appears. Click the "see details" link to get your archive information.
You land on the archive page where are displayed the archive URI and credentials.
As a showcase, we'll sync an AWS S3 bucket to C14 using Rclone CLI. The first step to get started is two new remote profiles with Rclone so that we will be able to sync files from AWS S3 to C14.
rclone config command in your terminal. An interactive prompt will guide you to successfully configure your C14 archive setup in Rclone.
$ rclone config n) New remote r) Rename remote c) Copy remote s) Set configuration password q) Quit config n/r/c/s/q> n name> C14 Type of storage to configure. Choose a number from below, or type in your own value 1 / Amazon Drive \ "amazon cloud drive" ... 12 / SSH/SFTP Connection \ "sftp" Storage> 12 SSH host to connect to Choose a number from below, or type in your own value 1 / Connect to example.com \ "example.com" host> xxxxxxxx-yyy-zzzz-aaww-qqeewwssddffa.buffer.c14.io SSH username, leave blank for current username, ebonlieu user> c14ssh SSH port port> 55321 SSH password, leave blank to use ssh-agent y) Yes type in my own password g) Generate random password n) No leave this optional password blank y/g/n> n Remote config -------------------- [C14] host = 329d943b-742d-4119-83d2-8539bef53a97.buffer.c14.io user = c14ssh port = 55321 pass = -------------------- y) Yes this is OK e) Edit this remote d) Delete this remote y/e/d> y Current remotes: Name Type ==== ==== C14 sftp e) Edit existing remote n) New remote d) Delete remote r) Rename remote c) Copy remote s) Set configuration password q) Quit config e/n/d/r/c/s/q> n
When the configuration is done, add another new remote for AWS S3 and fill-in with your credentials. When done, quit the config mode.
We now have two remotes: One for AWS S3 and one for C14:
Name Type ==== ==== C14 sftp S3 s3
Run the command
rclone lsd S3. This command lists the bucket available in AWS S3. We will archive all the content of the bucket
s3bucket2archive to C14
$ rclone lsd S3: -1 2017-03-22 11:28:06 -1 s3bucket2archive
Let's sync the content of the
s3bucket2archive to our C14 archive. First, create a new folder matching the AWS S3 bucket name:
$ rclone mkdir C14:s3bucket2archive $ rclone lsd C14: -1 2017-03-22 12:51:05 -1 s3bucket2archive
At this stage, we can perform the
sync action that will copy our AWS S3 bucket data to our C14 archive:
$ rclone sync S3:s3bucket2archive C14:s3bucket2archive
Once the sync command terminated, you can check that everything run fine by listing the content of C14 archive:
rclone ls C14:s3bucket2archive 39879 docs/doc01.pdf 33090 docs/doc02.pdf 91989 docs/doc03.pdf
Et voilà! Our C14 archive contains our AWS S3 bucket content and is ready for archiving. Back in the Online Control Panel, click the "Archive my files" button and your data will be stored safe in C14.
The rclone documentation is available here: http://rclone.org/.
Sign up now to try the Rclone CLI with C14!
Happy data archiving!