Using s3cmd and s3fs with the Storage API (Ubuntu 11.04)

Here’s how to use Greenqloud’s Storage API with s3cmd and s3fs on Ubuntu 11.04:

Step 1

Log into the Management Console: http://manage.greenqloud.com.

Step 2

Click on “Account information” in the dropdown menu under your username.

Next, download the tar.gz install script pack. This contains install scripts and config files.

Step 3

Unpack and save the folder to your chosen location e.g. “gq_install_scripts”.

Step 4

The install scripts: “s3cmd_install_ubuntu” and “s3fs_install_ubuntu” will install these programs for you and put the config files into the correct locations and backup the default config files.

Step 5

To run these scripts you have to make them executable. Use the command:

chmod +x s3cmd_install_ubuntu s3fs_install_ubuntu

Step 6

To run the scripts you can use the commands:

./s3cmd_install_ubuntu

and

./s3fs_install_ubuntu

Step 7

After running those scripts the programs should have installed correctly.

Step 8

By using the commands “s3cmd –help”  and “s3fs –help” you can see the possible commands with those programs.

s3cmd usage examples

The command:

s3cmd la

lists all your buckets with their items and

s3cmd ls

lists all your buckets

The command:

s3cmd mb s3://testbucket

creates the bucket “testbucket” and you can insert files into the bucket with

s3cmd put somefile s3://testbucket

(this command inserts the file “somefile” into the bucket “testbucket” ).

s3cmd sync usage examples

This is an extremely useful command which can sync an entire directory to a bucket in Greenqloud’s Storage.

In a nutshell it works like this:

s3cmd sync      FROM     TO

To sync a directory /home/user/data/ on your computer to a bucket named testbucket on Greenqloud’s storage use:

s3cmd sync   /home/user/data/   s3://testbucket

To sync a bucket named testbucket on GreenQloud’s storage to a directory /home/user/data/ on your computer use:

s3cmd sync  s3://testbucket     /home/user/data/

s3cmd optional parameters

–dry-run

s3cmd sync  –dry-run /home/user/data/   s3://testbucket

When using –dry-run, s3cmd does’nt upload or download any files, only shows you what will be done if you skip the parameter.

–delete-removed

s3cmd sync  –delete-removed /home/user/data/   s3://testbucket

When using –delete-removed files in the target directory which are not in the source directory are deleted

–skip-existing

s3cmd sync  –skip-existing /home/user/data/   s3://testbucket

When using –skip-existing s3cmd only checks if filenames are the same and doesn’t check whether the files themselves have changed.

Of course you can use all these parameters combined:

s3cmd sync –dry-run –skip-existing –delete-removed   /home/user/data/   s3://testbucket

And you can sync the other way around as well with these parameters:

s3cmd sync –dry-run –skip-existing –delete-removed   s3://testbucket    /home/user/data/

Notes:

Installing On redhat

yum install s3cmd
s3cmd –configure
s3cmd ls
s3cmd sync  –dry-run –delete-remove /root s3://farhantesting
s3cmd del s3://farhantesting/baculaaitr.sql
s3cmd del s3://farhantesting/*

Installing s3cmd on Ubuntu 10.04

You can copy and paste the following commands into your terminal.

Advertisements