In this post I am
explaining about AWS CLI. Here we come to know about how to
access the amazon account using CLI.
We can access AWS account from Window's machine and
Linux machine.
Windows
To access AWS account in
window we should install aws cli.
For that we can download
from the link (https://aws.amazon.com/cli/) or we can use https://s3.amazonaws.com/aws-cli/AWSCLI64.msi
if we run this .msi
file, AWS CLI install in our local windows machine.
Then open command
prompt:
Type "aws
configure"
Then it will ask
flllowing configurations
AWS Access Key ID
[None]: xxxxxxxxxxxx
AWS Secret Access Key
[None]: ###########################
Default region name
[None]: @@@@@
Default output format
[None]: ENTER
Finally you installed AWS CLI in your windows machine.
For testing you can
use ”aws s3 ls”. It will give all s3 bucket list.
AWS EC2 machine:
In AWS ec2 machine it
will give default in the position /usr/bin/aws.
So no need to install
again in this machine. U can access directly from anywhere.
Here I explained some
queries, we can use them for the fast processing.
“aws s3 ls “
It will give us list of
all buckets.
“aws s3 ls s3://somutest/”
It will give us list of
all documents and folders in this bucket.
“aws s3 cp .
s3://somutest/”
It will transfer all
documents from present local folder (should type .dot to represent present
location) to s3. To transfer only one file we should mention file name like “aws
cp abc.txt to s3://somutest/”
Here fallowed by some
commands which are mostly used in real life.
aws s3 cp s3://somutest
.
$ aws s3 sync .
s3://my-bucket/MyFolder --acl public-read
Creating Buckets
$ aws s3 mb
s3://bucket-name
Removing Buckets
$ aws s3 rb
s3://bucket-name
$ aws s3 rb
s3://bucket-name --force (Non empty bucket)
Listing Buckets
$ aws s3 ls
$ aws s3 ls
s3://bucket-name/MyFolder
Managing objects
aws s3 mv
s3://mybucket . --recursive
$ aws s3 cp file.txt
s3://bucket-name/ --grants read=uri=http://acs.amazonaws.com/groups/global/AllUsers
full=emailaddress=user@example.com
$ aws s3 sync
<source> <target> [--options]
aws s3 sync .
s3://somutest (It will upload all documents from local that folder to s3
bucket)
// Delete local file
$ rm ./MyFile1.txt
// Attempt sync
without --delete option - nothing happens
$ aws s3 sync .
s3://my-bucket/MyFolder
// Sync with deletion
- object is deleted from bucket
$ aws s3 sync .
s3://my-bucket/MyFolder --delete
delete:
s3://my-bucket/MyFolder/MyFile1.txt
// Delete object from
bucket
$ aws s3 rm
s3://my-bucket/MyFolder/MySubdirectory/MyFile3.txt
delete:
s3://my-bucket/MyFolder/MySubdirectory/MyFile3.txt
// Sync with deletion
- local file is deleted
$ aws s3 sync
s3://my-bucket/MyFolder . --delete
delete: MySubdirectory\MyFile3.txt
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
Local directory
contains 3 files:
MyFile1.txt
MyFile2.rtf
MyFile88.txt
'''
$ aws s3 sync .
s3://my-bucket/MyFolder --exclude '*.txt'
upload: MyFile2.rtf to
s3://my-bucket/MyFolder/MyFile2.rtf
'''
$ aws s3 sync .
s3://my-bucket/MyFolder --exclude '*.txt' --include 'MyFile*.txt'
upload: MyFile1.txt to
s3://my-bucket/MyFolder/MyFile1.txt
upload: MyFile88.txt
to s3://my-bucket/MyFolder/MyFile88.txt
upload: MyFile2.rtf to
s3://my-bucket/MyFolder/MyFile2.rtf
'''
$ aws s3 sync .
s3://my-bucket/MyFolder --exclude '*.txt' --include 'MyFile*.txt' --exclude
'MyFile?.txt'
upload: MyFile2.rtf to
s3://my-bucket/MyFolder/MyFile2.rtf
upload: MyFile88.txt
to s3://my-bucket/MyFolder/MyFile88.txt
@@@@@@@@@@@@@@@@@@@@@@@@@@@
// Copy MyFile.txt in
current directory to s3://my-bucket/MyFolder
$ aws s3 cp MyFile.txt
s3://my-bucket/MyFolder/
// Move all .jpg files
in s3://my-bucket/MyFolder to ./MyDirectory
$ aws s3 mv
s3://my-bucket/MyFolder ./MyDirectory --exclude '*' --include '*.jpg'
--recursive
// List the contents
of my-bucket
$ aws s3 ls
s3://my-bucket
// List the contents
of MyFolder in my-bucket
$ aws s3 ls
s3://my-bucket/MyFolder
// Delete
s3://my-bucket/MyFolder/MyFile.txt
$ aws s3 rm
s3://my-bucket/MyFolder/MyFile.txt
// Delete
s3://my-bucket/MyFolder and all of its contents
$ aws s3 rm
s3://my-bucket/MyFolder --recursive
Aws EMR Cluster
aws emr create-cluster \
--ami-version 3.8.0 \
--instance-type m1.xlarge \
--instance-count 1 \
--name "cascading-kinesis-example" \
--visible-to-all-users \
--enable-debugging \
--auto-terminate
\
--no-termination-protected \
--log-uri s3n://quanttestbucket/logs/ \
--service-role EMR_DefaultRole --ec2-attributes InstanceProfile=EMR_EC2_DefaultRole
\