What is AWS CLI?
Command Line Interface (CLI) is a unified tool to manage your AWS services
What is the latest version of AWS CLI?
AWS CLI v2 is the major version and note that it doesn't backported to v1 thus it might requires you to make changes in your existing scripts, if any
What are all prerequisites to install AWS CLI?
64-bit version of Windows XP or later
Admin rights to install software
How to install AWS CLI in Windows?
Step 1: Download "awscliv2.msi" from following URL and install it https://awscli.amazonaws.com/AWSCLIV2.msi
Step 2: After install, check the version of aws from command prompt
C:\> aws --version
How to install AWS CLI in Linux?
Step 1: $ curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
Step 2: $ unzip awscliv2.zip
Step 3: $ sudo ./aws/install
How to connect your AWS account from CLI?
Step 1: In AWS console,
1.1) go to "user account" => "My Security Credentials" => generate "Access keys (access key ID and secret access key)
1.2) store the generated key file in a secure place
Step 2: In command prompt,
C:\> aws configure
AWS Access Key ID [None]: <<key ID from step 1.1>>
AWS Secret Access Key [None]:<<secret key from step 1.1>>
Default region name [None]: <<region name>>
Default output format [None]: <<json||table||yaml||yaml-stream||text||text>>
Note: these credentials are stored in following folder & file in Windos & Linux
Windows : %USERPROFILE%\.aws\config
%USERPROFILE%\.aws\credential
Linux/mac : ~/.aws/config
~/.aws/credentials
How to access AWS resource (Example, S3 bucket) thru AWS CLI?
Go to command prompt and execute following aws cli commands
1) List all buckets
Command : aws s3 ls
Brief : To list out the existing S3 buckets
2) Create/make a new bucket
Command : aws s3 mb s3://test123
Brief : (it make a new bucket named "test123". Note the s3 is a global name space the bucket name should be unique)
3) Delete a bucket
Command : aws s3 rb s3://dastest10feb21
Brief : delete a bucket
4) Move an object into a S3 bucket (it deletes in local folder)
Command : aws s3 mv fifth.txt s3://dastest
Brief : upload an object (fifth.txt) to "dastest" bucket)
5) Delete an object from S3 bucket
Command : aws s3 rm s3://dastest/sixth.txt
Brief : delete the object "sixth.txt"
6) upload/copy a folder:
Command : aws s3 cp <<directory_name>> s3://<<bucket_name>>/ --recursive
aws s3 cp D:\Raja\aws s3://dastest/aws --recursive
aws s3 cp D:\DBBackup\sdsdm.zip s3://dastest/aws/DBBackup
3.7) upload/copy an object with metadata:
Command: aws s3 cp two.txt s3://dastest/two.txt --metadata="mdate=05/02/2021 08:30 pm"
3.8) Retrieve an object's meta data:
Command : aws s3api head-object --bucket dastest --key two.txt
Output :
{
"AcceptRanges": "bytes",
"LastModified": "2021-02-11T10:58:24+00:00",
"ContentLength": 46,
"ETag": "\"970e96a6af8d4b8376df6a68bc714d08\"",
"ContentType": "text/plain",
"Metadata": {
"mdate": "05/02/2021 08:30 pm"
}
}
For more details: https://docs.aws.amazon.com/cli/latest/userguide/welcome-versions.html
How to programmatically access AWS resources using Python language?
AWS put a SDK for python named "Boto" to access its resources
It enables Python developers to create, configure, and manage AWS services, such as EC2 and S3
Boto provides an easy to use, object-oriented API, as well as low-level access to AWS services
Python 3.7 or later
How to install "Boto" and access AWS resources in Python?
Step 1: pip install boto3
Step 2: configure AWS (Step 2 in "Connect" section)
Step 3: program in python "listbucket.py"
import boto3
# Let's use Amazon S3
s3 = boto3.resource('s3')
# Print out bucket names
for bucket in s3.buckets.all():
print(bucket.name)
Step 4: run the program "listbucket.py"
py listbucket.py