AWS Assignment– 2

AWS S3 for Linux Learners

Basic Questions

  1. Create an S3 bucket from the AWS Console with a globally unique name.
  2. Create an S3 bucket using the AWS CLI (aws s3 mb).
  3. Upload a text file to your S3 bucket using the AWS Console.
  4. Upload a text file using the AWS CLI (aws s3 cp).
  5. Download an object from your S3 bucket to your Linux machine.
  6. List all buckets in your account using AWS CLI.
  7. List all objects in your bucket using AWS CLI.
  8. Enable Bucket Versioning on your S3 bucket.
  9. Upload two versions of the same file and verify versioning.
  10. Delete one version of an object and restore the previous version.
  11. Enable Default Encryption (SSE-S3) for your S3 bucket.
  12. Upload a file and verify it is encrypted.
  13. Create an IAM user with S3 access permissions only.
  14. Configure AWS CLI on Linux using the IAM user credentials.
  15. Upload an image file to S3 using the IAM user profile.
  16. Enable Block Public Access on your S3 bucket.
  17. Change object permissions using ACLs and make one object public.
  18. Access the public object using its URL in a browser.
  19. Configure Storage Class for an object to Standard-IA.
  20. Upload a file with the Glacier storage class and verify storage type.

Intermediate Questions

  1. Enable Cross-Region Replication (CRR) between two buckets in different regions.
  2. Upload a file to the source bucket and verify replication to the target bucket.
  3. Create a Lifecycle rule to move objects older than 30 days to Glacier.
  4. Create a Lifecycle rule to delete objects after 90 days.
  5. Enable Access Logs for your S3 bucket and write them to another bucket.
  6. Explore the generated access logs and identify object access events.
  7. Enable CloudWatch metrics for your S3 bucket.
  8. Create a CloudWatch Alarm for “NumberOfObjects” in the bucket.
  9. Configure SSE-KMS encryption for your bucket.
  10. Upload a file with SSE-KMS and verify the KMS key used.
  11. Configure SSE-C encryption and upload a file with a custom key.
  12. Create a Bucket Policy that allows only your IAM user to access objects.
  13. Create a policy that denies public access to all objects.
  14. Host a static website using S3 (index.html + error.html).
  15. Access your S3 static website URL in a browser.
  16. Configure a CORS policy for your bucket to allow cross-origin requests.
  17. Create a folder in S3 and upload multiple files at once.
  18. Sync a local Linux folder with your S3 bucket using aws s3 sync.
  19. Use aws s3 ls –recursive to list objects with sizes and timestamps.
  20. Document differences between S3 storage classes (Standard, IA, Glacier).

Advanced Questions

  1. Create a bucket with Versioning + Lifecycle + Encryption enabled by default.
  2. Configure Cross-Account Access to allow another AWS account to access your bucket.
  3. Upload logs from an EC2 instance to S3 automatically using a cron job + AWS CLI.
  4. Configure S3 to store Terraform state files for remote backend.
  5. Configure a CodePipeline to use S3 as a storage location for build artifacts.
  6. Store Docker images in ECR with backups stored in S3.
  7. Write a Lambda function that triggers on file upload to S3 and processes metadata.
  8. Integrate S3 bucket logs with ELK/EFK stack for analysis.
  9. Write a shell script to automate daily backups from Linux /home to S3.
  10. Deliver a final hands-on project:
    • Create 2 buckets (source + destination)
    • Enable Versioning, CRR, Lifecycle policies
    • Encrypt objects with SSE-KMS
    • Enable Access Logs + CloudWatch metrics
    • Host a static website in one bucket
    • Automate Linux backups to the source bucket