AWS Assignment– 2

Storage & Artifact Management for DevOps Learners

Basic Questions

  1. Create an S3 bucket named devops-artifacts to store build artifacts.
  2. Configure a CodeBuild project to upload build output to the devops-artifacts bucket.
  3. Set up CodePipeline with S3 as the source stage for deployment.
  4. Configure S3 bucket versioning for artifact version control.
  5. Enable S3 event notification to trigger a CodePipeline job on new artifact upload.
  6. Store a Terraform state file in S3 and configure state locking with DynamoDB.
  7. Create an S3 bucket lifecycle policy to transition old build artifacts to Glacier.
  8. Configure S3 server-side encryption with SSE-KMS for all uploaded artifacts.
  9. Push a Docker image to ECR and store its metadata logs in S3.
  10. Set up S3 bucket logging to capture access logs for artifact tracking.
  11. Attach an EBS volume to a Jenkins server EC2 instance for persistent job data.
  12. Configure Jenkins to store its workspace on the attached EBS volume.
  13. Automate daily EBS snapshots of the Jenkins server using AWS CLI.
  14. Configure EFS as a shared storage for multiple Jenkins agents.
  15. Mount EFS on two Jenkins worker nodes to share job workspaces.
  16. Store Ansible playbooks and roles inside an EFS mount shared across DevOps teams.
  17. Enable CloudWatch monitoring for S3 bucket metrics (NumberOfObjects, BucketSizeBytes).
  18. Configure EBS volume encryption for a CI server to secure build data.
  19. Create an S3 bucket policy that allows only CodeBuild and CodePipeline access.
  20. Document how S3, EBS, and EFS can be used together in a CI/CD workflow.

Intermediate Questions

  1. Automate the creation of S3 buckets for environments (dev, test, prod) using Terraform.
  2. Configure CodePipeline to pull Terraform code from GitHub and store state in S3.
  3. Set up CodeBuild to compile source code and upload artifacts to S3 with a unique version ID.
  4. Write an IAM policy for a CI/CD role that allows S3 read/write but denies delete.
  5. Configure Jenkins pipeline to archive build artifacts in S3 after every run.
  6. Enable cross-region replication on an artifact bucket to replicate builds globally.
  7. Configure lifecycle rules to expire old Docker image logs stored in S3.
  8. Mount an EBS volume to an EC2 instance running GitLab Runner to persist CI data.
  9. Use Ansible to mount EBS volumes automatically on new EC2 instances.
  10. Write a Terraform script to provision an EC2 with an attached EBS volume.
  11. Set up an Auto Scaling Group of EC2 instances using Launch Templates with persistent EBS volumes.
  12. Configure EFS to be mounted by ECS tasks running in Fargate.
  13. Mount EFS volumes into EKS Pods for shared config storage.
  14. Create a backup strategy to copy Jenkins job data from EBS to S3 daily.
  15. Store central logs from multiple CI/CD jobs in an EFS file system.
  16. Set CloudWatch alarms for S3 bucket size exceeding a defined threshold.
  17. Integrate S3 with AWS CloudTrail to log all artifact-related API calls.
  18. Use CodePipeline to trigger deployment automatically when new artifacts arrive in S3.
  19. Configure CodePipeline deployment approval step that requires artifacts from S3.
  20. Document best practices for using S3, EBS, and EFS in production CI/CD pipelines.

Advanced Questions

  1. Write a Terraform configuration that provisions an S3 bucket, DynamoDB table (for state locking), and a CodePipeline to deploy infrastructure.
  2. Configure Jenkins pipeline to build Docker images, push them to ECR, and upload logs to S3.
  3. Automate artifact promotion across environments (dev → staging → prod) using S3 and CodePipeline.
  4. Use EFS as a shared persistent volume for multiple Jenkins agents in Kubernetes.
  5. Implement a solution where EBS is used for build caching and S3 for long-term artifact storage.
  6. Configure GitLab CI to upload Terraform plan files to S3 and apply only after manual approval.
  7. Set up automated EFS-to-S3 backup synchronization for disaster recovery.
  8. Integrate S3 with ELK stack for centralized CI/CD log analysis.
  9. Configure a blue-green deployment pipeline using CodePipeline with artifacts stored in S3.
  10. Final Hands-on Project:
    • Store Terraform state files in S3 with DynamoDB locking
    • Use CodePipeline with S3 + CodeBuild stages to build and deploy a sample app
    • Store build artifacts and Docker logs in S3
    • Use EBS for Jenkins persistence, EFS for shared build data
    • Monitor all storage via CloudWatch + lifecycle policies for cost optimization