Skip to content
Pablo Rodriguez

Well Architected S3

Applying the AWS Well-Architected Framework Principles to Storage

Section titled “Applying the AWS Well-Architected Framework Principles to Storage”
Well-Architected Framework

“The AWS Well-Architected Framework has six pillars, and each pillar includes best practices and a set of questions that you should consider when you architect cloud solutions.”

This section highlights best practices from the pillars most relevant to storage:

  • Security
  • Reliability
  • Performance Efficiency
  • Cost Optimization

Best Practice Approach: Data Protection – Protecting Data at Rest

Section titled “Best Practice Approach: Data Protection – Protecting Data at Rest”

Enforce Encryption at Rest

Maintain confidentiality of sensitive data in event of unauthorized access or accidental disclosure

  • Private data should be encrypted by default when at rest
  • Data that is encrypted cannot be read without first unencrypting the data
  • Any data stored unencrypted should be inventoried and controlled

Enforce Access Control

Use mechanisms such as isolation and versioning, and apply principle of least privilege

  • Prevent granting of public access to your data
  • Verify that only authorized users can access data on need-to-know basis
  • Protect data with regular backups and versioning
  • Isolate critical data from other data to protect confidentiality and integrity

“Before architecting any workload, foundational practices that influence security should be in place. The methods described for protecting data are important because they support objectives such as preventing mishandling or complying with regulatory obligations.”

Amazon S3 Security Features Supporting Data Protection

Section titled “Amazon S3 Security Features Supporting Data Protection”
  • Default Encryption: S3 buckets and objects encrypted by default
  • Private by Default: S3 buckets and objects created are private and protected
  • Block Public Access: Feature makes buckets inaccessible to public by default
  • Access Controls: Limit access through IAM policies and enable versioning
  • Versioning: “You can protect data in Amazon S3 by limiting access through IAM policies and enabling versioning”
  • Backup Capability: Use Amazon S3 for backing up data to improve failure management
  • Access Management: Multiple tools available for controlling access to buckets and objects

Best Practice Approach: Architecture Selection

Section titled “Best Practice Approach: Architecture Selection”

Performance Efficiency Pillar Best Practices

Section titled “Performance Efficiency Pillar Best Practices”

Learn Available Services

Understand cloud services and features

  • Learn about available cloud services and features
  • Understand what’s available for optimal solution selection
  • Well-Architected workloads use multiple solutions

Factor Cost into Decisions

Consider cost in architectural decisions

  • Factor cost into architectural decisions
  • Solutions often combine multiple approaches
  • Allow different features to improve performance

“The optimal solution for a particular workload varies, and solutions often combine multiple approaches. Well-Architected workloads use multiple solutions and allow different features to improve performance.”

  • Massive Storage: “Amazon S3 is a good choice for object storage and can store massive amounts of unstructured data”
  • Unstructured Data: Optimized for storing and retrieving unstructured data at scale
  • S3 Transfer Acceleration: “For moving files over a long distance between a client and a bucket”
  • Multipart Upload: “Improves throughput when uploading very large files”
  • Multiple Storage Classes: “Amazon S3 provides different storage classes that you can use to store objects based on expected access patterns”
  • S3 Intelligent-Tiering: “Automatically moves objects based on access patterns to the storage tier that is most cost-effective”
  • Performance and Cost: Features support both performance efficiency and cost optimization best practices

Best Practice Approach: Cost-Effective Resources

Section titled “Best Practice Approach: Cost-Effective Resources”

“Using the appropriate services, resources, and configurations for your workloads is key to cost savings. Workloads can change over time. Some services or features are more cost effective at different usage levels.”

  • Automatic Transitions: “Amazon S3 provides different storage classes, and you can create lifecycle rules to automatically move data to a more cost-effective class”
  • Intelligent Tiering: “S3 Intelligent-Tiering automatically moves objects based on access patterns to the storage tier that is most cost-effective”
  • Dual Benefits: Note how these features support both performance and cost-optimization best practices
  • S3 Inventory: “Use Amazon S3 Inventory to audit how Amazon S3 is being used to help make cost-effective choices about how your organization is using Amazon S3”
  • Cost Monitoring: Regular auditing helps identify optimization opportunities
  • Organizational Insights: Provides visibility into storage usage patterns across organization

“By performing the analysis on each component over time and at projected usage, the workload remains cost-effective over its lifetime.”

Best Practice Approach: Failure Management

Section titled “Best Practice Approach: Failure Management”

Use Fault Isolation

Protect your workload through fault isolation

  • Select appropriate locations for multi-location deployment
  • Build layers of defense for resilience
  • Deploy workload components to multiple Availability Zones when possible

Expect Failures

Failures are inevitable

  • “Failures are a given, and everything will eventually fail over time” - Werner Vogels, CTO Amazon.com
  • Use approach that builds layers of defense
  • Always deploy to multiple AZs when possible for high availability
  • 11 Nines Durability: “Amazon S3 is designed for 11 nines of durability to help ensure that data is not lost”
  • 4 Nines Availability: “Designed for 4 nines of availability so that you can rely on your data being accessible when needed”
  • Multi-AZ Storage: “Amazon S3 redundantly stores your objects on multiple Availability Zones in the Amazon S3 Region you designate”
  • Checksum Verification: “Amazon S3 regularly verifies the integrity of your data by using checksums”
  • Automatic Repair: Amazon S3 automatically detects and repairs any lost redundancy
  • Concurrent Failures: “Amazon S3 is designed to sustain concurrent device failures by quickly detecting and repairing any lost redundancy”
  • Backup Solution: Use Amazon S3 for backing up data to improve failure management of applications and data
  • Cross-Region Replication: Additional protection through replication across regions
  • Versioning: Protects against accidental deletion or modification

Key Takeaways: Applying Well-Architected Framework to Storage

Section titled “Key Takeaways: Applying Well-Architected Framework to Storage”
  • Data Protection: “Protecting data is a security best practice that Amazon S3 supports through these default configurations: encrypting objects, making objects private, blocking public access”
  • Access Control: “You can protect data in Amazon S3 by limiting access through IAM policies and enabling versioning”
  • Architecture Selection: “Selecting an architecture is a performance efficiency best practice that Amazon S3 supports through its ability to store massive amounts of unstructured data”
  • Performance Features: “Amazon S3 includes performance-improving options such as S3 Transfer Acceleration and multipart upload”
  • Resource Selection: “Selecting cost-effective resources is a cost-optimization best practice that Amazon S3 supports through features such as lifecycle policies, intelligent tiering, and Amazon S3 Inventory”
  • Ongoing Optimization: Regular analysis and adjustment of storage configurations maintain cost-effectiveness
  • Failure Management: “Failure management is a reliability best practice that Amazon S3 has been designed for through its durability and availability features”
  • Backup Strategy: “You can use Amazon S3 for backing up data to improve failure management of your applications and data”
  • Enable default encryption on all buckets
  • Implement principle of least privilege for access controls
  • Use IAM policies and bucket policies appropriately
  • Enable versioning for critical data
  • Regular security audits using AWS Trusted Advisor
  • Choose appropriate storage classes based on access patterns
  • Implement S3 Transfer Acceleration for global users
  • Use multipart upload for large files
  • Leverage S3 Intelligent-Tiering for unknown access patterns
  • Set up lifecycle policies to transition data to cheaper storage classes
  • Use S3 Inventory to monitor and analyze storage usage
  • Regular cost analysis and optimization reviews
  • Consider data transfer costs in architecture decisions
  • Leverage built-in S3 durability and availability features
  • Implement cross-region replication for critical data
  • Use S3 as part of backup and disaster recovery strategy
  • Regular testing of data recovery procedures

The AWS Well-Architected Framework provides a comprehensive approach to designing storage solutions that are secure, performant, cost-effective, and reliable, with Amazon S3 offering built-in features that support all these pillars.