AWS Certified Solutions Architect Associate SAA-C03 Practice Question
A company needs to store thousands of data files generated daily from hundreds of sensors. Each sensor sends small-size files roughly every minute. To minimize the storage costs on Amazon S3, what is the BEST strategy the company should follow?
Batch multiple files together before uploading to reduce the number of PUT requests to S3.
Configure S3 Intelligent-Tiering to automatically move the files to the most cost-efficient tier.
Enable S3 Transfer Acceleration on the bucket to optimize the upload speed of the files.
Upload each file individually to ensure immediate availability and processing in S3.
Batching files together before uploading to Amazon S3 is a cost-effective strategy when dealing with a large number of small files. It reduces the number of PUT requests, which can lower costs, especially because S3 charges for each request. Uploading files individually would generate a significant number of requests, increasing the cost disproportionately in comparison to the actual storage used. Although using S3 Transfer Acceleration might speed up the transfer process and S3 Intelligent-Tiering could help with cost savings over time, they do not address the main cost issue related to the sheer number of PUT requests.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are PUT requests in Amazon S3?
Open an interactive chat with Bash
How does batching files impact storage costs?
Open an interactive chat with Bash
What is S3 Transfer Acceleration, and when should it be used?