AWS S3 bucket Misconfigurations and Exploitations

Akash Venky
6 min readOct 13, 2022

--

— — — — — — — — — - AWS S3 bucket Misconfigurations — — — — — — — — — —

What is (Simple Storage Service) S3 Buckets

An Amazon S3 bucket is a public cloud storage resource available in Amazon Web Services’ (AWS) Simple Storage Service (S3), an object storage offering. Amazon S3 buckets, which are similar to file folders, store objects, which consist of data and its descriptive metadata.

Types of S3 Buckets in AWS?

S3 Storage Classes can be configured at the object level, and a single bucket can contain objects stored across

S3 Standard

S3 Intelligent-Tiering

S3 Standard-IA

S3 One Zone-IA.

How to find S3 Buckets:

You can use many online tools which are available on GitHub to find S3 bucket of a website. I would like to list down few of them:

  1. Lazy S3
  2. bucket_finder
  3. AWS Cred Scanner
  4. sandcastle
  5. Mass3
  6. Dumpster Diver
  7. S3 Bucket Finder

Some methods to identify S3-buckets are:

  • Look at the HTTP-response for a Server-header which says AmazonS3.
  • Look at a random URL that doesn’t exist and see if it gives you a S3–404, either with “Static Website enabled” or not, containing Access Denied or NoSuchKey:
  • The DNS-entry of the domain might reveal the bucket-name directly if the host points directly to S3.
  • Try accessing the root-URL. If index-listing is enabled (public READ on the Bucket ACL) you will be able to see the bucket-name defined in <Name>-element.

======================Next Method========================

If you find a domain that is pointing to a bucket, but cannot get the bucket name, try the actual fully qualified domain name (FQDN) as the bucket name, this is a common setup, having the bucket named as the domain that is pointing to it.

If this doesn’t work, try to:

  • Google the domain and see if any history of it exposes the bucket name.
  • Look at response headers of objects in the bucket to see if they have meta data that reveals the bucket name.
  • Look at the content and see if it refers to any bucket. We’ve seen instances where assets are tagged with the bucket name and a date when they were deployed.
  • Brute-force. Be nice here, don’t shoot thousands of requests against S3 just to find a bucket. Try be clever depending on the name of the domain pointing to it and the actual reason why the bucket exists. If the bucket contains audio files for ACME on the domain media.acme.edu, try media.acme.edu, acme-edu-media, acme-audio or acme-media.

If the response on $bucket.s3.amazonaws.com shows NoSuchBucket you know the bucket doesn’t exist. An existing bucket will either give you ListBucketResult or AccessDenied.

======================Next Method========================

1. HTML inspection

Consider the HTML code of the web application under Testing. It might in fact happen that you might find S3 URLs directly in the HTML code and by the resources loaded by the web page …!!! saving you the trouble of looking around for the buckets.

2. Brute-force

A brute-force approach, possibly based on a wordlist of common words along with specific words coming from the domain you’re testing, might also do the trick.

For eg: we can use the Burp Intruder to perform a series of request to the URL http://s3.amazonaws.com/[bucketname]. This URL does not identify a bucket however, it responds with a convenient PermanentRedirect message in case a bucket is found and a NoSuchBucket message otherwise.

In the intruder tab configure as target host http://s3.amazonaws.com, then move to position and setup a simple get request and put the payload position right after the / character of the request.

Proceed to the Payloads section and load your wordlist, finally move to Options and in the Grep — Match panel add only a match for the word PermanentRedirect. This will help in identifying and sorting the result of the attack.

Now press the Start attack button and the intruder will start performing requests and collecting results of possible buckets.

3. Google Dork

Google always comes to the rescue when it comes to search for URLs. You can in fact use the convenient “Inurl” directive to search for possibly interesting AWS S3 buckets. The following list of google docks that can be used to retrieve possibly juice AWS S3 buckets.

Inurl: s3.amazonaws.com/legacy/

Inurl: s3.amazonaws.com/uploads/

Inurl: s3.amazonaws.com/backup/

Inurl: s3.amazonaws.com/mp3/

Inurl: s3.amazonaws.com/movie/

Inurl: s3.amazonaws.com/video/

inurl: s3.amazonaws.com

4. DNS Caching

There are many services out there maintaining some sort of DNS caching that can be queried by users. By taking advantage of such services it is possible to hunt down AWS S3 buckets.

Few examples are below:

https://findsubdomains.com/

https://www.robtex.com/

https://buckets.grayhatwarfare.com/ (created specifically to collect AWS S3 buckets)

5. Bing reverse IP

Microsoft’s Bing search engine can be very helpful in identifying AWS S3 buckets given its ability of searching for domains given an IP address. Given the IP address of a known AWS S3 bucket, just by taking advantage of the “ip:[IP]” feature of Bing, it is possible to retrieve many other AWS S3 buckets resolving to the same IP.

Eg: Bing Searchbar and type= ip:x.x.x.x

===============================================================

Types of Exploitations attacks possible on S3 bucket

  1. full anonymous access (cd or “— no-sign-request”)
  2. Arbitrary file listing (ls)
  3. Malicious file uploads
  4. Copy/Move/Delete the Sensitive S3 bucket`s Data.
  5. Allows arbitrary read/writes of objects
  6. S3 bucket reveals ACP/ACL

How to Exploit the Misconfiguration of an AWS S3 Buckets.

  1. Check for the Private/Public Exposed S3 Buckets (Can use Burp Extensions,Keyfinder,KeyHunter,TokenFinder etc)
  2. Install and configure the AWS Command Line Interface (AWS CLI).{Pip install AWS-CLI, NEXT aws — version (To verify the installations), Next aws s3 cp s3://Bucket_Url.s3.amazonaws.com/ — no-sign-request
  3. AWS CLI has Operations such as CP: Copy, MV:Move, RM:Delete
  4. Directory and S3 Prefix Operations: Sync,mb,rb,ls,website,Presign

Boom….!!!!!! S3 has been now exploited, only if target Private/Public S3 is misconfigured.

— — — — — — — — — — -Few key Notes on AWS S3 Buckets — — — — — — — — — —

All S3 Buckets looks like this https://S3_Bucket_NAME.s3.amazonaws.com

Also identified by https://github.com/FishermansEnemy/bucket_finder

What is S3 lifecycle?

An S3 Lifecycle configuration is an XML file that consists of a set of rules with predefined actions that you want Amazon S3 to perform on objects during their lifetime. You can also configure the lifecycle by using the Amazon S3 console, REST API, AWS SDKs, and the AWS Command Line Interface (AWS CLI)

Is S3 global or region specific?

Amazon S3 supports global buckets, which means that each bucket name must be unique across all AWS accounts in all the AWS Regions within a partition.

Who is S3 bucket owner?

By default, an S3 object is owned by the AWS Account that uploaded the object. S3 Object Ownership gives you a simple bucket setting that changes this default behavior, so that new objects uploaded with the bucket-owner-full-control access control list (ACL) will instead be owned by you.

How do I transfer data between S3 buckets?

To copy objects from one S3 bucket to another, follow these steps:

Create a new S3 bucket.

Install and configure the AWS Command Line Interface (AWS CLI).

Copy the objects between the S3 buckets. …

Verify that the objects are copied.

Update existing API calls to the target bucket name.

Suggestions are most welcomed,

Please write a mail to Akash.venky091@gmail.com, Also you can follow me here for more updates on Security, Ethical hacking Akash Venky or contact me @ https://www.linkedin.com/in/akash-h-c-4a4090a7/

--

--