[Django]-Boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden

114👍

I’m using Amazon IAM for the particular key ID and access key and just bumped into the same 403 Forbidden… Turns out you need to give permissions that target both the bucket root and its subobjects:

{
  "Statement": [
    {
      "Principal": {
          "AWS": "*"
      },
      "Effect": "Allow",
      "Action": "s3:*",
      "Resource": ["arn:aws:s3:::bucket-name/*", "arn:aws:s3:::bucket-name"]
    }
  ]
}
👤AKX

51👍

I would recommend that you try to test your AWS credentials separately to verify whether the credentials do actually have permission to read and write data to the S3 bucket. The following should work:

>>> import boto
>>> s3 = boto.connect_s3('<access_key>', '<secret_key>')
>>> bucket = s3.lookup('donebox-static')
>>> key = bucket.new_key('testkey')
>>> key.set_contents_from_string('This is a test')
>>> key.exists()
>>> key.delete()

You should try the same test with the other bucket (‘donebox-media’). If this works, the permissions are correct and the problem lies in the Django storages code or configuration. If this fails with a 403 then either:

  • The access_key/secret_key strings are incorrect
  • The access_key/secret_key are correct but that account doesn’t have the necessary permissions to write to the bucket

I hope that helps. Please report back your findings.

46👍

I had the same problem and finally discovered that the real problem was the SERVER TIME.
It was misconfigured and AWS responds with a 403 FORBIDDEN.

Using Debian you can autoconfigure using NTP:

ntpdate 0.pool.ntp.org

8👍

This will also happen if your machine’s time settings are incorrect

3👍

In case this helps anyone, I had to add the following configuration entry for collectstatic to work and not return 403:

AWS_DEFAULT_ACL = ''
👤Danra

3👍

It is also possible that the wrong credentials are being used. To verify:

import boto
s3 = boto.connect_s3('<your access key>', '<your secret key>')
bucket = s3.get_bucket('<your bucket>') # does this work?
s3 = boto.connect_s3()
s3.aws_access_key_id  # is the same key being used by default?

If not, take a look at ~/.boto, ~/.aws/config and ~/.aws/credentials.

👤kat

1👍

Here is a refinement with minimal permissions.
In all cases, as discussed elsewhere s3:ListAllMyBuckets is necessary on all buckets.

In it’s default configuration django-storages will upload files to S3 with public-read permissions – see django-storages Amazon S3 backend

Trial and error revealed that in this default configuration the only two permissions required are s3:PutObject to upload a file in the first place and s3:PutObjectAcl to set the permissions for that object to public.

No additional actions are required because from that point forward read is public on the object anyway.

IAM User Policy – public-read (default):

{
   "Version": "2012-10-17",
   "Statement": [
       {
           "Effect": "Allow",
           "Action": "s3:ListAllMyBuckets",
           "Resource": "arn:aws:s3:::*"
       },
       {
           "Effect": "Allow",
           "Action": [
               "s3:PutObject",
               "s3:PutObjectAcl"
           ],
           "Resource": "arn:aws:s3:::bucketname/*"
       }
   ]
}

It is not always desirable to have objects publicly readable. This is achieved by setting the relevant property in the settings file.

Django settings.py:

...
AWS_DEFAULT_ACL = "private"
...

And then the s3:PutObjectAcl is no longer required and the minimal permissions are as follows:

IAM User Policy – private:

{
   "Version": "2012-10-17",
   "Statement": [
       {
           "Effect": "Allow",
           "Action": "s3:ListAllMyBuckets",
           "Resource": "arn:aws:s3:::*"
       },
       {
           "Effect": "Allow",
           "Action": [
               "s3:PutObject",
               "s3:GetObject"
           ],
           "Resource": "arn:aws:s3:::bucketname/*"
       }
   ]
}

0👍

Another solution avoiding custom policies and using AWS predefined policies:

  • Add S3 full access permissions to your S3 user.

    • IAM / Users / Permissions and Attach Policy
    • Add policy “AmazonS3FullAccess”

0👍

Maybe you actually don’t have access to the bucket you’re trying to lookup/get/create..

Remember: bucket names have to be unique across the entire S3 eco-system, so if you try to access (lookup/get/create) a bucket named ‘test’ you will have no access to it.

Leave a comment