[Fixed]-Django fixture fails, stating "DatabaseError: value too long for type character varying(50)"

8đź‘Ť

âś…

Well, what makes the difference is the encoding of the template databases. On the production server they had ascii encoding while on the dev box it is utf-8.

By default postgres creates a database using the template1. My understanding is that if its encoding is not utf-8, then the database you create will have this issue, even though you create it with utf-8 encoding.

Therefore I dropped it and recreated it with its encoding set to UTF8. The snippet below does it (taken from here):

psql -U postgres 

UPDATE pg_database SET datallowconn = TRUE where datname = 'template0';
\c template0
UPDATE pg_database SET datistemplate = FALSE where datname = 'template1';
drop database template1;
create database template1 with template = template0 encoding = 'UNICODE';
UPDATE pg_database SET datistemplate = TRUE where datname = 'template1';
\c template1
UPDATE pg_database SET datallowconn = FALSE where datname = 'template0';

Now the fixture loads smoothly.

👤shanyu

10đź‘Ť

Update: the 50 char limit is now 255 in Django 1.8

—

Original answer:

I just encountered this this afternoon, too, and I have a fix (of sorts)

This post here implied it’s a Django bug to do with length of the value allowed for auth_permission. Further digging backs up that idea, as does this Django ticket (even though it’s initially MySQL-related).

It’s basically that a permission name is created based on the verbose_name of a model plus a descriptive permission string, and that can overflow to more than the 50 chars allowed in auth.models.Permission.name.

To quote a comment on the Django ticket:

The longest prefixes for the string value in the column auth_permission.name are “Can change ” and “Can delete “, both with 11 characters. The column maximum length is 50 so the maximum length of Meta.verbose_name is 39.

One solution would be to hack that column to support > 50 characters (ideally via a South migration, I say, so that it’s easily repeatable) but the quickest, most reliable fix I could think of was simply to make my extra-long verbose_name definition a lot shorter (from 47 chars in the verbose_name to around 20). All works fine now.

👤Steve Jalim

3đź‘Ť

Get the real SQL query on both systems and see what is different.

👤Frank Heikens

2đź‘Ť

Just for information : I also had this error

DatabaseError: value too long for type character varying(10)

It seems that I was writing data over the limit of 10 for a field. I fixed it by increasing the size of a CharField from 10 to 20

I hope it helps

👤luc

1đź‘Ť

As @stevejalim says, it’s quite possible that the column auth_permission.name is the problem with length 50, you verify this with \d+ auth_permission in postgres’s shell. In my case this is the problema, thus when I load django models’s fixtures I got “DatabaseError: value too long for type character varying(50)”, then change django.contrib.auth’s Permission model is complicated, so … the simple solution was perform a migrate on Permission model, I did this running ALTER TABLE auth_permission ALTER COLUMN name TYPE VARCHAR(100); command in postgres’s shell, this works for me.

credits for this comment

👤geoom

1đź‘Ť

You can make Django use longer fields for this model by monkey-patching the model prior to using it to create the database tables. In “manage.py”, change:

if __name__ == "__main__":
    execute_manager(settings)

to:

from django.contrib.auth.models import Permission
if __name__ == "__main__":
    # Patch the field width to allow for our long model names
    Permission._meta.get_field('name').max_length=200
    Permission._meta.get_field('codename').max_length=200
    execute_manager(settings)

This modifies the options on the field before (say) manage.py syncdb is run, so the databate table has nice wide varchar() fields. You don’t need to do this when invoking your app, as you never attempt to modify the Permissions table whle running.

Leave a comment