[Django]-Django.db.utils.DatabaseError: out of shared memory

3👍

I have had a quick google and there are a couple of resources to look at:

  1. postgresql: out of shared memory?
  2. http://www.databasesoup.com/2012/06/postgresqlconf-maxlockspertransaction.html

This quotes is taken from http://www.postgresql.org/docs/9.1/static/runtime-config-locks.html

max_locks_per_transaction (integer)

The shared lock table tracks locks on max_locks_per_transaction * (max_connections + max_prepared_transactions) objects (e.g., tables);
hence, no more than this many distinct objects can be locked at any
one time. This parameter controls the average number of object locks
allocated for each transaction; individual transactions can lock more
objects as long as the locks of all transactions fit in the lock
table. This is not the number of rows that can be locked; that value
is unlimited. The default, 64, has historically proven sufficient, but
you might need to raise this value if you have clients that touch many
different tables in a single transaction. This parameter can only be
set at server start.

Increasing this parameter might cause PostgreSQL to request more System V shared memory than your operating system’s default
configuration allows. See Section 17.4.1 for information on how to
adjust those parameters, if necessary.

When running a standby server, you must set this parameter to the same or higher value than on the master server. Otherwise, queries
will not be allowed in the standby server.

So, it looks like you should check the default value of max_locks_per_transaction to check its set to a sensible value

Likewise, if you are using a default install of postgres then you should probably check all the other defaults as in some instances defaults can be set too low.

Leave a comment