27π
I think the
for key in x: cache.delete(key)
is pretty good and concise. delete
really wants one key at a time, so you have to loop.
Otherwise, this previous question and answer points you to a lua-based solution.
62π
Use SCAN iterators: https://pypi.python.org/pypi/redis
for key in r.scan_iter("prefix:*"):
r.delete(key)
- [Django]-What is the difference between cached_property in Django vs. Python's functools?
- [Django]-How to test custom django-admin commands
- [Django]-How to add superuser in Django from fixture
40π
Here is a full working example using py-redis:
from redis import StrictRedis
cache = StrictRedis()
def clear_ns(ns):
"""
Clears a namespace
:param ns: str, namespace i.e your:prefix
:return: int, cleared keys
"""
count = 0
ns_keys = ns + '*'
for key in cache.scan_iter(ns_keys):
cache.delete(key)
count += 1
return count
You can also do scan_iter
to get all the keys into memory, and then pass all the keys to delete
for a bulk delete but may take a good chunk of memory for larger namespaces. So probably best to run a delete
for each key.
Cheers!
UPDATE:
Since writing the answer, I started using pipelining feature of redis to send all commands in one request and avoid network latency:
from redis import StrictRedis
cache = StrictRedis()
def clear_cache_ns(ns):
"""
Clears a namespace in redis cache.
This may be very time consuming.
:param ns: str, namespace i.e your:prefix*
:return: int, num cleared keys
"""
count = 0
pipe = cache.pipeline()
for key in cache.scan_iter(ns):
pipe.delete(key)
count += 1
pipe.execute()
return count
UPDATE2 (Best Performing):
If you use scan
instead of scan_iter
, you can control the chunk size and iterate through the cursor using your own logic. This also seems to be a lot faster, especially when dealing with many keys. If you add pipelining to this you will get a bit of a performance boost, 10-25% depending on chunk size, at the cost of memory usage since you will not send the execute command to Redis until everything is generated. So I stuck with scan:
from redis import StrictRedis
cache = StrictRedis()
CHUNK_SIZE = 5000
def clear_ns(ns):
"""
Clears a namespace
:param ns: str, namespace i.e your:prefix
:return: int, cleared keys
"""
cursor = '0'
ns_keys = ns + '*'
while cursor != 0:
cursor, keys = cache.scan(cursor=cursor, match=ns_keys, count=CHUNK_SIZE)
if keys:
cache.delete(*keys)
return True
Here are some benchmarks:
5k chunks using a busy Redis cluster:
Done removing using scan in 4.49929285049
Done removing using scan_iter in 98.4856731892
Done removing using scan_iter & pipe in 66.8833789825
Done removing using scan & pipe in 3.20298910141
5k chunks and a small idle dev redis (localhost):
Done removing using scan in 1.26654982567
Done removing using scan_iter in 13.5976779461
Done removing using scan_iter & pipe in 4.66061878204
Done removing using scan & pipe in 1.13942599297
- [Django]-Django Rest Framework, passing parameters with GET request, classed based views
- [Django]-Django form: what is the best way to modify posted data before validating?
- [Django]-Does SQLAlchemy have an equivalent of Django's get_or_create?
11π
From the Documentation
delete(*names) Delete one or more keys specified by names
This just wants an argument per key to delete and then it will tell you how many of them were found and deleted.
In the case of your code above I believe you can just do:
redis.delete(*x)
But I will admit I am new to python and I just do:
deleted_count = redis.delete('key1', 'key2')
- [Django]-Django 2 β How to register a user using email confirmation and CBVs?
- [Django]-Check if key exists in a Python dict in Jinja2 templates
- [Django]-Pagination in Django-Rest-Framework using API-View
9π
Btw, for the django-redis you can use the following (from https://niwinz.github.io/django-redis/latest/):
from django.core.cache import cache
cache.delete_pattern("foo_*")
- [Django]-What is the best django model field to use to represent a US dollar amount?
- [Django]-How to assign items inside a Model object with Django?
- [Django]-How do I restrict foreign keys choices to related objects only in django
7π
cache.delete(*keys)
solution of Dirk works fine, but make sure keys isnβt empty to avoid a redis.exceptions.ResponseError: wrong number of arguments for 'del' command
.
If you are sure that you will always get a result: cache.delete(*cache.keys('prefix:*') )
- [Django]-Django related_name for field clashes
- [Django]-Django Aggregation: Summation of Multiplication of two fields
- [Django]-Why is __init__ module in django project loaded twice
5π
You can use a specific pattern to match all keys and delete them:
import redis
client = redis.Redis(host='192.168.1.106', port=6379,
password='pass', decode_responses=True)
for key in client.keys('prefix:*'):
client.delete(key)
- [Django]-Django Deprecation Warning or ImproperlyConfigured error β Passing a 3-tuple to django.conf.urls.include() is not supported
- [Django]-Django: TemplateDoesNotExist (rest_framework/api.html)
- [Django]-How to register users in Django REST framework?
4π
According to my test, it will costs too much time if I use scan_iter
solution (as Alex Toderita wrote).
Therefore, I prefer to use:
from redis.connection import ResponseError
try:
redis_obj.eval('''return redis.call('del', unpack(redis.call('keys', ARGV[1])))''', 0, 'prefix:*')
except ResponseError:
pass
The prefix:*
is the pattern.
refers to:
https://stackoverflow.com/a/16974060
- [Django]-AngularJS with Django β Conflicting template tags
- [Django]-How can I filter a Django query with a list of values?
- [Django]-What does error mean? : "Forbidden (Referer checking failed β no Referer.):"
0π
Use delete_pattern: https://niwinz.github.io/django-redis/latest/
from django.core.cache import cache
cache.delete_pattern("prefix:*")
- [Django]-Reload django object from database
- [Django]-Different db for testing in Django?
- [Django]-Composite primary key in django
0π
The answer suggested by @radtek is not working for me, since the keys are getting deleted while iterating, which leads to unexpected behavior. Hereβs an example:
from redis import StrictRedis
cache = StrictRedis()
for i in range(0, 10000):
cache.set(f'test_{i}', 1)
cursor = '0'
SCAN_BATCH_SIZE = 5000
while cursor != 0:
cursor, keys = self._redis.scan(cursor=cursor, match='test_*', count=SCAN_BATCH_SIZE)
if keys:
cache.delete(*keys)
## Iteration 1
# cursor=5000, keys=['test_0', .... , 'test_4999']
# keys will get deleted
## Iteration 2
# cursor=0, keys=[]
# No remaining keys are found reason being, there are just the
# 5000 entries left post deletion and the cursor position is already
# at 5000. Hence, no keys are returned.
You can use redis pipeline in order to solve this issue as mentioned below:
from redis import StrictRedis
cache = StrictRedis()
for i in range(0, 10000):
cache.set(f'test_{i}', 1)
pipe = cache.pipeline()
cursor = None
SCAN_BATCH_SIZE = 5000
while cursor != 0:
cursor, keys = self._redis.scan(cursor=cursor or 0, match='test_*', count=SCAN_BATCH_SIZE)
if keys:
pipe.delete(*keys)
pipe.execute()
- [Django]-How can i test for an empty queryset in Django?
- [Django]-Django "login() takes exactly 1 argument (2 given)" error
- [Django]-What is a django.utils.functional.__proxy__ object and what it helps with?