Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Should we auto reconnect to redis? #167

Open
botzill opened this issue May 1, 2020 · 2 comments
Open

Should we auto reconnect to redis? #167

botzill opened this issue May 1, 2020 · 2 comments

Comments

@botzill
Copy link

botzill commented May 1, 2020

I get the following error while the consumer spider running:

RedisMixin.spider_idle of <MySpider 'my_spider' at 0x7f5c2cb335c0>>
Traceback (most recent call last):
  File "/usr/local/lib/python3.6/dist-packages/scrapy/utils/signal.py", line 32, in send_catch_log
    *arguments, **named)
  File "/usr/local/lib/python3.6/dist-packages/pydispatch/robustapply.py", line 55, in robustApply
    return receiver(*arguments, **named)
  File "/usr/local/lib/python3.6/dist-packages/scrapy_redis/spiders.py", line 121, in spider_idle
    self.schedule_next_requests()
  File "/usr/local/lib/python3.6/dist-packages/scrapy_redis/spiders.py", line 116, in schedule_next_requests
    self.crawler.engine.crawl(req, spider=self)
  File "/usr/local/lib/python3.6/dist-packages/scrapy/core/engine.py", line 216, in crawl
    self.schedule(request, spider)
  File "/usr/local/lib/python3.6/dist-packages/scrapy/core/engine.py", line 222, in schedule
    if not self.slot.scheduler.enqueue_request(request):
  File "/usr/local/lib/python3.6/dist-packages/scrapy_redis/scheduler.py", line 167, in enqueue_request
    self.queue.push(request)
  File "/usr/local/lib/python3.6/dist-packages/scrapy_redis/queue.py", line 104, in push
    self.server.execute_command('ZADD', self.key, score, data)
  File "/usr/local/lib/python3.6/dist-packages/redis/client.py", line 901, in execute_command
    return self.parse_response(conn, command_name, **options)
  File "/usr/local/lib/python3.6/dist-packages/redis/client.py", line 915, in parse_response
    response = connection.read_response()
  File "/usr/local/lib/python3.6/dist-packages/redis/connection.py", line 730, in read_response
    response = self._parser.read_response()
  File "/usr/local/lib/python3.6/dist-packages/redis/connection.py", line 321, in read_response
    raw = self._buffer.readline()
  File "/usr/local/lib/python3.6/dist-packages/redis/connection.py", line 256, in readline
    self._read_from_socket()
  File "/usr/local/lib/python3.6/dist-packages/redis/connection.py", line 201, in _read_from_socket
    raise ConnectionError(SERVER_CLOSED_CONNECTION_ERROR)
redis.exceptions.ConnectionError: Connection closed by server.
2020-05-01 07:38:27 [scrapy.core.engine] INFO: Closing spider (finished)

I think we need to reconnect to redis if we have cases like this.

@LuckyPigeon
Copy link
Collaborator

@botzill
How often does this occurred?

@rmax
Copy link
Owner

rmax commented Oct 27, 2020

I think makes sense to add some configurable retrying behaviour for known cases.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants