New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SeaweedFS定期出现8888端口无法连接,爬虫的文件也丢失。 #1451
Labels
bug
Something isn't working
Comments
持久化主节点数据了么 |
这个问题解决了,我发现docker版本切换到0.6.3就一切正常了。但是有时候需要git pull两次才能正常拉取代码,否则会第一次文件丢失,第二次pull文件都回来了。 |
是crawlab的docker镜像版本切换到0.6.3,不要用最新的镜像。 |
怎么同步git都没有用 |
找到个解决办法,挂载共用目录,这样git只需要sync一次,还省力 举例方法: volumes:
然后在爬虫的命令里加上root/***开头 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Bug 描述
爬虫代码使用了Git克隆,并且开启了自动拉取,每过2天左右,日志就会出现File not Found。查看Docker日志,显示SeaweedFS的8888端口Connection Refused.
复现步骤
该 Bug 复现步骤如下
期望结果
SeaweedFS能正常工作,不要定期出现Connection Refused的问题。
The text was updated successfully, but these errors were encountered: