Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Logs from "files" processor are capped at 256000 bytes #1202

Open
jimmyw opened this issue Oct 13, 2022 · 3 comments
Open

Logs from "files" processor are capped at 256000 bytes #1202

jimmyw opened this issue Oct 13, 2022 · 3 comments

Comments

@jimmyw
Copy link

jimmyw commented Oct 13, 2022

Describe the bug
Logs from "files" processor are capped at 256000 bytes

To Reproduce
Steps to reproduce the behavior:

  1. Create a job, use the files processor

Expected behavior
There is plenty of disk to use for logs. There should be no cap.

** Specifications:**

  • OS: Linux
  • Version 3.2.1

Additional context
Add any other context about the problem here.

@vcastellm
Copy link
Member

This is a limitation of the executor https://github.com/distribworks/dkron/blob/master/builtin/bins/dkron-executor-shell/shell.go#L27 not the processor, there are several ways of fixing this, I've got this from the top of my head:

  • Change the architecture to execute processors on streaming output to continuously execute processors

@cobolbaby
Copy link
Contributor

Change the architecture to execute processors on streaming output to continuously execute processors

The design should make this easier

@jimmyw
Copy link
Author

jimmyw commented Apr 24, 2023

Streaming out the logs is the correct way, buffering in ram is wrong in so many ways...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants