Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change default chunk payload string length #1039

Merged
merged 1 commit into from
Dec 9, 2024

Conversation

smallhive
Copy link
Contributor

Closes #1032.

We should try it out

@@ -37,7 +40,7 @@ var (
// The chunkedReader returns io.EOF when the final 0-length chunk is read.
func NewChunkedReader(r io.ReadCloser, streamSigner *ChunkSigner) io.ReadCloser {
return &chunkedReader{
r: bufio.NewReader(r),
r: bufio.NewReaderSize(r, MaxChunkedPayloadLineLength),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think it's a fix. I can always create something that is larger than this limit and make the gateway panic which is unacceptable.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure too, but we would try

@@ -199,7 +202,7 @@ func readChunkLine(b *bufio.Reader) ([]byte, []byte, error) {
}
return nil, nil, err
}
if len(p) >= maxLineLength {
if len(p) >= MaxChunkedPayloadLineLength {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And here it's too late, panic happens earlier than that. The real question is why ReadSlice doesn't return ErrBufferFull above.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Moving these rows upper is not a solution either, because panic inside bufio.ReadSlice and we just don't have the opportunity to check the length

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know, but we also need to know why it panics. It shouldn't.

@roman-khimov
Copy link
Member

Can be related to multithreading in fact. Why does it happen in a separate routine? You can use https://pkg.go.dev/io#TeeReader instead of wrapReader.

Signed-off-by: Evgenii Baidakov <evgenii@nspcc.io>
@smallhive smallhive force-pushed the 1032-panic-under-high-load branch from 327cf91 to 19fb4cc Compare December 6, 2024 11:29
@smallhive
Copy link
Contributor Author

wrapReader

I don't think this fix will change the situation, but it looks better.

Maybe this panic is connected to golang/go#22330

@roman-khimov roman-khimov merged commit 4e9ea88 into master Dec 9, 2024
15 of 18 checks passed
@roman-khimov roman-khimov deleted the 1032-panic-under-high-load branch December 9, 2024 14:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Panic under high load
2 participants