Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Always export batch when limit reached #519

Merged
merged 1 commit into from Apr 14, 2021

Conversation

jtescher
Copy link
Member

Currently spans will begin to drop if more than the max queue size are generated within a given batch window. This change causes the batch to export when full, allowing more spans to be exported within a given window even if the scheduled export time has not yet been reached.

Resolves #468

Currently spans will begin to drop if more than the max queue size are
generated within a given batch window. This change causes the batch to
export when full, allowing more spans to be exported within a given
window even if the scheduled export time has not yet been reached.
@jtescher jtescher requested a review from a team as a code owner April 14, 2021 01:43
@codecov
Copy link

codecov bot commented Apr 14, 2021

Codecov Report

Merging #519 (faec528) into main (607b32a) will decrease coverage by 0.1%.
The diff coverage is 72.2%.

Impacted file tree graph

@@           Coverage Diff           @@
##            main    #519     +/-   ##
=======================================
- Coverage   51.5%   51.3%   -0.2%     
=======================================
  Files         95      95             
  Lines       8531    8522      -9     
=======================================
- Hits        4394    4380     -14     
- Misses      4137    4142      +5     
Impacted Files Coverage Δ
opentelemetry/src/sdk/trace/span_processor.rs 80.7% <72.2%> (-1.5%) ⬇️
opentelemetry/src/global/error_handler.rs 44.4% <0.0%> (-5.6%) ⬇️
...ntelemetry/src/sdk/metrics/aggregators/ddsketch.rs 79.4% <0.0%> (-0.2%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 607b32a...faec528. Read the comment docs.

Copy link
Contributor

@TommyCpp TommyCpp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall looks good 👍 . I wonder if it make sense to move the background job into a dedicate function and add some comment to explain how it works

@jtescher
Copy link
Member Author

I wonder if it make sense to move the background job into a dedicate function and add some comment to explain how it works

Open to suggestions, its relatively simple (loop with 3 branches based on message type), so moving the closure into a function doesn't bring all that much clarity, it might be easier to reason about if the three inner blocks were functions, but most of what they are doing is error handling and the function calls would add some indirection making the code a bit harder to follow, but happy to do that if others find it easier to read / reason about that way.

@jtescher
Copy link
Member Author

Ok will merge this for now, we can think about cleanup with a subsequent issue or PR as it shouldn't be a breaking change.

@jtescher jtescher merged commit 1cdf725 into main Apr 14, 2021
@jtescher jtescher deleted the always-export-batch-when-limit-reached branch April 14, 2021 16:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

batchexporter doesnt export spans when batch size is reached
3 participants