Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

prune or re-encode activesets from previous epochs #5114

Closed
dshulyak opened this issue Sep 30, 2023 · 0 comments
Closed

prune or re-encode activesets from previous epochs #5114

dshulyak opened this issue Sep 30, 2023 · 0 comments

Comments

@dshulyak
Copy link
Contributor

stored activesets are ~530MB, while total db size is 1.4GB.

they contain a lot of duplicate data, so there is an opportunity to dedup them, either on gossip or on database level:

  • delta encoding. reference to original activeset plus new/removed ids
  • store single collection of ids in the epoch ordered by receive time, every new activeset stores bitfield with 1 for atxs that it has in its own activeset

the simplest would be to prune it, Iddo shared that we may want to wait with that:

About pruning, the only reason that it might not be ok is if we have protocol rule that measure the median observed total weight. What Tal described in the call about not rewarding if the active set weight is too small (but above the hardcoded minimal weight) was about ephemeral data, but we also have ongoing discussion regarding how to prevent DoS attack (regarding if your own weight is too low in the far future after genesis, and this issue of observed weight is relevant in this context too). So we should be careful before deciding to prune active sets (deduplication is benign and we should do it asap).

it is caused by unreliable propagation of atxs. expected to get improved by #5097 . so need to recheck it in next epoch

@dshulyak dshulyak self-assigned this Oct 17, 2023
bors bot pushed a commit that referenced this issue Oct 19, 2023
closes: #5114

pruning activesets will significantly reduce space (as of now ~1.5GB / 2.9GB).
bors bot pushed a commit that referenced this issue Oct 19, 2023
closes: #5114

pruning activesets will significantly reduce space (as of now ~1.5GB / 2.9GB).
bors bot pushed a commit that referenced this issue Oct 19, 2023
closes: #5114

pruning activesets will significantly reduce space (as of now ~1.5GB / 2.9GB).
bors bot pushed a commit that referenced this issue Oct 19, 2023
closes: #5114

pruning activesets will significantly reduce space (as of now ~1.5GB / 2.9GB).
bors bot pushed a commit that referenced this issue Oct 19, 2023
closes: #5114

pruning activesets will significantly reduce space (as of now ~1.5GB / 2.9GB).
@lrettig lrettig changed the title prune or re-encode acitvesets from previous epochs prune or re-encode activesets from previous epochs Oct 19, 2023
@bors bors bot closed this as completed in 21c602e Oct 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Archived in project
Development

Successfully merging a pull request may close this issue.

1 participant