Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Track performance metrics of child processes #171

Open
infogulch opened this issue Apr 15, 2023 · 2 comments
Open

Track performance metrics of child processes #171

infogulch opened this issue Apr 15, 2023 · 2 comments

Comments

@infogulch
Copy link

infogulch commented Apr 15, 2023

It would be nice to track and report various metrics for child processes created with assert_cmd. The kind of process metrics that you might track with /usr/bin/time or perf, such as instruction count, page faults, max memory, io counts, context switches, etc...

Some resources I found:

Does anyone have a strong disposition for or against tracking performance metrics of processes started with assert_cmd?

@infogulch infogulch changed the title Track performance metrics of children Track performance metrics of child processes Apr 15, 2023
@epage
Copy link
Contributor

epage commented Apr 17, 2023

Could we do it in a cross-platform way?

What would we then do with these metrics? At this time, libtest doesn't really let us report things of interest like this for bubblinb up to the user. Directly reporting it gets a bit weird because of parallel runs.

@infogulch
Copy link
Author

infogulch commented Apr 17, 2023

I guess the question of what to do with the metrics is up to the user, since this crate doesn't integrate directly with libtest (other than to panic on assertions where libtest expects).

I see that libtest has experimental Metric and MetricMap which could be useful here:

Maybe Command could expose a function to indicate a list of metrics to track and a way to export them after the command completes. Then the user can do whatever they want; if using the experimental Metrics they could export it there, else write it to a file or whatever.


I guess that metrics will be platform-specific enough that it will be impractical to try to make the collection system generic to the platform. Maybe the user would provide a list of metrics they would "like" to track and whichever ones are available on the current platform are tracked.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants