Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ci: extracting files in scaling test takes too much time #8374

Closed
BugenZhao opened this issue Mar 6, 2023 · 6 comments · Fixed by #8431
Closed

ci: extracting files in scaling test takes too much time #8374

BugenZhao opened this issue Mar 6, 2023 · 6 comments · Fixed by #8431
Labels
component/ci type/bug Something isn't working
Milestone

Comments

@BugenZhao
Copy link
Member

BugenZhao commented Mar 6, 2023

Before #8326, on "scaling test (deterministic simulation)":

�_bk;t=1678035269235��[32;1m   Extracted�[0m �[1m63�[0m files to �[1m/tmp/nextest-archive-oA0Tga�[0m in 7.60s

After it:

2023-03-06 12:00:11 UTC | Extracted 63 files to /tmp/nextest-archive-0zQhiA in 7m 36.09s

Not sure whether it's related to the debug-info... 😟 cc @fuyufjh Seems not relevant. ci-sim is based on dev profile and the artifacts are not built with build.sh.

@huangjw806
Copy link
Contributor

@liurenjie1024 has increased the timeout time on main, if it is a bug, we need to change it back after we fix it. #8409

@BugenZhao
Copy link
Member Author

I believe it's a bug.

@BugenZhao
Copy link
Member Author

BugenZhao commented Mar 8, 2023

cc @xxchan @wangrunji0408 🥺

@xxchan
Copy link
Member

xxchan commented Mar 8, 2023

Can't reproduce locally.

@xxchan
Copy link
Member

xxchan commented Mar 8, 2023

I got the reason. It's because every time cargo nextest run --archive-file extracts the archive once. When parallel, it OOMs, so it's naturally slow.

@xxchan
Copy link
Member

xxchan commented Mar 8, 2023

nextest-rs/nextest#830

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component/ci type/bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants