Skip to content

Conversation

@adisbladis
Copy link
Member

Things done

See #155060 (comment) and onward for motivation.
Ideally these tests would get executed differently instead of being removed outright, but that's a task up to home-assistant maintainers.

  • Built on platform(s)
    • x86_64-linux
    • aarch64-linux
    • x86_64-darwin
    • aarch64-darwin
  • For non-Linux: Is sandboxing enabled in nix.conf? (See Nix manual)
    • sandbox = relaxed
    • sandbox = true
  • Tested, as applicable:
  • Tested compilation of all packages that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD". Note: all changes have to be committed, also see nixpkgs-review usage
  • Tested basic functionality of all binary files (usually in ./result/bin/)
  • 25.05 Release Notes (or backporting 24.11 and 25.05 Release notes)
    • (Package updates) Added a release notes entry if the change is major or breaking
    • (Module updates) Added a release notes entry if the change is significant
    • (Module addition) Added a release notes entry if adding a new NixOS module
  • Fits CONTRIBUTING.md.

Add a 👍 reaction to pull requests you find important.

@github-actions github-actions bot added 10.rebuild-darwin: 0 This PR does not cause any packages to rebuild on Darwin. 10.rebuild-linux: 0 This PR does not cause any packages to rebuild on Linux. labels Dec 11, 2024
@ofborg ofborg bot added the 8.has: clean-up This PR removes packages or removes other cruft label Dec 11, 2024
@mweinelt
Copy link
Member

mweinelt commented Dec 15, 2024

We created this workaround because people started complaining that building home-assistant in their nixpkgs-review was too expensive and so currently only the rebuild count for home-assistant's non-optional dependencies will be inflated by all its test runs.

If we folded all tests back into home-assistant itself, then we'd be fixing the rebuild count but at the same time regressing the review experience for lots more pull requests than just uv.

Additionally, we'd lose the ability to granularly bisect component breakages.

I agree what we have right now is a hack, but the status quo really has the smallest impact on everyone involved.

@mweinelt mweinelt requested review from Mic92 and dotlambda December 15, 2024 22:43
@adisbladis
Copy link
Member Author

If we folded all tests back into home-assistant itself

That isn't the only option though. It's possible to have granularity without exposing it to the rest of the world. Afaik CI rebuild counts only include things exposed as their own attrs.

Additionally, we'd lose the ability to granularly bisect component breakages.

Even if home-assistant-component-tests is removed you can still access individual tests through home-assistant.tests.components.

@mweinelt
Copy link
Member

mweinelt commented Dec 17, 2024

Even if home-assistant-component-tests is removed you can still access individual tests through home-assistant.tests.components.

They wouldn't get run by hydra that way, which provides a helpful good/bad start point for bisects.

That isn't the only option though. It's possible to have granularity without exposing it to the rest of the world. Afaik CI rebuild counts only include things exposed as their own attrs.

That way they aren't run by nixpkgs-review, the de facto standard review tool right now. A proposal to run passthru tests is stuck: Mic92/nixpkgs-review#397.

@adisbladis
Copy link
Member Author

adisbladis commented Jan 9, 2025

That way they aren't run by nixpkgs-review, the de facto standard review tool right now. A proposal to run passthru tests is stuck: Mic92/nixpkgs-review#397.

There are other mechanisms than passthru tests. You could create a meta derivation that depends on all of the components.
There are possibly other ways to deal with it, I'm leaving that up to someone who is invested in the HA ecosystem.

I agree what we have right now is a hack, but the status quo really has the smallest impact on everyone involved

That's easy to say, but I don't think it's true. Having anything go through the staging workflow sucks big time, and this is nudging more packages to take that path.

@Mic92
Copy link
Member

Mic92 commented Jan 11, 2025

@mweinelt so just having passthru.tests being build in nixpkgs-review would be enough for your usecase? Would it be fine if this just happens if home-assistant was explicitly selected?

@dotlambda
Copy link
Member

Would it be fine if this just happens if home-assistant was explicitly selected?

No. We want to make sure the respective component test still passes when a Python library used by Home Assistant is updated. Adding the component tests to every library's passthru.tests would be a ton of work (there are hundreds of them and they keep changing).

@adisbladis
Copy link
Member Author

adisbladis commented Feb 8, 2025

Unless there is some willingness to address this from the HA maintainers I think that this PR should get merged.
There doesn't seem to be any willingness to find a solution, so I suggest these hacks should be removed.

Like I pointed to in my comment this isn't just inflating build results but also make package searches return packages that are 100% irrelevant for end users.

@adisbladis adisbladis closed this Feb 8, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

8.has: clean-up This PR removes packages or removes other cruft 10.rebuild-darwin: 0 This PR does not cause any packages to rebuild on Darwin. 10.rebuild-linux: 0 This PR does not cause any packages to rebuild on Linux.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants