Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG: noarch package in multi-output recipe changes CPU architecture mid-build #5349

Open
2 tasks done
h-vetinari opened this issue May 21, 2024 · 8 comments
Open
2 tasks done
Labels
type::bug describes erroneous operation, use severity::* to classify the type

Comments

@h-vetinari
Copy link
Contributor

Checklist

  • I added a descriptive title
  • I searched open reports and couldn't find a duplicate

What happened?

We recently split up the pyarrow package into more bits and pieces (e.g. distinguish minimal variant for small footprint from full-featured "give me everything"), and now we have a sandwich where the first and last package in a telescopic chain of dependencies are per python, whereas there are two empty metapackages in the middle that could easily be noarch:

pyarrow-core    # depends on libarrow*;     per python version resp. CUDA/non-CUDA
pyarrow         # depends on pyarrow-core;  could be noarch
pyarrow-all     # depends on pyarrow;       could be noarch
pyarrow-tests   # depends on pyarrow-all;   per python version

I've tried doing that in conda-forge/pyarrow-feedstock#119, and it works fine in native builds, but fails with non-sensical errors on aarch/ppc:

TEST END: /home/conda/feedstock_root/build_artifacts/linux-aarch64/pyarrow-core-16.0.0-py39ha95f412_1_cpu.conda
TEST START: /home/conda/feedstock_root/build_artifacts/noarch/pyarrow-16.0.0-hd8ed1ab_1.conda
WARNING: Multiple meta files found. The meta.yaml file in the base directory (/tmp/tmpnher07k4/info/recipe) will be used.
Reloading output folder (local): ...working... done
Solving environment (_test_env): ...working... failed
WARNING: failed to get package records, retrying.  exception was: Unsatisfiable dependencies for platform linux-64: {MatchSpec("16.0.0=*_1_"), MatchSpec("pyarrow==16.0.0=hd8ed1ab_1"), MatchSpec("pyarrow-core=16.0.0[build=*_1_*]")}
Encountered problems while solving:
  - nothing provides requested pyarrow-core 16.0.0.* *_1_*
  - nothing provides pyarrow-core 16.0.0 *_1_* needed by pyarrow-16.0.0-hd8ed1ab_1

When I say non-sensical, I mean that that the TEST END: at the top just confirmed the local existence of an artefact that should match the requested pattern. Looking closer, what's happening is

TEST END: /home/conda/feedstock_root/build_artifacts/linux-aarch64/pyarrow-core-16.0.0-py39ha95f412_1_cpu.conda
[...]                                                ^^^^^^^^^^^^^
Unsatisfiable dependencies for platform linux-64: ...
                                        ~~~~~~~~

i.e. the build switches architecture when getting to the noarch output.

Note that all jobs in that PR want to build:

noarch/pyarrow-16.0.0-hd8ed1ab_1.conda
noarch/pyarrow-all-16.0.0-hd8ed1ab_1.conda

which is the goal - we need to build it in each job in order to successfully build the per-python pyarrow-tests on top, but only one build would be uploaded in the end across all jobs, due to the (intentional!) hash collision.

Conda Info

No response

Conda Config

No response

Conda list

No response

Additional Context

No response

@h-vetinari h-vetinari added the type::bug describes erroneous operation, use severity::* to classify the type label May 21, 2024
@github-project-automation github-project-automation bot moved this to 🆕 New in 🧭 Planning May 21, 2024
@h-vetinari
Copy link
Contributor Author

The old logs are lost (so harder to analyze what changed), but it seems that this now works...? 🤔

@jaimergp
Copy link
Contributor

Intriguing. I would have assumed that #5350 was responsible for the fix, but that's not released yet :/

@h-vetinari
Copy link
Contributor Author

aaaaand, it broke again (only difference was inclusion of sparse or not as a test dependence). I'll recheck again after #5350 is released.

@h-vetinari
Copy link
Contributor Author

Can confirm that with conda-build installed from main, this now passes 🥳

@h-vetinari
Copy link
Contributor Author

Closing this as fixed by #5350

@github-project-automation github-project-automation bot moved this from 🆕 New to 🏁 Done in 🧭 Planning Jul 23, 2024
@h-vetinari
Copy link
Contributor Author

Spoke too soon it seems; at least installing from the main branch here, conda-forge/pyarrow-feedstock#119 broke again.

@h-vetinari h-vetinari reopened this Jul 23, 2024
@github-project-automation github-project-automation bot moved this from 🏁 Done to 🏗️ In Progress in 🧭 Planning Jul 23, 2024
@h-vetinari
Copy link
Contributor Author

Now confirmed: unfortunately this is still an issue with conda-build 24.7.1

@h-vetinari
Copy link
Contributor Author

Most likely this is simply the confusion between the build and host architectures coming through - I guess that the code paths for building a noarch package in a cross-compiling recipe are pretty much completely untested. Perhaps that was obvious for everyone else already, but I just realized it 😆

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type::bug describes erroneous operation, use severity::* to classify the type
Projects
Status: 🏗️ In Progress
Development

No branches or pull requests

2 participants