Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch Linux release builds to Qt6 #1908

Merged
merged 9 commits into from
Jan 20, 2025

Conversation

matterhorn103
Copy link
Contributor

Developer Certificate of Origin
Version 1.1

Copyright (C) 2004, 2006 The Linux Foundation and its contributors.
1 Letterman Drive
Suite D4700
San Francisco, CA, 94129

Everyone is permitted to copy and distribute verbatim copies of this
license document, but changing it is not allowed.

Developer's Certificate of Origin 1.1

By making a contribution to this project, I certify that:

(a) The contribution was created in whole or in part by me and I
have the right to submit it under the open source license
indicated in the file; or

(b) The contribution is based upon previous work that, to the best
of my knowledge, is covered under an appropriate open source
license and I have the right under that license to submit that
work with modifications, whether created in whole or in part
by me, under the same open source license (unless I am
permitted to submit under a different license), as indicated
in the file; or

(c) The contribution was provided directly to me by some other
person who certified (a), (b) or (c) and I have not modified
it.

(d) I understand and agree that this project and the contribution
are public and that a record of the contribution (including all
personal information I submit with it, including my sign-off) is
maintained indefinitely and may be redistributed consistent with
this project or the open source license(s) involved.

@matterhorn103
Copy link
Contributor Author

This is a small subset of the changes in #1869, essentially just adding a Linux workflow (that builds against Qt6) and removing the Linux builds from the old "CMake" and Qt6 workflows, while keeping everything else the same.

@ghutchis
Copy link
Member

ghutchis commented Jan 6, 2025

This is failing because the tests directory hasn't been ported to Qt6 yet (i.e., run by the ubsan and asan builds)

@avo-bot
Copy link

avo-bot commented Jan 6, 2025

This pull request has been mentioned on Avogadro Discussion. There might be relevant details there:

https://discuss.avogadro.cc/t/upcoming-1-100-release/6629/12

@ghutchis
Copy link
Member

ghutchis commented Jan 7, 2025

@matterhorn103 - try merging HEAD - I think #1923 should fix the tests here.

Signed-off-by: Matthew J. Milner <matterhorn103@proton.me>
@matterhorn103 matterhorn103 reopened this Jan 13, 2025
Copy link
Contributor

Here are the build results
macOS.dmg
Win64.exe
Artifacts will only be retained for 90 days.

@matterhorn103
Copy link
Contributor Author

@matterhorn103 - try merging HEAD - I think #1923 should fix the tests here.

Looks like it does :)

@ghutchis
Copy link
Member

Is this ready to merge?

@matterhorn103
Copy link
Contributor Author

I believe so? I rebased on latest HEAD as suggested and now everything passes.

@ghutchis
Copy link
Member

I don't see a new AppImage build action. I'd also like to have something running the test suite on Ubuntu (e.g., with the sanitizer builds).

Am I missing new actions? They don't seem to be in this commit. Thanks.

@matterhorn103
Copy link
Contributor Author

Yes, well spotted, thanks. I must have messed something up with the rebase or something, there is meant to be a a build_linux.yml that includes all of that.

Signed-off-by: Matthew J. Milner <matterhorn103@proton.me>
@matterhorn103
Copy link
Contributor Author

matterhorn103 commented Jan 16, 2025

Added the missing file, assuming all the checks pass it's now definitely ready, promise! Sorry for the dumb oversight.

…ith test builds

Signed-off-by: Matthew J. Milner <matterhorn103@proton.me>
@matterhorn103
Copy link
Contributor Author

I added the testing builds on Linux with Qt5 back in, so that we have some Qt5 tests going on, and also with the idea that even if the Qt6 test builds continue to fail we can still merge this for now.

@ghutchis
Copy link
Member

Thanks. I think #1941 looks pretty good (in terms of updating the few Qt tests to Qt6), but appreciate having the older Qt5 tests too.

@ghutchis
Copy link
Member

The AppImage doesn't seem to actually produce an AppImage artifact. I'm not sure if I can edit your pull request, but we need to fix that before this is merged. Thanks.

@matterhorn103 matterhorn103 marked this pull request as draft January 17, 2025 15:03
Signed-off-by: Matthew J. Milner <matterhorn103@proton.me>
@matterhorn103
Copy link
Contributor Author

Hmm, for some reason the AppImage packaging step was missing, sorry. I could swear I put that in. Think I really botched the rebase. I've gone over it again with a bit more of a fine-toothed comb, I think it's all sound now.

I've still converted this to a draft because I have a few quick questions for us to make decisions on:

  1. Since they aren't uploaded to the releases, I assume the Ubuntu binaries (not the AppImage) are built just to test the build process on Linux? If so, do we need to bother uploading the artifacts? Does anyone ever download or test them? (I personally grab either the AppImage or the Flatpak.) Can I drop those artifacts while keeping the build?

  2. Now that Linux ARM runners are available to open-source projects, I've added a Ubuntu ARM matrix entry, for testing as much as anything. Unless you feel differently, I'll apply the same policy to the ARM entry as to the x86 one when it comes to whether or not to upload an artifact etc.

  3. As of the 10th of December we are building and distributing the beta Flatpak for ARM too, and when 1.100 is released the stable branch will have an ARM release. As it stands there's no test for that as part of the CI, so bugs could be introduced to avogadrolibs/master without us being aware of them until I try to update the Flatpak. So I think I'd like to make the Flatpak manifest a matrix with both x86 and ARM runners, if that's ok? (It would be a separate PR.)

  4. If we run a Flatpak ARM workflow as part of the CI as suggested in (3), we are testing a form of ARM build already, so is it then unnecessary to do (2) too?

  5. Should we build an ARM AppImage? Originally I thought why not, but we already have an ARM Flatpak and ARM builds available in distro's repos, so ARM has options. I worry that for non-technical users having two AppImage links on offer might cause confusion.

  6. In this section:

    - name: Configure
      run: |
        if [ ! -d "${{ runner.workspace }}/build" ]; then mkdir "${{ runner.workspace }}/build"; fi
        cd "${{ runner.workspace }}/build"
        # won't have any effect except on Mac
        CC=${{matrix.config.cc}} CXX=${{matrix.config.cxx}} cmake $GITHUB_WORKSPACE/openchemistry ${{env.FEATURES}} -DCMAKE_BUILD_TYPE=${{matrix.config.build_type}} ${{matrix.config.cmake_flags}}
      shell: bash

Which is the part that is Mac specific and can I take it out of the Linux manifest?

  1. Do you still want these tmate sessions in the up-to-date manifests? Maybe I misunderstand how they work and what they are meant to do, but I don't feel like they've ever done anything in the PRs I've had.

  2. Does the cleanup step (still) make sense? I thought anything in the container image that isn't explicitly cached gets discarded?

@ghutchis
Copy link
Member

Can I drop those artifacts while keeping the build?

Sure.

I've added a Ubuntu ARM matrix entry, for testing as much as anything

Sounds great. Probably worth building an AppImage for Ubuntu ARM too. I don't know if that's a big problem for users. Mac users (not very technical) have figured it out for a while.

make the Flatpak manifest a matrix with both x86 and ARM runners, if that's ok?

Sure. I think the only caveat is that the "regular" runner is a lot faster than Flatpak, so there's some marginal benefit to seeing an issue before Flatpak-ARM hits an error.

Which is the part that is Mac specific and can I take it out of the Linux manifest?

There was an environment variable being set MACOSX_DEPLOYMENT_TARGET - not sure why the comment is there for the Linux script.

Do you still want these tmate sessions in the up-to-date manifests?

Yes. It lets you ssh into the build if there's a failure. You have ~60 minutes after the failure.

The cleanup step isn't really needed on the GitHub runners - only on self-hosted.

@matterhorn103
Copy link
Contributor Author

Sure. I think the only caveat is that the "regular" runner is a lot faster than Flatpak, so there's some marginal benefit to seeing an issue before Flatpak-ARM hits an error.

That's true. In that case let's leave the Flatpak runner as x86 only, any ARM-specific issues should get flagged by the other ARM builds anyway.

Signed-off-by: Matthew J. Milner <matterhorn103@proton.me>
@matterhorn103 matterhorn103 marked this pull request as ready for review January 17, 2025 19:40
@ghutchis
Copy link
Member

I think for now we should drop linux-arm builds and deal with those after release.

The AppImage is failing because you also need to install libfuse2: https://github.com/appimage/appimagekit/wiki/fuse

@matterhorn103
Copy link
Contributor Author

I hoped they'd pass immediately since the ARM Flatpak works fine. Happy to postpone.

Signed-off-by: Matthew J. Milner <matterhorn103@proton.me>
Signed-off-by: Matthew J. Milner <matterhorn103@proton.me>
@ghutchis
Copy link
Member

I'll have to check - there seems to be a slightly different path, so it couldn't find the custom scripts/AppRun.sh to install into the AppImage

@ghutchis
Copy link
Member

Okay, the final bits. First, the line should be:

cp ../avogadrolibs/openchemistry/avogadrolibs/scripts/AppImage.sh appdir/AppRun

Next up, the .AppImage is called Avogadro-x86_64.AppImage so we'd need to change the Upload step if we want to keep that exact name.

Personally, I'd go with Avogadro2-x86_64.AppImage since that's what we've been using, but 🤷

Signed-off-by: Matthew J. Milner <matterhorn103@proton.me>
@matterhorn103
Copy link
Contributor Author

Okay, the final bits. First, the line should be:

Thanks, changed as appropriate.

I forgot that I made that change, which is to checkout the source code under openchemistry rather than in the root of the workspace. This was to align with your (or whoever's it was) advice in the Avogadro build guide that Avogadro is best built outside of the build tree. That means currently this workflow is inconsistent with the others but we can adjust the others later to match.

Next up, the .AppImage is called Avogadro-x86_64.AppImage so we'd need to change the Upload step if we want to keep that exact name.

Personally, I'd go with Avogadro2-x86_64.AppImage since that's what we've been using, but 🤷

Here I don't follow. I was planning to use your suggested name – my only change was to add the architecture so that it's Avogadro2-x86_64.AppImage rather than just Avogadro2.AppImage. The artifact is specified in the matrix configuration as Avogadro2-x86_64.AppImage. And the AppImage packaging step uses globs all the time as far as I can tell. So how does it occur that the name ends up not including the 2?

@ghutchis
Copy link
Member

I'm not sure why it's picking up Avogadro instead of Avogadro2 (from the .desktop file?).

I just know that if we want to keep Avogadro2* we'll need to change the mv line at the end to rename because the file I saw on-disk using tmate was named Avogadro-x86_64.AppImage not Avogadro2*

Also, it looks like 0e22312 has confused some paths. For now, I'd probably suggest dropping that commit.

@matterhorn103
Copy link
Contributor Author

Sure, I can revert it.

Would you mind running tree from the initial working directory - which I believe is /__w/avogadrolibs – so I can understand the structure of the container better?

My understanding is that the container should look like this:

__w
└── avogadrolibs
    └── avogadrolibs = $GITHUB_WORKSPACE and ${{ runner.workspace }}
        ├── openchemistry
        │   ├── avogadroapp
        │   ├── avogadrolibs
        │   ├── cmake
        │   └── thirdparty
        └── build
             ├── avogadroapp <--- packaged AppImage gets put in here
             ├── avogadroapp-prefix
             ├── avogadrolibs
             │   ├── avogadroapp
             │   ├── avogadrolibs
             │   ├── cmake
             │   ├── openchemistry <--- definitely not created when built normally
             │   │   └── avogadrolibs
             │   │       └── scripts
             │   │           └── AppImage.sh <--- script to copy is supposedly here?
             │   └── thirdparty
             ├── avogadrolibs-prefix
             ├── appdir
             │    └── usr <--- prefix gets moved to here
             └── prefix -----> gets moved to usr
                  ├── bin
                  ├── include
                  ├── lib
                  ├── lib64
                  └── share

Signed-off-by: Matthew J. Milner <matterhorn103@proton.me>
@matterhorn103
Copy link
Contributor Author

Ok, I guess it should finally all work.

Assuming it does, I won't touch this PR any further so that we can finally get it merged and release.

But it looks like there's a bug that means the "workspace" paths are not consistent, see e.g. here and here.

The GitHub help page for the runner context doesn't even list runner.workspace as a property, but I guess it maybe points to __w/avogadrolibs? The path necessary in the AppImage step (../avogadrolibs/openchemistry/avogadrolibs/scripts/AppImage.sh) seems to indicate that the build directory is actually at __w/avogadrolibs/build.

The checkout action claims to checkout repos relative to $GITHUB_WORKSPACE. Which makes sense given the third line of the cmake config step:

    - name: Configure
      run: |
        if [ ! -d "${{ runner.workspace }}/build" ]; then mkdir "${{ runner.workspace }}/build"; fi
        cd "${{ runner.workspace }}/build"
        CC=${{matrix.config.cc}} CXX=${{matrix.config.cxx}} cmake $GITHUB_WORKSPACE/openchemistry ${{env.FEATURES}} -DCMAKE_BUILD_TYPE=${{matrix.config.build_type}} ${{matrix.config.cmake_flags}}
      shell: bash

So when post-1.100 I work on making all the builds consistent, wouldn't it make most sense to do everything relative to $GITHUB_WORKSPACE rather than having a mix of environment variables?

@ghutchis
Copy link
Member

So when post-1.100 I work on making all the builds consistent, wouldn't it make most sense to do everything relative to $GITHUB_WORKSPACE rather than having a mix of environment variables?

Sure, that would be fine. I don't remember why I didn't use that before. Certainly some of the GH action environment features have changed over the years.

The GitHub help page for the runner context doesn't even list runner.workspace as a property, but I guess it maybe points to __w/avogadrolibs? The path necessary in the AppImage step (../avogadrolibs/openchemistry/avogadrolibs/scripts/AppImage.sh) seems to indicate that the build directory is actually at __w/avogadrolibs/build.

Yes, that's right. You get …/{project name} for runner.workspace and the build directory has usually been at …/{project name}/build

This is one reason I had all the platforms in one action - it made it easier to update / tweak.

OTOH, the Windows code-signing needs to be in a completely separate action so it's worth separating each of the platforms (and retiring the self-hosted action).

@ghutchis
Copy link
Member

🎉

@ghutchis ghutchis merged commit e408198 into OpenChemistry:master Jan 20, 2025
23 checks passed
@matterhorn103 matterhorn103 deleted the linux-qt6 branch January 20, 2025 12:16
@avo-bot
Copy link

avo-bot commented Jan 20, 2025

This pull request has been mentioned on Avogadro Discussion. There might be relevant details there:

https://discuss.avogadro.cc/t/qt6-tracking-thread/5592/23

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants