-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Revert innoextract 6.3.0+ patching to use OS package when support for 6.3.0 is added. #85
Comments
Out of interest, when you pulled in the PR patch, did it actually build? I've no idea what happened in that PR, but it looks a little, odd. Of course, dscharrer's latest mainline branch doesn't support 6.3.0, and I tried the patch providers mainline, but it fails to build (at least for me), and their repo doesn't support issues. Completely understand the choice to wait for Alpine's version when 6.3.0 is supported, just no indication of timeline for that. I would prefer to build the image from scratch, though I do note you've kindly pushed your image to Docker Hub. So I pulled your image for testing... and ran into the "sharing violation" issue - nothing a simple I'm using only M.2 drives, so I'm not completely convinced about it being the drive speed causing the problem, but a manual workaround is a manual workaround. Thanks for all you've done to bootstrap this, and when I find time, I'll have a look at Kalifornia909's comment on the DCS forum about getting SRS to work (I was in his stream today, and that's how I found this project). If I get SRS working, I'll happily give you a PR with another Dockerfile for people to use (though, of course, the innoextract issue needs to be resolved for anyone to build). EDIT: I've just seen your dotnet462 image, I'll check it out and give feedback. |
Yes, see:
See:
I specifically pulled the content of that PR since it did not get merged, the PR was closed and the originating feature branch is gone: https://github.com/crackedmind/innoextract/tree/feat/inno_6.3 I expect that they never brought their feature branch into their fork's main branch which is potentially why you would see a build failure. Edit: they did merge it, but also a bunch of extra stuff that causes the CI to fail: dscharrer/innoextract@master...crackedmind:innoextract:dev
There's certainly nothing stopping you from building it yourself as per the README. The patching for innoextract is part of the dockerfile so it will just work.
I'm not fully convinced it is an issue with file performance either - could just be a new bug in Eagle Dynamic's installer. I'm unclear on how they actually avoid the file locking issue in general since a running program retains the lock on itself so it cannot trivially rewrite itself and would need a sidecar / shim of some kind to break the lock. In any case I'm not going to be breaking apart their code.
Happy to take contributions around SRS etc... but I just don't really have the time or motivation to go deep into scripting and testing it all out at the moment. |
Thanks for the reply 😄
It doesn't. The innoextract build failed - literally all I did was a I speculate it passes for you, because you're leveraging local cache and not actually rebuilding innoextract since the PR branch was nuked. Worst case there, we can both pull the compiled innoextract from your existing images. At first, I commented it out, then realised why it was used... spent a little time shopping around, tried crackedmind's mainline and that, similarly, failed to build innoextract. I also tried dscharrer's latest mainline, which didn't work as it doesn't support 6.3.0. So I pulled your images from Docker Hub, which work just fine - both
While that's a precious resource for me too, I'll take a look over it in the next few days... and if I get it working, I'll raise the appropriate PR here for you (no rush on the test/accepting side). If I don't, I'll at least document my journey with it for you, for when you do get the time to look into it. |
No, I am building this in Github CI and the process is fairly explicit and clear in the logs. See: The new test build has just finished successfully as expected. The patch I am pulling still exists. Line 14 in the current dockerfile works fine. See specifically: https://github.com/Aterfax/DCS-World-Dedicated-Server-Docker/actions/runs/11468709893/job/31914403396#step:6:372
Those were gone before I implemented the current patching method so it isn't that. I also know it isn't because that isn't the branch I am actually using. The patch branch I selected is the PR branch automatically created against the main repo when the PR is created, not any branch of the fork or any of the main repo's typical branches. The autocreated PR branch still exists and works fine as can be seen in the CI above.
I don't know exactly what dockerfile lines you're trying, but they're not right and it sounds like it is because you are targeting the wrong thing. Since the CI is working here, I would suggest purging your Git clone of this repo and recreate it because it sounds like you've made changes to the main branch and broken it, where it clearly does work on Github. Edit: Just did a localbuild with |
After a little head scratching, I think I found the issue - line 10 of the dockerfile, I'd removed Something about line 14 failed on my first attempt, which lead me down the path I went on, but I'm at a loss to explain what that is now, as the build is working, and the first error I saw was line 14, not 10. But ho hum, it's working. Incidentally, nothing in my comments was intended as accusatory - I apologise if it came across that way. Again, thanks for taking the time you have so far in putting this together. |
No worries, I didn't see it like that. If anything I am probably just being a bit too brusque as the hour gets late and I get tired 😩 |
DCS-World-Dedicated-Server-Docker/docker/Dockerfile
Lines 12 to 16 in 893c6e2
Pulling in custom patch code from a PR is icky. This should be reverted to using the Alpine provided package when the support is there for 6.3.0+ type executables.
See:
The text was updated successfully, but these errors were encountered: