Lovely topic, one close to my heart!
I've written about this on and off, but probably with equal parts buried good ideas mixed with lengthy rants. Here's a fresh round of scripture.
My desire for some time now has indeed been to get off snapshots and closely tied engine versions <=> "stable releases". The dynamic builders are a large piece of the puzzle in moving forward. Currently honestly the only reason stable release 50 == 0.50.0 engine version is because we bump the version manually and I happen to do so when doing a big round of testing and consider most our stuff is stable - it isn't a formal thing, I just figured make the numbers match till we make them more formal
What I think we
want to happen is instead bumping the patch level (x.y.
z) much more frequently after merging a few PRs that don't break backwards compatibility. Since we're still below release 1.0.0 the rigid rules of SemVer technically don't apply yet but I think we can indeed use it with minor (breaking) and patch (compatible) until we go 1.0.0. Same goes for modules.
The key for me is to do the versioning separately from any PRs / direct commits as it would be a pain trying to manually bump the version in that fashion (imagine multiple outstanding PRs or an oops commit that breaks something). Instead just merge to to the develop branch and if an author is happy to do a
component release (not a
game release) they do so by running a job in Jenkins with a single scope parameter - major, minor, patch. Or ask
@Gooey to do it for them on IRC or Slack. More on that later!
Right now I've got the scope parameter set up in Jenkins on the stable engine job and a test module job. It sadly just doesn't do anything yet, other than publish to a non-snapshot repo in Artifactory. I manually commit the engine version bump when I do a stable
game release. The release job in Jenkins naturally is meant to do the version bump for you, and IMHO also should do the Git push from develop -> master for you too. We should never need to manually push to master and everything should depend on release builds that have come out of master.
So say we get that in place for engine and all modules (and libs too where it makes sense). Handling module compatibility is a fun topic. I would also like to follow SemVer to the letter so we don't need to worry about max version. Here's a scenario touching on multiple points:
- Module x is currently at 1.5.3 and its author decides to rename a class that's part of its public API with multiple users. It needs to be bumped to 2.0.0.
- Author makes a PR with just the change (no version tweak) or commits directly to the module's develop branch. Jenkins builds a snapshot, posts stats
- Author confirms the change as expected, runs the release job in Jenkins with scope "Major" - Jenkins pushes develop to master, bumps version to 2.0.0, builds, bumps version to 2.0.1, pushes to develop for next snapshot
- This also gives a chance to catch changes of incorrect scope. Had the author intended a minor release but noticed during testing the change that it breaks backwards compatibility simply run the release build with the scope set higher (or outright revert it which is fine while it is just a snapshot)
- Nothing breaks anywhere - the modules using 1.5.3 know they are not allowed to use 2.0.0 (no max version set, just the rule that next major may break compatibility)
Now at this point we have two challenges:
- A lot of changes may require a major version bump yet some dependent modules may work perfectly fine. How to find that out easily?
- How do we determine for a game release what module versions to include?
This is where I'm seeing potential for the automatic pull request testing which is now live (yay
@msteiger !) + expanded options in the module index/manager. I want to build
every dependent project we know of when something upstream changes -
at all. This is why we needed the dynamic builder droplets as that's a lot of building
The exact details escape me a little, along with the setup in Jenkins, but it goes something like this:
- Release jobs (master branch) only ever build on manual request - that may sound like work but I want to use @Gooey to make releasing easy. Just tell him on IRC or Slack "Promote module x as major release" or so and Jenkins will take care of the rest.
- Snapshot builds (develop branch) run on commit just like now (this is all we have had so far)
- Pull request builds run on creation of a PR (if whitelisted / approved by admin) - report the usual code metrics (we have this now!)
- On completion of a PR build any immediate downstream module dependent on the module (or engine) built will itself run a throwaway build to do a compile and code metric test against the update. This will help catch situation where we break things without having to rely on a mega-workspace (I'm at 67 modules and rising - will become unrealistic soon, takes longer to build the Gradle project tree than actually executing tasks)
- Additionally or alternatively to the above run as a consequence of a release build finishing (but at that point the genie is already out of the bottle)
- Should another series of throwaway builds run in response to snapshot builds? If every change goes in via PR we wouldn't need this (would just repeat the PR-triggered builds)
Why do the throwaway builds? First to know when we break stuff, which is valuable information, especially
before we release a breaking change. Even if we are fully aware that we're committing a change that's incompatible with some stuff, and are guarding against it with a major release bump, knowing exactly what breaks allows us to prepare updates faster to get everything working again.
Secondly, and this is where it gets geekily interesting to me, the throwaway build tells us whether a downstream project
may work unchanged with the next major version of something. But we can't just go on that and automatically include the newer build in a game release (current problem). And testing every module individually to see if it does indeed work and then do a re-release
just to flag it compatible with a newer engine (or upstream module) would both suck and litter our repos. Instead:
- On successful compilation against a newer upstream major release mark the module as "Compiles OK!"
- On successful unit tests + other code metrics mark as "Compiles with good code health!"
- On successful integration tests / other deeper automated tests (TWOVANA) mark as "Passed automated game tests!" (automated acceptance test level I guess)
- On successful manual acceptance tests (somebody actually played it and reported it as fully functional) mark as "Play tests OK!"
- Finally if somebody actually changes the module then consider formally assigning upwards compatibility in the version file. But no Git commit needed until here.
The first four stages would be recorded outside of the repo and I'm not sure exactly how. Jenkins could annotate build jobs somehow, update a central database, edit module threads in the forum, or even batch-update the
Module Index (that could get Git-spammy so batch the changes). The data could then be presented on a module tracker site, shown in the launcher when browsing modules, or viewed otherwise when we're considering doing a full game release including all Omega modules.
Naturally the above
is a whole lot of work and not at all something we'll have any time soon. But we could start small and move toward that future. First off we should just get to where we release things and follow SemVer. Perhaps we should even make hitting Alpha (architecturally stable but not necessarily gameplay ready) == going 1.0.0? So we can go all SemVer.
On the technical side this involves some needed improvements:
- Script for the release jobs to do some Git pushing and updating of version files. Next on my list!
- Probably a single central release orchestrator job is needed to do the Git push before the release job actually starts, that way it can correctly report the actual changes present in that build via Git history. Only admins could run this.
- Authors that are not Jenkins admins can use Gooey commands instead (which in turn runs the release orchestrator). There is a Hubot role system we can use to assign modules to users so they can run releases for only a subset of modules
- Groovy script to update module jobs to have job dependencies matching their module.txt dependencies (I've got a proof of concept for this working) so the correct downstream jobs can be triggered when needed
- Central point for storing results for the fancy throwaway builds we can display/use later in the process (fairly easy to use the Index and that was my intent - although just for releases).
- Determine how we want to define compatibility bands for modules when both v1.0.0 and v2.0.0 will work fine for a module just declaring a dependency on min version 0.1.+. Set max version to 2.+.+ ?
- How to handle changes affecting multiple modules at once?
- Launcher still isn't updated for handling the new Distros, let alone beginning to show module release info (@Skaldarnar + @shartte ping!)
If we can get the module manager/browser working (intended to work either inside the game
or in the launcher) the need for an actual game release goes down substantially. You just grab the base game client and pick your "mod pack" akin to FTB/Technic (gameplay template for us, really) and get the appropriate modules downloaded automatically.
Mainly we have the stable game release zips because we
don't have the more granular way to auto-download the appropriate modules. And for any offline play situations, of course.
If anybody managed to read all the way through this mega-post attempting to show purpose in my madness give yourself a cookie. You deserve it!