We’ve spent the past few days learning about Full updates, Delta updates, and Express updates. All that was just background information! Finally we can talk about the thing I actually wanted to talk about: The Quality update, which obsoletes them all.
The Quality update takes a different approach to patching: Not only does it patch a file forward to the latest version, it also patches the latest version of the file backward to the original version.
Update | Full file | Patch base | Reverse patch |
||||
---|---|---|---|---|---|---|---|
M0 | M1 | M2 | M3 | M4 | |||
M1 | M1 |
M0 to M1
|
M1 to M0
|
||||
M2 | M2 |
M0 to M2
|
M1 to M2 |
M2 to M0
|
|||
M3 |
|
|
|||||
M4 | M4 |
M0 to M4
|
M1 to M4 | M2 to M4 |
M4 to M0
|
||
M5 | M5 |
M0 to M5
|
M1 to M5 | M2 to M5 | M4 to M5 |
M5 to M0
|
The Quality update includes only the two sets of patches, one to get from the initial version to the latest, and one to get from the latest version back to the original.
Quality update | Contents |
---|---|
M1 | M0 to M1, M1 to M0 |
M2 | M0 to M2, M2 to M0 |
M3 | M0 to M2, M2 to M0 |
M4 | M0 to M4, M4 to M0 |
M5 | M0 to M5, M5 to M0 |
Note that the M3 Quality update is the same as the M2 Quality update since the file F
did not change between M2 and M3.
The secret to the Quality update is that the client retains the patches necessary to bring its files back to the M0 version. At the release of M0, this is vacuous: The files are already at their M0 version, so no patches are needed. We’ll see how this invariant is maintained at each subsequent update.
Applying a Quality update consists of downloading the update, and then for each file in the update, applying two sets of patches: First patch the current file backward to the original M0 version using the patch cached on the client. Second, patch the M0 file forward to the version targeted by the Quality update. The resulting fully-patched file goes onto the system, and the backward patch included in the Quality update is saved on the system in preparation for the next Quality update.
By analogy, it would be as if you wanted to meet with a bunch of friends, but instead of having to give different directions to each friend, you tell everybody, “Okay, start at the library, and then…” You trust that everybody knows how to get to the library, and you give one set of directions that tells how to get to the final destination from the library. You also give directions from the meeting place back to the library, so they are ready for the next time you need to meet somewhere. (Okay, so that’s not really a good analogy, because your friends probably want to go home, not to the library.)
The total disk space required on the server is (eyeballs the graph in the blog post) roughly 250MB for the pair of patches. This is the smallest server footprint of all the patches we’ve been looking at this week.
Note that the download size of a Quality update is less than double the size of an Express update download. I suspect this is because the reverse patch can take advantage of the bytes in the M0 file that were calculated as part of applying the Quality update. For example, if the reverse patch would have said “Replace bytes 2000 through 3999 with these following 2000 bytes,” the information downloaded from the server could say “Replace bytes 2000 through 3999 with bytes 5000 through 6999 of the M0 file you already have.” This removes 2000 bytes from the download, and the client can get the 2000 bytes from the M0 file that it had temporarily created as part of applying the Quality update. In that way, what the client really downloads is not so much a reverse patch as it is a template for a reverse patch.
Feature summary of Quality updates:
- Quality updates can successfully update all customers, since every client knows how to roll back to M0, at which point they can apply the patch in the Quality update to move forward.
- Quality updates are about a third the size of a Full update.
- Quality updates require very little negotiation with the server. Every customer downloads the same update.
- Quality updates are cache-friendly, because every customer downloads the same update. Therefore, caching features like caching proxies, BranchCache, and peer-to-peer delivery are effective.
- Quality updates do not require significant server support. Once the package is negotiated, it is delivered in its entirety.
The blog article that announced the change to Quality updates reports a 40% improvement in memory usage on the client compared to Express updates, since the client doesn’t need to do an inventory of all the files on the system.
Why does the update need to include the reverse patch at all? Wouldn’t the client be able to construct it by itself?
In theory, you could make the client calculate the reverse patch, but that would slow down the update process because calculating an efficient reverse patch can be very CPU-intensive. We’ll see more about this later.
Ah, we finally know what the “this Cumulative Update does not include any new code and is designed to help us test our servicing pipeline” stuff in all the recent WIP new build blogposts is about.
In other words, Quality updates are a good upgrade from what there was before.
(Note that your Express link uses http: instead of https: like the Full and Delta links do. This triggered me because it didn’t show up as visited in my feed reader…)
How about the time required for installing the updates on the client side? I would imagine that patching a large number of files twice might take a significant longer time than simply applying a full update, i.e. overwriting all files. Or in other words: Compared to other operating systems that tend to apply full updates, I find Windows client updates to take quite some time for installation…
It's a tradeoff between bandwidth and execution time, and enough paying customers are still bandwidth-constrained enough that solutions are important, especially with more sites working entirely off mobile. Of course, Microsoft now also releases a straight full update without patches every six months, which also keeps the patches from accumulating to the ludicrous numbers that XP's and 7's did. Not sure why the rollup concept was so rarely used, instead of a quarterly thing from...
When cumulative updates were introduced, there was much consternation from users who lost the ability to pick and choose which updates to install. It won't take much work to find people who declared the loss of "pick and choose" the end of the world. This guy even called it communist.
With my mathematician’s hat on, I’d call this “reducing to an earlier problem”. Rather than write lots of proofs for tiny differences in starting point, you transform each problem into a known, already solved, version – on the assumption that transformation is much easier than the proof itself.
A joke we always used to tell was:
What does a mathematician do when they see a house on fire? Get some water, put the fire out.
What...
I knew the new updates were all cumulative, but doesn’t “quality” in a monthly cumulative update refer to it including non-security bug fixes (hotfix content), as opposed to the “security-only” updates?
Does this mean now hexpatching Windows files will completely prevent Quality Update from functioning (and leave only Full Update)?
I thought hexpatching Windows files wasn’t even possible thanks to the fact that every binary that Microsoft ships these days is digitally signed.
Default configurations of Windows don’t stop unsigned or invalidly signed userland code from running. Try it yourself; modify a byte in Notepad, or something, and try to run it.
There are policy configurations you can use to enforce signing, I believe, but having those on by default would break so much stuff it’s not even funny.
It would depend on where your hex edits are. If you edit something that hasn’t been modified since the original version, it would keep your changes (with potentially-dangerous results). If you edit something that’s been modified, it would overwrite the changes (I think). If your edits straddle changed and unchanged sections, they’d be partially-undone. If you change the size of the file, it would probably trash everything.
And if you changed something that wasn’t itself modified, but the patch said “Copy bytes X through Y to location Z, but add 2 to the second byte”, then things get even weirder.
Before any of these happens, CBS checks the file’s cryptographic hash to confirm rollback safety. If the check fails, the file is marked as corrupt. During the Windows maintenance schedule, CBS will repair the corruption. If necessary, it will download a copy from Microsoft’s CDN server.
Of course, none of this is bulletproof.
I thought Windows was doing that since long ago. Or was it Office? I remember back in 2013-ish one of our users had trouble installing an update and we found out it was because they "cleaned up" C:\Windows\Installer. Turns out update uninstallers are there and they are needed because before installing a new update you have to uninstall the current one to bring the product to the appropriate baseline (uninstall a CU to get back...
Most MSI update servicing is either full installs, a series of small MSP patches laid on top of each other, like the Express update model, or a cached full installer at major versions and caching new full patches while removing the old, akin to Quality updates. But that's all up to the developer to pick and implement the model they want.
Deleting the Installer folder is a whole other thing, though, since you've also deleted all...
No, a very different update model.
The Windows Component Based Servicing (CBS) model is governed by DISM and XML files. There are no .msi files involved. The thing that Windows was doing since long ago was storing multiple versions of the file in WinSxS.
Of course what this meant was that people were deleting the WinSxS directory because it was too big and apparently useless and they were breaking their systems that way.
Update Formats Week was interesting. I wonder if there's anything for Friday, or if we're back to regular posts.
How is the tradeoff in CPU usage between Express and Quality updates? The Express updates have to do the inventory, but the Quality updates have to apply more patches.
It's a little disappointing that the Quality updates are so much larger for the client than the Express updates. I understand that it really improves things on the server...
On the other hand, the Quality update works much better with network caching. If you're updating 1000 machines in a corporate network, you can set up a caching proxy and everybody gets a 100% cache hit. Even on personal networks, you get this benefit if you have more than one machine, since the second machine can get the update from the first machine, even if they have different starting points. So it's a trade-off that...
On personal network, if you have more than 1 Win10 PC, you can also be benefited when “PCs from my local network” source in “Windows Update settings” is selected too.