Getting started with Universal Packages
At the end of last sprint we flipped the switch on a new feature for Azure Artifacts called Universal Packages.
With Universal Packages teams can store artifacts that don’t neatly fit into the other kinds of package types that we support. A Universal Package is just a collection of files that you’ve uploaded to our service and labelled with a name and version.
Universal Packages can be huge (we’ve tested up to 4TB), and our deduplication and compression technology can dramatically improve efficiency. We routinely see 10:1 size savings which results in improved network and storage efficiency.
You can store anything you like in a Universal Package but here are some of the things that we’ve seen them used for so far:
- Configuration scripts and templates (e.g. ARM templates)
- Database snapshots for integration testing
- Machine learning training data and models
- Developer tools and SDKs
- 3D models and textures
To get started with Universal Packages you just need to download and install the VSTS CLI and once you’ve authenticated you can issue the following command.
$ vsts package universal publish ` --feed SDKs ` --name windows-sdk ` --version 10.0.17134 ` --path . ` --instance https://dev.azure.com/mseng
Downloading a package is just as easy, all you need to do is replace the publish command with the download command and you can pull down the files that you just published.
$ vsts package universal download ` --feed SDKs ` --name windows-sdk ` --version 10.0.17134 ` --path . ` --instance https://dev.azure.com/mseng
For those of you not familiar with the Windows SDK – the default installation on Windows takes up about 2.2GB of space on disk. That is a lot of content to put into a package but Universal Packages copes with it because of the way that we process and transmit content to the service – in fact we have some customers publishing packages today which exceed hundreds of gigabytes.
Optimizing upload & download
For Universal Packages we’ve focused a lot on optimizing the amount of data that is transferred over the network. Before uploading any file we first check to see whether we are already storing the file in the service and if we are we don’t bother transferring it. This also works at the sub-file level so if you have a large file (potentially many GB in size) we will only upload a subset of that files’ contents.
Both uploads and download transmit the content of files in parallel to maximize the use of the available network connection and reduce the time it takes to transfer the package. We are also working on a cache that can reduce the amount of content you download by reusing the bytes that were transferred previously.
This is just the start for Universal Packages. We have a bunch of new features and improvements lined up which will make them even more powerful. In the meantime I would encourage you to take a look at our getting started documentation and try it out for yourself! You can also use Universal Packages with Azure Pipelines.