Today we are announcing the Public Preview of the Azure Command Launcher for Java, a new tool that helps developers, SREs, and infrastructure teams standardize and automate JVM configuration on Azure. The goal is to simplify tuning practices and reduce resource waste across Java workloads.
JVM Tuning in a Cloud-Native World
Before the rise of microservices, Java applications were typically deployed as Java EE artifacts (WARs or EARs) on managed application servers. Ops teams were responsible for configuring and tuning the JVM, often on powerful servers that hosted multiple applications on a single Java EE application server instance.
With the move to cloud-native microservices, every service now runs independently with its own JVM and in its own dedicated container or virtual machine. Each service defines its own CPU and memory boundaries, and with that, its JVM tuning parameters. This shift transferred tuning responsibilities from centralized Ops teams to individual developer teams, creating complexity and inconsistency across environments.
Bradesco Bank is one example amongst thousands of customers that have gone through this shift. One of the top five largest banks in Latin America with over $300 billion (USD) in assets, Bradesco has built much of its backend systems on Java and the JVM and now runs significant back-end operations on Azure Red Hat OpenShift (ARO) environments. Bradesco Bank processes billions of transactions every day, supported by tens of thousands of JVMs with critical Java applications at scale.
“In our proof of concept, Azure Command Launcher for Java delivered exactly the kind of operational standardization we needed as we prepared to scale Java workloads on Azure. Early tests showed strong potential for reducing waste and simplifying performance tuning.” – Thiago Mendes, Solution Architect at Bradesco Bank
Without proper JVM tuning, development and operations teams like the ones at Bradesco Bank may risk meeting with:
- Resource waste due to low utilization in dedicated cloud environments
- JVM tuning configuration drifts
- Inconsistent behavior across deployments
- Higher operational costs
- Increased mean time to resolution
Introducing Azure Command Launcher for Java
Azure Command Launcher for Java, in Private Preview since May 2025, simplifies and automates JVM configuration for cloud workloads. It works as a drop-in replacement for the standard java command, compatible with any Azure supported JDK versions 8 or later.
Throughout the Private Preview of the Azure Command Launcher for Java, we met with several customers and found that about 20% of Java workloads on containers were being manually misconfigured in production. This led to significant resource waste, due to JVMs being tuned to values much lower than the resource limits provided to their Kubernetes deployments, resulting in unnecessary horizontal scaling to account for the increasing processing demands.
DevOps teams want consistent, battle-tested and worry-free JVM tuning today. That’s where Azure Command Launcher for Java steps in. Without changing your code or adopting a new runtime, teams simply replace their usual “java -jar”command with Azure Command Launcher for Java and gain efficient, smarter defaults plus standardized tuning across services. It’s a practical alternative for teams that want to preserve their existing JVM investments while bringing them under stronger operational control.
No code changes, no lock-in. Just replace:
java -Xmx1024m -jar myapp.jar
with:
jaz -jar myapp.jar
And Azure Command Launcher for Java manages the JVM configuration automatically.
Easy Onboarding and Rollback
By default, the tool respects any tuning flags the user provides. If it detects manual JVM settings, like -Xmx, it steps aside and does not apply its own tuning. For workloads with no tuning flags, the tool automatically uses its recommended configuration.
If operators want the tool to override manual tuning, they can enforce this behavior with:
JAZ_IGNORE_USER_TUNING=1
To return control to user-defined flags, set the variable back to:
JAZ_IGNORE_USER_TUNING=0
This approach keeps adoption safe, gradual, and fully reversible.
Smarter Defaults for Cloud Workloads
Out of the box, Azure Command Launcher for Java applies sensible JVM defaults that are optimized for dedicated containerized and virtualized environments. These defaults are based on widely accepted best practices and insights gathered from real-world Java workloads on Azure.
This allows teams to start with a configuration that is more closely aligned with modern cloud deployment models, helping reduce manual setup and the risk of configuration drift across services.
To understand our approach for these choices, please see this article from our JVM Performance Architect, Monica Beckwith.
Adaptive Learning and AI-Assisted Tuning
While the Public Preview focuses on standardization and improved default configurations for immediate benefits, the roadmap includes more advanced and intelligent capabilities for the future.
Planned features include adaptive learning based on telemetry, where the tool will gradually analyze JVM telemetry over time and suggest further optimizations. This capability will be introduced in later releases after further validation with customers and partners.
Additionally, we will also incorporate features like Application Class Data Sharing (JEP 310) so users can benefit automatically. Long term, we will enable Project Leyden.
Built for Azure
Azure Command Launcher for Java works with all Microsoft compute services, including but not limited to:
- Azure Kubernetes Service
- Azure Container Apps
- Azure App Service
- Azure Functions
- Azure Red Hat OpenShift
- Azure Virtual Machines
- Azure DevOps
- GitHub Codespaces
- GitHub Actions
Linux binaries are available for x64 and ARM64 architectures, and it also comes pre-bundled in the latest Microsoft Build of OpenJDK container images.
Get Started
The Public Preview is now available to all customers. To get started, visit the documentation page for more information on how to configure and use the tool.
0 comments
Be the first to start the discussion.