vRealize Automation (vRA) Audit Logging Solution

Solution Overview


The Audit Log within the vRealize Automation (vRA) User Interface (UI) to provides a concise view of lifecycle of Virtual Machines (VMs). However, the Audit Logs are stored within the IAAS SQL database and displayed through an .aspx page within the vRA UI. The data logged is not available from any logs written to the drives on the vRA appliance or the Infrastructure as a Service (IAAS) machine. As such, an alternative solution is required to facilitate tracking and reporting on vRA provisioning that must be written to a vRA or vRealize Orchestrator (vRO) log on disk that can then be forwarded to a log aggregator for dash-boarding.

Event Broker Solution

To satisfy the requirement, a solution of a vRA Event Broker subscription with a vRO workflow to log to the vRO server log was created.

Implementation Steps

Create the vRO Workflow

A vRO workflow will log the events, as they fire. Within vRO, create a new workflow, preferably in a new folder where custom workflows are stored.

The General page should have attributes as follows:

General page of vRO workflow

The attributes are outlined in the table below:

Name Type Value
delimiterString String You may specify any unique string value. This will be appended to the beginning and end of the log entry to allow easy identification within the target log

The only input to the workflow will be a Properties object type labeled payload. This value will be passed from the Event Broker each time the event fires.

input to the workflow

The workflow is a simple script object (Scriptable task) labeled printProperties as the only element in the workflow.

script block in workflow

The script block has two inputs to it, delimiterString and payload.

script block inputs

The script is below:

var systemContextJSON = getSystemContextValues();

if(payload != null) {		
	System.debug("Payload not null")
	payloadJsonString = JSON.stringify(payload,null, 4);
	payloadJSON = JSON.parse(payloadJsonString);
	System.debug("=============  Payload ============");
	System.debug(JSON.stringify(payloadJSON,null, 4));
	System.debug("=============  Payload ============");
} else { 
	System.debug("Payload is null");

System.log(delimiterString + "Machine " + payloadJSON.machine.name + " changed state to " + payloadJSON.lifecycleState.phase + " " + (getState(payloadJSON.lifecycleState.state) == "VMPSMasterWorkflow32" ? getState(payloadJSON.lifecycleState.event) : getState(payloadJSON.lifecycleState.state)) + " time: " + systemContextJSON.__asd_requestInstanceTimestamp + " machine id: " + payloadJSON.machine.id + delimiterString);

function getState(inState){
var stateArray = inState.split(".");

return stateArray[(stateArray.length == 1 ? 1 : stateArray.length-1)]

function getSystemContextValues(){
	var systemContext = System.getContext();
	var buildSystemContextJSON = new Object();

	for each ( var param in systemContext.parameterNames().sort() ) {
		buildSystemContextJSON[param] = systemContext.getParameter(param).replace(/\"/g,"");

	System.debug("=============  System Context ============");
	System.debug(JSON.stringify(buildSystemContextJSON, null, 4));
	System.debug("=============  System Context ============");
	return buildSystemContextJSON

Notes about the script

  1. Logging within the vRO UI will only show the desired output when the logging view is Info. However, all values within the payload and system context data are displayed when logging view is Debug.  

Info logging in vRO

Debug logging in vRO

2. To adjust the data within the desired output, adjust the System.log line of the script above. The primary data is detailed below:

Data Point Script Variable What’s in There?
System Context systemContextJSON System context around the event broker workflow running, i.e. provisionTracking in this solution document.
Payload payloadJSON Data from the VM in the vRA workflow run either provisioning or destroying the machine in vRA.

3. The number of log entries is controlled by the event broker subscription. To reduce them, i.e. only show PRE, POST, or EVENT, add that limitation to the subscription. However, review runs both provisioning and destroying to ensure that the desired states are captured before eliminating events. It’s simple to filter data in a log aggregation solution, as long as you have the data to filter.

Create the Property Group

To pass data from the event broker subscription to vRO, custom properties are added to the blueprint, specifying which values to pass. As a number of events are logging, it is simpler to add all values as a property group and then add the property group to the blueprint.

Property groups are created under Administration > Property Dictionary > Property Groups. Click + New to create a new property group.

Name the group stub workflow values and click + New to add each property. The properties to be added are in the table below:

Name Value Encrypted Show in Request
Extensibility.Lifecycle.Properties.CloneWorkflow.CustomizeMachine * No No
Extensibility.Lifecycle.Properties.CloneWorkflow.InitialPowerOn * No No
Extensibility.Lifecycle.Properties.VMPSMasterWorkflow32.Requested * No No
Extensibility.Lifecycle.Properties.VMPSMasterWorkflow32.MachineActivated * No No
Extensibility.Lifecycle.Properties.VMPSMasterWorkflow32.UnprovisionMachine * No No
Extensibility.Lifecycle.Properties.VMPSMasterWorkflow32.Expired * No No
Extensibility.Lifecycle.Properties.VMPSMasterWorkflow32.MachineProvisioned * No No
Extensibility.Lifecycle.Properties.VMPSMasterWorkflow32.Disposing * No No
Extensibility.Lifecycle.Properties.VMPSMasterWorkflow32.Off * No No
Extensibility.Lifecycle.Properties.VMPSMasterWorkflow32.On * No No

Once created, add the property group to the VM in the blueprint to be logged.

Create the Event Broker Subscription

Within vRA, the event broker subscription is created under Administration > Events > Subscriptions. Click + New to add a new subscription.

On the first screen, scroll in the list and choose Machine Provisioning. This will fire on the events in which we are interested.

event subscription selection

Click Next and leave the choice as Run for all events. After implementing the solution, changes may be made to this page to limit which events and phases are triggered. Click Next.

event subscription conditions

Choose the workflow created in the previous step. Note that the input property passed to the workflow is the payload that we specified in the vRO workflow.

event subscription workflow choice

Click Next. Set the Priority and Timeout to 2 and, most importantly, click the checkbox to make this a blocking subscription. This is critical. If the subscription is non-blocking, data is not passed through the payload to the workflow.

As this is a logging solution, a priority of 2 simply makes this less important than priority 1 subscriptions, normally used for critical provisioning processes.

Click Finish.

configure blocking subscription

At this point, the subscription is created, but not active. To have it actually fire, we must publish it. To do so, highlight the subscription and click Publish.

publishing the subscription

Test the solution

To test the solution, open the vRO client and focus on the workflow. In vRA, provision a machine from a blueprint with the property group assigned. You should see the workflow execution within the vRO client.

How to Use the Logging

The data that is logged in the vRO UI is also captured in the server.log for vRO. On the vRA appliance, one path is /storage/log/vmware/vco/app-server. To utilize the logging:

  1. Forward the server.log or locate it within your log aggregation solution.
  2. Filter the output using the delimiter string specified on the logging workflow.
  3. Focus on an individual machine’s log by also filtering on the machine id within the log entry. This value will remain consistent over the life of the machine.


Leave a Reply

Your email address will not be published. Required fields are marked *