Azure Automation and logs in OMS
In this installment, we will look closer at some aspects of automation especially in the era of Azure Automation. I am a monitor person of hearth, and firmly believe that every piece of software running out there should adhere to the principal “more information the better”. By that, I mean that every piece of running code should as a minimum log what it does logically. By this I mean if you are using New-ADUser cmdlet to create a new user, as a minimum one should log what parameters are used in creating this user. I am also a firm believer of using multiple sources, and one of those sources should be a monitoring solution. For creating a new user, I would typically do this when it comes to logging. For me this is good practice, but your mileage may vary.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 | If($oErr) { Write-EventLog -EventId 501 -LogName 'Windows PowerShell' -Source 'Workflows' ` -EntryType Error ` -Message "Function New-UserAD: Failed to create AD user: $($User.DisplayName) with error: $oErr" Write-Error -Message "Function New-UserAD: Failed to create AD user: $($User.DisplayName)" -ErrorAction Continue $oErr = $Null } Else { $msg = "Function New-UserAD: Created AD User with manager set.`n" + ` "Attributes: `n" + ` "Name: $($ADName), GivenName: $($User.givenName), Surname: $($User.surname)," + ` "EmployeeNumber: $($User.employeeNumber), Manager: $($Manager.Name)`n" + ` "UserPrincipalName: $($User.SamAccountName + '@' + $DomainInfo.Domain)," + ` "SamAccountName: $($User.SamAccountName), EmailAddress: $email`n" + ` "Title: $($User.Title), employeeType: $($User.employeeType), " + ` "Company: $($User.Company), businessCategory: $($User.businessCategory)`n" + ` "HomePhone: $($User.homePhone), MobilePhone: $($User.MobilePhone), " + ` "OfficePhone: $($User.telephoneNumber), otherMobile: $($User.otherMobile), " + ` "IpPhone: $($User.ipPhone), OtherTelephone: $($User.otherTelephone)`n" + ` "proxyAddresses: $("smtp:" + $User.SamAccountName + '@' + $DomainInfo.Domain) & $("SMTP:" + $email)" Write-EventLog -EventId 501 -LogName 'Windows PowerShell' -Source 'Workflows' ` -EntryType Information -Message $msg Write-Verbose -Message $msg } |
So the logic aspect of it, creating the user and the values for the attributes, are logged to an outside source. In this case, both the Automation database and the local computers event log. The eventlog is used so a monitoring system like SCOM (System Center Operations Manager) can pick up the information and make it available through views for administrators to check. Monitoring systems are usually very powerful when it comes to visualization of what is happening in the different logical layers of the infrastructure, so it is good practice to feed these with additional data to give the possibility of drawing a more complete picture of what is happening. Not just for problems, but also to see that there is a matchup of what the business logic dictates, and what the actual code logic does.
Transitioning to the cloud and Azure Automation I found these possibilities a bit lacking, so I have been a bit vary of recommending customers from leaving SMA (Service Management Automation) behind.
Now some good news looking ahead, with the release of OMS (Operations Management Suite) you can pull logs from azure and centralize information to better visualize what is happening in your Azure environment. My colleague Stanislav has some excellent blog posts talking about OMS here, go have a read.
Now let us get on with the main event.
For this, you need an Azure Automation account, an azure active directory user that has access to the storage account you want to use for logging. Next, we will look at how to create a new storage container to do this test on, and how to set up the automation Runbook and how to import the needed Integration Module.
First, let us create a storage container; in the classic portal navigate to Storage and click New. Just a heads up, what we discuss here do not support ARM (Azure Resource Manager) storage.
Check that the new storage container is online, and navigate to the Operational Insights tab.
Navigate click the active account and navigate to storage.
Click Add and choose the storage container we created earlier.
Check that it connects.
Now on to the next part in Azure Automation. For this part, we need to go to the new preview portal. This can be accessed by clicking you’re logged on user name in the top right corner of the classic portal. Choose to switch to the Preview Portal.
In this portal find the Automation Accounts tab. Choose an active account and navigate to module tab.
Click add a module, and add the zip file included in this blog post.
Clicking “Ok” will upload the integration module. It will take some time for it to be activated so be patient. When everything has finished unpacking you should be able to see the following clicking the module name.
Make sure you can see the two functions at the bottom. Next, we need to create a connection entry for this module. This can be done in the Asset tab, and by clicking Connections.
I have already created these for different storage account, but to create a new click add a connection and choose the correct type as shown in the picture below.
We are not quite finished yet; we need to create a credential asset with a user that has access to the subscription (usually the same as above).
Go into credentials and add a new one. Enter a user that has access to the subscription where the storage container is.
Now we can create the Runbook.
Choose to “Add a runbook”, and give it a name.
Edit the new Runbook.
Check that the AutaomationWADTools module is available.
Then enter the following code in the Runbook.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 | $AzureCred = Get-AutomationPSCredential -Name 'AzureAutomation' Add-AzureAccount -credential $AzureCred $AzureStorageConnect = Get-AutomationConnection -Name "AzureLogConnect" $TableValues = @{ 'TableName' = "WADWindowsEventLogsTable"; 'ProviderName' = "Runbook TestOMSlog"; 'ProviderGuid' = "00000000-1111-2222-1111-000000000000"; 'DeploymentId' = "b52232c898c94c8fb72638c9320579f4"; 'Role' = "Automation"; 'RoleInstance' = "TestOMSlogging"; 'Channel' = "Azure Automation"; 'Level' = "3"; 'EventId' = "10"; 'RawXml' = "Hello world from Azure Automation"; 'Description' = "Come as you are"; 'PreciseTimeStamp' = [String]((Get-Date).ToUniversalTime()).ToString('dd.MM.yyyy HH:mm:ss'); 'TIMESTAMP' = [String]((Get-Date).ToUniversalTime()).ToString('dd.MM.yyyy HH:mm:ss'); } $TableValues Add-EntityAzureTable -TableValues $TableValues -AzureStorageConnect $AzureStorageConnect -Verbose |
Now you can go to the test pane and run the code one time. The output should look something like what is shown below.
Once this has completed you probably need to wait a bit before OMS retrieves the log.To check with OMS go to Operational Insight in the other portal and click the manage button.
In OMS go and get the events and see that the content of the TableValues variable from the previous created Runbook has made its way into the system.
That wraps up this demonstration of how to get more out of Azure Automation and OMS. In addition, as I always like to do in my posts, a little reminder.
Happy tinkering!