# Integrating CycleCloud Server Logs with Azure Log Analytics for Enhanced Monitoring on Azure Dashboard

# Introduction

In environments where direct access to the CycleCloud Server Portal is restricted, organizations need alternative methods to monitor and analyze server activity. This blog outlines a solution to transfer logs from CycleCloud Server to Log Analytics Workspace, enabling visualization on the Azure Portal Dashboard.

The VM core usage shown in upper-left corner of the CycleCloud Server Portal as below. We will export the data from from the CycleCloud Server logs and transferred to the Log Analytics Workspace for visualization.

alt text

# Prerequisites

Before implementing this solution, ensure you have the following:

  1. A Python environment (Python 3.12 is used in this example).
  2. Log Analytics Workspace ID and Shared Key to facilitate data transfer to Log Analytics Workspace. We can obtain these details from the Azure Portal.

alt text

# Implementations

  1. Verify CycleCloud Server Command

First, confirm that you can retrieve the necessary information from the CycleCloud Server using the appropriate commands. The output should look something like this:

1
sudo /opt/cycle_server/cycle_server execute --format json 'select MachineType, count(*) as MachineCount, sum(CoreCount) as CoreCount from cloud.instancesession where ClusterName == "CLUSTERNAME" group by MachineType'

Sample output data shown as below:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
[ {
"MachineType" : "Standard_D4ads_v5",
"MachineCount" : 1,
"CoreCount" : 4
}, {
"MachineType" : "Standard_D2ds_v5",
"MachineCount" : 3,
"CoreCount" : 6
}, {
"MachineType" : "Standard_D4ds_v5",
"MachineCount" : 1,
"CoreCount" : 4
}, {
"MachineType" : "Standard_D2ads_v5",
"MachineCount" : 2,
"CoreCount" : 4
} ]
  1. Python Script to Transfer Logs

Next, write a Python script to send CycleCloud Server logs to Log Analytics Workspace. Save this script to /home/azureuser/vm_core_usage_to_law.py in my case.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
import base64
from datetime import datetime
import hashlib
import hmac
import json
import requests
import subprocess
# Replace with your Log Analytics workspace ID
workspace_id = 'YourWorkspaceId'

# Replace with your Log Analytics workspace key
shared_key = 'YourWorkspaceSharedKey'

# Data Collector API URL
log_analytics_url = f'https://{workspace_id}.ods.opinsights.azure.com/api/logs?api-version=2016-04-01'

# Log type (the name of the custom log table)
log_type = 'YourTableName' # 'c1' in my case

# Variable for cluster name
cyclecloud_cluster_name = 'YourCycleCloudClusterName'

def execute_command(cluster_name):
command = f'sudo /opt/cycle_server/cycle_server execute --format json \'select MachineType, count(*) as MachineCount, sum(CoreCount) as CoreCount from cloud.instancesession where ClusterName == "{cluster_name}" group by MachineType\''
result = subprocess.run(command, shell=True, check=True, capture_output=True, text=True)
return json.loads(result.stdout)

# Sample JSON data
data = execute_command(cyclecloud_cluster_name)

# If data is empty, skip the subsequent code
if not data:
print("No data available, skipping.")
else:
# Convert JSON data to a string
body = json.dumps(data)

# Generate x-ms-date
rfc1123date = datetime.now().strftime('%a, %d %b %Y %H:%M:%S GMT')

# Build the signature
string_to_hash = f"POST\n{len(body)}\napplication/json\nx-ms-date:{rfc1123date}\n/api/logs"
bytes_to_hash = bytes(string_to_hash, 'utf-8')
decoded_key = base64.b64decode(shared_key)
encoded_hash = base64.b64encode(hmac.new(decoded_key, bytes_to_hash, digestmod=hashlib.sha256).digest()).decode('utf-8')
signature = f"SharedKey {workspace_id}:{encoded_hash}"

# Headers
headers = {
'Content-Type': 'application/json',
'Authorization': signature,
'Log-Type': log_type,
'x-ms-date': rfc1123date
}

# Send the request
response = requests.post(log_analytics_url, headers=headers, data=body)

# Check the response
if response.status_code == 200:
print('Data posted successfully.')
else:
print(f'Failed to post data: {response.status_code}')
print(response.text)
  1. Schedule Python Script

Finally, schedule the Python script to run at regular intervals using the cron job scheduler. For example, to run the script every 1 minutes, add the following line to the crontab file:

1
* * * * * python3 /home/azureuser/vm_core_usage_to_law.py
  1. Visualize Data on Azure Portal

After the script has run successfully, you can visualize the data on the Azure Portal Dashboard. Navigate to the Log Analytics Workspace and select the appropriate table to view the data with following kusto query:

1
2
3
4
YourTableName_CL
| summarize TotalCoreCount = sum(CoreCount_d) by bin(TimeGenerated, 1m), MachineType_s
| order by TimeGenerated asc
| render areachart

Resulting in a chart like this:

alt text

we can pin this kusto query to the Azure Portal Dashboard for real-time monitoring.

alt text

# Conclusion

By integrating CycleCloud Server logs with Azure Log Analytics, organizations can monitor server activity and visualize data on the Azure Portal Dashboard. This solution provides a convenient method to track server performance and resource utilization in real-time.

alt text

Special thanks to Azure CycleCloud Engineering Team for providing the inspiration for this blog post.


# Reference

  • Cost and Usage Tracking - Azure CycleCloud