We are starting to rely on the Log Analytics add-on a bit more these days, and I noticed that the value field is just sort of hard-coded to a simple string. But often, it's actually json itself. And since the whole event is built as a json object, I wanted to the values to effectively be nested json in those cases.
That said, I've modified input_module_log_analytics.py in my test environment, and it seems to be working. I wanted to share it here in case anyone had feedback or if somebody else would be interested .... and of course to let @jkat54 know.
So I changed this (line 89):
for n in range(len(data["tables"][0]["rows"][i])):
field = str(data["tables"][0]["columns"][n]["name"])
value = str(data["tables"][0]["rows"][i][n]).replace('"',"'").replace("\\", "\\\\").replace("None", "").replace("\r\n","")
if value == "":
continue
else:
data1 += '"%s":"%s",' % (field, value)
To the following. Here I try to process the value field as json. If it succeeds, we use it. If it fails, I use the original logic. But note that i put the surrounding double quotes directly into the value assignment instead of in the data1 assignment, because they don't belong there if the value itself is json.
for n in range(len(data["tables"][0]["rows"][i])):
field = str(data["tables"][0]["columns"][n]["name"])
value = str(data["tables"][0]["rows"][i][n])
try:
value = json.dumps(json.loads(value))
except ValueError:
value = '"' + value.replace('"',"'").replace("\\", "\\\\").replace("None", "").replace("\r\n","") + '"'
if value == "":
continue
else:
data1 += '"%s":%s,' % (field, value)
I also don't think I need the empty value check now, but left it in there just in case...it's not hurting anything.
Thanks @maciep I’ll add this to the list for the next revision!
You’re awesome for sharing!
I’d love to hear more about how you’re relying on the TA some day, maybe I can pull a testimonial from you on linked in?
Thanks again,
Michael “JKat54”
Thanks @maciep I’ll add this to the list for the next revision!
You’re awesome for sharing!
I’d love to hear more about how you’re relying on the TA some day, maybe I can pull a testimonial from you on linked in?
Thanks again,
Michael “JKat54”
one day we hope that splunk will build a fully functional/reliable azure add-on like they have aws, We were using Azure Monitor add-on to get data from an event hub, but like most azure add-ons out there, it stopped working too often. So we are now pushing data to log analytics so that we can consume them in Splunk.
As long as the add-on can keep up with the load, we should be good....