All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

As I always caution people in this forum, do not treat structured data such as JSON as text.  Regex is usually not the right tool. Is the illustrated JSON the raw event?  If so, Splunk should have g... See more...
As I always caution people in this forum, do not treat structured data such as JSON as text.  Regex is usually not the right tool. Is the illustrated JSON the raw event?  If so, Splunk should have given you a field named correlation_id of value ['321e2253-443a-41f1-8af3-81dbdb8bcc77'].  If it is part of a raw event that is compliant JSON, you need to show the full raw event - and Splunk should have given you a field named some_path.correlation_id.  If it is part of a raw event that is not JSON, you need to show the raw event so we can help you extract the JSON part, then you can use spath on the JSON part.  This is much more robust and maintainable than using regex on structured data.
Illustrating raw data is an improvement.  Now could you describe the desired outcome, perhaps with a mock table, and describe the logic between the sample data and desired result?
I must filter which Host get which Risk (Hosts can have multiple Risk values) and what risk is falling away on which date and what risk is new  You need to first refine your requirement to a p... See more...
I must filter which Host get which Risk (Hosts can have multiple Risk values) and what risk is falling away on which date and what risk is new  You need to first refine your requirement to a point you can mathematically, perhaps even visually represent the desired outcome. (This is really not about Splunk, but about data analytics.)  I cannot think of a single table to represent the above sentence.  Can you illustrate with a mock results table, and illustrate some mock data to derive that mock table?  Are you looking for multiple charts to represent each element in that sentence?
Let me see if I can restate the requirement correctly:  if IP values for a given Hostname overlap in the two lookups, that's a match unless lookup A contains only one single IP.  A mismatch happens i... See more...
Let me see if I can restate the requirement correctly:  if IP values for a given Hostname overlap in the two lookups, that's a match unless lookup A contains only one single IP.  A mismatch happens if there is zero overlap of IP for a Hostname in the two, or if lookup A contains a single IP for that Hostname. Mathematically, this translates into a test of unique values because if there is any overlap, total number of unique IPs must be smaller than the sum of unique IPs in each lookup.  Hence     | inputlookup lookup_A | stats values(IP) as IP_A by Hostname | inputlookup lookup_B append=true | stats values(IP) as IP_B by Hostname | eval IP = coalesce(IP_A, IP_B) | stats values(IP_A) as IP_A values(IP_B) as IP_B values(IP) as IP by Hostname | eval match = if(mvcount(IP_A) == 1 OR mvcount(IP) == mvcount(IP_A) + mvcount(IP_B), "no", "yes")     Your sample data gives Hostname IP_A IP_B IP match Host A 10.10.10.1 10.10.10.2 10.10.10.1 10.10.10.1 10.10.10.2 yes Host B 172.1.1.1 172.1.1.1 172.1.1.2 172.1.1.1 172.1.1.2 no Below is an emulation that you can play with and compare with real data     | makeresults | eval _raw = "Hostname,IP Host A,10.10.10.1 Host A,10.10.10.2 Host B,172.1.1.1" | multikv forceheader=1 | fields - _* linecount | stats values(IP) as IP_A by Hostname ``` above emulates | inputlookup lookup_A | stats values(IP) as IP_A by Hostname ``` | append [| makeresults | eval _raw = "Hostname,IP Host A,10.10.10.1 Host B,172.1.1.1 Host B,172.1.1.2" | multikv forceheader=1 | fields - _* linecount | stats values(IP) as IP_B by Hostname] ``` subsearch emulates | inputlookup lookup_B append=true | stats values(IP) as IP_B by Hostname ``` | eval IP = coalesce(IP_A, IP_B) | stats values(IP_A) as IP_A values(IP_B) as IP_B values(IP) as IP by Hostname | eval match = if(mvcount(IP_A) == 1 OR mvcount(IP) == mvcount(IP_A) + mvcount(IP_B), "no", "yes")    
Specifically listing them  using  the GET is proving troublesome. When I search the returned results, I don't find all alerts, but I do find all reports. The POST to create and alert is not an iss... See more...
Specifically listing them  using  the GET is proving troublesome. When I search the returned results, I don't find all alerts, but I do find all reports. The POST to create and alert is not an issue. 
Hello,  sorry for the missing information. i am realy new to splunk and its complicated with all parameters. I get one event per host per risk  Means the host with the IP 10.10.10.10 get scanned w... See more...
Hello,  sorry for the missing information. i am realy new to splunk and its complicated with all parameters. I get one event per host per risk  Means the host with the IP 10.10.10.10 get scanned with a vulnerability tool and after this i get a log with 20 different vulnerability events. Example maybe 2 with the risk classification Critical - 10 with the risk classification High and 8 with Medium. Every Risk is one event for this host means i get 20 different events on the same Host.   
Hello inventsekar, So let me try to explain this. We started to monitor our docker/containers on splunk. Unfortunately we cannot install splunk forwarders on container. So we using fluent bit to col... See more...
Hello inventsekar, So let me try to explain this. We started to monitor our docker/containers on splunk. Unfortunately we cannot install splunk forwarders on container. So we using fluent bit to collect all logs from container and grabing logs via httpeventcollector.  So we cannot use splunk on this scenario. When containers crushing they spamming exactly same logs(only timestamp different) like there is no tomorrow.  
Hi @ikulcsar, have you found a solution to the problem? I currently face a similar issue in Splunk 9.0.5 with an accelerated datamodel, completing 100% but with 0 byte size and no results while h... See more...
Hi @ikulcsar, have you found a solution to the problem? I currently face a similar issue in Splunk 9.0.5 with an accelerated datamodel, completing 100% but with 0 byte size and no results while having 30 buckets and the base-search is returing a million events and no errors.
Hi, I would like to implement the following behavior in Dashboard studio: when a user clicks on a line chart showing the trend of a flow in terms of error count, I would like to show in a drill down... See more...
Hi, I would like to implement the following behavior in Dashboard studio: when a user clicks on a line chart showing the trend of a flow in terms of error count, I would like to show in a drill down graph the trend of all the errors per that specific flow. I tried with the following token : flow= click.value2  but it does not work. Any hint? thank you Best Regards  
Hi team, I have currently configured my Otel Collector to send traces data from the adservice (Otel-demo service) to AppDynamics over a proxy. My problem is that AppDynamics doesn't show any ingeste... See more...
Hi team, I have currently configured my Otel Collector to send traces data from the adservice (Otel-demo service) to AppDynamics over a proxy. My problem is that AppDynamics doesn't show any ingested data in the Otel section (No Data available). The collector logs show no errors. This is my Collector config: config: | receivers: otlp: protocols: grpc: http: processors: resource: attributes: - key: appdynamics.controller.account action: upsert value: "company" - key: appdynamics.controller.host action: upsert value: "company.saas.appdynamics.com" - key: appdynamics.controller.port action: upsert value: 443 batch: send_batch_size: 90 timeout: 30s exporters: otlphttp: endpoint: "https://some-agent-api.saas.appdynamics.com" headers: {"x-api-key": "<some-api-key>"} logging: verbosity: detailed sampling_initial: 10 sampling_thereafter: 5 extensions: zpages: service: telemetry: logs: level: debug extensions: [zpages] pipelines: traces: receivers: [otlp] processors: [resource, batch] exporters: [logging, otlphttp] env: - name: HTTPS_PROXY value: proxy.company.com:8080
Good day, What screen do users get when they attempt to reply to a poll after clicking on the link to the poll, even if the maximum number of replies allowed is 100? If the poll has already reached ... See more...
Good day, What screen do users get when they attempt to reply to a poll after clicking on the link to the poll, even if the maximum number of replies allowed is 100? If the poll has already reached its maximum number of responses. Will they still have the opportunity to see the replies chart that illustrates how everyone else answered the questions? This is the outcome that I am hoping for. Many thanks
Hi, How are you? Thank you for the community! I have tried to search logs using API as per Creating searches using the REST API - Splunk Documentation this seems complex anyhow possible but by my ex... See more...
Hi, How are you? Thank you for the community! I have tried to search logs using API as per Creating searches using the REST API - Splunk Documentation this seems complex anyhow possible but by my experience this has been impossible for me until now. How to search in Splunk using the API? Here what I found https://community.splunk.com/t5/Building-for-the-Splunk-Platform/How-to-collect-debug-logs-for-apps-on-Splunk-Cloud/m-p/586144 . Kind regards, Tiago
how to make splunk rest api sid remains unchanged
Hi @_JP , Thanks for the reply but that's not what I'm looking for. I want the ability to list and create alerts, not view triggered alerts.
Hi @_JP, I am conceptually agree with you, but the customer already has logs on logstash and wants to use Enterprise Security, that uses CIM. For this reason I have to ingest and parse logstash dat... See more...
Hi @_JP, I am conceptually agree with you, but the customer already has logs on logstash and wants to use Enterprise Security, that uses CIM. For this reason I have to ingest and parse logstash data, trying to persuade customer to pass to Universal Forwarders. I asked to the Community if someone has already addressed this problem, to have some hint or attention point. Anyway, working by myself, I already reconducted some data flows to standard add-ons. Thank you for your answer. Ciao. Giuseppe
Morning  Thank you for the response. The Dashboard is a Classic Dashboard, i definitely don't see as many options for inputs as you have above.  I'll have a look at the lookup option to speed up th... See more...
Morning  Thank you for the response. The Dashboard is a Classic Dashboard, i definitely don't see as many options for inputs as you have above.  I'll have a look at the lookup option to speed up the search. that's something i never thought of thank you.    kind regards,    Paula      
Hi @BoldKnowsNothin .. Yes, Splunk detects the "pattern" in the log lines.  may we know more details from you..  from the subject.. it looks like you want to drop (dont want to ingest) a portion of ... See more...
Hi @BoldKnowsNothin .. Yes, Splunk detects the "pattern" in the log lines.  may we know more details from you..  from the subject.. it looks like you want to drop (dont want to ingest) a portion of the logs.. if its correct, pls provide us more details..  1) the sample logs.. 2) do you have HF or no HF 3) the props.conf / transforms may be needed later. thanks. 
logger:integration-fabrics-exp-api.put:\orders\submit\(storeid).Exception    message: [10-12 05:36:03] INFO Exception [[MuleRuntime].uber.12973: [integration-fabrics-exp-api-prod].util:logger-except... See more...
logger:integration-fabrics-exp-api.put:\orders\submit\(storeid).Exception    message: [10-12 05:36:03] INFO Exception [[MuleRuntime].uber.12973: [integration-fabrics-exp-api-prod].util:logger-exception/processors/0.ps.BLOCKING @8989]: { "correlationId" : "787979-50ac-4b6f-90bd-64f1b6f79985", "message" : "Exception", "tracePoint" : "EXCEPTION", "priority" : "INFO", "category" : "kfc-integration-fabrics-exp-api.put:\\orders\\submit\\(storeid).Exception", "elapsed" : 3806, "locationInfo" : { "lineInFile" : "69", "component" : "json-logger:logger", "fileName" : "common/common-logger-flow.xml", "rootContainer" : "util:logger-exception" }, "timestamp" : "2023-10-12T05:36:03.317Z", "content" : { "payload" : { "api" : "integration-fabrics-exp-api-prod", "message" : "{\n \"externalOrderId\": \"275769403\",\n \"instruction\": \"275769403\",\n \"items\": [\n {\n \"id\": \"I-30995\",\n \"name\": \"Regular Chips\",\n \"unitPrice\": 445,\n \"quantity\": 1,\n \"subItems\": []\n },\n {\n \"id\": \"I-30057\",\n \"name\": \"Regular Potato \\u0026 Gravy\",\n \"unitPrice\": 545,\n \"quantity\": 1,\n \"subItems\": []\n },\n {\n \"id\": \"I-30017\",\n \"name\": \"3 Wicked Wings®\",\n \"unitPrice\": 695,\n \"quantity\": 1,\n \"subItems\": []\n },\n {\n \"id\": \"I-898-0\",\n \"name\": \"Kids Meal with Nuggets\",\n \"unitPrice\": 875,\n \"quantity\": 1,\n \"subItems\": [\n {\n \"id\": \"M-41687-0\",\n \"name\": \"4 Nuggets\",\n \"unitPrice\": 0,\n \"quantity\": 1,\n \"subItems\": []\n },\n {\n \"id\": \"M-40976-0\",\n \"name\": \"Regular Chips\",\n \"unitPrice\": 0,\n \"quantity\": 1,\n \"subItems\": []\n },\n {\n \"id\": \"M-40931-0\",\n \"name\": \"Regular 7Up\",\n \"unitPrice\": 0,\n \"quantity\": 1,\n \"subItems\": []\n }\n ]\n },\n {\n \"id\": \"I-32368-0\",\n \"name\": \"Kids Meal with Nuggets\",\n \"unitPrice\": 875,\n \"quantity\": 1,\n \"subItems\": [\n {\n \"id\": \"M-41687-0\",\n \"name\": \"4 Nuggets\",\n \"unitPrice\": 0,\n \"quantity\": 1,\n \"subItems\": []\n },\n {\n \"id\": \"M-40976-0\",\n \"name\": \"Regular Chips\",\n \"unitPrice\": 0,\n \"quantity\": 1,\n \"subItems\": []\n },\n {\n \"id\": \"M-40931-0\",\n \"name\": \"Regular 7Up\",\n \"unitPrice\": 0,\n \"quantity\": 1,\n \"subItems\": []\n }\n ]\n }\n ],\n \"customer\": {\n \"firstName\": \"9403\",\n \"lastName\": \"ML\",\n \"email\": \"ghgjhgj@hotmail.com\",\n \"phoneNumber\": \"897987\"\n },\n \"tenders\": [\n {\n \"type\": \"credit-card\",\n \"amount\": 3435\n }\n ],\n \"discountLines\": []\n}", "description" : "HTTP PUT on resource 'http://mule-worker-internal-order-sys-api-prod.au-s1.cloudhub.io:8091/orders/submit/716' failed: bad request (400).", "correlationId" : "1cb22ac0-50ac-4b6f-0988-64f1b6f79985", "category" : "integration-fabrics-exp-api.put:\\orders\\submit\\(storeid)", "timeStamp" : "2023-10-12T16:36:03:316000Z", "incomingMessage" : { "externalOrderId" : "9898", "instruction" : "275769403", "items" : [ { "id" : "I-30995", "name" : "Regular Chips", "unitPrice" : 445, "quantity" : 1, "subItems" : [ ] }, { "id" : "I-30057", "name" : "Regular Potato & Gravy", "unitPrice" : 545, "quantity" : 1, "subItems" : [ ] }, { "id" : "I-30017", "name" : "3 Wicked Wings®", "unitPrice" : 695, "quantity" : 1, "subItems" : [ ] }, { "id" : "I-32368-0", "name" : "Kids Meal with Nuggets", "unitPrice" : 875, "quantity" : 1, "subItems" : [ { "id" : "M-41687-0", "name" : "4 Nuggets", "unitPrice" : 0, "quantity" : 1, "subItems" : [ ] }, { "id" : "M-40976-0", "name" : "Regular Chips", "unitPrice" : 0, "quantity" : 1, "subItems" : [ ] }, { "id" : "M-40931-0", "name" : "Regular 7Up", "unitPrice" : 0, "quantity" : 1, "subItems" : [ ] } ] }, { "id" : "I-32368-0", "name" : "Kids Meal with Nuggets", "unitPrice" : 875, "quantity" : 1, "subItems" : [ { "id" : "M-41687-0", "name" : "4 Nuggets", "unitPrice" : 0, "quantity" : 1, "subItems" : [ ] }, { "id" : "M-40976-0", "name" : "Regular Chips", "unitPrice" : 0, "quantity" : 1, "subItems" : [ ] }, { "id" : "M-40931-0", "name" : "Regular 7Up", "unitPrice" : 0, "quantity" : 1, "subItems" : [ ] } ] } ], "customer" : { "firstName" : "9403", "lastName" : "ML", "email" : "ns@hotmail.com", "phoneNumber" : "98908" }, "tenders" : [ { "type" : "credit-card", "amount" : 3435 } ], "discountLines" : [ ] }, "errorMetadata" : { "errorType" : { "parentErrorType" : { "identifier" : "ANY", "namespace" : "MULE" }, "identifier" : "BAD_REQUEST", "namespace" : "HTTP" }, "description" : "HTTP PUT on resource 'http://mule-worker-internal-order-sys-api-prod.au-s1.cloudhub.io:8091/orders/submit/898' failed: bad request (400).", "additionalDetails" : "HTTP PUT on resource 'http://mule-worker-internal-order-sys-api-prod.au-s1.cloudhub.io:8091/orders/submit/716' failed: bad request (400).", "exception" : { "correlationId" : "1cb22ac0-50ac-4b6f-90bd-78979", "timestamp" : "2023-10-12T16:36:03:273000Z", "errorType" : "400 HTTP:BAD_REQUEST", "description" : "{\"code\":\"ghgj\",\"message\":\"CTT failed items. ModifierRequirementNotMet - 4 Nuggets,ModifierRequirementNotMet - 4 Nuggets\"}" } } } }, "applicationName" : "integration-fabrics-exp-api-prod", "applicationVersion" : "", "environment" : "prod", "threadName" : "[MuleRuntime].uber.12973: [integration-fabrics-exp-api-prod].util:logger-exception/processors/0.ps.BLOCKING @64c03d54" }   Here is the sample logger from which we need to group only the error message , "description" : "{\"code\":\"ghgj\",\"message\":\"CTT failed items. ModifierRequirementNotMet - 4 Nuggets,ModifierRequirementNotMet - 4 Nuggets...and create alert checking if we aregetting more than 3 such continuous errors within an hour  
 logger:integration-fabrics-exp-api.put:\orders\submit\(storeid).Exception    message: [10-12 05:36:03] INFO Exception [[MuleRuntime].uber.12973: [integration-fabrics-exp-api-prod].util:logger-excep... See more...
 logger:integration-fabrics-exp-api.put:\orders\submit\(storeid).Exception    message: [10-12 05:36:03] INFO Exception [[MuleRuntime].uber.12973: [integration-fabrics-exp-api-prod].util:logger-exception/processors/0.ps.BLOCKING @8989]: { "correlationId" : "787979-50ac-4b6f-90bd-64f1b6f79985", "message" : "Exception", "tracePoint" : "EXCEPTION", "priority" : "INFO", "category" : "kfc-integration-fabrics-exp-api.put:\\orders\\submit\\(storeid).Exception", "elapsed" : 3806, "locationInfo" : { "lineInFile" : "69", "component" : "json-logger:logger", "fileName" : "common/common-logger-flow.xml", "rootContainer" : "util:logger-exception" }, "timestamp" : "2023-10-12T05:36:03.317Z", "content" : { "payload" : { "api" : "integration-fabrics-exp-api-prod", "message" : "{\n \"externalOrderId\": \"275769403\",\n \"instruction\": \"275769403\",\n \"items\": [\n {\n \"id\": \"I-30995\",\n \"name\": \"Regular Chips\",\n \"unitPrice\": 445,\n \"quantity\": 1,\n \"subItems\": []\n },\n {\n \"id\": \"I-30057\",\n \"name\": \"Regular Potato \\u0026 Gravy\",\n \"unitPrice\": 545,\n \"quantity\": 1,\n \"subItems\": []\n },\n {\n \"id\": \"I-30017\",\n \"name\": \"3 Wicked Wings®\",\n \"unitPrice\": 695,\n \"quantity\": 1,\n \"subItems\": []\n },\n {\n \"id\": \"I-898-0\",\n \"name\": \"Kids Meal with Nuggets\",\n \"unitPrice\": 875,\n \"quantity\": 1,\n \"subItems\": [\n {\n \"id\": \"M-41687-0\",\n \"name\": \"4 Nuggets\",\n \"unitPrice\": 0,\n \"quantity\": 1,\n \"subItems\": []\n },\n {\n \"id\": \"M-40976-0\",\n \"name\": \"Regular Chips\",\n \"unitPrice\": 0,\n \"quantity\": 1,\n \"subItems\": []\n },\n {\n \"id\": \"M-40931-0\",\n \"name\": \"Regular 7Up\",\n \"unitPrice\": 0,\n \"quantity\": 1,\n \"subItems\": []\n }\n ]\n },\n {\n \"id\": \"I-32368-0\",\n \"name\": \"Kids Meal with Nuggets\",\n \"unitPrice\": 875,\n \"quantity\": 1,\n \"subItems\": [\n {\n \"id\": \"M-41687-0\",\n \"name\": \"4 Nuggets\",\n \"unitPrice\": 0,\n \"quantity\": 1,\n \"subItems\": []\n },\n {\n \"id\": \"M-40976-0\",\n \"name\": \"Regular Chips\",\n \"unitPrice\": 0,\n \"quantity\": 1,\n \"subItems\": []\n },\n {\n \"id\": \"M-40931-0\",\n \"name\": \"Regular 7Up\",\n \"unitPrice\": 0,\n \"quantity\": 1,\n \"subItems\": []\n }\n ]\n }\n ],\n \"customer\": {\n \"firstName\": \"9403\",\n \"lastName\": \"ML\",\n \"email\": \"ghgjhgj@hotmail.com\",\n \"phoneNumber\": \"897987\"\n },\n \"tenders\": [\n {\n \"type\": \"credit-card\",\n \"amount\": 3435\n }\n ],\n \"discountLines\": []\n}", "description" : "HTTP PUT on resource 'http://mule-worker-internal-order-sys-api-prod.au-s1.cloudhub.io:8091/orders/submit/716' failed: bad request (400).", "correlationId" : "1cb22ac0-50ac-4b6f-0988-64f1b6f79985", "category" : "integration-fabrics-exp-api.put:\\orders\\submit\\(storeid)", "timeStamp" : "2023-10-12T16:36:03:316000Z", "incomingMessage" : { "externalOrderId" : "9898", "instruction" : "275769403", "items" : [ { "id" : "I-30995", "name" : "Regular Chips", "unitPrice" : 445, "quantity" : 1, "subItems" : [ ] }, { "id" : "I-30057", "name" : "Regular Potato & Gravy", "unitPrice" : 545, "quantity" : 1, "subItems" : [ ] }, { "id" : "I-30017", "name" : "3 Wicked Wings®", "unitPrice" : 695, "quantity" : 1, "subItems" : [ ] }, { "id" : "I-32368-0", "name" : "Kids Meal with Nuggets", "unitPrice" : 875, "quantity" : 1, "subItems" : [ { "id" : "M-41687-0", "name" : "4 Nuggets", "unitPrice" : 0, "quantity" : 1, "subItems" : [ ] }, { "id" : "M-40976-0", "name" : "Regular Chips", "unitPrice" : 0, "quantity" : 1, "subItems" : [ ] }, { "id" : "M-40931-0", "name" : "Regular 7Up", "unitPrice" : 0, "quantity" : 1, "subItems" : [ ] } ] }, { "id" : "I-32368-0", "name" : "Kids Meal with Nuggets", "unitPrice" : 875, "quantity" : 1, "subItems" : [ { "id" : "M-41687-0", "name" : "4 Nuggets", "unitPrice" : 0, "quantity" : 1, "subItems" : [ ] }, { "id" : "M-40976-0", "name" : "Regular Chips", "unitPrice" : 0, "quantity" : 1, "subItems" : [ ] }, { "id" : "M-40931-0", "name" : "Regular 7Up", "unitPrice" : 0, "quantity" : 1, "subItems" : [ ] } ] } ], "customer" : { "firstName" : "9403", "lastName" : "ML", "email" : "ns@hotmail.com", "phoneNumber" : "98908" }, "tenders" : [ { "type" : "credit-card", "amount" : 3435 } ], "discountLines" : [ ] }, "errorMetadata" : { "errorType" : { "parentErrorType" : { "identifier" : "ANY", "namespace" : "MULE" }, "identifier" : "BAD_REQUEST", "namespace" : "HTTP" }, "description" : "HTTP PUT on resource 'http://mule-worker-internal-order-sys-api-prod.au-s1.cloudhub.io:8091/orders/submit/898' failed: bad request (400).", "additionalDetails" : "HTTP PUT on resource 'http://mule-worker-internal-order-sys-api-prod.au-s1.cloudhub.io:8091/orders/submit/716' failed: bad request (400).", "exception" : { "correlationId" : "1cb22ac0-50ac-4b6f-90bd-78979", "timestamp" : "2023-10-12T16:36:03:273000Z", "errorType" : "400 HTTP:BAD_REQUEST", "description" : "{\"code\":\"ghgj\",\"message\":\"CTT failed items. ModifierRequirementNotMet - 4 Nuggets,ModifierRequirementNotMet - 4 Nuggets\"}" } } } }, "applicationName" : "integration-fabrics-exp-api-prod", "applicationVersion" : "", "environment" : "prod", "threadName" : "[MuleRuntime].uber.12973: [integration-fabrics-exp-api-prod].util:logger-exception/processors/0.ps.BLOCKING @64c03d54" }   Here is the sample logger from which we need to group only the error message , "description" : "{\"code\":\"ghgj\",\"message\":\"CTT failed items. ModifierRequirementNotMet - 4 Nuggets,ModifierRequirementNotMet - 4 Nuggets...and create alert checking if we aregetting more than 3 such continuous errors within an hour
JSON is data-oriented.  Everything is treated as data.  But just like comment is useless and harmless code in "normal" computing languages, you can think of comment as useless and harmless data in JS... See more...
JSON is data-oriented.  Everything is treated as data.  But just like comment is useless and harmless code in "normal" computing languages, you can think of comment as useless and harmless data in JSON.  The trick is to embed useless data in keys that the application does not reject yet does not utilize.  One trick I found from Stackoverflow (there are many like that) is to place an unusual character at the beginning of key but you can design your own pattern as long as DS doesn't find it objectionable and doesn't act on it.  This example uses "_comment": { "visualizations": { "viz_OQMhku6K": { "type": "splunk.ellipse", "_comment": "about vizualization" } }, "dataSources": { "_comment": [ "datasource comment 1", "source comment 2" ] }, "defaults": { "dataSources": { "ds.search": { "options": { "queryParameters": { "latest": "$global_time.latest$", "earliest": "$global_time.earliest$" } } } } }, "inputs": { "input_global_trp": { "type": "input.timerange", "options": { "token": "global_time", "defaultValue": "-24h@h,now" }, "title": "Global Time Range" } }, "layout": { "type": "absolute", "_comment": "something about layout", "options": { "display": "auto-scale", "backgroundImage": { "sizeType": "contain", "x": 0, "y": 0, "src": "splunk-enterprise-kvstore://649ab2cf9e8252528a4843f1" } }, "structure": [ { "item": "viz_OQMhku6K", "type": "block", "position": { "x": 130, "y": 60, "w": 130, "h": 130 }, "_commment": "structure comment here" } ], "globalInputs": [ "input_global_trp" ] }, "_comment": "general comments go here", "description": "", "title": "Test Dashboard Studio comment" } Hope this helps.