Hi maybe you could use several row and panels with some reports and/or base and post searches? See more https://docs.splunk.com/Documentation/Splunk/9.2.0/Viz/Savedsearches r. Ismo
Hi @Fadil.CK, After inquiring about your question, someone from our Ops team did an SSL scan for Agent traffic on both of your instances and they didn't see anything reporting for TLS 1 and 1.1.
...
See more...
Hi @Fadil.CK, After inquiring about your question, someone from our Ops team did an SSL scan for Agent traffic on both of your instances and they didn't see anything reporting for TLS 1 and 1.1.
Was this just a hypothetical question?
Hi guys can you please help me ? I'm trying to use a space as thousands separator and I can't, the max that I could it's a comma with this:
eval value= if(value!="N/A",printf("%'d",value),v...
See more...
Hi guys can you please help me ? I'm trying to use a space as thousands separator and I can't, the max that I could it's a comma with this:
eval value= if(value!="N/A",printf("%'d",value),value)
Result = 123,456 so I guess I can change it with a replace maybe, But then we have problem number 2 , When I try to sort the by value in the arrow of the column, the sort it isn't correct, and the bigger numbers are considerate as strings. Can you guys help me solve this please ? Tell me if you need more things
Hi until you give some sample data to us, it’s hard to tell exactly how to do it. Here is some ideas how to proceed with this case ...
| rex "....(?<yourXML>....until it ends)...."
| fields _time y...
See more...
Hi until you give some sample data to us, it’s hard to tell exactly how to do it. Here is some ideas how to proceed with this case ...
| rex "....(?<yourXML>....until it ends)...."
| fields _time yourXML
| xmlkv maxinputs=99999 yourXML
.... r. Ismo
Hi This could be the reason. Just hunting this kind of issues and still working with it. You should look limits.conf and it's kv-stanza. Also check what you have on your sourcetype (and/or host and...
See more...
Hi This could be the reason. Just hunting this kind of issues and still working with it. You should look limits.conf and it's kv-stanza. Also check what you have on your sourcetype (and/or host and source) definition's TRUNCATE value. Both of those are affecting how many fields splunk automatically found. r. Ismo
Hello I have some linux systems that run in cron every day this line: /usr/bin/nmon -f -t -s 300 -c 288 -m /var/log/nmon/ As a result, I have one file per day with nmon metrics. These servers don'...
See more...
Hello I have some linux systems that run in cron every day this line: /usr/bin/nmon -f -t -s 300 -c 288 -m /var/log/nmon/ As a result, I have one file per day with nmon metrics. These servers don't have communication with splunk. My question is, it is posible to ingests these files to NMON Splunk App to analyze them ? I supouse to manually load the files in the index nmon but I'm not sure if first I have to do something before. Thank you in advance
How big are the events? Splunk default to 200 field extractions, IIRC. Also, what type of data is it? I've seen problems extracting JSON data, especially nested JSON.
hi @gcusello , Thanks for your inputs on this. Yeah , we have validated that "All fields" are selected in the fields drop down. We are running the search in verbose mode. But, nothing helped...o...
See more...
hi @gcusello , Thanks for your inputs on this. Yeah , we have validated that "All fields" are selected in the fields drop down. We are running the search in verbose mode. But, nothing helped...out event is very big and i am thinking if there is any limitation Splunk is hitting in showing up the fields.
Hi @richgalloway , Thanks for you inputs on this. We are running the search in verbose mode only, but this did no help us. The event we are dealing with is very big and we are thinking if Splunk...
See more...
Hi @richgalloway , Thanks for you inputs on this. We are running the search in verbose mode only, but this did no help us. The event we are dealing with is very big and we are thinking if Splunk is hitting any limitation in showing up all the fields in the left side panel selected fields and interested fields.
yes tried it out with all options already, with quote, without quote & double quotes. All are giving the same error.
Error in 'mstats' command:
Invalid token: sum(eval(if(calc:service.thaa_str...
See more...
yes tried it out with all options already, with quote, without quote & double quotes. All are giving the same error.
Error in 'mstats' command:
Invalid token: sum(eval(if(calc:service.thaa_stress_requests_count_lr_tags>0
Try without quotes around the field names (perhaps there is something significant about the colon? | mstats sum(eval(if(calc:service.thaa_stress_requests_count_lr_tags>0, calc:service.thaa_stress_re...
See more...
Try without quotes around the field names (perhaps there is something significant about the colon? | mstats sum(eval(if(calc:service.thaa_stress_requests_count_lr_tags>0, calc:service.thaa_stress_requests_count_lr_tags, null()))) As "Count", avg(eval(if(calc:service.thaa_stress_requests_lr_tags>0, calc:service.thaa_stress_requests_lr_tags, null()))) As "Response" where index=itsi_im_metrics by Dimension.id
I am trying to get values from xml part of an event. The event starts with few lines than it has xml part, after that few more lines and another xml part. I want to extract at first only the parts of...
See more...
I am trying to get values from xml part of an event. The event starts with few lines than it has xml part, after that few more lines and another xml part. I want to extract at first only the parts of the event that are in xml format.
We need more information. Are you trying to extract at search time or index time? Are you trying keep the XML or discard it? Please share a sanitized sample event or two.
Thank you, that did the trick. I was trying to do everything through the field extraction page, but using 'sed' worked better. It kept all the context I needed and eliminated all the slashes.