it loos like this is the first line this is the second line <!DOCTYPE....> ......... the rest of the xml is here this is the third line <!DOCTYPE...> .......rest of the xml is here this is th...
See more...
it loos like this is the first line this is the second line <!DOCTYPE....> ......... the rest of the xml is here this is the third line <!DOCTYPE...> .......rest of the xml is here this is the fourth line this is the fifth line
I was looking into the splunk integration with hadoop and saw that it's on schedule for EOL (Jan 2025 per https://docs.splunk.com/Documentation/Splunk/9.2.0/HadoopAnalytics/MeetSplunkAnalyticsforHado...
See more...
I was looking into the splunk integration with hadoop and saw that it's on schedule for EOL (Jan 2025 per https://docs.splunk.com/Documentation/Splunk/9.2.0/HadoopAnalytics/MeetSplunkAnalyticsforHadoop). I know it's changed around a few times, like there used to be a "hadoop connect" app, before "Splunk Analytics for Hadoop". Is this happening again where it's just moving somewhere else, or is it totally gone now? Nothing else to substitute it?
Hi it’s hard to help you without more information about your environment and queries. You could try to look if this helps https://conf.splunk.com/files/2020/slides/TRU1761C.pdf There are many more...
See more...
Hi it’s hard to help you without more information about your environment and queries. You could try to look if this helps https://conf.splunk.com/files/2020/slides/TRU1761C.pdf There are many more presentations which could help too? r. Ismo
Hi SanDeep, Thanks for watching the episode. Depending on what product you are trying to monitor your K8 cluster with will determine what agent you would need. If you had AppD CSaaS for example th...
See more...
Hi SanDeep, Thanks for watching the episode. Depending on what product you are trying to monitor your K8 cluster with will determine what agent you would need. If you had AppD CSaaS for example then you would be correct, we would need to install a cluster agent. However, this video is on CCO (Cisco Cloud Observability) and COP (Cisco Observability Platform). The agents in CCO are called collectors because they are making use of OpenTelemetry. I have written a guide on Cisco U that might help you - https://ondemandelearning.cisco.com/apollo-alpha/tc-cnao-app-auto-instrumentation/pages/1 There is also a guide for CSaaS - https://ondemandelearning.cisco.com/apollo-alpha/tc-appd-auto-apm-instrumentation/pages/1
Hi maybe you could use several row and panels with some reports and/or base and post searches? See more https://docs.splunk.com/Documentation/Splunk/9.2.0/Viz/Savedsearches r. Ismo
Hi @Fadil.CK, After inquiring about your question, someone from our Ops team did an SSL scan for Agent traffic on both of your instances and they didn't see anything reporting for TLS 1 and 1.1.
...
See more...
Hi @Fadil.CK, After inquiring about your question, someone from our Ops team did an SSL scan for Agent traffic on both of your instances and they didn't see anything reporting for TLS 1 and 1.1.
Was this just a hypothetical question?
Hi guys can you please help me ? I'm trying to use a space as thousands separator and I can't, the max that I could it's a comma with this:
eval value= if(value!="N/A",printf("%'d",value),v...
See more...
Hi guys can you please help me ? I'm trying to use a space as thousands separator and I can't, the max that I could it's a comma with this:
eval value= if(value!="N/A",printf("%'d",value),value)
Result = 123,456 so I guess I can change it with a replace maybe, But then we have problem number 2 , When I try to sort the by value in the arrow of the column, the sort it isn't correct, and the bigger numbers are considerate as strings. Can you guys help me solve this please ? Tell me if you need more things
Hi until you give some sample data to us, it’s hard to tell exactly how to do it. Here is some ideas how to proceed with this case ...
| rex "....(?<yourXML>....until it ends)...."
| fields _time y...
See more...
Hi until you give some sample data to us, it’s hard to tell exactly how to do it. Here is some ideas how to proceed with this case ...
| rex "....(?<yourXML>....until it ends)...."
| fields _time yourXML
| xmlkv maxinputs=99999 yourXML
.... r. Ismo
Hi This could be the reason. Just hunting this kind of issues and still working with it. You should look limits.conf and it's kv-stanza. Also check what you have on your sourcetype (and/or host and...
See more...
Hi This could be the reason. Just hunting this kind of issues and still working with it. You should look limits.conf and it's kv-stanza. Also check what you have on your sourcetype (and/or host and source) definition's TRUNCATE value. Both of those are affecting how many fields splunk automatically found. r. Ismo
Hello I have some linux systems that run in cron every day this line: /usr/bin/nmon -f -t -s 300 -c 288 -m /var/log/nmon/ As a result, I have one file per day with nmon metrics. These servers don'...
See more...
Hello I have some linux systems that run in cron every day this line: /usr/bin/nmon -f -t -s 300 -c 288 -m /var/log/nmon/ As a result, I have one file per day with nmon metrics. These servers don't have communication with splunk. My question is, it is posible to ingests these files to NMON Splunk App to analyze them ? I supouse to manually load the files in the index nmon but I'm not sure if first I have to do something before. Thank you in advance
How big are the events? Splunk default to 200 field extractions, IIRC. Also, what type of data is it? I've seen problems extracting JSON data, especially nested JSON.
hi @gcusello , Thanks for your inputs on this. Yeah , we have validated that "All fields" are selected in the fields drop down. We are running the search in verbose mode. But, nothing helped...o...
See more...
hi @gcusello , Thanks for your inputs on this. Yeah , we have validated that "All fields" are selected in the fields drop down. We are running the search in verbose mode. But, nothing helped...out event is very big and i am thinking if there is any limitation Splunk is hitting in showing up the fields.