All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @Yogesh.Joshi, We have quite a bit of existing content on Machine Agent. A lot of it is here in the Community. Here are the search results for "Machine Agent" in the Knowledge Base AppD Docs... See more...
Hi @Yogesh.Joshi, We have quite a bit of existing content on Machine Agent. A lot of it is here in the Community. Here are the search results for "Machine Agent" in the Knowledge Base AppD Docs: https://docs.appdynamics.com/appd/onprem/latest/en/infrastructure-visibility/machine-agent/administer-the-machine-agent/faqs-and-troubleshooting-for-the-machine-agent
Hi @Yogesh.Joshi, Have you seen this Knowledge Base Article? https://community.appdynamics.com/t5/Knowledge-Base/Why-is-the-Machine-Agent-not-reporting-properly/ta-p/13983 Let me know if it helps!
I have an Enterprise free trial system that I installed on an Ubuntu Server.  In the gui I went to the settings>forwarding and receiving>receiving>add new because I am going to try and set up a forwa... See more...
I have an Enterprise free trial system that I installed on an Ubuntu Server.  In the gui I went to the settings>forwarding and receiving>receiving>add new because I am going to try and set up a forwarder.   On the Add New page I entered 514 in the Listen on this port field.  I get the rror after I click save. I want to use this for gathering syslog data from my OPNsense router and then build a dashboard for it.   I also keep getting this message when trying change settings  CSRF validation failed
We need more information.  How exactly are you trying to add a receiver port?  What command are you issuing and where are you entering it?
@gcusello I do not see an option for upload asset in Splunk Cloud in 9.x version. How to upload image in cloud through UI?Or if not how to refer an external image using href. My image isnt loading th... See more...
@gcusello I do not see an option for upload asset in Splunk Cloud in 9.x version. How to upload image in cloud through UI?Or if not how to refer an external image using href. My image isnt loading though the href sharepoint URL works properly It is just below 2 options Sorry for digging up old post
I get the following error when I try to add a receiver with port 9997 or 514. The following error was reported: SyntaxError: Unexpected token '<', " <p class=""... is not valid JSON. I get the same... See more...
I get the following error when I try to add a receiver with port 9997 or 514. The following error was reported: SyntaxError: Unexpected token '<', " <p class=""... is not valid JSON. I get the same error no matter what port I try to enter.  This is a new installation and this is the first thing I tried to do.  I am somewhat of a novice with splunk.
@scelikok No luck. I have attached outcome screenshot for your reference | search job_name=*Group06* OR job_name=*Group01* - This produce 2 events, 1st one belong to Group06 2nd event belong to Gr... See more...
@scelikok No luck. I have attached outcome screenshot for your reference | search job_name=*Group06* OR job_name=*Group01* - This produce 2 events, 1st one belong to Group06 2nd event belong to Group01 | search job_name=*Group06* OR job_name=rerunGroup - This produce only 1 event belong to Group06   
message=* OR city=* | eval Field2=coalesce(Field2, FieldA) | stats values(*) as * by Field2 | where isnotnul(Field1)
Thank you both. Is there any other approach to get this result? If so, please do help me on this. Thanks
Our networking team needs to get ASN from public IP addresses. We found the TA-asngen add-on. I put it through splunk-inspect and fixed the failures, added a MaxMind license key in the default/asngen... See more...
Our networking team needs to get ASN from public IP addresses. We found the TA-asngen add-on. I put it through splunk-inspect and fixed the failures, added a MaxMind license key in the default/asngen.conf, installed it in our Splunk Cloud instance. When I try to run the `asngen` command, it gives out this error message:   Exception at "/opt/splunk/etc/apps/TA-asngen/bin/asngen.py", line 55 : maxmind license_key is required     Just wonder if anyone has tried the TA in the cloud. Any thoughts would be much appreciated.
No, just says null now on the x-axis.
Hi, I have a union'ed search where I am wanting to link different events based on fields that have matching values. My search looks like this: | union [search message=* | spath Field1 | spath Fi... See more...
Hi, I have a union'ed search where I am wanting to link different events based on fields that have matching values. My search looks like this: | union [search message=* | spath Field1 | spath Field2] [search city=* | spath FieldA  | spath FieldB] | table Field1 Field2 FieldA FieldB My current output looks like this: Field1 Field2 FieldA FieldB John Blue         Blue Ohio     Yellow Wyoming   However I need a way to link Field1 to FieldB if Field2=FieldA, where the output would look something like this:  Field1 Field2 FieldA FieldB John Blue Blue Ohio     Yellow Wyoming If there is a way to do something like this, please let me know, even if I need to create new fields. The excess FieldA and FieldB are unimportant if there is not a matching Field2.  please help, please
There can be various reasons for this issue but here are the common ways to troubleshoot this error. First and foremost, - you need to track down the automatic lookup definition - record the looku... See more...
There can be various reasons for this issue but here are the common ways to troubleshoot this error. First and foremost, - you need to track down the automatic lookup definition - record the lookup definition name being referenced - find the lookup definition and record the lookup table name and then go check the following:   Check if your lookup file exist - you can use the Lookup Editor app to check this or go to: Settings > Lookups > Lookup table files Check if your lookup definition exist - you can check this by going to Settings > Lookups > Lookup definition If you are using an automatic lookup check the following: Do you have the correct read permission to the lookup definition and lookup table? If the permissions are correct, check the lookup table size (see step #3) If you are using the lookup command: Do you have permission to the lookup table or lookup definition? Does your lookup definition exist? Does your search runs fine with adding local=true to your lookup command? This means that your lookup isn't being replicated to the indexer cluster, see step #4. Rare that this happens, but check the lookup table size for the lookup listed in the automatic lookup and check if it exceeds the size defined in [replicationSettings] in distsearch.conf. If the lookup table exceeds whatever size is defined there, the lookup error comes up. Check if the lookup being used is in the deny list under distsearch.conf btool distsearch list replicationBlacklist --debug btool distsearch list replicationDenylist--debug Update: - I installed SSE app v3.7.1 on a new Nix host with Splunk v9.1.1 and I didn't see any lookup errors when I run a search. So I recommend you follow the troubleshooting steps above since I can't replicate the issue with a fresh app and Splunk install.
Hi @Thulasiraman, Can you please try below?  index="jenkins" sourcetype="json:jenkins" job_name="$env$_Group*" event_tag=job_event type=completed | eval rerunGroup = case("$group$"=="Group06", "*Gr... See more...
Hi @Thulasiraman, Can you please try below?  index="jenkins" sourcetype="json:jenkins" job_name="$env$_Group*" event_tag=job_event type=completed | eval rerunGroup = case("$group$"=="Group06", "*Group01*", "$group$"=="Group07", "*Group02*", "$group$"=="Group08", "*Group03*", "$group$"=="Group09", "*Group04*", "$group$"=="Group10", "*Group05*",1==1, "???") |''' table rerunGroup - This shows Group01 in the table ''' | search job_name=*$group$* OR job_name=rerunGroup | head 2 | dedup build_number | stats sum(test_summary.passes) as Pass | fillnull value="Test Inprogress..." Pass
index="_internal" [| makeresults | addinfo | eval earliest=relative_time(info_min_time, "-7d") | eval latest=relative_time(info_max_time, "-7d") | fields earliest latest]
Hi, I have image stored in sharepoint and i am trying to show it in dashboard. Since it is Splunk cloud i do not have access to place the image under static/app on Search Heads.Below is the code i a... See more...
Hi, I have image stored in sharepoint and i am trying to show it in dashboard. Since it is Splunk cloud i do not have access to place the image under static/app on Search Heads.Below is the code i am using in the dashboard but the image isnt coming up. I did check the url and it is loading the image <html> <centre> <img style="padding-top:60px" height="92" href="https://sharepoint.com/:i:/r/sites/Shared%20Documents/Pictures/Untitled%20picture.png?csf=1&amp;web=1&amp;e=CSz2lp" width="272" alt="Terraform "></img> </centre> </html>  
Hello, Are data transfer costs built into the cost model for Splunk Archive?   Customer is concerned about surprises (in the form of a bill, or data caps) associated with freezing their data into th... See more...
Hello, Are data transfer costs built into the cost model for Splunk Archive?   Customer is concerned about surprises (in the form of a bill, or data caps) associated with freezing their data into the Splunk managed archive solution
I want to write a splunk query which will run over the same timewindow but on a different date selected in the datetime picker.  For ex. lets say I select 8th aug 10am to 8th august 10:15am range in... See more...
I want to write a splunk query which will run over the same timewindow but on a different date selected in the datetime picker.  For ex. lets say I select 8th aug 10am to 8th august 10:15am range in the datepicker my query should give me result for the timewindow 1st aug 10am to 1st aug 10:15am.
Hi @PickleRick , I changed something in the transforms.conf: # set host=host.name [set_hostname_logstash] REGEX = \"host\":\{\"name\":\"([^\"]+) FORMAT = host::$1 DEST_KEY = MetaData:Host # set so... See more...
Hi @PickleRick , I changed something in the transforms.conf: # set host=host.name [set_hostname_logstash] REGEX = \"host\":\{\"name\":\"([^\"]+) FORMAT = host::$1 DEST_KEY = MetaData:Host # set source=log.file.path per linux_audit e linux_secure [set_source_logstash_linux] REGEX = \"log\":\{\"file\":\{\"path\":\"([^\"]+) FORMAT = source::$1 DEST_KEY = MetaData:Source [set_sourcetype_logstash_linux_audit] REGEX = . #REGEX = \"log\":\{\"file\":\{\"path\":\"(/var/log/audit/audit.log)\" CLONE_SOURCETYPE = linux_audit [set_sourcetype_logstash_linux_secure] #REGEX = \"log\":\{\"file\":\{\"path\":\"(/var/log/secure)\" REGEX = . CLONE_SOURCETYPE = linux_secure [drop_dead_linux_audit] REGEX = . #REGEX = sourcetype:linux_audit DEST_KEY = queue FORMAT = nullQueue [drop_dead_linux_secure] #REGEX = sourcetype:linux_secure REGEX = . DEST_KEY = queue FORMAT = nullQueue [drop_dead_all] REGEX = . DEST_KEY = queue FORMAT = nullQueue [keep_matching_events_linux_audit] #REGEX = sourcetype:linux_audit REGEX = \"log\":\{\"file\":\{\"path\":\"(/var/log/audit/audit.log)\" DEST_KEY = queue FORMAT = indexQueue [keep_matching_events_linux_secure] REGEX = \"log\":\{\"file\":\{\"path\":\"(/var/log/secure)\" #REGEX = sourcetype:linux_secure DEST_KEY = queue FORMAT = indexQueue [cut_most_of_the_event_linux_audit] REGEX = (?ms).*\"message\":\"([^\"]+).* FORMAT = $1 DEST_KEY = _raw WRITE_META = true [cut_most_of_the_event_linux_secure] REGEX = (?ms).*\"message\":\"([^\"]+).* FORMAT = $1 DEST_KEY = _raw WRITE_META = true and now linux_audit is rcorrectly running, instead linux_secure sometimes loses the extra contents removal. I'm working to understand why. One additional question: when I'll have other data flows, they will be removed or they will remain with the original sourcetype(logstash)? Ciao and thank you very much for your help. Giuseppe
Try something like this | tstats earliest(_time) as earliest_event where earliest=-6mon latest=now [search index=_internal source=/opt/splunk/var/log/splunk/cloud_monitoring_console.log* TERM(logRes... See more...
Try something like this | tstats earliest(_time) as earliest_event where earliest=-6mon latest=now [search index=_internal source=/opt/splunk/var/log/splunk/cloud_monitoring_console.log* TERM(logResults:splunk-ingestion) | rename data.* as * | fields idx | rename idx as index] by index