All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I'm looking for support on my $xmlregex Blacklist. I have checked as many previous tickets as I can and I'm still stuck. It works when I put the events into regex101 which is why I'm so confused. T... See more...
I'm looking for support on my $xmlregex Blacklist. I have checked as many previous tickets as I can and I'm still stuck. It works when I put the events into regex101 which is why I'm so confused. This is what I have ended up with: [WinEventLog://Microsoft-Windows-PowerShell/Operational] disabled = 0 start_from = oldest renderXml = 1 # 4100 Error Log | 4104 Script Block whitelist = 4104,4100 blacklist = $xmlRegex= $\<EventID\>(?:4104|4100)\<\/EventID\>.*\<Data\sName='ScriptBlockText'\>[\S\s]*[C-Z]:\\Program(?:\sFiles|Data)(\s\(x86\))?\\(?:qualys|Nexthink|uniFLOW\sSmartClient)\\$ blacklist1 = $xmlRegex= $\<EventID\>(?:4104|4100)\<\/EventID\>.*\<Data\sName='ScriptBlockText'\>[\S\s]*[C-Z]:\\Windows\\ccm\\$ I've had to use [\S\s]* because the it's a PowerShell script which has carriage returns in. Any help would be massively appreciated. Thanks! 
Hi Splunkers,    I would like to pass the label value to the macro based on some condition, when a single value is selected, the value is correctly passed to macro and search is loading the results ... See more...
Hi Splunkers,    I would like to pass the label value to the macro based on some condition, when a single value is selected, the value is correctly passed to macro and search is loading the results but when the multiple values were selected the search is throwing error in macro. </input> <input type="multiselect" token="machine" searchWhenChanged="true"> <label>Machine type</label> <choice value="*">All</choice> <choice value="VDI">VDI</choice> <choice value="Industrial">Industrial</choice> <choice value="Standard">Standard</choice> <choice value="MacOS">MacOS</choice> <choice value="**">DMZ</choice> <default>*</default> <initialValue>*</initialValue> <delimiter>, </delimiter> <change> <condition match="$label$ == &quot;*DMZ*&quot;"> <set token="machine_type_dmz">"mcafee_DMZ=DMZ"</set> </condition> <condition match="$label$ != &quot;*DMZ*&quot;"> <unset token="machine_type_dmz"></unset> </condition> </change> </input> Thanks in Advance!
hi i try to use a table icon viz like below in the static folder i have put the "table_icons_rangemap.js" and the 'table_decorations.css" files I call these file in my xml like this : <dashbo... See more...
hi i try to use a table icon viz like below in the static folder i have put the "table_icons_rangemap.js" and the 'table_decorations.css" files I call these file in my xml like this : <dashboard version="1.1" script="table_icons_rangemap.js" stylesheet="table_decorations.css"> when I run the dashboard nothing happens  I just have severe, high instead an icon I use 9..1.0.1 Splunk Enterprise version is anybody cant help please?? thanks  
I have installed the latest splunk with Splunk enterprise security on it. I have worked with enterprise security before, and there were some filters available to filter incidents, now in this versio... See more...
I have installed the latest splunk with Splunk enterprise security on it. I have worked with enterprise security before, and there were some filters available to filter incidents, now in this version 7.3.0 there are no filters,    Is there anything wrong I am doing?  
Hello Splunk experts, I would like to know is there an API which can access all events which are generating in Splunk irrespective of search? Please suggest! Thank you in advance. Regards, Eshwar... See more...
Hello Splunk experts, I would like to know is there an API which can access all events which are generating in Splunk irrespective of search? Please suggest! Thank you in advance. Regards, Eshwar 
We currently use a User service account to bind with Splunk for LDAP authorization. Is there a way to use Active Directory Managed Service Accounts instead to reduce the overhead of maintaining pass... See more...
We currently use a User service account to bind with Splunk for LDAP authorization. Is there a way to use Active Directory Managed Service Accounts instead to reduce the overhead of maintaining passwords?
Hi all, I’m a Splunk beginner, I want to show and hide corresponding pie charts using check box. Can someone please guide me on how to achieve this? Any help or example queries would be greatly appre... See more...
Hi all, I’m a Splunk beginner, I want to show and hide corresponding pie charts using check box. Can someone please guide me on how to achieve this? Any help or example queries would be greatly appreciated. Thank You!
Does anyone know how to invoke a macro on Splunk Cloud using Rest API?  I am using following command but it always returns the output as "No matching fields exist." . I am able to run the same mac... See more...
Does anyone know how to invoke a macro on Splunk Cloud using Rest API?  I am using following command but it always returns the output as "No matching fields exist." . I am able to run the same macro directly from Splunk Search Page and it does return results. curl -k -u uswer:"password" -k https://company.splunkcloud.com:8089/services/search/v2/jobs/export -d exec_mode=oneshot -d search="\`lastLoginStatsByUserProd(userid,7)\`" -d output_mode=json
Hi, we had deployed cloud flare ta app on one of our sh,could anyone help me in fixing the logs parsing issue in splunk. App link splunkbase.splunk.com/app/5114 Thanks
Database logs  on a dashboard is not showing in splunk. Is there anything i can do to make it work 
Greetings! We are trying to generate a table after we got output from a Splunk query. We are trying pipe (|) this to our query but do not know how to do this. Can someone assist?  This is the outpu... See more...
Greetings! We are trying to generate a table after we got output from a Splunk query. We are trying pipe (|) this to our query but do not know how to do this. Can someone assist?  This is the output after we ran our Splunk query, Feb 13 20:36:21 hostname1 sshd[100607]: pam_unix(sshd:session): session opened for user user123 by (uid=0) Feb 13 20:36:23 hostname2 sshd[100608]: pam_unix(sshd:session): session opened for user user345 by (uid=0) We want to capture the table in this form, Time                                   Hosts                       Users Feb 13 20:36:21       hostname1                user123 Feb 13 20:36:23       hostname2                user345 And so on.. How do we do this. Thank you in advance!
I need some help updating the mmdb file for the iplocation command. Ive read the other forum questions regarding this, as well as the docs, and i am a bit confused.    I initially uploaded the new ... See more...
I need some help updating the mmdb file for the iplocation command. Ive read the other forum questions regarding this, as well as the docs, and i am a bit confused.    I initially uploaded the new mmdb file from MaxMind, the GeoLite2-City.mmdb. I uploaded it through the GeoIP panel through the lookups tab.    It uploads, but i cant seem to find the file afterwards. I am looking on the specific server that I uploaded the file to, we have a clustered environment, but that one specific server I uploaded it to should have it. I ran locate and find commands, but could not locate it. We still have the original under $SPLUNK_HOME$/share/dbip-city-lite.mmdb   Even though the dropbox for the mmdb file showed a successful upload, I can not find it anywhere.  I dont see any trace of the upload through splunkd, or through /export/opt/splunk/var/run/splunk/upload/ , or through any find or locate command.  I wanted to update the file path to include both databases, and i know i needed to change the limits.conf file, and update it to include both paths. But the question is, How do i change the limits.conf so that it replicates. We dont have any app named TA-geoisp or anything similar, and thats what these forums and docs reference.   Somewhere I saw that I could update the search app's limits.conf and just push that from the shcluster directory, as that will push a bundle change that will push out to all Search heads in the cluster. Since the search app is the default app, we could just use that app to point to the mmdb files. But we don't have the search app located under our /$SPLUNK_HOME$/etc/shcluster/apps/   We dont seem to have the search app under our Clustermaster/Deployer shcluster directory. I think i might be missing something. I would basically just like to update the limits.conf to point to the new dir path of both of the mmdb files. Id like to just edit the limits.conf to look like:     [iplocation] MMDBPaths = /path/to/your/GeoIP2-City.mmdb,/path/to/your/dbip-city-lite.mmdb       The question im trying to ask here, is when i upload the file through the gui, where does the file end up. And if i wanted to push these changes manually,  if i wanted to push to all SH and indexers from the deployer and deployment server, how do i go about replicating the folder that holds the mmdb as well as the limits.conf that hold the paths to the files.    Thank you for any assistance.   
I am relatively new to the Splunk coding space so bare with me in regards to my inquiry. Currently I am trying to create a table, each row would have the _time, host, and a unique field extracted fr... See more...
I am relatively new to the Splunk coding space so bare with me in regards to my inquiry. Currently I am trying to create a table, each row would have the _time, host, and a unique field extracted from the entry: _Time   Host                         Field-Type       Field-Value 00:00    Unique_Host_1   F_Type_1        F_Type_1_Value 00:00    Unique_Host_1   F_Type_2        F_Type_2_Value 00:00    Unique_Host_1   F_Type_3        F_Type_3_Value 00:00    Unique_Host_2   F_Type_1        F_Type_1_Value 00:00    Unique_Host_2   F_Type_2        F_Type_2_Value 00:00    Unique_Host_2   F_Type_3        F_Type_3_Value .. The data given for each server: Field-Type=F_Type_1,.....,Section=F_Type_1_Value Field-Type=F_Type_2,.....,Section=F_Type_2_Value Filed-Type=F_Type_3,.....,Section=F_Type_3_Value  I have created 3 field extractions for F-Type Values: (.|\n)*?\bF_Type_1.*?\b Section=(?<F_Type_1_Value>-?\d+) This is what I have done so far for the table: index="nothing" source-type="nothing" | first( F_Type_1) by host I am not sure this is the best approach, and I can also refine the field extraction if needed. Generally, my thought process follows: Source | Obtain first entries for all the hosts | Extract fields values | Create table But I am currently hitting a road block in the syntax to create rows for each of the unique Field-Types and their value.   
Hello! I am trying to send data to Splunk using UDP, I tried to set it up using the documentation and seen a few videos on how to set it up but can't get it right. I have the data coming into my HF ... See more...
Hello! I am trying to send data to Splunk using UDP, I tried to set it up using the documentation and seen a few videos on how to set it up but can't get it right. I have the data coming into my HF from network devices and then should be sent to my indexers. After going through the set up I get this error message "Search peer splunk_indexer_02 has the following message: Received event for unconfigured/disabled/deleted index=<index> with source="source::udp:514" host="host::xx.xx.xx.xx" sourcetype="sourcetype::<sourcetype>. So far received events from 2 missing index(es)." I created a new index during the set up but there is no data to search.
Hi, I am working my way through some of the splunk courses. I am currently on "working with time". In one of the videos the following command is used to find all results within the past day, roundi... See more...
Hi, I am working my way through some of the splunk courses. I am currently on "working with time". In one of the videos the following command is used to find all results within the past day, rounding down. "| eval yesterday = relative_time(now(),"1d@h")". However when I attempt this command myself, it simply prints the "yesterday" value however it uses the time specified in my time picker, not in the actual command. I was under the impression that any time specified within a command would automatically overwrite the time picker. Was I mistaken in this? Or am I perhaps using the command incorrectly? Any help would be greatly appreicated.
My company is transitioning from an on-premise MFA setup within ADFS to the Azure MFA setup.  What's the best approach to getting those MFA events into Splunk?  Does the Splunk Addon for Microsoft Az... See more...
My company is transitioning from an on-premise MFA setup within ADFS to the Azure MFA setup.  What's the best approach to getting those MFA events into Splunk?  Does the Splunk Addon for Microsoft Azure (splunkbase 3757) meet that goal?  
Been struggling for a while on this one. On-prem Splunk Enterprise.  v9.1.2, running on CentOS 7.9 -- Just trying to find a consistent way to be able to upload log files through HTTP Event Collect... See more...
Been struggling for a while on this one. On-prem Splunk Enterprise.  v9.1.2, running on CentOS 7.9 -- Just trying to find a consistent way to be able to upload log files through HTTP Event Collector (HEC) tokens.  I found the whole RAW vs JSON thing confusing at first and thought the only way to be able to specify/override values like host, sourcetype, etc. was to package up my log file in the JSON format. Discovered today that you can specify those values in the RAW url, like so: https://mysplunkinstance.com:8088/services/collector/raw?host=myserver&sourcetype=linux_server which was encouraging.  It seemed to work. And I think I've gotten further ahead.  I now have this effectively, as my curl command running in a bash script: curl -k https://mysplunkinstance.com:8088/services/collector/raw?host=myserver&sourcetype=linux_server -H "Authorization: Splunk <hec_token>" -H "Content-type: plain/text" -X 'POST' -d "@${file}" Happy to report that I now see the log data. However, it only seems happy if its a single line log.  When I give it a log file with more lines, it just jumbles it all together.  I thought it would honour the configuration rules we have programmed for sourcetype=linux_secure (from community add-ons and our own updates) but it doesn't.  Loading the same file through Settings -> Add Data has no problem properly line-breaking per the configuration. I'm guessing there is something I am missing then in how one is meant to send RAW log files through HEC?
Anyone know how and what path to query on splunkcloud instance to pull existing SAML configuration details and certificate? I can view the information by browsing to settings -> authentication metho... See more...
Anyone know how and what path to query on splunkcloud instance to pull existing SAML configuration details and certificate? I can view the information by browsing to settings -> authentication method -> SAML -> SAML configuration. I want to be able to export that information if it is captured in a file as a backup prior to migrating to different authentication method.  Thanks in advance.  
I have a distributed environment with 2 independent search heads.  I run the same search on both, and one shows a field that the other does not.  I can't figure out why.  I can't find any data models... See more...
I have a distributed environment with 2 independent search heads.  I run the same search on both, and one shows a field that the other does not.  I can't figure out why.  I can't find any data models that mention the index or sourcetype I'm searching.  Is there a way to show me if a data model is being used in my search? The logs are coming from an IBM i-series system using syslog through sc4s.
hello all, I have an app that to perform an action I cant insert the required parameter as a list. but as a string. this is a bit issue because I am using data value from action results as the para... See more...
hello all, I have an app that to perform an action I cant insert the required parameter as a list. but as a string. this is a bit issue because I am using data value from action results as the parameter to insert, for example:  "my_App_action:action_result.data.*.device_id" and as far as I understand, action_result.data collection is always an array. so I can not use directly this action results returned parameter as a parameter to insert to my action. the only workaround I found is to add a code block that gets the datapath-parameter as input, and outputs the value_name[0]. is there a better workaround for this?