All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi all, after the last Windows update (JAN-2022) a windows_TA input blacklist filter for security logevents does not work anymore. before it worked fine. the black filter looks like this: blacklist... See more...
Hi all, after the last Windows update (JAN-2022) a windows_TA input blacklist filter for security logevents does not work anymore. before it worked fine. the black filter looks like this: blacklist =EventCode="(4634|4672)" Message="Account\sName:\s+(?i)([\S+]+[\$]|serviceaccount1|serviceaccount2)" the blacklist should filter out computer accounts and other service accounts for certain eventcode. has someone the same problem/ can someone help with that? Thanks a lot
Hi All, I want to extract the following word from sentence: nodeUrl=https://sappbos.aexp.com/odata.svc/v1.0/BlazeoData/UddsSappSupplierPymntAccpts? nodeUrl=https://merchantcompass.aexp.com/odata.s... See more...
Hi All, I want to extract the following word from sentence: nodeUrl=https://sappbos.aexp.com/odata.svc/v1.0/BlazeoData/UddsSappSupplierPymntAccpts? nodeUrl=https://merchantcompass.aexp.com/odata.svc/v1.0/BlazeoData/MerchantCompassLookups Can someone please guide me.
Hi, I want to integrate my website to Splunk to track pageviews, session etc., is there there any tutorial about this? 
I am trying to get data into Splunk to show the members of the local / builtin windows groups. In particular "Administrators" and "Remote Desktop Users" Utilizing the Splunk Forwarder. I am using a ... See more...
I am trying to get data into Splunk to show the members of the local / builtin windows groups. In particular "Administrators" and "Remote Desktop Users" Utilizing the Splunk Forwarder. I am using a WMI (WQL) query to do this via wmi.conf (C:\Program Files\SplunkUniversalForwarder\etc\apps\Splunk_TA_windows\local\wmi.conf) This stanza currently works: (FYI: Fakenameofserver = hostname) disabled = 0 ## Run once per day ## edited interval = 86400 wql = ASSOCIATORS OF {win32_group.Domain="Fakenameofserver",Name="Administrators"} where assocClass=win32_groupuser Role=GroupCompOnent ResultRole=Partcomponent index = window I don't want to have to prefill the wql queries in the wmi.conf file with the server name on each server. How do i use an environmental or Splunk variable to replace "Fakenameofserver" with the name of the host the Splunk forwarder is running on. I have tried a number of combinations of $host, %host%, %servername%, %computername% etc etc. Everytime i restart the forwarder to force the query to run i don't get any data into splunk and the log file says: Error occurred while trying to retrieve results from a WMI query (error="Object cannot be found." HRESULT=80041002) (root\cimv2: ASSOCIATORS OF {win32_group.Domain="%VARIABLENAME%",Name="Remote Desktop Users"} where assocClass=win32_groupuser Role=GroupCompOnent ResultRole=Partcomponent)   Has anyone had success with this and can you suggest how i can get the stanza to resolve the variable into the value when it queries? Where should i define the variables (if required) and what syntax do i use when writing these in the wql query? Thanks for any suggestions.
hi why my background color property dont works while the font size property works? thanks   <row> <panel> <html> <style> <b> <font size="4" face="verdana" col... See more...
hi why my background color property dont works while the font size property works? thanks   <row> <panel> <html> <style> <b> <font size="4" face="verdana" color="blue" background-color="white"> <center> </center> </font> </b> </style> </html> </panel> </row>  
I am trying to find frequently used search filters from my application log. I have written a below query to extract a json from the log and store it in the search_filter variable index="*" "Searchi... See more...
I am trying to find frequently used search filters from my application log. I have written a below query to extract a json from the log and store it in the search_filter variable index="*" "Searching records" | rex field=_raw "(?P<search_filter>\{.*\})" | eval search_filter=replace(search_filter,"\\\\\"","\"") The resulting json in the search_filter variable for one event looks like below { "pageSize": 0, "offset": 0, "criteria":[ { "field": "status", "operator": "equalsIgnoreCase", "values":["REPROCESS"] }, { "field": "id", "operator": "equals", "values":["352353"] }] } Now I want to convert this json in the below format and then finally sort this array and list it in the table ordered by count. [status##equalsIgnoreCase,id##equals] I tried doing the below  index="*" "Searching records" | rex field=_raw "(?P<search_filter>\{.*\})" | eval search_filter=replace(search_filter,"\\\\\"","\"") | eval cfo=json_extract(search_filter, "criteria{}.field", "criteria{}.operator") | eval cf=json_extract(cfo,"{0}") | eval co=json_extract(cfo,"{1}") | eval cfos=mvzip(cf, co, "##") This results in cfos are like this which is not what I want. I am not able to use mvzip on the json array. ["status","id"]##["equalsIgnoreCase","equals"]  Any suggestions on how to go about it or is there a better way in finding  frequently used filters in my scenario.
Hello,  I'm currently working on configuring SSL from a UF sitting on a Windows server to a HF running on RHEL 7. I am using third party certs that I obtained from my lab windows PKI environment th... See more...
Hello,  I'm currently working on configuring SSL from a UF sitting on a Windows server to a HF running on RHEL 7. I am using third party certs that I obtained from my lab windows PKI environment that has two tiers of CA (RootCA/SubCA) with the HF having a signed cert (.pem file) as well as each UF sharing a single cert that was signed by the same Subordinated CA. I've managed to successfully achieve/confirm this SSL connection is happening using the Validate your configuration doc by Splunk, but the configuration seems to contradict itself so I would appreciate some insight into where I may have gone wrong that resulted in this.  The contradiction is within the requireClientCert parameter on the Splunk HF, as well as the clientCert parameter on the Universal Forwarder. I have tested several times after restarting the Splunk service on both workstations and this connection ONLY works when I have both clientCert configured within the outputs.conf as well as requireClientCert = false. If either of these variables are changed or I remove the clientCert parameter (even though it should be technically not required) I have a connection error.  For example, with the configuration below, if I changed requireClientCert = true the entire connection would fail, even though I believe the clientCert .pem file is configured correctly.  My certificate chains for each host are as follows: HF:  adcs.pem (shared with clients via deployment server, to be used for sslRootCaPath parameter) <intermediate (subordinate/issuing) ca certificate> <root ca certificate> heavyforwarder01.pem (certificate chain to be used for serverCert parameter) <hf01.pem cert issued by subordinate ca> <decrypted rsa key> <subca_pub.pem> <rootca_pub.pem> UF:  server.conf sslRootCAPath = C:/path/to/adcs.pem universalforwarder01.pem (certificate chain to be used for clientCert parameter) <uf01.pem certificate issued by subordinate CA> <encrypted rsa key for cert> <subca_pub.pem> <rootca_pub.pem> Configuration for outputs.conf on Universal Forwarder: [tcpout] defaultGroup=splhf01 [tcpout:splhf01] disabled=0 server = splhf01.domain.local:9998 clientCert = C:\path\to\universalforwarder01.pem sslPassword = <redacted> useClientSSLCompression = true sslCommonNameToCheck = splhf01.domain.local sslVerifyServerCert = true Configuration for server.conf on Universal forwarder: [sslConfig] sslRootCAPath = C:\path\to\combined\adcs.pem   Configuration for inputs.conf on Heavy Forwarder/Indexer: [default] host = splhf01 [splunktcp:9997] disabled = 0 [splunktcp-ssl:9998] disabled = 0 [SSL] serverCert = /opt/splunk/etc/auth/ssl/s2s/heavyforwarder01.pem requireClientCert = false sslVersions *,-ssl2 sslCommonNameToCheck = splhf01.domain.local Configuration for Heavy forwarder server.conf <..> [sslConfig] sslRootCAPath = /path/to/combined/adcs.pem <...>
Hi guys I am definitely a splunk novice. I want to run a search with the splunk REST API. it is a tstats on a datamodel. the issue i am facing is that the result take extremely long to return. when... See more...
Hi guys I am definitely a splunk novice. I want to run a search with the splunk REST API. it is a tstats on a datamodel. the issue i am facing is that the result take extremely long to return. when i run the same search on the front end its extremely fast but via the rest API for 3 results it takes between 7-10 min where as on the front end it returns the results quickly. my search is structured as follows: https://<server>:8089/services/search/jobs -d " search= |  tstats summariesonly=1 values(<value>) (there are a few values after this) from datamodel=<datamodel name> WHERE (some values for the values option before the from) | head 3" -d earliest= -5m@m -d latest =now -d output_mode=json So when running an index search on which the datamodel is built (which is slower on the front end) it returns results as soon as i run the /results endpoint but the datamodel search takes extremely long. Any ideas as to what my problem could be. The search does eventually return the results but it takes very long for the result size im requesting.
Hi guys I'm trying to run a search to the /jobs endpoint. however I get a  bash: syntax error near unexpected token `(' error message. my search has quotes in it for a | rex command and I tried e... See more...
Hi guys I'm trying to run a search to the /jobs endpoint. however I get a  bash: syntax error near unexpected token `(' error message. my search has quotes in it for a | rex command and I tried escaping the quotes with the \ but is till seem to get the issue. when using the \ I get a  <msg type="ERROR">Unparsable URI-encoded request data</msg> error. My search is structured as follows: |  tstats summariesonly=1 values(<values>) ....(there are a lot of these) from datamodel=<name> WHERE (some values for the previous section) | lookup <lookup> | rex field=<name> "(?<new field name>[^.]{9}$)" ...  there are about 4 lookups in total and 2 rex command. however when i try to escape in the rex command I get the Unparsebale URI error.   Anybody come across this error before?  
I'm still new, and struggling with the following. I am looking at a set of data from three probes. If all three probes show success of "down" for the reporting period than I would like the result to ... See more...
I'm still new, and struggling with the following. I am looking at a set of data from three probes. If all three probes show success of "down" for the reporting period than I would like the result to be "down", but if one or more have a success of "up" than it should be considered "up" for that time span. For instance: 12:00 - Probe1 - Up 12:00 - Probe2 - Down 12:00 - Probe3 - Up 13:00 - Probe1 - Down 13:00 - Probe2 - Down 13:00 - Probe2 - Down So, for 12:00 the value would be Up, and for 13:00 the value would be down...
What i would like to do is to take this form from regedit, and splash it into Splunk. I have exported data from \WMI\Autologger level, put a sort of serial number at each "line" and tried to co... See more...
What i would like to do is to take this form from regedit, and splash it into Splunk. I have exported data from \WMI\Autologger level, put a sort of serial number at each "line" and tried to convert it into .csv When i Add Data and do a index once file, i get this My main request is to be able to do a table with all that information spreaded on multiple columns. Would help me alot. What can i do or moddify in my file so splunk could know what is what and what is going where. Should i leave it like this, or woud be better to be multiple info in the same "row"? Instead of this: I'm thinking more about at this example:  How can i tell Splunk to ket data like this?   Thanks for anyone who reads this time.
Hi, all! How could I edit my search command in order to filter this table which will display the earliest time of the same value(Call_Session_ID)?  Here is my original search command: index="hkciv... See more...
Hi, all! How could I edit my search command in order to filter this table which will display the earliest time of the same value(Call_Session_ID)?  Here is my original search command: index="hkcivr" source="/appvol/wlp/DIVR01HK-AS01/applogs/progresshk.log*"| table Time Call_Session_ID  
Hi I have two result like this   REQ Name                        count  Node1.Node2     100 Node3.Node4     500   RSP Name                        count  Node2.Node1     60 Node4.Node3     ... See more...
Hi I have two result like this   REQ Name                        count  Node1.Node2     100 Node3.Node4     500   RSP Name                        count  Node2.Node1     60 Node4.Node3     400     How can I compare them on timechart ? e.g.  put them on timechart so I can see Node2 recieve 100 REQ but response to 60 of them. need to put them all on timechart.   Any idea? Thanks,
Hi I have two field that extract send & rec like this: | rex "S\[(?<SEND>\w+\.\w+)" | rex "R\[(?<REC>\w+\.\w+)"   now have 2 query like this: |table SEND count SERVER1.HUB       10 |table REC ... See more...
Hi I have two field that extract send & rec like this: | rex "S\[(?<SEND>\w+\.\w+)" | rex "R\[(?<REC>\w+\.\w+)"   now have 2 query like this: |table SEND count SERVER1.HUB       10 |table REC  count HUB.SERVER1      50 need to combine them and excpected result is a sankey diagram like this: SERVER1(10)> HUB(50) >SERVER1   FYI: comon value between result string is HUB. any idea? Thanks,
I've installed Splunk as Standalone and I'm trying to run Splunk commands under /opt/splunk and they didn't work. My question is what is the path/folder that I should be at to run Splunk commands li... See more...
I've installed Splunk as Standalone and I'm trying to run Splunk commands under /opt/splunk and they didn't work. My question is what is the path/folder that I should be at to run Splunk commands like:  splunk show splunkd-port splunk show web-port splunk show servername
Hi Community, I need to move current data from one of my indexes into an S3 bucket. Is that possible? I read about the SmartStore feature. However, I need to move the data into another location a... See more...
Hi Community, I need to move current data from one of my indexes into an S3 bucket. Is that possible? I read about the SmartStore feature. However, I need to move the data into another location after the index reaches a particular size. The problem is that I have an index growing up like crazy. I need to have the data available for at least one year and be able to perform searches for at least six months. And of course, local storage is too expensive.  So, I am getting confused here on what would be the best approach to follow since that's the only index causing me issues. In 60 days, I got around 300GB of data. Thanks,
Hello, I have been trying to find a way to get internet service provider (ISP) information from IPs collected from a honey pot project I'm working on and have them displayed on a dashboard. I have se... See more...
Hello, I have been trying to find a way to get internet service provider (ISP) information from IPs collected from a honey pot project I'm working on and have them displayed on a dashboard. I have seen a few Splunk apps that could do this, however, they all seem to be out of date. If anyone could point me in the right direction that would be much appreciated.  Thank you.
I have a json raw string from which I have to extract the           "Source device","values":[{"ip":            key and pair value. Can you please assist. The log line looks like below: "Source devi... See more...
I have a json raw string from which I have to extract the           "Source device","values":[{"ip":            key and pair value. Can you please assist. The log line looks like below: "Source device","values":[{"ip":"10.10.10.10","mac"  I want to extract the ip address: 10.10.10.10
Hi, I have registered successfully for phantom and also got a link to download phantom but could not find Phantom OVA (blank). Can anyone help me why I have not got the ova file?    
Hi Splunkers, I am integrating Jamf pro logs to splunk using Jamf Pro Add-on for Splunk (https://splunkbase.splunk.com/app/4729/). I have defined the inputs as per the documentation. However, I am ... See more...
Hi Splunkers, I am integrating Jamf pro logs to splunk using Jamf Pro Add-on for Splunk (https://splunkbase.splunk.com/app/4729/). I have defined the inputs as per the documentation. However, I am not getting logs on indexer. I can see below errors in splunkd.log and jamf_pro_addon_for_splunk_jamfcomputers.log.  Any suggestions regarding this will be greatly appreciated. Thanks, Error 1: 2022-01-28 13:44:58,590 ERROR pid=10751 tid=MainThread file=base_modinput.py:log_error:309 | Get error when collecting events. Error 2: