All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi All, I am trying to create a dashboard with total calls to a particular business transaction using ADQL query. I am able to fetch this from All applications, except from Lambda application, altho... See more...
Hi All, I am trying to create a dashboard with total calls to a particular business transaction using ADQL query. I am able to fetch this from All applications, except from Lambda application, although the transactions are showing in application dashboard, but unable to get the count using query. Also unable to list the BTs of that application using query.  The same query is working for other application and business transactions. PFB the query which I used. SELECT count(*) FROM transactions WHERE application = "test-1" AND transactionName = "/test/api/01/" Please check and let me know, why I am not able to pull this.  Regards Fadil
many thanks @jawahir007 
| rex field=raw_msg max_match=0 "(?<=\(|]\\\\;)(?<group>[^:]+:status:[^:]*:pass_condition\[[^\]]*\]:fail_condition\[[^\]]*\]:skip_condition\[[^\]]*)\]"
Hi @tomjb94 , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Poi... See more...
Hi @tomjb94 , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Where are you seeing that?  In what context? Splunk doesn't process audio.  Please tell us more about the use case and the issue.
Thanks @ITWhisperer , the additional backslash seems to be doing the trick for rex command but still no luck having this worked with transforms.conf mv_add=true setting. Basically i need this fields ... See more...
Thanks @ITWhisperer , the additional backslash seems to be doing the trick for rex command but still no luck having this worked with transforms.conf mv_add=true setting. Basically i need this fields to be available at search time hence trying to figure out a way for that.  And when you say extract each group of fields as a whole what you mean by that. Can you please help me with an example to better understand that approach ?
Hi Giussepe,  Many thanks for your response, its greatly appreciated. I need the rex to be dynamic regardless of the particular timestamp in the original message i sent, its going to be a saved searc... See more...
Hi Giussepe,  Many thanks for your response, its greatly appreciated. I need the rex to be dynamic regardless of the particular timestamp in the original message i sent, its going to be a saved search. In addition, when i run this i get 0 results despite running exactly within the timestamp of that particular message in Splunk. I think this search may be quite expensive on our indexers, so for now i'll just get this working with the existing extracted fields. Thanks again, Tom
There is an inbuilt package available with in Splunk ES.. You can follow the below steps to configure the Enterprise Security specific indexes in to the indexers On the Enterprise Security menu ba... See more...
There is an inbuilt package available with in Splunk ES.. You can follow the below steps to configure the Enterprise Security specific indexes in to the indexers On the Enterprise Security menu bar, select Configure > General > General Settings. Scroll to Distributed Configuration Management, and click Download Splunk_TA_ForIndexers . Select the contents for the package. You must select at least one of the following options to download the package. (Optional) Select the check box for Include index time properties to include the props.conf and transforms.conf files in the package. (Optional) Select the check box for Include index definitions to include the indexes.conf file in the package. Click Download the Package to create and download the Splunk_TA_ForIndexers. After the add-on downloads, you can modify the contents of the package. For example, modify indexes.conf to conform with site retention settings and other storage options. Use the cluster master to deploy the Splunk_TA_ForIndexers or add-ons to the cluster peers. See Manage common configurations across all peers and Manage app deployment across all peers in Managing Indexers and Clusters of Indexers. When you install a new add-on to use with Enterprise Security, repeat these steps to create an updated version of Splunk_TA_ForIndexers.   Refer this link for more details : https://docs.splunk.com/Documentation/ES/7.3.2/Install/InstallTechnologyAdd-ons#Create_the_Splunk_TA_ForIndexers_and_manage_deployment_manually
In general, it is better to use  entires  with  xxxxxx.mystack.splunkcloud.com where mystack is your stackid HEC : https://http-inputs.mystack.splunkcloud.com/  
Hi, What does session end reason = aged-out mean? We are facing one way audio issue. Could this be possibly a reason?  thanks  
I was able to fix my issue.  I simply added the "rename" function in my main table search. | advhunt cred=all renew=True query="DeviceProcessEvents | where Timestamp > ago(30d) | where FileName h... See more...
I was able to fix my issue.  I simply added the "rename" function in my main table search. | advhunt cred=all renew=True query="DeviceProcessEvents | where Timestamp > ago(30d) | where FileName has 'file.exe' | project DeviceName, FileName, ProcessCommandLine, FolderPath, AccountName" | spath input=_raw | stats count by AccountName,DeviceName | sort -count   | advhunt cred=all renew=True query="DeviceProcessEvents | where Timestamp > ago(30d) | where FileName has 'file.exe' | project DeviceName, FileName, ProcessCommandLine, FolderPath, AccountName" | spath input=_raw | rename AccountName as user | stats count by user,DeviceName | sort -count  
Hello @rukshar  if you have self-signed certificate in your local network then you have add those CA CERT Chain to below locations: 1) /opt/splunk/lib/python3.7/site-packages/certifi And 2) /etc... See more...
Hello @rukshar  if you have self-signed certificate in your local network then you have add those CA CERT Chain to below locations: 1) /opt/splunk/lib/python3.7/site-packages/certifi And 2) /etc/apps/<APP_FOLDER>/lib/certify Check if this resolves your problems, this documentation : https://splunk.my.site.com/customer/s/article/Office-365-Add-on-not-ingesting-any-events-and-throwing-SSL can help you understand ERROR its of splunk built add-on but yes same solution can be applied in your case as well. If this helps you please mark this as answer.
First things first. What does your raw data looks like? The sample you pasted - is it one event or are these multiple events? Where from and how are you getting this data? Because it looks as if it w... See more...
First things first. What does your raw data looks like? The sample you pasted - is it one event or are these multiple events? Where from and how are you getting this data? Because it looks as if it was XML horribly butchered by spliting into single lines and sending each line separately. And that's first thing that should be fixed instead of trying to do walkarounds in search time.
Try it this way around | spath output=RAM ResourceInfo.RAM | rex field=RAM max_match=0 "\"(?<tmp>[^\"]+\":[\d\.]+)" | mvexpand tmp | rex field=tmp "(?<component>[^\"]+)\":(?<Value>[\d\.]+)" | table... See more...
Try it this way around | spath output=RAM ResourceInfo.RAM | rex field=RAM max_match=0 "\"(?<tmp>[^\"]+\":[\d\.]+)" | mvexpand tmp | rex field=tmp "(?<component>[^\"]+)\":(?<Value>[\d\.]+)" | table component Value
Hi @tomjb94 , yes obviously, yu have to extract the fields using regexes. I can help you, with the following regex that extract all the values but the orderCode, that I don't know with part of the ... See more...
Hi @tomjb94 , yes obviously, yu have to extract the fields using regexes. I can help you, with the following regex that extract all the values but the orderCode, that I don't know with part of the logs is, if you want my help about this, please, highlight this value in your logs using bold.. Anyway, you can use a search like the following (except orderCode):   index=test | rex "^\[2024-09-10 07:27:46\.424 \(TID:(?<merchantCode>\d+).*\<subState\>(?<subState>\w+).*\<subCountryCode\>(?<subCountryCode>\d+)" | search merchantCode=MERCHANTCODE1 subCountryCode=* subState=* | stats count by merchantCode subCountryCode subState   You can test the regex at https://regex101.com/r/KZMUxp/1  Then it isn't so clear for me if you need also the other fields (SubState, SubCountryCode, SubCity, PFID, SubName, SubID, SubPostalCode, SubTaxID). If yes, you have to extract all of them, if you want my help, please indicate the part of log of each of them. Ciao. Giuseppe
Hi, I am currently working on an nginx plus as ingress controller for my kubernetes and using sc4s to forward logs to splunk enterprise. However I notice that sc4s does not forward all of logs includ... See more...
Hi, I am currently working on an nginx plus as ingress controller for my kubernetes and using sc4s to forward logs to splunk enterprise. However I notice that sc4s does not forward all of logs include the approtect WAF and DoS. Does the WAF and DoS require special setup to forward logs? I tried with syslog-ng https://github.com/nginxinc/kubernetes-ingress/blob/v3.6.2/examples/ingress-resources/app-protect-dos/README.md like this example but the logs is not showing on splunk enterprise. Thanks.
Hi All -  I need help with a fairly complex search i am being asked to build by a user. The ask is that the below fields are extracted from this XML sample: [2024-09-10 07:27:46.424 (TID:14567... See more...
Hi All -  I need help with a fairly complex search i am being asked to build by a user. The ask is that the below fields are extracted from this XML sample: [2024-09-10 07:27:46.424 (TID:14567876)] <subMerchantData> [2024-09-10 07:27:46.424 (TID:dad4d2e725854048)] <pfId>499072</pfId> [2024-09-10 07:27:46.424 (TID:145767627)] <subName>testname</subName> [2024-09-10 07:27:46.424 (TID:dad4d2e725854048)] <subId>123456</subId> [2024-09-10 07:27:46.424 (TID:145767627)] <subStreet>1 TEST LANE</subStreet> [2024-09-10 07:27:46.424 (TID:145767627)] <subCity>HongKong</subCity> [2024-09-10 07:27:46.424 (TID:145767627)] <subState>HK</subState> [2024-09-10 07:27:46.424 (TID:dad4d2e725854048)] <subCountryCode>344</subCountryCode> [2024-09-10 07:27:46.424 (TID:dad4d2e725854048)] <subPostalCode>1556677</subPostalCode> [2024-09-10 07:27:46.424 (TID:dad4d2e725854048)] <subTaxId>-15566777</subTaxId> [2024-09-10 07:27:46.424 (TID:14567876)] </subMerchantData> This search doesn't pull anything back, i believe because they are not extracted fields index=test merchantCode=MERCHANTCODE1 subCountryCode=* subState=* orderCode=* | stats count by merchantCode subCountryCode subState orderCode In addition to these fields SubState, SubCountryCode, SubCity, PFID, SubName, SubID, SubPostalCode, SubTaxID However i'm not sure how this can be fulfilled, could anyone support with writing a search that would allow me to extract this info within a stats count? Thanks, Tom
Hi splunkers ! I m facing an issue that is going to make me crazy ! I've got to set the timestamp in the following logs (timestamp field is the 11th field, the first one being the insert time by the ... See more...
Hi splunkers ! I m facing an issue that is going to make me crazy ! I've got to set the timestamp in the following logs (timestamp field is the 11th field, the first one being the insert time by the proxy himself)  : 2024-09-16T13:12:54+02:00 Logging-Client  "-1","username","1.2.3.4","POST","872","2211","www.facebook.com","/csp/reporting/","OBSERVED","","1726484997","2024-09-16 11:09:57","https","Social Networking","application/x-empty","","Minimal Risk","Remove 'X-Forwarded-For' Header","200","10.97.5.240","","","Firefox","102.0","Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:102.0) Gecko/20100101 Firefox/102.0","firefox.exe","1.2.3.4","443","US","","t","t","t","f","f","computerName","","1.2.3.4","1.2.3.4","8080" So, I'm using a regex to extract fields and set the real timestamp in my props.conf :  [mySourcetype] SHOULD_LINEMERGE = false EXTRACT-mySourcetype = ^[^,\n]*,"(?P\w+)","(?P[^"]+)","(?P\w+)","(?P[^"]+)[^,\n]*,"(?P[^"]+)[^,\n]*,"(?P[^"]+)","(?P(?=\s*)|[^"]+)","(?P[^"]+)","(?P(?=\s*)|[^"]+)","(?P[^"]+)","(?P[^"]+)","(?P[^"]+)","(?P[^"]+)","(?P(?=\s*)|[^"]+)","(?P(?=\s*)|[^"]+)","(?P[^"]+)","(?P(?=\s*)|[^"]+)","(?P[^"]+)","(?P[^"]+)","(?P(?=\s*)|[^"]+)","(?P(?=\s*)|[^"]+)","(?P[^"]+)","(?P(?=\s*)|[^"]+)","(?P(?=\s*)|[^"]+)","(?P[^"]+)","(?P[^"]+)","(?P[^"]+)","(?P[^"]+)","(?P(?=\s*)|[^"]+)","(?P[^"]+)","(?P[^"]+)","(?P[^"]+)","(?P[^"]+)","(?P[^"]+)","(?P[^"]+)","(?P(?=\s*)|[^"]+)","(?P[^"]+)","(?P[^"]+)","(?P[^"]+)"$ TIME_PREFIX = (?:[^,]+,){11} TIME_FORMAT = %Y-%m-%d %H:%M:%S     Then, Ive got different results based on different source: Upload a file directly in the search head                      Extraction    Ok         Timestamp    OK File red from an universal forwarder                       Extraction     OK           Timestamp  Failed   The is NO heavy forwarder between the UF and the indexers. The props.conf is deployed only on the SearchHeads. So, Something is tricky here !   If someone got an idea, I will apreciate ! Cheers.
Hi @Ram2 ...May i know what happens when you try this props: SHOULD_LINEMERGE=false LINE_BREAKER=([\r\n]+) TIME_PREFIX=^  
Table SPL: | advhunt cred=all renew=True query="DeviceProcessEvents | where Timestamp > ago(30d) | where FileName has 'file.exe' | project DeviceName, FileName, ProcessCommandLine, FolderPath, Accou... See more...
Table SPL: | advhunt cred=all renew=True query="DeviceProcessEvents | where Timestamp > ago(30d) | where FileName has 'file.exe' | project DeviceName, FileName, ProcessCommandLine, FolderPath, AccountName" | spath input=_raw | stats count by AccountName,DeviceName | sort -count Source Code of Panel: { "type": "splunk.table", "options": { "count": 100, "dataOverlayMode": "none", "drilldown": "none", "showRowNumbers": false, "showInternalFields": false }, "dataSources": { "primary": "ds_xxxxx" }, "title": "File.exe (Last 30 Days)", "eventHandlers": [ { "type": "drilldown.linkToSearch", "options": { "query": "| inputlookup lookuptable where field1=$row.user.value$\n| table field1, field2", "earliest": "auto", "latest": "auto", "type": "custom", "newTab": true } } ], "context": {}, "showProgressBar": false, "showLastUpdated": false } SPL for search on click: | inputlookup lookuptable where field1=$row.user.value$ | table field1, field2