All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

When you send valid json object you can query any data what it contains. You could utilize those keys or use just any words which it has.
Hi there are couple of versions which you can use. https://splunkbase.splunk.com/app/4355 https://github.com/paychex/Splunk.Conf19 just add git client and do regular git add + git commit + git p... See more...
Hi there are couple of versions which you can use. https://splunkbase.splunk.com/app/4355 https://github.com/paychex/Splunk.Conf19 just add git client and do regular git add + git commit + git push from script This needs more manual works when you need to restore those into SHC. r. Ismo
Have you created that table with stats ... values? You should try list instead of values and then those lines will keep their orders and amounts also match. After that you could use above mvzip trick ... See more...
Have you created that table with stats ... values? You should try list instead of values and then those lines will keep their orders and amounts also match. After that you could use above mvzip trick to split those to correct rows.
Sample 1: I sent the logs from Mendix to Splunk, but all the messages are saved within message.  { level: ERROR env: test Message: {"Module": SplunkTest""Microflow": ACT_Splunk_Create_Test""lates... See more...
Sample 1: I sent the logs from Mendix to Splunk, but all the messages are saved within message.  { level: ERROR env: test Message: {"Module": SplunkTest""Microflow": ACT_Splunk_Create_Test""latesterror_message": "401: Access Denied at SplunkTest.ACT_Omnext_Create_TEST (CallRest : 'Call REST (POST)') Advanced stacktrace:"http_status": "401"http_response_content": "{ "statusCode": 401, "message": "Access denied due to invalid subscription key. Make sure to provide a valid key for an active subscription." }"http_reasonphrase": "Access Denied"session_id": "13314141414141212} but i would like to extract some data from the message as below { level: ERROR env: test Module: SplunkTest Microflow:  ACT_Splunk_Create_Test http_reasonphrase: Access Denied session_id: 13314141414141212 } My question is, can this message adjustable like my wish from Splunk. or Do i need to find a way to send data from Mendix in a structured way.
Hi You should always have separate user for running UF on any box. What this user name should be and is it local or centrally managed depends on your company's policies. Anyhow it should be somethin... See more...
Hi You should always have separate user for running UF on any box. What this user name should be and is it local or centrally managed depends on your company's policies. Anyhow it should be something else than root! Earlier that user was splunk as also in enterprise. In some phases it has changed to splunkfwd. I'm not sure if it's currently again splunk or still splunkfwd.  If/when you are using your OS's package manager to install Splunk UF then it creates that user and usually you don't need to take care of it. But when you are using tar.gz package and install it manually or with some scripts, you must create that OS level user by yourself. The most important task is check that this user owns all files under SPLUNK_HOME and the correct OS user name is used in enable boot startup settings! Basically this user name can be what ever you want, but if/when you are using something else than those default you must do chown -R always after you have update UF version! With earlier splunk versions you must grant access for this user to your monitored log files. Currently this is not needed if/when you are using systemd start scripts. It this change good or not is another story? You could look more: https://splunk.my.site.com/customer/s/article/Universal-Forwarder-is-able-to-ingest-files-that-it-does-not-have-permission-to-read https://community.splunk.com/t5/Installation/Security-issue-Splunk-UF-v9-x-is-re-adding-readall-capability/td-p/649047 https://help.splunk.com/en/splunk-enterprise/forward-and-process-data/universal-forwarder-manual/9.3/working-with-the-universal-forwarder/manage-a-linux-least-privileged-user r. Ismo
Hi this is not your University's support site. You must contact directly to their support email or chat and ask they check and fix your access issue. r. Ismo
Have you try dedup with sortby? And of course you should use bin with a new column like index=main | bin _time as time span=1month | dedup time sortby _time | table bill_date ID Cost _time In that... See more...
Have you try dedup with sortby? And of course you should use bin with a new column like index=main | bin _time as time span=1month | dedup time sortby _time | table bill_date ID Cost _time In that way it should take only one event per month. Modifying sort order it will be first or last event in month. 
Hi @sverdhan , did you tried to use the lookup command (https://help.splunk.com/en/splunk-enterprise/search/spl-search-reference/9.4/search-commands/lookup) instead of inputlookup in your search? t... See more...
Hi @sverdhan , did you tried to use the lookup command (https://help.splunk.com/en/splunk-enterprise/search/spl-search-reference/9.4/search-commands/lookup) instead of inputlookup in your search? the lookup command is like a left join. | tstats count WHERE index=* sourcetype=A4Server by index | rex field=index max_match=0 "(?<clients>\w+)(?<sensitivity>_private|_public)" | fields - count | lookup appserverdomainmapping.csv client OUTPUT NewIndex, Domain, Sourcetype | eval NewIndex= NewIndex.sensitivity | table clients, sensitivity, Domain, Sourcetype, NewIndex Ciao. Giuseppe  
Thanks, but that is still not working. Its only grabbing the very first ID. The data will have many IDs to one bill_date to multiple event times/_time.      
Hi @avikc100 , please, next time, send the search in text mode (using the Insert/Edit Code Sample button) so you can mask the sensitive data and we can use it. At first, don't use search or where a... See more...
Hi @avikc100 , please, next time, send the search in text mode (using the Insert/Edit Code Sample button) so you can mask the sensitive data and we can use it. At first, don't use search or where after the main search, but put all the conditions as left as possible, possibly in the main search: index="webmethods_prd" source="/apps/WebMethods*/IntegrationServer/instances/default/logs/MISC.log" MISC_dynamicPrice mainframePrice!=discountPrice | stats count BY mainframePrice discountPrice accountNumber itemId  otherwise you could add the dc function to identify the different values: index="webmethods_prd" source="/apps/WebMethods*/IntegrationServer/instances/default/logs/MISC.log" MISC_dynamicPrice | stats dc(mainframePrice) AS mainframePrice_count dc(discountPrice) AS discountPrice_count first(mainframePrice) AS first_mainframePrice first(discountPrice) AS first_discountPrice last(mainframePrice) AS last_mainframePrice last(discountPrice) AS last_discountPrice BY accountNumber itemId | where mainframePrice_count>1 OR discountPrice_count>1 | fields - *_count Ciao. Giuseppe  
Hello, I am working on Education field and have started using Splunk Entp  since May 18 , 2025. Yesterday 16 Jun 2025 , i faced with log in problem. I uploaded once again the version 9.4.3 and tried... See more...
Hello, I am working on Education field and have started using Splunk Entp  since May 18 , 2025. Yesterday 16 Jun 2025 , i faced with log in problem. I uploaded once again the version 9.4.3 and tried to log in again but same result. Admin should be the person who can solve this issue what is mentioned on the main black box.  I need a service support by admin or responsible POC. I am using VPN if this may cause to log in, this is for the technical team information. My University email is oguz.unal@ogr.yesevi.edu.tr which i used while signing. My alternative email is  Kind regards, Ogz
A4server Beta is the first value so no matter what sourcetype i choose it is on;y giving the values of A4server Beta in sourcetype , newIndex an ddomain
Hello team ,  Please help me modify this query such that it is able to loop through all the values of the csv file :   Although it is able to give the clients and sensitivity of the selected source... See more...
Hello team ,  Please help me modify this query such that it is able to loop through all the values of the csv file :   Although it is able to give the clients and sensitivity of the selected sourcetype but in the results in the fields- Sourcetype Domain and NewIndex it is only giving the values of the first sourcetype- A4Server Like for example over here the selected sourcetype is A4server but in the sourcetype it is giving A4ServerBeta  as it is not looping through the entire csv but only the first value | tstats count WHERE index=* sourcetype=A4Server by index  | rex field=index max_match=0 "(?<clients>\w+)(?<sensitivity>_private|_public)"   | table index, clients, sensitivity | join type=left client [     | inputlookup appserverdomainmapping.csv      | table NewIndex, Domain, Sourcetype ]| eval NewIndex= NewIndex + sensitivity | table clients, sensitivity, Domain, Sourcetype, NewIndex  
got the solution index="webmethods_prd"  source="/apps/WebMethods*/IntegrationServer/instances/default/logs/MISC.log" MISC_dynamicPrice |where mainframePrice!=discountPrice |stats count by mainfra... See more...
got the solution index="webmethods_prd"  source="/apps/WebMethods*/IntegrationServer/instances/default/logs/MISC.log" MISC_dynamicPrice |where mainframePrice!=discountPrice |stats count by mainframePrice, discountPrice,accountNumber,itemId
this is my log    i need a report like below: where I can see price difference in a single report. I don't want to put those records which has same mainframePrice and discountPrice, only I ... See more...
this is my log    i need a report like below: where I can see price difference in a single report. I don't want to put those records which has same mainframePrice and discountPrice, only I want to put those records where mainframePrice and discountPrice are different here I manually entered the individual values to get the report,      
The splunkfwd user is created by default in version 9.1, and seeing the warning "User splunkfwd does not exist - using root" while upgrade. the upgrade guide does not say that creating the splunkfwd... See more...
The splunkfwd user is created by default in version 9.1, and seeing the warning "User splunkfwd does not exist - using root" while upgrade. the upgrade guide does not say that creating the splunkfwd user is mandatory for Universal Forwarder installations or upgrades. Upgrade the universal forwarder | Splunk Docs "When you upgrade, the RPM/DEB package installer retrieves the file owner of SPLUNK_HOME/etc/myinstall/splunkd.xml. If a previous user exists, the RPM/DEB package installer will not create a splunkfwd user and instead will reuse the existing user. If you wish to create a least privileged user, that is, the splunkfwd user, you must remove the existing user first." the warning appears during the upgrade regarding the missing splunkfwd user, there are no permission issues, and the forwarder is functioning properly with "splunk" User. Appreciate your guidance on whether it is mandatory to create the splunkfwd user for Universal Forwarder9.4.0 or higher version? Note: in this topic Splunk enterprise and Splunk UF not installed on the same machine
One more question since i am new to this platform i am wondering how can search for a certain error/ warning or info message. Such as how to seach for " 404 - file not found for file "
Any update on this topic? I am facing the same issue.
@Meett Thanks for responding. I have created a Addon builder called TA-splunk-webhook-alerts and i have attached it to a alerts So, whenever that alert is triggered it will trigger the addon builder... See more...
@Meett Thanks for responding. I have created a Addon builder called TA-splunk-webhook-alerts and i have attached it to a alerts So, whenever that alert is triggered it will trigger the addon builder. this addon builder contains a python script which calls some api to push the alert data. The above picture shows the some of python script. if you see the there are some log statements in it. like  helper.log_info("username={}".format(username)) my question is whenever this script is executed where can i find these logs? i have not done any specific configuration for logging. helper.log_info is default one. FYI: I have developed this addon builder using splunk enterprise version 9 and installed in splunk cloud.  in splunk enterprise i am able to find the location of logs($SPLUNK_HOME/var/log/splunk) but not in splunk cloud. Please assist to find the logs in splunk cloud.  
i tried with above but it is not showing anything