All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello all, Consider we have X application requested on-boarding on to Splunk. Created index for this X application, a new role (restricted to X index) and assigned this role to X AD group. Likewise ... See more...
Hello all, Consider we have X application requested on-boarding on to Splunk. Created index for this X application, a new role (restricted to X index) and assigned this role to X AD group. Likewise we have Y, Z soon application. We do in the same manner. But now the requirement is this X,Y,Z application come under 'A' applications and they want all 'A' team members (probably X,Y,Z combined) to view X,Y,Z applications. How we can achieve this? Can't create single index for all X,Y, and Z application because the logs should not be mixed.
We are migrating the Splunk 9.0.3 Search Head from Virtual box to Physical box. Splunk services were up and running in new Physical box but in Splunk Web UI, I was unable to login using the my auth... See more...
We are migrating the Splunk 9.0.3 Search Head from Virtual box to Physical box. Splunk services were up and running in new Physical box but in Splunk Web UI, I was unable to login using the my authorized credentials and found the below error in Splunkd.log   01-21-2025 05:18:05.218 -0500 ERROR ExecProcessor [3275615 ExecProcessor] - message from "/apps/splunk/splunk/etc/apps/splunk_app_db_connect/bin/server.sh" action=task_server_start_failed error=com.splunk.HttpException: HTTP 503 -- KV Store initialization failed. Please contact your system administrator
if it's wrong ... so how it works in search? my result are correct until I use my search in dashboard
Hi @woodman2 , as I said, dollar char at the borders of a word is the way in Splunk to identify tokens, so you cannot use this format for your fields. you mast modify your searches and you data str... See more...
Hi @woodman2 , as I said, dollar char at the borders of a word is the way in Splunk to identify tokens, so you cannot use this format for your fields. you mast modify your searches and you data structure. Ciao. Giuseppe
Door and Doorname are field name that exist in my search result they have values and my search works fine with them unless I use them in a dashboard because it counts them as tokens and not taking th... See more...
Door and Doorname are field name that exist in my search result they have values and my search works fine with them unless I use them in a dashboard because it counts them as tokens and not taking their values from my results
Hi @woodman2 , at first, don't use the search command after the main search, because you have a slower search. then, what are $Door$ and $Doorname$? in Splunk they are tokens defined in a dashboar... See more...
Hi @woodman2 , at first, don't use the search command after the main search, because you have a slower search. then, what are $Door$ and $Doorname$? in Splunk they are tokens defined in a dashboard. If you have a variable or a field with this name it cannot run in a search. Ciao. Giuseppe
I have such a search and it works fine but not in Dashboard!         index=unis | search *sarch* | eval name = coalesce(C_Name, PersonName) | eval "DoorName"=if(sourcetype=="ARX:db", $Door$,$Doo... See more...
I have such a search and it works fine but not in Dashboard!         index=unis | search *sarch* | eval name = coalesce(C_Name, PersonName) | eval "DoorName"=if(sourcetype=="ARX:db", $Door$,$DoorName$)       when I use this is in a dashboard it looks for Door and DoorName as tokens while they are values of those fields what should I do to make it work in dashboard studio error I get : Set token value to render visualization $Door$ $DoorName$ edit: if I remove all $  it still works same as in search but still not working in dashboard (without any error) it returns result but DoorName field will be empty
Yes. The result is the same.    
Are you sending the logs directly to Splunk Cloud or thru a Intermediate Forwarder? An app with props.conf and transforms.conf uploaded to Splunk Cloud is run on the Search Head. In my cases I ha... See more...
Are you sending the logs directly to Splunk Cloud or thru a Intermediate Forwarder? An app with props.conf and transforms.conf uploaded to Splunk Cloud is run on the Search Head. In my cases I had to install the app on the Intermediate Forwarder that sends on-prem logs to Splunk Cloud, when it worked as it had done before migrating to the cloud.
@annielee have you tried using the relevant .whl file for the 2 libs and adding to the app as a wheel dependency?  --- Hope this helped? Happy SOARing! ---
The original issue was a self-signed cert, which we have now navigated, but you are correct on CORS now. Just figuring out where we need to make the change and ensure we update with the correct url ... See more...
The original issue was a self-signed cert, which we have now navigated, but you are correct on CORS now. Just figuring out where we need to make the change and ensure we update with the correct url and then we will see what happens next... Thank you for the input  Regards, Ben
so i want to search and open lookup file in splunk using python. I am able to go to lookup editing app using python .
This is too vague - please explain what you are trying to achieve, what you have tried and what results you have so far
| makeresults | eval start_date = "2024-01-01", end_date = "2024-01-07" | eval start_timestamp = strptime(start_date, "%Y-%m-%d") | eval end_timestamp = strptime(end_date, "%Y-%m-%d") | eval num_days... See more...
| makeresults | eval start_date = "2024-01-01", end_date = "2024-01-07" | eval start_timestamp = strptime(start_date, "%Y-%m-%d") | eval end_timestamp = strptime(end_date, "%Y-%m-%d") | eval num_days = round((end_timestamp - start_timestamp) / 86400) | eval all_dates = start_date | eval range = mvrange(1, num_days + 1) | foreach mode=multivalue range [| eval all_dates=mvappend(all_dates,strftime(relative_time(start_timestamp,"+".<<ITEM>>."d"),"%Y-%m-%d"))] | fields all_dates
I want to access lookup editing app using python , how can I do that?
No, the second time i click on the button it does not log "CLICKED" nor anything at all. It is as if I'm not clicking the button, and it doesn't print errors nor warnings
My requirement is that my start time is January 1, 2024 and end time is January 7, 2024. In addition to placing the start and end times in multi value fields, please also include each date in this ti... See more...
My requirement is that my start time is January 1, 2024 and end time is January 7, 2024. In addition to placing the start and end times in multi value fields, please also include each date in this time interval, such as January 2, 2024, January 3, 2024, January 4, 2024, January 5, 2024, January 6, 2024. The final field content should be January 1, 2024, January 2, 2024, January 3, 2024, January 4, 2024, January 5, 2024, January 6, 2024, and July. The SPL statement is as follows: | makeresults | eval start_date = "2024-01-01", end_date = "2024-01-07" | eval start_timestamp = strptime(start_date, "%Y-%m-%d") | eval end_timestamp = strptime(end_date, "%Y-%m-%d") | eval num_days = round((end_timestamp - start_timestamp) / 86400) | eval range = mvrange(1, num_days) | eval intermediate_dates = strftime(relative_time(start_timestamp, "+".tostring(range)."days"), "%Y-%m-%d") | eval all_dates = mvappend(start_date, intermediate_dates) | eval all_dates = mvappend(all_dates, end_date) | fields all_dates
Hi, there is no official API for that. However you can just open up developer options in your browser and look at the flowmap response json for the data, and then do with that what you want. What i... See more...
Hi, there is no official API for that. However you can just open up developer options in your browser and look at the flowmap response json for the data, and then do with that what you want. What is your use case or end goal by getting the info?
Hi @Lockie , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Hi, I am now adding a new action "ingest excel" to the existing SOAR App CSV Import. Two dependencies are required to be installed for this action: pandas and openpyxl. However, after adding the de... See more...
Hi, I am now adding a new action "ingest excel" to the existing SOAR App CSV Import. Two dependencies are required to be installed for this action: pandas and openpyxl. However, after adding the dependencies in App Wizard, it still show me the output  ModuleNotFoundError: No module named 'pandas' I found that in the app JSON, my dependencies in only added to "pip_dependencies" , but not  "pip39_dependencies". Is that the reason why dependency is not installed? Please advise. Thank you.