All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

The original issue was a self-signed cert, which we have now navigated, but you are correct on CORS now. Just figuring out where we need to make the change and ensure we update with the correct url ... See more...
The original issue was a self-signed cert, which we have now navigated, but you are correct on CORS now. Just figuring out where we need to make the change and ensure we update with the correct url and then we will see what happens next... Thank you for the input  Regards, Ben
so i want to search and open lookup file in splunk using python. I am able to go to lookup editing app using python .
This is too vague - please explain what you are trying to achieve, what you have tried and what results you have so far
| makeresults | eval start_date = "2024-01-01", end_date = "2024-01-07" | eval start_timestamp = strptime(start_date, "%Y-%m-%d") | eval end_timestamp = strptime(end_date, "%Y-%m-%d") | eval num_days... See more...
| makeresults | eval start_date = "2024-01-01", end_date = "2024-01-07" | eval start_timestamp = strptime(start_date, "%Y-%m-%d") | eval end_timestamp = strptime(end_date, "%Y-%m-%d") | eval num_days = round((end_timestamp - start_timestamp) / 86400) | eval all_dates = start_date | eval range = mvrange(1, num_days + 1) | foreach mode=multivalue range [| eval all_dates=mvappend(all_dates,strftime(relative_time(start_timestamp,"+".<<ITEM>>."d"),"%Y-%m-%d"))] | fields all_dates
I want to access lookup editing app using python , how can I do that?
No, the second time i click on the button it does not log "CLICKED" nor anything at all. It is as if I'm not clicking the button, and it doesn't print errors nor warnings
My requirement is that my start time is January 1, 2024 and end time is January 7, 2024. In addition to placing the start and end times in multi value fields, please also include each date in this ti... See more...
My requirement is that my start time is January 1, 2024 and end time is January 7, 2024. In addition to placing the start and end times in multi value fields, please also include each date in this time interval, such as January 2, 2024, January 3, 2024, January 4, 2024, January 5, 2024, January 6, 2024. The final field content should be January 1, 2024, January 2, 2024, January 3, 2024, January 4, 2024, January 5, 2024, January 6, 2024, and July. The SPL statement is as follows: | makeresults | eval start_date = "2024-01-01", end_date = "2024-01-07" | eval start_timestamp = strptime(start_date, "%Y-%m-%d") | eval end_timestamp = strptime(end_date, "%Y-%m-%d") | eval num_days = round((end_timestamp - start_timestamp) / 86400) | eval range = mvrange(1, num_days) | eval intermediate_dates = strftime(relative_time(start_timestamp, "+".tostring(range)."days"), "%Y-%m-%d") | eval all_dates = mvappend(start_date, intermediate_dates) | eval all_dates = mvappend(all_dates, end_date) | fields all_dates
Hi, there is no official API for that. However you can just open up developer options in your browser and look at the flowmap response json for the data, and then do with that what you want. What i... See more...
Hi, there is no official API for that. However you can just open up developer options in your browser and look at the flowmap response json for the data, and then do with that what you want. What is your use case or end goal by getting the info?
Hi @Lockie , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Hi, I am now adding a new action "ingest excel" to the existing SOAR App CSV Import. Two dependencies are required to be installed for this action: pandas and openpyxl. However, after adding the de... See more...
Hi, I am now adding a new action "ingest excel" to the existing SOAR App CSV Import. Two dependencies are required to be installed for this action: pandas and openpyxl. However, after adding the dependencies in App Wizard, it still show me the output  ModuleNotFoundError: No module named 'pandas' I found that in the app JSON, my dependencies in only added to "pip_dependencies" , but not  "pip39_dependencies". Is that the reason why dependency is not installed? Please advise. Thank you.        
@bowesmana Sorry for the delay in response, I was on vacation.  Thanks for sharing the cluster command,  I tried but it is not giving me the required result or I am not using it correctly. I shared ... See more...
@bowesmana Sorry for the delay in response, I was on vacation.  Thanks for sharing the cluster command,  I tried but it is not giving me the required result or I am not using it correctly. I shared only one part of the requirement. Actually the requirement is to compare two days' logs (today and yesterday) coming from different apps and trigger alert whenever there is a new error. There is no specific error pattern or fields to identify errors, we need to look for the keywords "Error/Fail/Timeout in the logs." I am trying to identify similar phrases in error logs and store the unique error text in a  lookup file  and then match it with the next day's data to identify new error log. Query : index="a" OR index="b" (ERROR OR TIMEOUT OR FAIL OR EXCEPTION)
@oO0NeoN0Oo  I think JavaScript Fetch API is not directly available in splunk dashboard js. But you can try this JS code which is working fone for me. var settings = { "url": "https://l... See more...
@oO0NeoN0Oo  I think JavaScript Fetch API is not directly available in splunk dashboard js. But you can try this JS code which is working fone for me. var settings = { "url": "https://localhost:8088/services/collector/event", "method": "POST", "timeout": 0, "headers": { "Authorization": "Splunk hec_token", "Content-Type": "application/json" }, "data": JSON.stringify({ "sourcetype": "my_sample_data", "event": "this is my data!" }), }; $.ajax(settings).done(function (response) { console.log(response); });   As you are sending events from the browser you should gothrogh below link as well. https://www.splunk.com/en_us/blog/tips-and-tricks/http-event-collector-and-sending-from-the-browser.html I hope this will help you. Thanks KV An upvote would be appreciated if any of my replies help you solve the problem or gain knowledge.
Hello, i have started my journey in more admin activities. Currently I was attempting to add a URL (comment) under the "Next Steps" in a notable event, but it is grayed out. I currently gave my user ... See more...
Hello, i have started my journey in more admin activities. Currently I was attempting to add a URL (comment) under the "Next Steps" in a notable event, but it is grayed out. I currently gave my user all relatable privileges (so this doesn't seem to be the issue). I also tried to edit this by going through configure, content management, and then attempting to edit the search (alert) from there but while trying to edit the notable it is grayed out without the option to edit and with the comment ""this alert action does not require any user configuration". I realize it is easier to edit that part for correlation searches, but I am attempting to edit alerts not correlation searches. 
Renaming of authorize.conf to authorize.conf.old in system/local helped in my case
Hi @N_K , I would, in a nutshell, use SSH action to create a temp unique folder locally on SOAR, then use SSH action "put file" to read from the vault your files and write them to this folder one by... See more...
Hi @N_K , I would, in a nutshell, use SSH action to create a temp unique folder locally on SOAR, then use SSH action "put file" to read from the vault your files and write them to this folder one by one. When all files are put in the folder, run a SSH command to archive them and finally upload it to Jira directly or send it to the vault and then send to Jira. Confirming that Jira action is completed, you can remove the temp unique folder and that will remove the local files to save space. You can also remove the files from the vault at this time. Have you tried this logic?  
Do you get the CLICKED message in the console log or any other message? I assume you've looked at the dashboard examples for setting tokens on buttons, as your code is similar. If you add logging to... See more...
Do you get the CLICKED message in the console log or any other message? I assume you've looked at the dashboard examples for setting tokens on buttons, as your code is similar. If you add logging to the base code, does anything get logged at all.
Hi @silverKi, The maxDataSize for your hot buckets is 1 MB. Your friend's setting appears to be higher (5 MB). To add to what's already been written, you're writing (compressed) data at different r... See more...
Hi @silverKi, The maxDataSize for your hot buckets is 1 MB. Your friend's setting appears to be higher (5 MB). To add to what's already been written, you're writing (compressed) data at different rates: Friend: ~720 bytes per second You: ~19 bytes per second This will influence the size of the warm bucket after it rolls from hot when either maxDataSize (1 MB in your case) or the default maxHotSpanSecs value of 90 days has been exceeded. Hot buckets can also roll to warm when Splunk is restarted or when triggered manually. That probably isn't happening here, but it's worth noting.
I suppose that SHOULD_LINEMERGE=true is that way from some historical currently unknown reason and nobody has so brave that change its default to false
My shot in the dark would be that you're trying to use fetch() to push events to a server A from a webpage coming from server B. And you're hitting CORS problems.
As you can check from there https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Commandsbytype also this command move actions from indexers to SH side. And as @PickleRick said this com... See more...
As you can check from there https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Commandsbytype also this command move actions from indexers to SH side. And as @PickleRick said this command use lot of memory too.