All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello, I have a saved search that pushes data to summary index. The summary index has data for last 2 years and data volume is really huge. Suppose I want to add a new field to this data in summary ... See more...
Hello, I have a saved search that pushes data to summary index. The summary index has data for last 2 years and data volume is really huge. Suppose I want to add a new field to this data in summary index, I need to re run search for last two years. Since the volume is huge, if I try to run the search for all 2 years data in one time, the search fails or data gets missed. To avoid this, I'll be pushing data in 10 days batch or 30 days batch. For example - if I have to repopulate my summary index after adding a new field. So, for first batch I'll run for data from 1st Aug 2023 to 10th Aug 2023. Next batch I'll run from 11th Aug to 20th Aug. Similar thing has to be done for past two years of data to be pushed in summary index. This task is very cumbersome . Is there a way to automate this task in splunk. Can I schedule my search in such a way that while repushing data , without manual intervention data gets pushed in 10 days batch in summary index?
Buenas comunidad quería saber si es posible, cuando tienes un splunk de manera local implementar maquinas "instancias" nuevas para monitorizarlas junto a la local eso es todo si es posible , me gusta... See more...
Buenas comunidad quería saber si es posible, cuando tienes un splunk de manera local implementar maquinas "instancias" nuevas para monitorizarlas junto a la local eso es todo si es posible , me gustaría que se me dejase el link de el procedimiento a seguir para completar esta tarea    Un saludo y gracias
I have a lookup file.  Lookup has "host", "count", "first_event" and "last_event" fields.  I want to run a search hourly that will update all the fields with fresh values and in the event that a "hos... See more...
I have a lookup file.  Lookup has "host", "count", "first_event" and "last_event" fields.  I want to run a search hourly that will update all the fields with fresh values and in the event that a "host" is not found in the search send an alert. Any guidance would be appreciated.
I have regular traffic passing through my server. The server has the IP 10.41.6.222 My goal is to extract the Rate /sec passing through the server and  to be able to see theRate /sec in a graph an h... See more...
I have regular traffic passing through my server. The server has the IP 10.41.6.222 My goal is to extract the Rate /sec passing through the server and  to be able to see theRate /sec in a graph an having x asis showing time and y axis Rate /sec (extracted values). -----------------------------------------------------------------------------------------------------------------------------------   Rate 0/sec : Bytes 9815772 : from owa client to vs_owa with address 10.41.6.166:443:10.41.6.222Rate 402/sec : Bytes 9816135 : from owa client to vs_owa with address 10.41.6.166:443:10.41.6.222Rate 139587/sec : Bytes 10004146 : from owa client to vs_owa with address 10.41.6.166:443:10.41.6.222Rate 147636/sec : Bytes 10009645 : from owa client to vs_owa with address 10.41.6.166:443:10.41.6.222Rate 69967/sec : Bytes 10358668 : from owa client to vs_owa with address 10.41.6.166:443:10.41.6.222Rate 69967/sec : Bytes 10361672 : from owa client to vs_owa with address 10.41.6.166:443:10.41.6.222Rate 69967/sec : Bytes 10364579 : from owa client to vs_owa with address 10.41.6.166:443:10.41.6.222Rate 69967/sec : Bytes 10364667 : from owa client to vs_owa with address 10.41.6.166:443:10.41.6.222Rate 49661/sec : Bytes 10371887 : from owa client to vs_owa with address 10.41.6.166:443:10.41.6.222Rate 217793/sec : Bytes 10700517 : from owa client to vs_owa with address 10.41.6.166:443:10.41.6.222Rate 353829/sec : Bytes 10944230 : from owa client to vs_owa with address 10.41.6.166:443:10.41.6.222Rate 93689/sec : Bytes 10946290 : from owa client to vs_owa with address 10.41.6.166:443:10.41.6.222Rate 82030/sec : Bytes 10950753 : from owa client to vs_owa with address
Hi All, I need help building a SPL that would return all available fields mapped to their sourcetypes/source  Looking across all Indexers crawling through all indexes index=* I currently use to... See more...
Hi All, I need help building a SPL that would return all available fields mapped to their sourcetypes/source  Looking across all Indexers crawling through all indexes index=* I currently use to strip off all the fields and their extracted fields but I have no idea where they are coming from, what is their sourcetype and source: index=* fieldsummary | search values!="[]" | rex field=values max_match=0 "\{\"value\":\"(?<extracted_values>[^\"]+)\"" | fields field extracted_values Thank you!
Hi All, trying to identify what data source/sourcetype is needed for each individual field while performing Data Model CIM normalization. For example for Endpoint->Ports/Data Set (https://docs.splunk... See more...
Hi All, trying to identify what data source/sourcetype is needed for each individual field while performing Data Model CIM normalization. For example for Endpoint->Ports/Data Set (https://docs.splunk.com/Documentation/CIM/5.2.0/User/Endpoint) there is a table with 5 columns Dataset Name/Field name/Data type/Description/Abbreviated list of example values/, but there is no guidance of what data source is needed for each individual field to start populating. As an example, I recently found that for the Registry Data Set it needs WinRegMon stanza (configuring this is another challenge ) to be able to recognise and start parsing data. Any help much appreciated!
Hi There!    I need to pass a token form one dashboard to another dashboard when clicking its pie chart  Input in dashboard 1 </input> <input type="multiselect" token="choose_office" searchWhenCh... See more...
Hi There!    I need to pass a token form one dashboard to another dashboard when clicking its pie chart  Input in dashboard 1 </input> <input type="multiselect" token="choose_office" searchWhenChanged="true"> <label>Front/Back office</label> <choice value="Front Office">Front Office</choice> <choice value="Back Office">Back Office</choice> <initialValue>Front Office,Back Office</initialValue> <default>Front Office,Back Office</default> <valuePrefix>"</valuePrefix> <valueSuffix>"</valueSuffix> <delimiter>, </delimiter> </input> one of the searches in dashboard 1 `compliance_op` | search office IN ($choose_office$) | chart count by $scope$global | sort $scope$global My link to next dashboard is  <drilldown> <link target="_blank">/app/SAsh/operational_beautiful?form.choose_office=$choose_office$&amp;form.machine=$machine$&amp;form.origin=$origin$&amp;form.country=$country$&amp;form.cacp=$cacp$&amp;form.scope=$scope$</link> </drilldown> Multiselect in dashboard 2 <input type="multiselect" token="office_filter" searchWhenChanged="true"> <label>Front/Back Office</label> <choice value="Front Office">Front Office</choice> <choice value="Back Office">Back Office</choice> <choice value="Unknown">Unknown</choice> <prefix>office IN (</prefix> <suffix>)</suffix> <initialValue>Front Office,Back Office,Unknown</initialValue> <valuePrefix>"</valuePrefix> <valueSuffix>"</valueSuffix> <delimiter>, </delimiter> <change> <eval token="office_filter_drilldown">mvjoin('form.office_filter',"&amp;form.office_filter=")</eval> </change> </input>   search in dashboard 2 `compliance_ap` | search office IN ($choose_office$) | chart count by $scope$global | sort $scope$global I'm facing error in search of dashboard 2. Thanks!
Hi There!    I would like to pass multiselect values to macros, earlier it was dropdown. The values in multiselect itself an macros which we need pass as a token to search, <input type="checkbox"... See more...
Hi There!    I would like to pass multiselect values to macros, earlier it was dropdown. The values in multiselect itself an macros which we need pass as a token to search, <input type="checkbox" token="index_scope" searchWhenChanged="true"> <label>Choose console</label> <choice value="1T*">Standard</choice> <choice value="2A*">Scada</choice> <choice value="2S*">AWS</choice> <default>1T*</default> <initialValue>1T*</initialValue> </input>| Here is the search `compliance($index_scope$, now(), $timerange$, $scope$, $origin$, $country$, $cacp$)`   It's not working as expected in multiselect, earlier for dropdown its working good. Thanks in Advance! Manoj Kumar S
Hello, How can I implement this one. to autochoose category dropdown from ingredient dropdown. FOr example, If I choose apple, it will autochoose fruit.. Many thanks @Anonymous         ... See more...
Hello, How can I implement this one. to autochoose category dropdown from ingredient dropdown. FOr example, If I choose apple, it will autochoose fruit.. Many thanks @Anonymous             
I'm going crazy with this, would appreciate some help. I'm pretty sure the record numbers were not being shown to me. I'm trying to index a simple JSON and unsure where it is coming from or how ... See more...
I'm going crazy with this, would appreciate some help. I'm pretty sure the record numbers were not being shown to me. I'm trying to index a simple JSON and unsure where it is coming from or how to disable!
looking for help in editing the saved Alert query using the api/curl . i would like to change the pod_count 9 to 12 . how can i do that using api/curl alertname = test_goski-list alert query = in... See more...
looking for help in editing the saved Alert query using the api/curl . i would like to change the pod_count 9 to 12 . how can i do that using api/curl alertname = test_goski-list alert query = index=list-service source="eventhub://sams-jupiter-prod-scus-logs-premium-1.servicebus.windows.net/list-service;" "kubernetes.namespace_name"="list-service" | stats dc(kubernetes.pod_name) as pod_count | where pod_count < 9 i am trying to run the below curl to connect to the search head IP and getting the error response. curl -k -u admin:changeme -X POST https://10.236.140.2:8089/servicesNS/admin/search/alerts/test_goski-list -d "search=index=list-service" <?xml version="1.0" encoding="UTF-8"?> <response> <messages> <msg type="ERROR">Not Found</msg> </messages> </response>
I have SPL index=main state=open | stats count(state) as open by risk_rating | rename risk_rating as state | addtotals col=t row=f labelfield=state | append [ search index=main state=closed | stats ... See more...
I have SPL index=main state=open | stats count(state) as open by risk_rating | rename risk_rating as state | addtotals col=t row=f labelfield=state | append [ search index=main state=closed | stats count(state) as closed by risk_rating_after | rename risk_rating_after as state | addtotals col=t row=f labelfield=risk  I want crate table like below but risk_rating_after field only has Sustainable value so when I do selfjoin state  it only has sustainable. I try join but it did not get result. It any way I can achieve this result.   Many thank you in advance.  state Critical  Moderate Severe Sustainable Total Open 1 2 4 5 12 Close 0 0 0 6 6 Total                              1                                   2                                   4                                   11                                18
I notice that CSV ingestion (from Splunk Web file upload) sometimes cuts off an event, possibly because one field is extra lengthy.  In one example, I see that Splunk only gets roughly 8,160 characte... See more...
I notice that CSV ingestion (from Splunk Web file upload) sometimes cuts off an event, possibly because one field is extra lengthy.  In one example, I see that Splunk only gets roughly 8,160 characters of a column that has 8,615 characters.  That field and any column after the column, are not extracted. (Those ~8100 characters remain in Splunk's raw event.) When I took the same CSV file to a similarly configured instance, however, ingestion was successful for this event.  No missing fields.  Particularly surprising is that I have increased [kv]maxchars in the instance that had this trouble.  So, I suspect that if I ingest it again in the same instance, it may succeed as well.  In other words, this seems rather random. (Even without increasing maxchars, the length in this column is still smaller than default (10,240).)   Instance 1 (dropped part of event) Instance 2 (event ingestion complete) limits.conf [kv] From local/limits.conf indexed_kv_limit = 1000 maxchars = 40960 From default/limits.conf indexed_kv_limit = 200 maxchars = 10240 RAM 16 GB 8 GB What else should I check?  Both instances run Splunk Enterprise 9.1.1.  
Hi, I am trying to access the REST API for an on-premises controller. I followed the instructions on this for my version of AppDynamics - API Clients (appdynamics.com) I created an API Client, gene... See more...
Hi, I am trying to access the REST API for an on-premises controller. I followed the instructions on this for my version of AppDynamics - API Clients (appdynamics.com) I created an API Client, generated the client ID, secret and I am able to generate the token by accessing the 'controller/api/oauth/access_token' endpoint. But when I use the Authentication token and/or the username/password for accessing any other endpoint, I am getting a 500 Internal Server Error response. Any suggestions on how to resolve this?
Hello, I was going through the Custom Visualization Tutorial here:  https://docs.splunk.com/Documentation/Splunk/9.1.1/AdvancedDev/CustomVizTutorial When I got to the step to build the visualiza... See more...
Hello, I was going through the Custom Visualization Tutorial here:  https://docs.splunk.com/Documentation/Splunk/9.1.1/AdvancedDev/CustomVizTutorial When I got to the step to build the visualization, I ran into the following error when trying to build using windows using npm run build which is also described in this forum post by @niketn : https://community.splunk.com/t5/Getting-Data-In/Why-does-Splunk-Custom-Visualization-API-result-in-a-build-error/m-p/351681 > $SPLUNK_HOME/bin/splunk cmd node ./node_modules/webpack/bin/webpack.js '$SPLUNK_HOME' is not recognized as an internal or external command, operable program or batch file. The workaround described in the post is to change the SPLUNK_HOME Windows Environment variable in the package.json file from the *NIX format ($SPLUNK_HOME)  to  the Windows (%SPLUNK_HOME%) so that the scripts look like below. "scripts": { "build": "%SPLUNK_HOME%/bin/splunk cmd node ./node_modules/webpack/bin/webpack.js", "devbuild": "%SPLUNK_HOME%/bin/splunk cmd node ./node_modules/webpack/bin/webpack.js --progress", "watch": "%SPLUNK_HOME%/bin/splunk cmd node ./node_modules/webpack/bin/webpack.js -d --watch --progress" } However, when I attempt to build the visualization after changing the package.json file, I still get the following error:  > standin@1.0.0 devbuild > %SPLUNK_HOME%/bin/splunk cmd node ./node_modules/webpack/bin/webpack.js --progress 'C:\Program' is not recognized as an internal or external command, operable program or batch file. I am assuming this is because the file path is "C:\Program Files\Splunk" which contains a space. According to the original forum post, the only way to build the visualization on Windows is to reinstall Splunk in a different file path that does not contain spaces. I was hoping there is a different solution for Windows that does not require the reinstallation of Splunk? Or if there is any other mistake I am making that would cause the build to fail? Thanks!
Hi community! I've tried and exhausted all my brain cells but I still couldn't make this work. Any ideas? Below is deployed into a Windows 11 machine, running UF 9.1.1 splunk-playground/TA-powershe... See more...
Hi community! I've tried and exhausted all my brain cells but I still couldn't make this work. Any ideas? Below is deployed into a Windows 11 machine, running UF 9.1.1 splunk-playground/TA-powershell_scripting/local/inputs.conf at main · morethanyell/splunk-playground (github.com)
Signed up for a free trial of splunk cloud, to test out an app im considering buying. however, nowhere was i told that i can not upload an app file, or that the cloud App can not be installed into... See more...
Signed up for a free trial of splunk cloud, to test out an app im considering buying. however, nowhere was i told that i can not upload an app file, or that the cloud App can not be installed into the free trial of splink cloud. under APPS > Manage Apps, there is no "Upload" button. how can i test and see if this is worth buying, if I cant test and see if its worth throwing money at?
After migrating to Splunk 9.1.1, all of the controls under: splunk/search_mrsparkle/exposed/js/views/shared/controls/ are no longer there. Doing a search found them under: splunk/quarantined_files/s... See more...
After migrating to Splunk 9.1.1, all of the controls under: splunk/search_mrsparkle/exposed/js/views/shared/controls/ are no longer there. Doing a search found them under: splunk/quarantined_files/share/splunk/search_mrsparkle/exposed/js/views/shared/controls folder.  Was there a reason all of the controls where moved?  I looked under the Release docs, but didn't find anything on this topic. Is there a reason all of these are quarantined or can I move them all back?
Hi, Is it possible to fetch the account access key, which is under the license page, using the API command, instead of getting it from the controller Regards, Mohammed Saad
I'm trying to create a visual dashboard  (specifically a column graph or bar chart) using  index=guardium ruleDesc="OS Command Injection" | stats count by dbUser, DBName, serviceName, sql   This ... See more...
I'm trying to create a visual dashboard  (specifically a column graph or bar chart) using  index=guardium ruleDesc="OS Command Injection" | stats count by dbUser, DBName, serviceName, sql   This is the graph I get: I would like to group these fields into categories on the chart where one part would show count of 1-5 then 6-10...and so on.  Then I could drill down a specific bar within the count group to view the fields for that bar in a table format.  How would I go about doing this.  I am new to splunk and have been stuck finding the best way to represent this data.  I was given this search statement and was told to make a visual dashboard of it.