All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello splunkers, I'm working with the latest version of Splunk Add-on Builder to index data from a REST API. TA only pulls the first page of results by calling:   https://mywebpage.com/api/source... See more...
Hello splunkers, I'm working with the latest version of Splunk Add-on Builder to index data from a REST API. TA only pulls the first page of results by calling:   https://mywebpage.com/api/source/v2   At the bottom of the pulled data are URL for the next url:   "next_url" : "/api/source/v2?last=5431"   How do I configure TA for iterates through all the pages? I checked from link below, but i dont' understand how (or if is possible) pass the variable  from modular input to my endpoint like this or in other way:   https://mywebpage.com/api/source/v2?last=${next_url}   https://docs.splunk.com/Documentation/AddonBuilder/4.3.0/UserGuide/ConfigureDataCollection#Pass_values_from_data_input_parameters  Any ideas? Thanks!
I am looking to replace a sourcetype using props.conf / transforms.conf so far with no luck. props.conf [original_sourcetype] NO_BINARY_CHECK = 1 SHOULD_LINEMERGE = false TIME_PREFIX = oldtimepref... See more...
I am looking to replace a sourcetype using props.conf / transforms.conf so far with no luck. props.conf [original_sourcetype] NO_BINARY_CHECK = 1 SHOULD_LINEMERGE = false TIME_PREFIX = oldtimeprefix TIME_FORMAT=oldtimeformat pulldown_type = 1 TRANSFORMS-set_new=set_new_sourcetype [new_sourcetype_with_new_timeformat] NO_BINARY_CHECK=1 SHOULD_LINEMERGE=false TIME_PREFIX=newtimeprefix TIME_FORMAT=newtimeformat pulldown_type = 1 #rename=original_sourcetype transforms.conf [set_new_sourcetype] SOURCE_KEY = MetaData:Source REGEX = ^source::var/log/path/tofile.log FORMAT = sourcetype::new_sourcetype_with_new_timeformat DEST_KEY = MetaData:Sourcetype tried different REGEX's, including  REGEX = var/log/path/tofile.log   Also tried setting it like this in props.conf [source::var/log/path/tofile.log] TRANSFORMS-set_new=set_new_sourcetype   I am also looking at inputs.conf, which has monitoring stanzas for all syslog traffic, perhaps some blacklisting/ whitelisting based on source can be done there. But I am curious as to what is not working with my props/transforms. Thanks      
In my environment, palo alto (proxy) logs are being stored into Splunk. I want to know what kind of operation on a server make high-risk communication to internet using palo alto logs and Windows ev... See more...
In my environment, palo alto (proxy) logs are being stored into Splunk. I want to know what kind of operation on a server make high-risk communication to internet using palo alto logs and Windows event logs or Linux audit  log or some thing. Is it possible with Correlation Search of Splunk ?
I configured a search head cluster and configured a captain and added the searchheads to the indexer cluster. I now want to break the shcluster and have done this so far; All from the cli: removed... See more...
I configured a search head cluster and configured a captain and added the searchheads to the indexer cluster. I now want to break the shcluster and have done this so far; All from the cli: removed the member that was not the captain, went ok Tried to remove the other member, didnt work the command just hanged for half an hour before I gave up and aborted it. Tried to set the captain in static mode, did a clean raft, but still no luck. configured disabled=1 in the shclustering part of the server.conf and this time it went ok I guess I now get the message this node is not a part of any cluster configuration.   Over to the indexer cluster where I now want to get rid of the searchheads from the GUI which is still showing up as up and running. ran the command splunk remove cluster-search-heads and that went successful but the searchheads are still there in the indexer clustering GUI some suggests that this will go away after a few minutes and after a restart of the manager node this will certainly go away. I have now waited a whole day and restarted, but they are still showing up and running with a green checkmark too. Where does it get its information from and how can I get rid of them?
Hello, Example I have 2 lookups, first.csv and second.csv first.csv have 1 column name=fruit_name and with multivalue first.csv fruit_name apple banana melon mango grapes guyab... See more...
Hello, Example I have 2 lookups, first.csv and second.csv first.csv have 1 column name=fruit_name and with multivalue first.csv fruit_name apple banana melon mango grapes guyabano coconut second.csv have 2 column fruits and remarks with multivalue under fruits column fruits remarks apple mango guyabano visible How can i check if all the values of second.csv (apple,mango,guyabano) are present in the column fruit_name under first.csv then echo out the remarks with the value of visible Thanks in advance
Hi Team, Due to SSL cert issue I see the Database queries tab is not loading which we are working on it. Customer is asking to fetch the following data => Query, time executed, time took for complet... See more...
Hi Team, Due to SSL cert issue I see the Database queries tab is not loading which we are working on it. Customer is asking to fetch the following data => Query, time executed, time took for completion etc.  Is there any way we can get the data from the database? Queries data is located in which database also the path to DB? Please can share the DB and table name to so we can export the data from database. Thanks
Hello Splunkers!! In a scheduled search within Splunk, we have set up email notifications with designated recipients. However, there is an intermittent issue where sometime recipients do not consis... See more...
Hello Splunkers!! In a scheduled search within Splunk, we have set up email notifications with designated recipients. However, there is an intermittent issue where sometime recipients do not consistently receive the scheduled search email. To address this, we need to determine if there is a way within Splunk to verify whether the recipients successfully received the email notifications. Please help me identify how address and how to check this things in Splunk.   index=_internal source=*splunkd.log sendemail I have tried above search but above search is not providing the information about receipents email address. 
Hi, i got error after completed set up Enterprise Security on my lab. First im using Windows but when want to setup Enterprise Security always got    Error in 'essinstall' command: (InstallExcepti... See more...
Hi, i got error after completed set up Enterprise Security on my lab. First im using Windows but when want to setup Enterprise Security always got    Error in 'essinstall' command: (InstallException) "install_apps" stage failed - Splunkd daemon is not responding: ('Error connecting to /services/admin/localapps: The read operation timed out',)   then i want to try install fresh Splunk Enterprise in WSL (in my case Ubuntu 22) i got success install and can doing anything normally. After that, i try install Enterprise Security again. And now i got successful notification when setup Enterprise Security via WebGUI, but unfortunately when successful restart i can't open Splunk Enterprise    This is my CLI looks like    i cannot see any error in my CLI that's why i ask it here, maybe somebody can help me ?      
Mvmap has different results on different versions left screen is 9.3.1 version right is 9.0.5  if field will have more then one value result will be equal    
Hi,   in getting a 201 token error on Splunk cloud maintenance dashboard.   just wondered if anyone has seen this before.
Hi Team, i am trying to design a query which show be result like total event count, sub event count and sub event in percent. can you please help with query For example below table : Work_Month_... See more...
Hi Team, i am trying to design a query which show be result like total event count, sub event count and sub event in percent. can you please help with query For example below table : Work_Month_week | total_week_day|work day of week| Number of work hours | percent work hours 1                                      |  3                               | Mon                          | 2                                            |     %                                                                                                                              |Tus                             | 4                                            |     %                                                                                |Tus                             | 4                                            |     %  2                                      |  2                               | Mon                          | 2                                            |     %                                                                                                                              |Tus                             | 4                                            |     %  3                                      |  3                               | Mon                          | 3                                            |     %                                                                                                                              |Tus                             |  5                                           |     %                                                                                |thu                             | 4                                            |     % 
I have this message field that I need to extract the value from the brackets. The values are C,D,E,F,G Message.Rogue.AllDskID{} how would I use REX to do this? Or would I need to use the eval comman... See more...
I have this message field that I need to extract the value from the brackets. The values are C,D,E,F,G Message.Rogue.AllDskID{} how would I use REX to do this? Or would I need to use the eval command?    
Hi community, I have observed an issue with the ingestion of the first line in a log file that, at first glance, seemed to have been truncated. Here's a screenshot for reference: My apolo... See more...
Hi community, I have observed an issue with the ingestion of the first line in a log file that, at first glance, seemed to have been truncated. Here's a screenshot for reference: My apologies for the poor job at blurring the data, but the first event should look like the second event, with a whole lot of data after the highlighted field. The field DistPoint itself should have a value of "DEPSY.IM2" and, it got, apparently, truncated at such a weird point. All other subsequent lines in the log were successfully ingested. There were 3 log files landing on the ingestion point in quick succession - seconds apart, so I am not sure if this could have been the issue. I was about to update the truncate value for the sourcetype, but all lines in the logs are 3551 bytes, by default. Any ideas as to what could the problem have been? Thank you.
I'm using cmd |iplocation src, and the results produce results for the City. Next i want to compare each City and report when results is different. Example when result for a City is Miami and next h... See more...
I'm using cmd |iplocation src, and the results produce results for the City. Next i want to compare each City and report when results is different. Example when result for a City is Miami and next hour or so  in the same field for the City is Boston.
I am creating a panel and input type select as "link". There multiple choice filed is created, how to keep all choice button in a line using splunk classic. <panel id="panel_id_1"> <input type="l... See more...
I am creating a panel and input type select as "link". There multiple choice filed is created, how to keep all choice button in a line using splunk classic. <panel id="panel_id_1"> <input type="link" token="token_tab" searchWhenChanged="true" id="details"> <label></label> <choice value="x">X</choice> <choice value="Y">Y</choice> <choice value="z">Z</choice> </panel> I want keep all choice value as X Y Z, but for me it is coming X Y Z
What would be the proper way to push an authentication.conf from the deployer and have the bind password not left in clear text? Is it possible to push the authentication from the deployer without th... See more...
What would be the proper way to push an authentication.conf from the deployer and have the bind password not left in clear text? Is it possible to push the authentication from the deployer without the bind password  and then add another authentication.conf manually to each search head in system local with only the bind password in the stanza? After restart of the search head cluster I’m thinking the bind password would then be encrypted? Would this be the proper way to do this? Would appreciate any other suggestions. 
I found this very usefull search for a dashboard on gosplunk: | rest /services/data/indexes | dedup title | fields title | rename title AS index      | map maxsearches=1500 search="| metadata t... See more...
I found this very usefull search for a dashboard on gosplunk: | rest /services/data/indexes | dedup title | fields title | rename title AS index      | map maxsearches=1500 search="| metadata type=sourcetypes index=\"$index$\"     | eval Retention=tostring(abs(lastTime-firstTime), \"duration\")     | convert ctime(firstTime) ctime(lastTime)     | sort lastTime     | rename totalCount AS \"TotalEvents\" firstTime AS \"FirstEvent\" lastTime AS \"LastEvent\"     | eval index=\"$index$\""     | fields index  sourcetype TotalEvents FirstEvent LastEvent Retention     | sort sourcetype     | stats list(sourcetype) AS SourceTypes list(TotalEvents) AS TotalEvents list(FirstEvent) AS "First Event" by index     | append [| rest /services/data/indexes | dedup title | fields title | rename title AS index]     | dedup index | fillnull value=null SourceTypes TotalEvents "First Event" "Last Event" Retention | sort index | search index=* (SourceTypes=*) However, when i first ran it, some of the "lastevent" values appeared correctly. Ever since then, "LastEvent" and "Retention" have allways been "Null". I cant figure out why i dont get any return values on these fields. I got an error saying the limit on "list" command of 100 was surpassed. So i tried replacing "list()" with "values()" in the search, but the result is the same, just without the error. 
I have a lookup file saved with a single column having values of specific fields in it. And want to use to search in query which matched with values in field names Example: lookupname : test.csv ... See more...
I have a lookup file saved with a single column having values of specific fields in it. And want to use to search in query which matched with values in field names Example: lookupname : test.csv column name: column1 fieldname: field1
Hi  I am building dashboard for UPS monitoring and i would like to convert a specific metric which is battery age.  Which give us some information about last battery changed however i would like ... See more...
Hi  I am building dashboard for UPS monitoring and i would like to convert a specific metric which is battery age.  Which give us some information about last battery changed however i would like to see the result in month , days like below  Expected outcome - 1 month 20 days. current outcome  below image  Spl query -  index="ups" indexed_is_service_aggregate=1 kpi=BatteryAge| lookup service_kpi_lookup _key as itsi_service_id OUTPUT title AS service_name | search service_name="MainUPS" |stats latest(alert_value) AS BatteryAge Can anyone help me on this 
Thanks for the solution which worked When i select the data entity and the time  and hit the submit button with below query ...But without selecting the env test or prod the query get search based on... See more...
Thanks for the solution which worked When i select the data entity and the time  and hit the submit button with below query ...But without selecting the env test or prod the query get search based on the default dropdown applies to the query if it is test  index as "np-ap" and sets stageToken as test. I want  the submit button to work even for the env selection ...along with data entity and date index="np-ap" AND source="--a-test" <query>index=$indexToken$ AND source="-a-$stageToken$"   <form version="1.1" theme="dark"> <label> stats</label> <fieldset submitButton="true"> <input type="dropdown" token="indexToken1"> <label>Environment</label> <choice value="pd-ap,prod">PROD</choice> <choice value="np-ap,test">TEST</choice> <change> <eval token="stageToken">mvindex(split($value$,","),1)</eval> <eval token="indexToken">mvindex(split($value$,","),0)</eval> </change> <default>np-ap,test</default> </input> <input type="dropdown" token="entityToken"> <label>Data Entity</label> <choice value="aa">aa</choice> <choice value="bb">bb</choice> <choice value="cc">cc</choice> <choice value="dd">dd</choice> <choice value="ee">ee</choice> <choice value="ff">ff</choice> <default>aa</default> </input> <input type="time" token="timeToken" searchWhenChanged="false"> <label>Time</label> <default> <earliest>-24h@h</earliest> <latest>now</latest> </default> </input> </fieldset> <row> <panel> <html id="APIStats"> <style> #user{ text-align:center; color:#BFFF00; } </style> <h2 id="user">API</h2> </html> </panel> </row> <row> <panel> <table> <title>Unique</title> <search> <query>index=$indexToken$ AND source="-a-$stageToken$" | stats count </query> <earliest>$timeToken.earliest$</earliest> <latest>$timeToken.latest$</latest> </search> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> </table> </panel> </row> </form>