All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

How to load drop down list based on time selector token
I'm using python SDK to search and retrieve results in JSON output_mode. The data I'm searching for was loaded into splunk as a CSV file with the first row as header.  Currently I'm getting these ... See more...
I'm using python SDK to search and retrieve results in JSON output_mode. The data I'm searching for was loaded into splunk as a CSV file with the first row as header.  Currently I'm getting these keys in the output     "_bkt","_cd","_indextime","_raw","_serial","_si","_sourcetype","_time",host,index,linecount,source,sourcetype,"splunk_server"     _raw field has a string of comma separated values(actual data). I'm not able to get the header for these values. The rest of the fields are just metadata.   How do I get the CSV header in the JSON output of the search? I even tried CSV 'output_mode'. No luck
Hello Splunkers !! In below screenshot, we are getting the results in one of our Splunk report. But here is the issue that, we are getting blank column results also( highlighted yellow ). So please... See more...
Hello Splunkers !! In below screenshot, we are getting the results in one of our Splunk report. But here is the issue that, we are getting blank column results also( highlighted yellow ). So please let me understand this is the issue with any python version or any other workaround available for this issue ?   Thanks in advance
Hi Team, Can I get the sourcetype in splunk for below DB authentication, authorization, accounting (AAA) logs.
I need to know how to Sum(CreatedSD?,CreatedBD,CreatedLOD) as CreatedTotal Login, Document and Loan Counts High Level SERVICE UserLogins DocumentUploads CreatedSD? CreatedBD... See more...
I need to know how to Sum(CreatedSD?,CreatedBD,CreatedLOD) as CreatedTotal Login, Document and Loan Counts High Level SERVICE UserLogins DocumentUploads CreatedSD? CreatedBD CreatedLOD CreatedTotal Prod-310 1           Prod-330 2098 145 17 20 2   Prod-340 1553 184 3 9 6         20 29 8     3652 329 40 58 16    
Our Splunk environment is working specific vlan our management want to have to Splunk moved to out of band management or move some management servers of Splunk to different vlan and access the manage... See more...
Our Splunk environment is working specific vlan our management want to have to Splunk moved to out of band management or move some management servers of Splunk to different vlan and access the management servers through ssh by using out band management is this can be Done? Thank you in advance 
Hi, I am forced to set individual TZ for individual hosts in a SeverClass because the hosts' OS time is not standardized. I have noticed TZ = US/Eastern, TZ = US/Central, and TZ = US/Pacific, all a... See more...
Hi, I am forced to set individual TZ for individual hosts in a SeverClass because the hosts' OS time is not standardized. I have noticed TZ = US/Eastern, TZ = US/Central, and TZ = US/Pacific, all account for Daylight Savings Time automatically. However, I have servers in the following Time Zones and I am hoping someone can confirm what TZ settings I should use to automatically adjust for DST. AUS/Eastern <<< using TZ=Australia/Sydney AWST <<< using TZ=Australia/West Etc/GMT+12  <<<< cannot find alternate GB (for UK BST)  <<<< using TZ=GB (for UK locations w/ BST) HKT <<<< cannot find alternate Hopefully that is correct... I was given these by the host admin.  Please refer me to doc, as I don't find these TZs in Splunk docs, other than a ref to wikipedia.   Thank you
Hi All, I have this error message on the SH in Splunk: { Knowledge bundle size=3525MB exceeds max limit=2048MB. Distributed searches are running against an outdated knowledge bundle. Please remove... See more...
Hi All, I have this error message on the SH in Splunk: { Knowledge bundle size=3525MB exceeds max limit=2048MB. Distributed searches are running against an outdated knowledge bundle. Please remove/disable files from knowledge bundle or increase maxBundleSize in distsearch.conf. } What I did is increase the maxBundleSize in distsearch.conf : I did this command on the server: /opt/splunk/bin/splunk btool distsearch list --debug | grep maxBundleSize and the result is: /opt/splunk/etc/system/default/distsearch.conf                  maxBundleSize = 2048 So inside the /opt/splunk/etc/system/local/distsearch.conf I added the: [replicationSettings] maxBundleSize = 4000 Restarted Splunk, and noticed that the first error message is gone, but a new Yellow warning appeared: { The current bundle directory contains a large lookup file that might cause bundle replication fail. The path to the directory is /opt/splunk/var/run/InvestBank-SH-1-1681121119-1681121612.delta. } So I went to this path to check what is going on there: cd /opt/splunk/var/run I have found 2 large files and one medium. Can someone please advise me on what to do past this point?  I have found someone posted to check the below search: index =_internal sourcetype=splunkd component=Archiver Archiving large_file=* | stats count latest(size_in_bytes) by large_file I don't know if this has any relation to the subject. Please note that my Splunk environment is not a cluster.  
Hello All, Currently a certain application is sending the data to splunk via syslog method(rsyslog) using TCP, so now the application team want to try and send the data using the syslog (rsyslog) ... See more...
Hello All, Currently a certain application is sending the data to splunk via syslog method(rsyslog) using TCP, so now the application team want to try and send the data using the syslog (rsyslog) over TCP with TLS encryption,  Can any one please help me how this can be achieved, and it would be really awesome if anybody can provide leads on any splunk documentation or links available for reference.  Thanks.
Hi All, Can you please guide and suggest, How to onboard the MACOS 13 logs into Splunk. Logs related to Web browser history and and network connection logs from MACOS 13 Systems.
Hi all, While adding payload URL on GitHub Webhook for Splunk http event collector getting below error : {"text":"Query string authorization is not enabled","code":16}  Please provide your valuabl... See more...
Hi all, While adding payload URL on GitHub Webhook for Splunk http event collector getting below error : {"text":"Query string authorization is not enabled","code":16}  Please provide your valuable suggestion to get it worked. Thanks in advance, 
Hi all, I have two fields. I want a splunk query that not a field contains another field. For example field1 is ::ffff:127.0.0.1 and the field2 is 127.0.0.1 , so I dont want to see the queries ... See more...
Hi all, I have two fields. I want a splunk query that not a field contains another field. For example field1 is ::ffff:127.0.0.1 and the field2 is 127.0.0.1 , so I dont want to see the queries that field1 contains field2. Thank you
Hii all, I am trying to add http event collector and get events from GitHub Webhook on Splunk cloud free instance. While adding GitHub Webhook getting below error : {"text":"Query string authoriz... See more...
Hii all, I am trying to add http event collector and get events from GitHub Webhook on Splunk cloud free instance. While adding GitHub Webhook getting below error : {"text":"Query string authorization is not enabled","code":16} I checked the Query String Authentication enable option not available in splunk cloud. Please provide your valuable suggestion to get it worked. Thanks in advance, 
I have done a search as below to create a table in Dashboard to list the top 20 users that upload files the most to cloud storage services and their accessed cloud storage service URLs then get the n... See more...
I have done a search as below to create a table in Dashboard to list the top 20 users that upload files the most to cloud storage services and their accessed cloud storage service URLs then get the number of file uploads for each user base on that listed 20 users and theirs accessed URLs. There is a problem that the search shows different results sometimes but when I rerun the search it will return the same result which is probably the correct one even though I do not change anything.  Has anyone seen the same symptom like this so far? Is there anything in my queries that possibly affects the search or there might be a cache problem? (time range:last month)     index=proxy sourcetype="XXX" filter_category="File_Storage/Sharing" [ search index=proxy sourcetype="XXX" filter_category="File_Storage/Sharing" | eval end_time=strftime(_time, "%Y-%m-%d %H:%M:%S") | eval bytes_in=bytes_in/1024/1024/1024 | eval bytes_in=round(bytes_in, 2) | table end_time,user,url,bytes_in | sort - bytes_in | head 20 | fields user url ] | eval end_time=strftime(_time, "%Y-%m-%d %H:%M:%S") | eventstats count(eval(bytes_in>0)) as Number_File_Uploads by user url | table end_time,user,src,src_remarks01,url,bytes_in,Number_File_Uploads | eval bytes_in=bytes_in/1024/1024/1024 | eval bytes_in=round(bytes_in, 2) | sort - bytes_in | head 20 | rename "end_time" as "Access date and time", "src" as "IP address", "src_remarks01" as "Asset information", "bytes_in" as "BytesIn(GB)"    
Hi All,      I had a panel "OS", that gives the value os in single value visualization,  based on the value of os,  if it were "*windows*" it should display a panel "defender version", not "Age... See more...
Hi All,      I had a panel "OS", that gives the value os in single value visualization,  based on the value of os,  if it were "*windows*" it should display a panel "defender version", not "Agent version" panel, If it were "MAC" or "OS X" or "IOS" it should display "Agent version" panel, not "defender version" panel, I don't need drop down by selecting the values in "OS" panel, The os values wants to make impact on choosing the panel.   <form theme="dark"> <label> ASSET STATUS</label> <fieldset submitButton="false" autoRun="true"> <input type="radio" token="category" searchWhenChanged="true"> <label>Category</label> <choice value="work">Work</choice> <choice value="auto">Auto</choice> <choice value="server">Server</choice> <search> <query/> <earliest>-24h@h</earliest> <latest>now</latest> </search> <default>work</default> <change> <condition value="work"> <set token="Work">"Work"</set> <unset token="Auto"></unset> </condition> <condition value="auto"> <set token="Auto">"Auto"</set> <unset token="Work"></unset> </condition> <condition value="server"> <set token="Server">"Server"</set> <unset token="Work"></unset> <unset token="Auto"></unset> </condition> </change> </input> <input type="text" token="src_name" searchWhenChanged="true"> <label>src_name</label> <default>*</default> </input> </fieldset> <row> <panel> <title>OS</title> <single> <search> <query>| inputlookup $category$_sanity_check_kvstore | fields src_name, os | search src_name IN ($src_name$) | table os</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="colorMode">block</option> <option name="drilldown">all</option> <option name="rangeColors">["0xdc4e41","0x53a051"]</option> <option name="rangeValues">[0]</option> <option name="refresh.display">progressbar</option> <option name="useColors">1</option> </single> </panel> </row> <row> <panel depends="$Work$"> <title>Defender Sig Version</title> <single> <search> <query>| inputlookup $category$_sanity_check_kvstore | fields src_name, defender_sig_version_check | search src_name IN ($src_name$) | table defender_sig_version_check</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="colorMode">block</option> <option name="drilldown">none</option> <option name="rangeColors">["0xdc4e41","0x53a051"]</option> <option name="rangeValues">[0]</option> <option name="refresh.display">progressbar</option> <option name="useColors">1</option> </single> </panel> <panel> <title>Agent Version</title> <single> <search> <query>| inputlookup $category$_sanity_check_kvstore | fields agentVersion_value,base_agentVersion_value, src_name | search src_name IN ($src_name$) | eval edr_mac_check=if(agentVersion_value&gt;=base_agentVersion_value,3,0) | table edr_mac_check</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="colorMode">block</option> <option name="drilldown">none</option> <option name="rangeColors">["0xdc4e41","0x53a051"]</option> <option name="rangeValues">[0]</option> <option name="useColors">1</option> </single> </panel> </row>   Thanks in Advance!
Is it possible to add a border to a pie chart? When the data values are 0 it shows a blank chart. I want to add a border to it so that it becomes more comprehensible.    
Hi, I was created 2 dashboards for pending tickets and completed tickets. when pending tickets are  completed, need to decrease the count in the pending tickets. Any help on this Thanks in advance.... See more...
Hi, I was created 2 dashboards for pending tickets and completed tickets. when pending tickets are  completed, need to decrease the count in the pending tickets. Any help on this Thanks in advance.  When I have closed the tickets the completed count was increasing, but the pending count is not decreasing. please help on this. Thanks & Regards 
Hello , I am using the ServiceNow development version instance, and I want to integrate Splunk with ServiceNow. I have installed the Splunk Add-on for Service in Splunk, and created an account succe... See more...
Hello , I am using the ServiceNow development version instance, and I want to integrate Splunk with ServiceNow. I have installed the Splunk Add-on for Service in Splunk, and created an account successfully authenticated by OAuth2.0, and successfully collected in Splunk Incidents in ServiceNow. But now I want to create an incident in Splunk to collect this incident in ServiceNow, but it has not been successful. Is there any error in my process? How should I do it next? I am eagerly waiting for your reply. Thank you    
Can anyone help with these errors? Clicking Configuration brings this error. Configuration page failed to load, the server reported internal errors which may indicate you do not have access... See more...
Can anyone help with these errors? Clicking Configuration brings this error. Configuration page failed to load, the server reported internal errors which may indicate you do not have access to this page. Error: Request failed with status code 500 ERR0002 Then, this occurs when I try and configure a new input, selecting either option, O365 Email Groups, or O365 email.       Error response received from server: Unexpected error "<class 'splunktaucclib.rest_handler.error.RestError'>" from python handler: "REST Error [500]: Internal Server Error -- Traceback (most recent call last): File "/opt/splunk/etc/apps/TA_microsoft_o365_email_add_on_for_splunk/bin/ta_microsoft_o365_email_add_on_for_splunk/aob_py3/splunktaucclib/rest_handler/handler.py", line 124, in wrapper for name, data, acl in meth(self, *args, **kwargs): File "/opt/splunk/etc/apps/TA_microsoft_o365_email_add_on_for_splunk/bin/ta_microsoft_o365_email_add_on_for_splunk/aob_py3/splunktaucclib/rest_handler/handler.py", line 345, in _format_all_response self._encrypt_raw_credentials(cont["entry"]) File "/opt/splunk/etc/apps/TA_microsoft_o365_email_add_on_for_splunk/bin/ta_microsoft_o365_email_add_on_for_splunk/aob_py3/splunktaucclib/rest_handler/handler.py", line 375, in _encrypt_raw_credentials change_list = rest_credentials.decrypt_all(data) File "/opt/splunk/etc/apps/TA_microsoft_o365_email_add_on_for_splunk/bin/ta_microsoft_o365_email_add_on_for_splunk/aob_py3/splunktaucclib/rest_handler/credentials.py", line 293, in decrypt_all all_passwords = credential_manager._get_all_passwords() File "/opt/splunk/etc/apps/TA_microsoft_o365_email_add_on_for_splunk/bin/ta_microsoft_o365_email_add_on_for_splunk/aob_py3/solnlib/utils.py", line 128, in wrapper return func(*args, **kwargs) File "/opt/splunk/etc/apps/TA_microsoft_o365_email_add_on_for_splunk/bin/ta_microsoft_o365_email_add_on_for_splunk/aob_py3/solnlib/credentials.py", line 281, in _get_all_passwords clear_password += field_clear[index] TypeError: can only concatenate str (not "NoneType") to str ". See splunkd.log/python.log for more details.
Hi,  I'm running index-time field extractions for a large TXT report.  For this particular regex searches, I'm searching and capturing 3 fields, and then using the repeat_match = true flag to crawl ... See more...
Hi,  I'm running index-time field extractions for a large TXT report.  For this particular regex searches, I'm searching and capturing 3 fields, and then using the repeat_match = true flag to crawl the rest of the TXT file. My goal is to extract data, but also somehow keep the sets of data extracted together but separate from the next set of regex captures.  For example: repeat 0 regex: Title, CCI, FixText repeat 1 regex: Title, CCI, FixText repeat 2 regex: Title, CCI, FixText But I need to keep the repeat0 fields connected somehow, repeat1 fields connected somehow, and repeat2 fields connected somehow; but also separate the repeat0 set from the repeat1 set. In this example, I want to ensure that Title for repeat 0 doesn't end up being attached to the CCI in repeat 1.   In my current rendition, I get all the data I need, but they are inside giant fields, where "CCI" might contain 100 items, and "FixText" might contain 100 items.  But I can't seem to figure out how to divide/expand them so that i can ensure that each group has the correct correlated information. The "FixText" field could include 1 line or many lines, so I can't separate those from one another easily after the get grouped.   I would like to note, that I'm ok with expanding these at search time as opposed to index time, but i'm thinking it might be easier reference the fields if they get separated at index time? Maybe I could add a pipe or something to the end of each capture, and then use a delimiter to expand the fields? Any help is appreciated. Thank you   Transforms: [SCAP_FAIL_INFO] REGEX = Title\s+\:\s(?<scap_fail_title>V.+)[\s\S]+?NIST\sSP\s800\-53\sRev\s4\:\s(?<scap_cci>.+);[\s\S]+?Fix\sText\s+(?<scap_fix_text>[\S\s]*?)?\nSeverity LOOKAHEAD = 600000 REPEAT_MATCH = true WRITE_META = true