All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I need to search using the input from csv and compare the results with the same csv containing two columns - and show the difference between them (accountname present and accountname absent) eventc... See more...
I need to search using the input from csv and compare the results with the same csv containing two columns - and show the difference between them (accountname present and accountname absent) eventcode=4768 contains Account_Name in NTID format eventcode=4769 contains Account_Name in UPN format index=<index_name> host=<host_list> EventCode=4768 OR EventCode=4769 [| inputlookup accountname.csv] | dedup Account_Name | table Account_Name, Ticket_Encryption_Type, Supplied_Realm_Name, Service_Name,Service_ID how do I make the results from above query to show the difference? Appreciate the help. Thanks
Hello everyone, I would like to ask if the following architecture is feasible to be build and to be functional: - Windows Domain with 200 Endpoints having UF installed. Endpoints collect host log... See more...
Hello everyone, I would like to ask if the following architecture is feasible to be build and to be functional: - Windows Domain with 200 Endpoints having UF installed. Endpoints collect host logs. - Heavy Forwarder collects all data from the UFs. - Same HF acts as an intermediate forwarder and forwards raw logs received to a Remote Indexer, outside the Windows Domain. - Remote Indexer is a Search Peer/Deployment Client of a Search Head/Deployment Server where Splunk ES is installed. Questions: 1. Is it possible for Splunk HF to be also a Deployment Server and manage the UFs on Endpoints? 2. Is HF a must for collecting data from 200 Endpoints and re-forwarding them to Indexer? Or a Splunk UF can easily do the job too with minimal footprint? 3. HW will not be directly connected with Splunk License Master (Search Head with ES installed). Can I install a license and set it as a License Slave? Thank you in advance. With kind regards, Chris
how to check the transfer speed from UF to indexer ?   is it possible to check by events or source ?
I am trying to use the Splunk Addon for New Relic v2.2.0, and I keep getting errors. I think the New Relic API has changed slightly, and I tried to edit the modalert python file, yet I'm still gettin... See more...
I am trying to use the Splunk Addon for New Relic v2.2.0, and I keep getting errors. I think the New Relic API has changed slightly, and I tried to edit the modalert python file, yet I'm still getting errors such as: 02-22-2022 04:52:01.962 -0500 ERROR sendmodalert - action=alerts_to_newrelic STDERR - NameError: name 'basestring' is not defined 02-22-2022 04:52:01.962 -0500 ERROR sendmodalert - action=alerts_to_newrelic STDERR - if isinstance(self.sid, basestring) and 'scheduler' in self.sid: 02-22-2022 04:52:01.962 -0500 ERROR sendmodalert - action=alerts_to_newrelic STDERR - File "/opt/splunk/etc/apps/Splunk_TA_New_Relic/bin/splunk_ta_new_relic/cim_actions.py", line 124, in __init__   We are on Splunk 8.1.7. Thanks! Stephen
Hi everyone, There are 10 single value graphs on my Dashboard. I don't want to use global time Range. How can I add for each?
Hi everyone, I need help in figuring out a way to use my report (table data) into calculations in my dashboard panel. I have a report that runs on daily basis and calculates avg response time of ser... See more...
Hi everyone, I need help in figuring out a way to use my report (table data) into calculations in my dashboard panel. I have a report that runs on daily basis and calculates avg response time of servers by environments (app name say ABC, def and xyz). Now I want to use this response time as an input to one of my panel's back end search. So report data is like below app name   response time       1) ABC           0.234 sec 2) def            0.113 sec 3) xyz            0.227 sec I want to use this response time to build gauge in my dashboard panel. I have added this report in my dashboard panel that gives in a search ref tag but I don't know how to use this further.  
Hello, Have not been able to find any current useful information regarding sending logs from HP-UX 11.11 and 11.31 to Splunk. Is it at all supported? Does anyone know if possible, if yes how and f... See more...
Hello, Have not been able to find any current useful information regarding sending logs from HP-UX 11.11 and 11.31 to Splunk. Is it at all supported? Does anyone know if possible, if yes how and for which Splunk version? Thank you
I want to add another title next to "UIP" on the apps bar! Settings-->User Interface-->Navigation Menus--> <nav search_view="search"> <view name="search" default='true' /> <view name="datasets" /... See more...
I want to add another title next to "UIP" on the apps bar! Settings-->User Interface-->Navigation Menus--> <nav search_view="search"> <view name="search" default='true' /> <view name="datasets" /> <view name="reports" /> <view name="alerts" /> <view name="dashboards" /> <collection label="UIP"> <view name="login_name"/> <view name="agent"/> <view name="hase_dashboard"/> <view name="hase_uip_data_search"/> </collection> <collection label="CIVR"> <view name="IVR-CC"/> <view name="IVR-CC3"/> </collection> </nav>  
I used to use the tokenlinks.js from the simple xml dashboard examples. Copied the file into appserver/static of my own app, add the script="tokenlinks.js" and all works. For some reason that no lo... See more...
I used to use the tokenlinks.js from the simple xml dashboard examples. Copied the file into appserver/static of my own app, add the script="tokenlinks.js" and all works. For some reason that no longer works. I have downloaded the dashboard examples app, copied the dashboard XML shown in the docs for the Custom Token Links example. Copied the JS shown in the gui into the appserver/static/tokenlinks.js into my own app, but the JS does not seem to get called and the dashboard does not do what it does in the Example Dashboard app. So, I have a working dashboard examples app, but not my own app. I see that the tokenlinks.js in appserver/static for the dashboard examples is now >400K and is very different to the original. It seems to be some webpack related stuff. Any idea what I am doing wrong?  
I'm trying to enable SAML SSO for my splunk test instance.  In the "Fully qualified domain name or IP of the load balancer" I given my instance name and I tried to skip the port number  but it taking... See more...
I'm trying to enable SAML SSO for my splunk test instance.  In the "Fully qualified domain name or IP of the load balancer" I given my instance name and I tried to skip the port number  but it taking "8443" automatically. So my splunk acs url becomes https://<my instance name> :8443/saml/acs. After successfult saml authentication, I'm landing on https://<my instance name> :8443/saml/acs then it say" This site can't be reached" Here, I'm not using any load balancer. It's trial version and I'm using for testing purpose. Please suggest me how to fix.
I am on Splunk 8.1 trying to create a dynamic dashboard. I am trying to create a multisearch query, the searches for which will be based on the checkboxes that the user clicks.   <input t... See more...
I am on Splunk 8.1 trying to create a dynamic dashboard. I am trying to create a multisearch query, the searches for which will be based on the checkboxes that the user clicks.   <input type="time" token="field1"> <label>Time</label> <default> <earliest>-15m</earliest> <latest>now</latest> </default> </input> <input type="text" token="userinput1"> <label>User Input 1</label> </input> <input type="text" token="userinput2"> <label>User Input 2</label> </input> <input type="checkbox" token="indexesSelected" searchWhenChanged="true"> <label>Indexes</label> <choice value="[search index=index1 $userinput1$ $userinput2$]">Index 1</choice> <choice value="[search index=index2 $userinput1$ $userinput2$]">Index 2</choice> <default></default> <initialValue></initialValue> <delimiter> </delimiter> <prefix>| multisearch [eval test1="test1"] [eval test2="test2"] </prefix> </input>   The search part looks like this:   <search> <query>$indexesSelected$ | table _time, index, field1, field2, field3, field4 | sort Time </query> <earliest>$field1.earliest$</earliest> <latest>$field1.latest$</latest> </search>     This works as expected except that the final query looks like this: | multisearch [eval test1="test1"] [eval test2="test2"] [search index=index1 $userinput1$ $userinput2$] [search index=index2 $userinput1$ $userinput2$] How can I make these $userinput1$ and $userinput2$ be converted to their token value from the user inputs in the dashboard and not as literal strings. I have tried to use <change> tags to use eval and set based on the <condition> that the user selects, but eval does not allow token value and replaces with literal strings only. Something like this:   <change> <condition match="like($indexesSelected$,&quot;%index1%&quot;)"> <eval token="finalQuery">replace($indexesSelected$,"index1", "[search index=index1 $userinput1$ $userinput2$]")</eval> </condition> <condition match="like($indexesSelected$,&quot;%index2%&quot;)"> <eval token="finalQuery">replace($indexesSelected$,"index2", "[search index=index2 $userinput1$ $userinput2$]")</eval> </condition> </change>  
Hello , sorry for this another noob question. Is there a way that we can set a search result to a token in js ? For example:  <row> <panel depends="$panel_show$"> <single> <search> <query>... See more...
Hello , sorry for this another noob question. Is there a way that we can set a search result to a token in js ? For example:  <row> <panel depends="$panel_show$"> <single> <search> <query>|makeresults|eval result=1016</query> <done> <set token="mytoken">$result.result$</set> </done> </search> <option name="rangeColors">["0x006d9c","0xf8be34","0xf1813f","0xdc4e41"]</option> <option name="rangeValues">[100,200,300]</option> <option name="underLabel">cool number</option> <option name="useColors">1</option> </single> </panel> </row> "<done> <set token="mytoken">$result.result$</set> </done>" is there a way to transfer this to js ?  thank you
It is set to select the host value as the file name. The name of the file that UF was reading will be changed in the middle of the file. Which of the following is your host name? (1) The one... See more...
It is set to select the host value as the file name. The name of the file that UF was reading will be changed in the middle of the file. Which of the following is your host name? (1) The one before the change  (2) The one after the change
Hi, I am trying to integrate AWS ALB logs using sqs based s3.  However, I am getting below error. I used ELB Access Logs decoder and tried with different source types.   2022-02-22 02:57:56,7... See more...
Hi, I am trying to integrate AWS ALB logs using sqs based s3.  However, I am getting below error. I used ELB Access Logs decoder and tried with different source types.   2022-02-22 02:57:56,763 level=ERROR pid=18045 tid=MainThread logger=splunk_ta_aws.modinputs.sqs_based_s3.handler pos=utils.py:wrapper:72 | datainput="symplistaging1_elb" start_time=1645498672 | message="Data input was interrupted by an unhandled exception." Traceback (most recent call last): File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/splunksdc/utils.py", line 70, in wrapper return func(*args, **kwargs) File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/splunk_ta_aws/modinputs/sqs_based_s3/handler.py", line 668, in run decoder = self.create_file_decoder() File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/splunk_ta_aws/modinputs/sqs_based_s3/handler.py", line 572, in create_file_decoder return factory.create(**vars(args)) File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/splunk_ta_aws/common/decoder.py", line 164, in create return decoder_type(**kwargs) TypeError: 'NoneType' object is not callable   Any ideas or solutions highly appreciated? BR, Gayan
It's always mentioned from doc but I couldn't find anywhere to download it including splunk base.
I have upgraded to Splunk 8.2.3 recently in a test environment. The last step in upgrading our slightly outdated environment is to migrate the KVStore. I don't understand the directions. What system ... See more...
I have upgraded to Splunk 8.2.3 recently in a test environment. The last step in upgrading our slightly outdated environment is to migrate the KVStore. I don't understand the directions. What system are we going into "the server.conf in the $SPLUNK_HOME/etc/system/local/ directory." Is it in the Deployer? Cluster Master? Any of the search heads we have? If that file is edited in one system, how is it being pushed to the search heads?  Also could we leave the variable storageEngineMigration=true in that file or does it have to stay as false when not migrating?  Thank you! @https://docs.splunk.com/Documentation/Splunk/8.2.3/Admin/MigrateKVstore
I'm trying to extract a number that may not always be formatted the same way every time. Examples:     OK: Process matching httpd is using 0% CPU OK: Process matching httpd is using 1.1% CPU... See more...
I'm trying to extract a number that may not always be formatted the same way every time. Examples:     OK: Process matching httpd is using 0% CPU OK: Process matching httpd is using 1.1% CPU OK: Process matching httpd is using 24.1% CPU       It's the "0%" that is tripping me up. This will work for numbers with a decimal but not for a percentage that is just "0".     rex "using\s(?<CPU_util_perc>\d+.\d+)\%"       Any help is greatly appreciated.
I wanted to join services (part of same index) with common field and show chosen fields from both searches.. Index=test service=serv1 Name RecordID Version Index=test service=serv2 State ... See more...
I wanted to join services (part of same index) with common field and show chosen fields from both searches.. Index=test service=serv1 Name RecordID Version Index=test service=serv2 State RecordID Version wants to combine two searches by RecordID from Service2  (meaning to optimize query needs to first take RecordID from Service2 and match with Service1)... and notice fieldname Version is common both services. so hence wants to rename version field in service2 to version2.  And final result is Name Version1 Version2  SQL Query: Select A.Name, A.version, B.version from Service1 A, Service2 B where B.RecordID = A.RecordID  
Hello, I don't understand why a file coming from a windows based UF does not get indexed properly.  By this I mean that some fields contain newlines that are interpreted by Splunk as event delimiter... See more...
Hello, I don't understand why a file coming from a windows based UF does not get indexed properly.  By this I mean that some fields contain newlines that are interpreted by Splunk as event delimiters thus turning a single event into multiple events.  I can index that same file manually or from a folder input on a unix filesystem without issue. I am using Splunk Enterprise 8.0.5 with UF 8.0.5 running on Windows 10 VM.  I am trying to index ServiceNow ticket data contained in an ANSI encoded CSV file. Scenarios that work and do not work: I upload the CSV manually through the "Add Data" wizard specifying the sourcetype and it works I deposit the CSV in a local folder on the same VM Splunk Enterprise is installed, configured with the same sourcetype, and it works I deposit the CSV in a local folder on the same VM the UF is installed, configured with the same sourcetype, and it does not work I have no idea why this last scenario does not work.  Here is a simple diagram outlining the second and third scenarios (the third one doesn't work and is highlighted in red): Here is the file I am trying to index (contains one ticket):     "company","opened_by","number","state","opened_at","short_description","cmdb_ci","description","subcategory","hold_reason","assignment_group","resolved_at","resolved_by","category","u_category","assigned_to","closed_by","priority","sys_updated_by","sys_updated_on","active","business_service","child_incidents","close_notes","close_code","contact_type","sys_created_by","sys_created_on","escalation","incident_state","impact","parent","parent_incident","problem_id","reassignment_count","reopen_count","severity","sys_class_name","urgency","u_steps","closed_at","sys_tags","reopened_by","u_process","u_reference_area","calendar_duration","business_duration" "XXX4-S.A.U.R.O.N","Jane Doe","INC000001","Closed","2021-06-01 08:34:04","Short description","SYS","Dear All, For some reason this data doesn't get indexed incorrectly. There are two LF characters between this line and the previous one: THIS ON THE OTHER HAND HAS A SINGLE LF ABOVE Manual upload works, but input from windows does not... Thank you for your help and assitance. Jack","ERP","","ABC-123-ERB-SWD-AB-XXX","2021-06-10 10:19:14","Jack Frost","SOFTWARE","Query","Raja xxxxxx","Jack Frost","3 - Moderate","XXXXXX@DIDI.IT","2021-06-15 23:00:02","false","AWD Services","0","closure confirmed by XXXXX@fox.COM ","Solved (Permanently)","Self-service","XXXXX@fox.COM","2021-06-01 08:34:04","Normal","Closed","3 - Low","","","","1","0","3 - Low","Incident","1 - High","0","2021-06-15 11:00:11","","","DATA","DATA","783910","201600"     here is the props.conf stanza which has been positioned exclusively on the indexer:     [snow_tickets] LINE_BREAKER = ([\r\n])+ DATETIME_CONFIG = NO_BINARY_CHECK = true MAX_EVENTS = 20000 TRUNCATE = 20000 TIME_FORMAT = %Y-%m-%d %H:%M:%S TZ = Europe/Rome category = Structured pulldown_type = 1 disabled = false INDEXED_EXTRACTIONS = csv KV_MODE = SHOULD_LINEMERGE = false TIMESTAMP_FIELDS = sys_updated_on CHARSET = MS-ANSI BREAK_ONLY_BEFORE_DATE =     here is a screenshot of the data coming from the local folder (works): here is what it looks like when it comes from the windows UF: as you can see, it treats a newline as a new event, and does not seem to recognise the sourcetype. Is there any blatantly obvious thing that I've missed?  Any push in the right direction would be great! Thank you and best regards, Andrew
Hello,  Thank you for taking the time to read/consider my question, it's very much appreciated.  I'm revamping a legacy Splunk deployment for a mid-size company that I work for and have recently de... See more...
Hello,  Thank you for taking the time to read/consider my question, it's very much appreciated.  I'm revamping a legacy Splunk deployment for a mid-size company that I work for and have recently deployed IT essentials work to monitor the health of both Windows and *nix hosts in our environment, this app has many wonderful features and visualizations, even though some/most are locked behind the ITSI paywall.  What I'm wondering (mainly from a security perspective), is if there's equivalent apps that Splunk (or third parties, or even individuals) have developed to visualize network & authentication data that is collected from Windows and Unix endpoints. I know network bandwidth is included within the ITE suite, which is terrific, but doesn't help me identify which processes are linked to remote network connections, or track lateral movement across the network.  Do people usually just develop apps internally that take care of this? If that's the case than that's totally fine and I completely understand admins not wanting to share that outside of their own organization, but I can't help but feel that I'm not the only one in this boat, and there must be others with this conundrum as well. As far as I know this is something that used to be dealt with rather well by the purpose built apps by Splunk for Windows and *nix systems, but now that these are going to be deprecated this year I'd like a long-term solution to this problem.  If these types of visualizations are typically reserved for EDR/EPP apps like Crowdstrike, Cylance, S1, Sophos, etc. I also get that, but I'm not actually sure if these apps all have dashboards that would allow you to filter by host, user, process, etc to identify suspicious remote network connections, or authentication attempts across a wide swath of monitored systems.  Again, I'd like to reiterate my appreciation for you taking the time to consider my question. I'm sure there's a simple solution to this that I just have not thought of or stumbled across in my research, but rather than waste another week or two trying to find what everyone else is doing for this I figured I'd just ask the experts myself.  Thanks again!