All Topics

Top

All Topics

We have recently switched over from one proxy to another in our organisation, when trying to put the new proxy details in the relevant add-ons like serviceNOW, cisco umbrella etc the data feeds stop,... See more...
We have recently switched over from one proxy to another in our organisation, when trying to put the new proxy details in the relevant add-ons like serviceNOW, cisco umbrella etc the data feeds stop, the Network team inform me that we need to use the CA file that they supply. Does anyone know where this needs to be installed in Splunk? I thought in /etc/auth/ but not sure how we point the config to it.
Hello, I would like to ask if there is a way to restore splunk user password. During the deployment of UF on client splunk user has been created to deploy UF. Unfortunately this pass is not working a... See more...
Hello, I would like to ask if there is a way to restore splunk user password. During the deployment of UF on client splunk user has been created to deploy UF. Unfortunately this pass is not working anymore. How to restore password for this user ? What would happen if new version of UF 9.1.2 is deployed ? Does is it help create a new user ?  Thanks in advance
Hi There!    I would like to find the values of host that were in macro 1 but not in macro 2 search 1   `macro 1` | fields host   search 2   `macro 2` | fields host   macro ... See more...
Hi There!    I would like to find the values of host that were in macro 1 but not in macro 2 search 1   `macro 1` | fields host   search 2   `macro 2` | fields host   macro 1 host a b c d macro 2 host a b e f Result Count - 2 because host c and d were not in macro 2 Thanks in Advance!
Hello Splunkers!! index=messagebus "AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName"="ASR/Hb/*/Entry*" OR "AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName"="ASR/Hb/*... See more...
Hello Splunkers!! index=messagebus "AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName"="ASR/Hb/*/Entry*" OR "AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName"="ASR/Hb/*/Exit*" | stats count by "AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName" |fields - _raw | fields AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName | rex field=AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName "(?<location>Aisle\d+)" | fields - AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName |strcat "raw" "," location group_name | stats count BY location group_name   Current visualisation I am getting by above search in column chart:      I want to obtain below visualization. Please guide me what changes I need to used in my current SPL to obtain below visualization.    
I am using Splunk 9.0.4 and I need to make a query where I extract data from a main search. So I am interested in results from the main search:   stage=it sourcetype=some_type NOT trid="<null>... See more...
I am using Splunk 9.0.4 and I need to make a query where I extract data from a main search. So I am interested in results from the main search:   stage=it sourcetype=some_type NOT trid="<null>" reqest="POST /as/*/auth *"   But then I need filter out results from the main search, using a subsearch that operates on a different data set, using a value from a field from the main search, let's call it trid, and trid is a string that might be part of a  value called message in a subsearch. There might be more results in the subsearch, but if there is at least one result in a subsearch then the result from the main search stays in the main search, if not it should not be included in the main search. So I am interested only in the results from the main search, and the subsearch is only used to filter out some of them that does not match.   stage=it sourcetype=some_type NOT trid="<null>" reqest="POST /as/*/auth *" | fields trid [ search stage=it sourcetype=another_type | eval matches_found=if(match(message, "ID=PASSLOG_" + trid), 1, 0) | stats max(matches_found) as matches_found ] | where matches_found>0   After a few hours I cannot figure out how to make it. What is wrong with it? Please advise.
Hi All, I have been trying to extract userids which has special characters in it but with no luck. For ex let's say a field name uid contains two userids one is "roboticts@gmail.com" and the other ... See more...
Hi All, I have been trying to extract userids which has special characters in it but with no luck. For ex let's say a field name uid contains two userids one is "roboticts@gmail.com" and the other one is "difficult+1@gmail.com". Now I want to write a query which could extract only the uid with + sign in it.  Please help on this
Hi, I've built the add-on using Add-on Builder which gathers some data from user including an API key (type o f the field is password so it replaces API key with asterisks on the input creation page)... See more...
Hi, I've built the add-on using Add-on Builder which gathers some data from user including an API key (type o f the field is password so it replaces API key with asterisks on the input creation page). During the creation of an input I can see that the API key is not encrypted an passed to new_input request as a plain text in the payload body. It only happens if the API key is valid. Is there any way to remove or hide the API key there?
When I use my code, I can see this error. " Error in 'where' command : The operator at ',127.542 - 0.001' is invalid. The problem code is this. | where time >= $max_value$ - 0.001  When I print "... See more...
When I use my code, I can see this error. " Error in 'where' command : The operator at ',127.542 - 0.001' is invalid. The problem code is this. | where time >= $max_value$ - 0.001  When I print "max_value"  at title, I can see that value is "315,127.542"   I think the reason this problem occurred is  ',' at the max_value.  How could I remove ',' at the max_value? And If it was not the problem, How could I solve this?
Hi, I'm trying to create a query which will display events matching following conditions: 5 or more different destination IP, one IDS attack name, all in 1h. I tried to use following: index=ids | ... See more...
Hi, I'm trying to create a query which will display events matching following conditions: 5 or more different destination IP, one IDS attack name, all in 1h. I tried to use following: index=ids | streamstats count time_window=1h by dest_ip attack_name | where count (attack_name=1 AND dest_ip>=5) but it is not accepted by Splunk so I presume it has to be written differently. Could somebody help me please?
I am trying to send the data from client machine (UF) installed and Heavy forwarder installed on other machine. But i am getting the below error. 12-06-2023 10:01:22.626 +0100 INFO  ClientSessio... See more...
I am trying to send the data from client machine (UF) installed and Heavy forwarder installed on other machine. But i am getting the below error. 12-06-2023 10:01:22.626 +0100 INFO  ClientSessionsManager [3779231 TcpChannelThread] - Adding client: ip=10.112.73.20 uts=windows-x64 id=86E862DA-2CDC-4B21-9E37-45DFF4C5EFBE name=86E862DA-2CDC-4B21-9E37-45DFF4C5EFBE 12-06-2023 10:01:22.626 +0100 INFO  ClientSessionsManager [3779231 TcpChannelThread] - ip=10.112.73.20 name=86E862DA-2CDC-4B21-9E37-45DFF4C5EFBE New record for sc=100_IngestAction_AutoGenerated app=splunk_ingest_actions: action=Phonehome result=Ok checksum=0 12-06-2023 10:01:24.551 +0100 INFO  AutoLoadBalancedConnectionStrategy [3778953 TcpOutEloop] - Removing quarantine from idx=3.234.1.140:9997 connid=0 12-06-2023 10:01:24.551 +0100 INFO  AutoLoadBalancedConnectionStrategy [3778953 TcpOutEloop] - Removing quarantine from idx=54.85.90.105:9997 connid=0 12-06-2023 10:01:24.784 +0100 ERROR TcpOutputFd [3778953 TcpOutEloop] - Read error. Connection reset by peer 12-06-2023 10:01:25.028 +0100 ERROR TcpOutputFd [3778953 TcpOutEloop] - Read error. Connection reset by peer 12-06-2023 10:01:28.082 +0100 WARN  TcpOutputProc [3779070 indexerPipe_1] - The TCP output processor has paused the data flow. Forwarding to host_dest=inputs10.align.splunkcloud.com inside output group default-autolb-group from host_src=prdpl2splunk02.aligntech.com has been blocked for blocked_seconds=60. This can stall the data flow towards indexing and other network outputs. Review the receiving system's health in the Splunk Monitoring Console. It is probably not accepting data.
Hi, I would like to know if it is possible to filter all the graphs in my dashboard by clicking on a portion in another graph of the same dashboard, in order to achieve a cross filter behavior. I am... See more...
Hi, I would like to know if it is possible to filter all the graphs in my dashboard by clicking on a portion in another graph of the same dashboard, in order to achieve a cross filter behavior. I am using dashboard studio. Thank you Best regards  
The requirement is to create a time delta field which has the value of time difference between the 2 time fields. Basically the difference between start time & receive time should populate under new ... See more...
The requirement is to create a time delta field which has the value of time difference between the 2 time fields. Basically the difference between start time & receive time should populate under new field name called timediff.   I have created eval conditions, can anyone help me with a props config based on that. index=XXX sourcetype IN(xx:xxx, xxx:xxxxx) | eval indextime=strftime(_indextime,"%Y-%m-%d %H:%M:%S") | eval it = strptime(start_time, "%Y/%m/%d %H:%M:%S") | eval ot = strptime(receive_time, "%Y/%m/%d %H:%M:%S") | eval diff = tostring((ot - it), "duration") | table start_time, receive_time,indextime,_time, diff
    Hey I know that such a question has been asked many times but I still haven't found a relevant answer that works for me. I have a table and I want to color a column with a different variable,... See more...
    Hey I know that such a question has been asked many times but I still haven't found a relevant answer that works for me. I have a table and I want to color a column with a different variable,   |stats values(interfaceName) as importer |eval importer_in_csv=if(isnull(max_time),0,1)   I want to color the importer column if importer_in_csv = 0 How do I do it in XML? thanks!!  
Hey everyone, I'm here with a query regarding the support provided by Red Hat [ https://www.lenovo.com/de/de/servers-storage/solutions/redhat/ ] for integrating Splunk into its ecosystem. Specifical... See more...
Hey everyone, I'm here with a query regarding the support provided by Red Hat [ https://www.lenovo.com/de/de/servers-storage/solutions/redhat/ ] for integrating Splunk into its ecosystem. Specifically, I'm seeking clarification on whether Red Hat extends support or compatibility for integrating Splunk within its systems. The primary concern revolves around the feasibility of integrating Splunk—a popular data analytics and monitoring platform—into Red Hat's environment. Understanding whether Red Hat officially supports or provides compatibility for integrating Splunk within its systems is crucial for ensuring a seamless integration process. The uncertainty arises from the need to establish a smooth and reliable connection between Splunk and Red Hat systems without encountering compatibility issues or unexpected limitations. Ensuring that Splunk can effectively operate within the Red Hat environment is essential for our integration plans. If anyone within the community has insights or experiences related to integrating Splunk with Red Hat, I'd greatly appreciate hearing about them. Any information regarding the official stance of Red Hat on supporting Splunk integration or any challenges encountered during such integration attempts would be immensely valuable. Understanding any potential roadblocks, compatibility concerns, or success stories related to integrating Splunk with Red Hat systems would greatly assist in planning and executing a successful integration strategy. Thank you all in advance for your time and contributions. Your shared experiences and expertise could provide valuable insights into the compatibility and support aspects of integrating Splunk within the Red Hat environment.
Hi Splunk Gurus, @ITWhisperer  I need your expertise in solving a rather simple issue which is taking lot more hours for me to solve. I'm trying to create a table in Splunk which should display Gran... See more...
Hi Splunk Gurus, @ITWhisperer  I need your expertise in solving a rather simple issue which is taking lot more hours for me to solve. I'm trying to create a table in Splunk which should display Grand Total under each row and also display Compleion% under "Completed" field. I'm able to achieve the grand total using addcoltotals. However, I'm unable to display the completion% under "Completed" field. Here is how the table should look like:  Name   Remaining   Completed   Alice  25 18  Bob  63 42  Claire  10 7  David  45 30  Emma  80 65 Grand Total: 223 162 Completion%   42.07% percetnage calculation = 162/(223+162)*100 I tried using eval function to calculate the percentage but it calculates for each row in a new field. Can you please help me out? Many thanks. Much appreciated!
Hello Friends,   I need your help to find out matching fields values and their total count by comparing from two different lookup files. | inputlookup   ABC.csv | fields Firewall_Name | stats coun... See more...
Hello Friends,   I need your help to find out matching fields values and their total count by comparing from two different lookup files. | inputlookup   ABC.csv | fields Firewall_Name | stats count | inputlookup  XYZ.csv | fields Firewall_Hostname | stats count My goal is to compare  two lookup files by using field name Firewall_Name with Firewall_Hostname and get matching field values count. EX. if in ABC.csv file field name Firewall_Name total count is 1000 and in second lookup file XYZ.csv field name  Firewall_Hostname total count is 850 then my result should display all matched values with their count. so I can get confirmation that from file name XYZ.csv all fields are matching with file ABC.csv and all firewalls are up and running with their total matched firewall count 850.  
Hello! Still very new to Splunk so hoping to get some clarification. My dashboard is currently using a post-process search as its base and filtering data from there. On my dashboard objects, I have ... See more...
Hello! Still very new to Splunk so hoping to get some clarification. My dashboard is currently using a post-process search as its base and filtering data from there. On my dashboard objects, I have a <link></link> which works fine until adding an eval strftime to convert the time to human readable. Running this search as a new search manually with the eval works fine. However, the link directs to a blank search. Removing the eval statement makes the link work. Link: <link target="_blank"> search?q=| inputlookup io_vuln_data_lookup where $severity$ | search last_found &gt;= "$info_min_time$" AND last_found &lt;= "$info_max_time$" | eval last_found = strftime(last_found, "%c") | table dns_name,  last_found | where lower(state)!="fixed" </link> I was hoping to only do this conversion for a single dashboard object, so didn't want to convert the entire lookup. Would be amazing if I could get this search to work Thanks!
Hi Brains Trust, I'm trying to find the location of a CSV file that used to be a file input in 2019 but the file input (Files & directories) has been removed from the HF.  Is there a way to search ... See more...
Hi Brains Trust, I'm trying to find the location of a CSV file that used to be a file input in 2019 but the file input (Files & directories) has been removed from the HF.  Is there a way to search for the file path? The only info I have is the index & source file name but need to know the details on the file input to see if the file in question still exists in that location. index=nessus source="2019_04_17_CRIT_HIGH.csv" Thanks in advance!
I wrote the description of the saved search using Korean. When the search operates and is recorded in scheduler.log, Korean characters are broken. It worked fine in version 8.2, but the problem oc... See more...
I wrote the description of the saved search using Korean. When the search operates and is recorded in scheduler.log, Korean characters are broken. It worked fine in version 8.2, but the problem occurs in 9.0.7 What should I do?  8.2.3.2 version 9.0.7 version
Using the following search strangely doesn't return the same result as it does in using postman, browser, etc. Essentially, we've got a list of IPs joined together that I'm attempting to pass to the ... See more...
Using the following search strangely doesn't return the same result as it does in using postman, browser, etc. Essentially, we've got a list of IPs joined together that I'm attempting to pass to the shodan API which the "net:" search filter supports. The list of IPs will looks like so: "1.2.3.4,1.1.1.1,8.8.8.8" etc (yes, the API key is included in the curl but is removed for the sake of this question) index=test_index  | dedup src_ip | stats values(src_ip) as ip_list | eval ip_list = mvjoin(ip_list, ",") | curl method=get uri="https://api.shodan.io/shodan/host/search?query=net:".ip_list."&fields=ip_str,port,timestamp,vulns&minify=false&language=en However, we get 0 matches when the response body is returned: { "matches": [], "total": 0 } Example query that returns a response: api.shodan.io/shodan/host/search?query=net:1.1.1.1,8.8.8.8,9.9.9.9&fields=ip_str,port,timestamp,vulns&minify=false&language=en Is the literal string expression (".ip_list.") not supported by TA-WebTools?  Thanks!