All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @judywatson , Splunk deploy it's compatibility at kernel level not at Operating systems level: Splunk 9.1.2 is certified on Linux Kernel >= 3. For my experience, I never found issues using Splun... See more...
Hi @judywatson , Splunk deploy it's compatibility at kernel level not at Operating systems level: Splunk 9.1.2 is certified on Linux Kernel >= 3. For my experience, I never found issues using Splunk On Red Hat. You can find more details at https://docs.splunk.com/Documentation/Splunk/9.1.2/Installation/Systemrequirements Put attention to the considerations and configurations that you have to do on the Operative System (ulimit and THP). Ciao. Giuseppe
Hi, I'm trying to create a query which will display events matching following conditions: 5 or more different destination IP, one IDS attack name, all in 1h. I tried to use following: index=ids | ... See more...
Hi, I'm trying to create a query which will display events matching following conditions: 5 or more different destination IP, one IDS attack name, all in 1h. I tried to use following: index=ids | streamstats count time_window=1h by dest_ip attack_name | where count (attack_name=1 AND dest_ip>=5) but it is not accepted by Splunk so I presume it has to be written differently. Could somebody help me please?
Hi @Mr_Adate , sorry I forgot a field, please try this: | inputlookup ABC.csv | eval lookup="ABC.csv" | fields Firewall_Name lookup | append [ | inputlookup XYZ.csv | eval lookup="XYZ.csv" | rena... See more...
Hi @Mr_Adate , sorry I forgot a field, please try this: | inputlookup ABC.csv | eval lookup="ABC.csv" | fields Firewall_Name lookup | append [ | inputlookup XYZ.csv | eval lookup="XYZ.csv" | rename Firewall_Hostname AS Firewall_Name | fields Firewall_Name lookup ] | chart count OVER lookup BY Firewall_Name Ciao. Giuseppe
I had the same issue with sending emails. I was able to resolve it by replacing the sendemail.py file in $SPLUNKetc\apps\search\bin, with an older version of sendemail.py. I still have the issue with... See more...
I had the same issue with sending emails. I was able to resolve it by replacing the sendemail.py file in $SPLUNKetc\apps\search\bin, with an older version of sendemail.py. I still have the issue with endless loading on settings pages. How were you able to resolve this?
@ITWhisperer Thank you for the quick revert. It worked!
Hi gcusello,   thank you very much for your prompt reply. I appreciate that   I tried with you code but I guess something is wrong with last line code. I am getting 0 result. can you please confi... See more...
Hi gcusello,   thank you very much for your prompt reply. I appreciate that   I tried with you code but I guess something is wrong with last line code. I am getting 0 result. can you please confirm it again?
In SimpleXML, certain characters must be entered with HTML entities. (Specifically, double quotes, greater than, less than, and so on.)  More generally, GET URLs are best encoded without special char... See more...
In SimpleXML, certain characters must be entered with HTML entities. (Specifically, double quotes, greater than, less than, and so on.)  More generally, GET URLs are best encoded without special characters.  So, replace | eval last_found = strftime(last_found, "%c") with %3D%20strftime(last_found%2C%20%22%25c%22)  Meanwhile I do not know how the cited URL could "works fine till."  If you are entering these in source editor, you can try replacing double quotes with ", i.e., | eval last_found = strftime(last_found, "%c") I recommend using the visual editor, however.  There, you can enter SPL as SPL.
I am trying to send the data from client machine (UF) installed and Heavy forwarder installed on other machine. But i am getting the below error. 12-06-2023 10:01:22.626 +0100 INFO  ClientSessio... See more...
I am trying to send the data from client machine (UF) installed and Heavy forwarder installed on other machine. But i am getting the below error. 12-06-2023 10:01:22.626 +0100 INFO  ClientSessionsManager [3779231 TcpChannelThread] - Adding client: ip=10.112.73.20 uts=windows-x64 id=86E862DA-2CDC-4B21-9E37-45DFF4C5EFBE name=86E862DA-2CDC-4B21-9E37-45DFF4C5EFBE 12-06-2023 10:01:22.626 +0100 INFO  ClientSessionsManager [3779231 TcpChannelThread] - ip=10.112.73.20 name=86E862DA-2CDC-4B21-9E37-45DFF4C5EFBE New record for sc=100_IngestAction_AutoGenerated app=splunk_ingest_actions: action=Phonehome result=Ok checksum=0 12-06-2023 10:01:24.551 +0100 INFO  AutoLoadBalancedConnectionStrategy [3778953 TcpOutEloop] - Removing quarantine from idx=3.234.1.140:9997 connid=0 12-06-2023 10:01:24.551 +0100 INFO  AutoLoadBalancedConnectionStrategy [3778953 TcpOutEloop] - Removing quarantine from idx=54.85.90.105:9997 connid=0 12-06-2023 10:01:24.784 +0100 ERROR TcpOutputFd [3778953 TcpOutEloop] - Read error. Connection reset by peer 12-06-2023 10:01:25.028 +0100 ERROR TcpOutputFd [3778953 TcpOutEloop] - Read error. Connection reset by peer 12-06-2023 10:01:28.082 +0100 WARN  TcpOutputProc [3779070 indexerPipe_1] - The TCP output processor has paused the data flow. Forwarding to host_dest=inputs10.align.splunkcloud.com inside output group default-autolb-group from host_src=prdpl2splunk02.aligntech.com has been blocked for blocked_seconds=60. This can stall the data flow towards indexing and other network outputs. Review the receiving system's health in the Splunk Monitoring Console. It is probably not accepting data.
this is my search: |search index=****** interfaceName=* |stats values(interfaceName) as importer |join type=lest [|search index=****** Code=* [|inputlookup importers.csv |table... See more...
this is my search: |search index=****** interfaceName=* |stats values(interfaceName) as importer |join type=lest [|search index=****** Code=* [|inputlookup importers.csv |table interfaceName] |lookup importers.csv interfaceName OUTPUTNEW system timeRange |where like(system, "%") |stats values(system) as reality values(timeRange) as max_time |eval importer_in_csv=if(isnull(max_time),0,1) I want to color the importer column if importer_in_csv = 0 How do I do it in XML? thanks!!
This is, what I want to achieve. 3.jpg. Time range from the time range picker. In this case the day 05.12.2023.
Yes, if i search for any field and value, the events are filtering based on my search, but the fields are not getting extracted.
That's.... wierd. If you search, for example, for UserName=*, you get events but those events don't show the UserName field?  
You can do | timechart span=1h count | where count>0 | timewrap 1day  To filter out "empty" results.
Hi, I would like to know if it is possible to filter all the graphs in my dashboard by clicking on a portion in another graph of the same dashboard, in order to achieve a cross filter behavior. I am... See more...
Hi, I would like to know if it is possible to filter all the graphs in my dashboard by clicking on a portion in another graph of the same dashboard, in order to achieve a cross filter behavior. I am using dashboard studio. Thank you Best regards  
The requirement is to create a time delta field which has the value of time difference between the 2 time fields. Basically the difference between start time & receive time should populate under new ... See more...
The requirement is to create a time delta field which has the value of time difference between the 2 time fields. Basically the difference between start time & receive time should populate under new field name called timediff.   I have created eval conditions, can anyone help me with a props config based on that. index=XXX sourcetype IN(xx:xxx, xxx:xxxxx) | eval indextime=strftime(_indextime,"%Y-%m-%d %H:%M:%S") | eval it = strptime(start_time, "%Y/%m/%d %H:%M:%S") | eval ot = strptime(receive_time, "%Y/%m/%d %H:%M:%S") | eval diff = tostring((ot - it), "duration") | table start_time, receive_time,indextime,_time, diff
This can't be done with this search because there is no field called max_time - please clarify your search
| appendpipe [| eval Completed=if(Name="Grand Total:",100*Completed/(Completed + Remaining), null()) | eval Remaining=null() | eval Name=if(Name="Grand Total:","Completion%",null()) |... See more...
| appendpipe [| eval Completed=if(Name="Grand Total:",100*Completed/(Completed + Remaining), null()) | eval Remaining=null() | eval Name=if(Name="Grand Total:","Completion%",null()) | where isnotnull(Name)]
    Hey I know that such a question has been asked many times but I still haven't found a relevant answer that works for me. I have a table and I want to color a column with a different variable,... See more...
    Hey I know that such a question has been asked many times but I still haven't found a relevant answer that works for me. I have a table and I want to color a column with a different variable,   |stats values(interfaceName) as importer |eval importer_in_csv=if(isnull(max_time),0,1)   I want to color the importer column if importer_in_csv = 0 How do I do it in XML? thanks!!  
1. there is a time range picker object on the dashboard. If I select any range, e.G. the whole day 05.12.2023, this time range I would like to have on x-axis in area chart. 2. in this case,  | eva... See more...
1. there is a time range picker object on the dashboard. If I select any range, e.G. the whole day 05.12.2023, this time range I would like to have on x-axis in area chart. 2. in this case,  | eval _time=case(row=0,strptime(StartTime,"%Y-%m-%d %H:%M:%S"),row=1,strptime(StartTime,"%Y-%m-%d %H:%M:%S"),row=2,strptime(EndTime,"%Y-%m-%d %H:%M:%S"),row=3,strptime(EndTime,"%Y-%m-%d %H:%M:%S")) | eval value=case(row=0,0,row=1,1,row=2,1,row=3,0) the time range of the x-axis in area chart is from the first StartTime (05:30) ... last EndTime (13:30).
Hey everyone, I'm here with a query regarding the support provided by Red Hat [ https://www.lenovo.com/de/de/servers-storage/solutions/redhat/ ] for integrating Splunk into its ecosystem. Specifical... See more...
Hey everyone, I'm here with a query regarding the support provided by Red Hat [ https://www.lenovo.com/de/de/servers-storage/solutions/redhat/ ] for integrating Splunk into its ecosystem. Specifically, I'm seeking clarification on whether Red Hat extends support or compatibility for integrating Splunk within its systems. The primary concern revolves around the feasibility of integrating Splunk—a popular data analytics and monitoring platform—into Red Hat's environment. Understanding whether Red Hat officially supports or provides compatibility for integrating Splunk within its systems is crucial for ensuring a seamless integration process. The uncertainty arises from the need to establish a smooth and reliable connection between Splunk and Red Hat systems without encountering compatibility issues or unexpected limitations. Ensuring that Splunk can effectively operate within the Red Hat environment is essential for our integration plans. If anyone within the community has insights or experiences related to integrating Splunk with Red Hat, I'd greatly appreciate hearing about them. Any information regarding the official stance of Red Hat on supporting Splunk integration or any challenges encountered during such integration attempts would be immensely valuable. Understanding any potential roadblocks, compatibility concerns, or success stories related to integrating Splunk with Red Hat systems would greatly assist in planning and executing a successful integration strategy. Thank you all in advance for your time and contributions. Your shared experiences and expertise could provide valuable insights into the compatibility and support aspects of integrating Splunk within the Red Hat environment.