All Topics

Top

All Topics

    i have noticed this error coming up often and have searched everywhere to find out what it is and if there is a fix for it. 09-29-2021 05:41:07.533 -0500 ERROR ScopedAliveProcessToken [237135... See more...
    i have noticed this error coming up often and have searched everywhere to find out what it is and if there is a fix for it. 09-29-2021 05:41:07.533 -0500 ERROR ScopedAliveProcessToken [2371353 BundleLookupIndexingExecutorWorker-0] - Failed to remove alive token file='/opt/splunk/var/run/searchpeers/DFADFAB9-11E6-4297-97DF-9227BFECA4AE-1632912055/apps/live_edge/lookups/LiveEdge_FatalLogs.csv_1632911454.870215.cs.index.lock'. No such file or directory source = /opt/splunk/var/log/splunk/splunkd.log sourcetype = splunkd  
How do I locate the missing Index & fix such issues please?
I run a search head cluster with Splunk Enterprise. Typically I update apps via the back end CLI, but am wondering if I can update via the GUI. My question is: does the GUI >> Manage Apps >> Find App... See more...
I run a search head cluster with Splunk Enterprise. Typically I update apps via the back end CLI, but am wondering if I can update via the GUI. My question is: does the GUI >> Manage Apps >> Find App >> Click "Update App to #.##" update the apps on all of my search heads or only on the one I am viewing? I've always been told to go through the cli so never have attempted this.  Thanks.  
I am trying to figure out how to pull fields to show the exact count of numbers and letters in a result. Like, if I have the result 11tt3yyy1, I want the fields to show me that there is 4 numbers and... See more...
I am trying to figure out how to pull fields to show the exact count of numbers and letters in a result. Like, if I have the result 11tt3yyy1, I want the fields to show me that there is 4 numbers and 5 letters. Is there a way to do this? I have tried everything I can think of.
Dear All, Kindly help me am getting error where I find all my splunkforwarder is missing and it shows me it last connected to indexers 02/05/2021. What causes this? how can I fix this?      Thank... See more...
Dear All, Kindly help me am getting error where I find all my splunkforwarder is missing and it shows me it last connected to indexers 02/05/2021. What causes this? how can I fix this?      Thank you in advance!!  
Hello everyone, I hope you all are doing well.   I have been tasked to update Splunk enterprise to the 8.2.1 version and the forwarders to 8.1.4. Does anyone know if this upgrade is going to effect t... See more...
Hello everyone, I hope you all are doing well.   I have been tasked to update Splunk enterprise to the 8.2.1 version and the forwarders to 8.1.4. Does anyone know if this upgrade is going to effect the compatibility for legacy systems? I am worries that RHEL 6 with 7.x version systems will have issues. Just wondering if anyone has had this problem at all. Thank you!
Hello All, Can any one help me on this event injection in Splunk.   sample data 122.0.0.2 NOT_AVAILABLE abc Agent= 2021-09-27 11:15:39 5648 WARN xyz NOT_AVAILABLE NOT_AVAILABLE NOT_AVAILABLE NOT... See more...
Hello All, Can any one help me on this event injection in Splunk.   sample data 122.0.0.2 NOT_AVAILABLE abc Agent= 2021-09-27 11:15:39 5648 WARN xyz NOT_AVAILABLE NOT_AVAILABLE NOT_AVAILABLE NOT_AVAILABLE 2021-09-27 11:16:08 5432 DEBUG Field: xyz - value: ID - unformatted value: vvcsa - formatted value: abcsc - returnType: - boost: 1 - append: False   Here it have to be two events with respective date time.            
Am trying to upgrade many UFs & HFs to 8.2.2. Any issues to watch for? Also, what should be the order? Should the Splunk servers (instances) be upgraded first or last? Thanks a million.
I am new to Splunk and did some fundamental courses to understand the platform. I have this question and would like to know if this is possible. I want to monitor Linux server (CPU usage, Disk usage,... See more...
I am new to Splunk and did some fundamental courses to understand the platform. I have this question and would like to know if this is possible. I want to monitor Linux server (CPU usage, Disk usage, Ram usage and network metrics) with Splunk. I know there are lot of apps available on Splunkbase. But I want to know if there is a way to just use Splunk without need of any other apps from Splunkbase to accomplish this objective? 
Hi,   I have a Table created by:   eval Actor=actor | eval "Total Time (max/avg/p50/p99)"=maxT + ", " + avgT + ", " + p50T + ", " + p99T | eval "Thread Execution Time (max/avg/p50/p99)"=maxE ... See more...
Hi,   I have a Table created by:   eval Actor=actor | eval "Total Time (max/avg/p50/p99)"=maxT + ", " + avgT + ", " + p50T + ", " + p99T | eval "Thread Execution Time (max/avg/p50/p99)"=maxE + ", " + avgE + ", " + p50E + ", " + p99E | eval "Time On Queue (max/avg/p50/p99)"=maxOnQ + ", " + avgOnQ + ", " + p50OnQ + ", " + p99OnQ | eval "Queue Depth (max/avg/p50/p99)"=maxqUsed + ", " + avgqUsed + ", " + p50qUsed + ", " + p99qUsed | eval "TPS (max/avg/p50/p99)"=maxTPS + ", " + avgTPS + ", " + p50TPS + ", " + p99TPS | <!--- create a table --> table Actor, "Total Time (max/avg/p50/p99)", "Thread Execution Time (max/avg/p50/p99)", "Time On Queue (max/avg/p50/p99)" , "Queue Depth (max/avg/p50/p99)", "TPS (max/avg/p50/p99)" |   Which looks like:   I wanted to change the color of the entire cell based on the max value. Say if max value is greater than 10000, color the cell red, else some other color.   I've tried following: https://community.splunk.com/t5/Dashboards-Visualizations/change-the-color-of-row-based-on-cell-value-in-splunk-without/m-p/525075   But I can't seem to get it to work with numbers. Any help is appreciated thanks!  
Hi all, I am using splunk after a while and lost touch with the SPL. Please help me on below. I have about 40 fields to extract using a SPL query. I am able to get all the fields required using int... See more...
Hi all, I am using splunk after a while and lost touch with the SPL. Please help me on below. I have about 40 fields to extract using a SPL query. I am able to get all the fields required using interesting fields. The issue that I am facing is that I am getting duplicate records in my result set (possibly it is due to the multiple source types that I am using in my query).  Just wondering what is the correct way to write SPL so that all fields that I retrieve are unique records. Don't think writing dedup on all 40 fields is a good idea. Also not sure if I use stats function,  do I have to write values(empno) as empno, vaues(empstartdate) as startdate.........on all 40 fields ? (If my data set has all employee details as an example)   Thanks in advance!
After failing over from the active cluster master to the redundant node (which holds the same configuration), 15 buckets report now    slave bucket=XXXXXX has unexpected mask=11 expected=0   This... See more...
After failing over from the active cluster master to the redundant node (which holds the same configuration), 15 buckets report now    slave bucket=XXXXXX has unexpected mask=11 expected=0   This results in search factor not met for the corresponding indices.  I can see the cluster master running the "CMChangeMasksJob" in regular intervals but it looks to me it just can't handle those 15 buckets.  I am looking for any hints how to tackle this. First and foremost, what is a bucket mask? Are my buckets corrupted? Can I try to update the mask manually?  
Hi there, I have a problem splitting transactions using request data while using custom expression on HttpRequest. My application is .NET Core running in a ECS Task. The custom expression described... See more...
Hi there, I have a problem splitting transactions using request data while using custom expression on HttpRequest. My application is .NET Core running in a ECS Task. The custom expression described here (https://docs.appdynamics.com/21.9/en/application-monitoring/configure-instrumentation/transaction-detection-rules/uri-based-entry-points) does not work. When using "${Url.ToString().Split(Char[]/-).[2]}-${UserAgent}" no transaction gets instrumented. While it is returning "Microsoft.AspNetCore.Http.DefaultHttpRequest" when using "${Url.ToString()}". Any ideas? Thank you
Hi, I've got a lookup with a number of records, and not all of them have all columns populated. Is there a way to append only those columns which are not empty? Something similar to:   | lookup m... See more...
Hi, I've got a lookup with a number of records, and not all of them have all columns populated. Is there a way to append only those columns which are not empty? Something similar to:   | lookup mylookup lookup_key OUTPUTNEW list of columns to append | <some SPL here to hide columns which are empty>   I'd be grateful for any tips.  I was experimenting with foreach but with no results. Regards
Is there any way to transfer log files utilizing Universal Forwarder? I have to use Heavy Forwarder to extract fields form complicated log texts. So It's necessary to send logs as whole file format ... See more...
Is there any way to transfer log files utilizing Universal Forwarder? I have to use Heavy Forwarder to extract fields form complicated log texts. So It's necessary to send logs as whole file format from the machines which generate logs toward Heavy Forwarder. if it's possible, could you tell me How and Which directory I should check on Heavy Forwarder machine. The Construction is this. (attachment photo)
Hello All, I have a search query that performs lookups against a CSV file and outputs only those hosts that are in the CSV file. The CSV file has the following 4 columns and notice the IP Address c... See more...
Hello All, I have a search query that performs lookups against a CSV file and outputs only those hosts that are in the CSV file. The CSV file has the following 4 columns and notice the IP Address column has a white space in it. I have verified the following command displays the values correctly of all hosts with their IP in a table       | inputlookup linux_servers.csv | table host "IP Address"       Now, if put the same thing  in a  tstats command, it does not show any results.  Any ideas why does it not take "IP Address" even though i have stated double quotes ??        | tstats max(_time) as lastSeen_epoch WHERE index=linux [| inputlookup linux_servers.csv | table host "IP Address" ] by host       The following search works fine , if i take out the "IP Address" . It displays the table with host column.       | tstats max(_time) as lastSeen_epoch WHERE index=linux [| inputlookup linux_servers.csv | table host ] by host        
Hi All,     I have logs in splunk and i need to create field values and create table with the values,present in logs. example :Caused by: org.apache.kafka.connect.errors.ConnectException: Failed to... See more...
Hi All,     I have logs in splunk and i need to create field values and create table with the values,present in logs. example :Caused by: org.apache.kafka.connect.errors.ConnectException: Failed to start new JMS session connection 1: JMSWMQ2013: The security authentication was not valid that was supplied for queue manager 'EVT302' with connection mode 'Client' and host name '10.37.84.12,10.37.100.13(1442)'. Above one is the example log and i need to extract value under caused by as description and queue manager number and also the hostname.  Can anyone help me on the same. Thanks in Advance.
I would like to install github3 module in phantom custom function using pip .. How do I do it?
Hi Splunkers! I hope you all are doing well. This is my indexes.conf My problem is that the COLD volume was fulled. This is the output of df command The fs of COLD volume is xfs Do you k... See more...
Hi Splunkers! I hope you all are doing well. This is my indexes.conf My problem is that the COLD volume was fulled. This is the output of df command The fs of COLD volume is xfs Do you know that the total maxsize of both COLD and splunk_summareis must not exceed from total space or Just setting the COLD volume is enough because the splunk_summaries volume is part of that? I mean in my case Splunk set the addition of both volume:COLD and volume:_splunk_summaries for total space for storing buckets or just set the maxVolumesize of volume:COLD config?  Thanks in advance for any advice   PS: I know Splunk do recommend that the summaries must be stored in HOT volume!
I need assistance to configure and forwarding the Mcafee DLP logs to Splunk. I already try to send the logs to splunk at port 8089 but the logs are encrypted. I intend to forward the logs to splunk... See more...
I need assistance to configure and forwarding the Mcafee DLP logs to Splunk. I already try to send the logs to splunk at port 8089 but the logs are encrypted. I intend to forward the logs to splunk at port 6514 but the port is not responding. Can anyone help us on this? Thank you.