All Topics

Top

All Topics

Hi I would like to group events in a timeline as a count until a different event occurs   Example:   So basically achieve the following: A user account was locked out   (count 13) A ... See more...
Hi I would like to group events in a timeline as a count until a different event occurs   Example:   So basically achieve the following: A user account was locked out   (count 13) A process has exited (count 1) A new process has been created (count 1) Permissions on an object were changed (count 2) A process has exited (count 1) And so on ......    
Looking for Splunk query to filter out event if "Attachment" field having extension .txt or .html or .jpg or .png if only above mentioned file extension available in "Attachment" field then only it... See more...
Looking for Splunk query to filter out event if "Attachment" field having extension .txt or .html or .jpg or .png if only above mentioned file extension available in "Attachment" field then only it should filter out. apart from that if any other file extension available with above mentioned then wee need to consider that event. Below mentioned is attachment field values in some of event.   red enclosed event should be filtered out as "Attachment" field have only ".txt",".html" and ".png". other two event have ".txt",".html",".docx" and ".txt",".ics" these two should not filterout.
I need to calculate the size of a clustered index, and I used this API for it: /services/cluster/manager/indexes (https://docs.splunk.com/Documentation/Splunk/9.0.2/RESTREF/RESTcluster#:~:text=conte... See more...
I need to calculate the size of a clustered index, and I used this API for it: /services/cluster/manager/indexes (https://docs.splunk.com/Documentation/Splunk/9.0.2/RESTREF/RESTcluster#:~:text=content%3E%0A%20%20%3C/entry%3E%0A%3C/feed%3E-,cluster/manager/indexes,-https%3A//%3Chost%3E%3A%3CmPort) But the index_size returned in the response is different (much less) from the total I get, if I use the dbinspect command on a particular index, and add size for the db_<buckets>. i.e. originating buckets. Is the index_size supposed to denote something else? If so, it is not clearly mentioned in the API documentation.
i want to make a dashboard of last 3 month of avg cpu load and max cpu load For example: dec= 320 dec=10 dec=40 dec=90 nov= 347 nov=150 nov=60 oct= 300 oct=320 and so on for dec 320+10... See more...
i want to make a dashboard of last 3 month of avg cpu load and max cpu load For example: dec= 320 dec=10 dec=40 dec=90 nov= 347 nov=150 nov=60 oct= 300 oct=320 and so on for dec 320+10+40+90/31 same for nov and oct So for that , need to calculate last 3 months count and last month count in same query. Please suggest.
I need to index only the lines which has .pl in the source file into splunk(highlighted below data). Regex expression is working as expected(tested in rex tool) . Now i am using below props and tr... See more...
I need to index only the lines which has .pl in the source file into splunk(highlighted below data). Regex expression is working as expected(tested in rex tool) . Now i am using below props and transform.conf to index the only required data captured in regex expression but my data is not getting index or it indexes completed log file. Please assit where am i going wrong props.conf [phone_access] TRANSFORMS-set= phone_access_extraction transforms.conf [phone_access_extraction] REGEX = ^(\d{1,2}\.\d\.\d\.\d - - \[\w+\/\w+\/\w+:\d+:\d+:\d+ -\d+\] .\w+ \/\w+.+\.pl.+) DEST_KEY = queue FORMAT = indexQueue Log file: 11.7.1.0 - - [27/Nov/2022:00:00:00 -0600] "GET /cgi-bin/phonedata.pl?pq=a1%3oGHK9416&names=a1%7Ca2&&attrs=a1a2&delim=%09 HTTP/1.1" 302 - 11.7.1.0 - - [27/Nov/2022:00:00:04 -0600] "-" 408 - 11.7.1.0 - - [27/Nov/2022:00:00:21 -0600] "-" 408 - 11.7.1.0 - - [27/Nov/2022:00:00:22 -0600] "GET / HTTP/1.1" 20 14497 11.7.1.0 - - [27/Nov/2022:00:00:23 -0600] "GET /mobile.html HTTP/1.1" 200 1001 11.7.1.0 - - [27/Nov/2022:00:00:24 -0600] "GET /PhoneOrgiChart/ HTTP/1.1" 302 - 11.7.1.0 - - [27/Nov/2022:00:01:15 -0600] "GET /cgibiWn/xml.pl?vk236e HTTP/1.1" 20 11.7.1.0 - - [27/Nov/2022:00:01:15 -0600] "GET /cgi-bFin/xml.pl?hv163t HTTP/1.1" 20
hi All, can someone help on the splunk search eval condition based on below scenario using fields  Actualstarttime and job_start_by   if job_start_by<= Actualstarttime return "GREEN / START... See more...
hi All, can someone help on the splunk search eval condition based on below scenario using fields  Actualstarttime and job_start_by   if job_start_by<= Actualstarttime return "GREEN / STARTED ON TIME" else: return "AMBER / STARTED LATE" else: if now <= Actualstarttime return "EARLY / NO DATA" else: return "RED / START SLA BREACH" if now > Actualstarttime return "RED / END SLA BREACH" else: return "BLUE / RUNNING"
Hi at all, I would to use a custom App to contain all the custom Correlation Searches I'm creating on ES. I need that the Correlation Searches contained in this custom App are visible in Enterprise... See more...
Hi at all, I would to use a custom App to contain all the custom Correlation Searches I'm creating on ES. I need that the Correlation Searches contained in this custom App are visible in Enterprise Security. I knew that to be visible in ES the Custom App must have a name starting with "SA-" but it isn't sufficient and doesn't work. Does anyone know what I forget? Thank you in advance. Ciao. Giuseppe
i want update a text box based on section from dropdown list
Hi, I am looking for a Splunk addon that will allow us to ingest RSS feed to our Splunk instance. I downloaded and installed this app "https://splunkbase.splunk.com/app/5844" but I cannot see whe... See more...
Hi, I am looking for a Splunk addon that will allow us to ingest RSS feed to our Splunk instance. I downloaded and installed this app "https://splunkbase.splunk.com/app/5844" but I cannot see where to setup. I know there are other RSS add-on but we preferred add-on that was build by Splunk.
Hello, Splunk lovers! I have some questions What i want: 1. i want to make a table from search history, where time presets were queried by all_time or long diaposone 2. i want find other searc... See more...
Hello, Splunk lovers! I have some questions What i want: 1. i want to make a table from search history, where time presets were queried by all_time or long diaposone 2. i want find other searches, what have command "outputlookup" please, help thank you! 
I am now trying to plot my geostats on custom map tile server when designing dashboard.  I found problem when I kept zooming. The map suddenly became blank when the zoom level reached 10. But what I... See more...
I am now trying to plot my geostats on custom map tile server when designing dashboard.  I found problem when I kept zooming. The map suddenly became blank when the zoom level reached 10. But what I want is to have a zoom range at 9-16. However, I can have a detailed zoom if I use the visualization after doing a search. This discrepancy really confuses me. Is there any suggestion if I can extend the zoom range in map on dashboard?? 
Hi Team,   I have created a notable in the Splunk ES and i received a notable and i analyzed the notable and i can see 130 events in the raw logs. But after sometime if i analyse the same notable... See more...
Hi Team,   I have created a notable in the Splunk ES and i received a notable and i analyzed the notable and i can see 130 events in the raw logs. But after sometime if i analyse the same notable i can see that there is increase in the  count of events . Can i know what the issue is regarding the increase in the event count. Thanks & Regards, Umesh
HI, I want to make the log below in the form of the table below. What should I do with the spl?   [log ex]  14:39:19.857 INF [md_system_user] remove success [user id:kimkimkim] by [id:tom]   ... See more...
HI, I want to make the log below in the form of the table below. What should I do with the spl?   [log ex]  14:39:19.857 INF [md_system_user] remove success [user id:kimkimkim] by [id:tom]   [table] user id id kimkimkim tom
  Is it possible to implement classification using Splunk MLTK assistance ? If yes, How to implement it ?    Regards Balaji TK
My objective is to make a search that compares the dest_ip field value of outbound traffic with the ip values in a lookup table of malicious IPs that results with any matches. The current search is ... See more...
My objective is to make a search that compares the dest_ip field value of outbound traffic with the ip values in a lookup table of malicious IPs that results with any matches. The current search is something simple like: index=NetworkTraffic dest_zone="Internet" NOT src_zone="Internet" to view the outbound traffic. The output includes a dest_ip field.  If I have a lookup table called maliciousIPs.csv, which contains a field called "ip", how do I compare that to the dest_ip field?  Ex: If the dest_ip field value of one of the NetworkTraffic logs is 1.2.3.4 and the IP address 1.2.3.4 exists within maliciousIPs.csv, then the search would result. 
I found a older discussion post that answered this question, but wanting to see if things changed. Does Splunk offer any sort of discount or exam voucher to active duty military members or veterans?... See more...
I found a older discussion post that answered this question, but wanting to see if things changed. Does Splunk offer any sort of discount or exam voucher to active duty military members or veterans? Thanks 
Hi Folks , I am new to splunk and trying to get dynamic source value from the response, here is my query:   index="itestData" AND source="/opt/ABC/DEF/GHI/KLM/LOG*" AND "error" Please note that... See more...
Hi Folks , I am new to splunk and trying to get dynamic source value from the response, here is my query:   index="itestData" AND source="/opt/ABC/DEF/GHI/KLM/LOG*" AND "error" Please note that * after LOG is a dynamic value (like LOG-A.log , LOG-B.log, LOG-C.log) and there are at least 70 servers like this, when i get any error i want to know from which log this error is coming (A or B or C and so on) . Let me know if there is any other way to get this (i do not want to individually put the name of sources as servers go up and down ) Thanks in advance.
Hi, I've been learning Splunk on my free time and at the part of my lesson that is teaching me how to add a splunk index via the CLI. I think I made a mistake with either the stanza or the key valu... See more...
Hi, I've been learning Splunk on my free time and at the part of my lesson that is teaching me how to add a splunk index via the CLI. I think I made a mistake with either the stanza or the key values can someone possibly help me out with this one?   Splunk> 4TW Checking prerequisites... Checking http port [8000]: open Checking mgmt port [8089]: open Checking appserver port [127.0.0.1:8065]: open Checking kvstore port [8191]: open Checking configuration... Done. Checking critical directories... Done Checking indexes... Problem parsing indexes.conf: Cannot load IndexConfig: stanza=security Required parameter=homePath not configured Validating databases (splunkd validatedb) failed with code '1'. If you cannot resolve the issue(s) above after consulting documentation, please file a case online at http://www.splunk.com/page/submit_issue $  
New to Splunk. Attempting to have Splunk monitor and index logs from NAS. Logs are being centrally stored on a NAS from multiple clients. Need Splunk to look at the Network drive and index the logs... See more...
New to Splunk. Attempting to have Splunk monitor and index logs from NAS. Logs are being centrally stored on a NAS from multiple clients. Need Splunk to look at the Network drive and index the logs in the shared folder. Using UNC path (\\192.168.xxx.xxx\sharefolder\filepath). Under the monitored files and directories in Splunk, able to monitor the number of log files in the selected directory and instantly show changes made if log files are added or removed. Splunk account given administrator access to rule out privilege issues. Logged into Splunk service account and confirmed able to access network location. If I take the same files and place them in a local folder, and attempt to index them using the same method, they are instantly indexed. The problem appears to be with Splunk not ingesting and indexing the logs when pointed to the network location. Any help would very much appreciated.
I want to represent interface wise (DFOINTERFACE) success and failure  success log below, where completed successfully is main keyword: 2022-12-06 14:43:21:064 EST| INFO |dfo_.allocation DFOINTE... See more...
I want to represent interface wise (DFOINTERFACE) success and failure  success log below, where completed successfully is main keyword: 2022-12-06 14:43:21:064 EST| INFO |dfo_.allocation DFOINTERFACE=dfo_.allocation START -- dfo_.allocation Execution accountNumber=%productValidationRequest/accountNumber/accountBase%%productValidationRequest/accountNumber/accountDest% completed successfully MFRESPONSETIME=96 millisec 176 microsec 997 nanosec MFPROGRAMEID=OMCRCAL1 Service Name : LowCodePlatform.RESTService.allocation:_post and error log below, where completed with Error is main keyword: 2022-12-06 13:52:38:233 EST| ERROR |dfo_.productValidation DFOINTERFACE=dfo_.productValidation START -- dfo_.productValidation Execution accountNumber=076732008 completed with Error 20120014 - CICS ECI Connection: Transformation error on reply: Invalid decimal digit: MFRESPONSETIME=411 millisec 753 microsec 627 nanosec MFPROGRAMEID=OECDFB21 Service Name : LowCodePlatform.RESTService.ProductValidation I want a report like, please see the attachment