All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Here is my search  index=abc Status=FAILED | eval exception =if(bucket_name=s3-abc, "yes","no") | stats count by bucket_name exception now if my bucket name is s3-abc, it would print bucket_name=s3... See more...
Here is my search  index=abc Status=FAILED | eval exception =if(bucket_name=s3-abc, "yes","no") | stats count by bucket_name exception now if my bucket name is s3-abc, it would print bucket_name=s3-abc and exception=yes, rest all buckets will fall under exception=no. Now i need to do this task through a lookup, i have a lookup which is buckets.csv and fields is there bucket_name, so I need to see that lookup if the bucket is there then it should print exception=yes rest it should print exception=no. i am doing like this but not getting anything index=abc Status=FAILED | eval exception =if(|search [|inputlookup bucket.csv |fields bucket_name], "yes","no") | stats count by bucket_name exception  
Hi there, How can I modify the CSS file to make the table automatically resized? Now I use width:100% for each table but it turns out that it can't adapt to different screen size.  Table table-asset... See more...
Hi there, How can I modify the CSS file to make the table automatically resized? Now I use width:100% for each table but it turns out that it can't adapt to different screen size.  Table table-assets is defined in JS.      #table-assets { width:100% !important; }     Thank you! 
How do I document if Splunk Core / ES cover NIST controls in my DR document?
Hi  I have a table like below and I am looking to have a tooltip for several of its columns. I have looked at some solutions on questions, but nothing seems to answer this one correctly. Any ideas... See more...
Hi  I have a table like below and I am looking to have a tooltip for several of its columns. I have looked at some solutions on questions, but nothing seems to answer this one correctly. Any ideas? Regards Robert Lynch  
Hello together, i am Telefonica Global Solution employee, and we have a company account for SPLUNK. I request a personalized Dev/Test license for Splunk Enterprise and didn't get it. Probably caus... See more...
Hello together, i am Telefonica Global Solution employee, and we have a company account for SPLUNK. I request a personalized Dev/Test license for Splunk Enterprise and didn't get it. Probably cause of the wrong company name in my profile ( instead of Telefonica Global Solution  i have Telefinica Global Solution). Could you pls. help to solve this problem. I got following answer from you ( see below )   Hello Oljeg, Thank you for your interest in a personalized Dev/Test license for Splunk Enterprise. These licenses are available to any paying Splunk Enterprise or Splunk Cloud customer; however, we were unable to verify your eligibility. If you feel that there has been an error in the verification process, please check the information on your splunk.com profile and try registering again. One common cause is that you provided a personal email address (e.g., Gmail, Yahoo, Hotmail). Another could be that the email domain on your account does not match that on the corporate account (e.g., xyz.com vs. xyzcorp.com). Please contact your Splunk admin, account executive or Splunk authorized partner if you have any other questions.
Simplest things down, downloaded the msi and installed on windows (Enterprise/free version) and then the Splunkuniversalforwarder as well.  restarted the local host and attempted to get to 127.0.0.1,... See more...
Simplest things down, downloaded the msi and installed on windows (Enterprise/free version) and then the Splunkuniversalforwarder as well.  restarted the local host and attempted to get to 127.0.0.1, cannot connect. Opened up the directory in the C:... /splunk path and selected splunk.exe.  will this fix the issue?
Hello -  I was reading this:  https://docs.splunk.com/Documentation/SCS/current/Search/Timemodifiers But it is not very clear to me how to use the time modifiers properly. index=blah sourcetyp... See more...
Hello -  I was reading this:  https://docs.splunk.com/Documentation/SCS/current/Search/Timemodifiers But it is not very clear to me how to use the time modifiers properly. index=blah sourcetype=blah fields _time index sourcetype GB | timechart span=1d sum(GB) as Gigabytes How would I draw my time chart to the end of the previous day over a 7-day period using a time modifier? Would it be:   index=blah sourcetype=blah _index_earliest=-7d@d index_latest=-1d@d Please advise, thank you.
Hi, I'm trying to exclude events that have an old timestamp in a url which look like this -  {"timestamp": 1626739199.964, "c-ip": "178.245.92.14", "time-to-first-byte": 0.002, "sc-status": 404, "s... See more...
Hi, I'm trying to exclude events that have an old timestamp in a url which look like this -  {"timestamp": 1626739199.964, "c-ip": "178.245.92.14", "time-to-first-byte": 0.002, "sc-status": 404, "sc-bytes": 467, "cs-method": "GET", "cs-protocol": "https", "cs-host": "xxxxxxx", "cs-uri-stem": "/out/v1/bac5ea7d5e06476598d34ba48b3f1bd1/index_8_0.m3u8?start=2021-07-16T16:40:07+00:00",   here timestamp and the start date are different.So is there any way to index the events have start date as current day. Thanks in advance
Hi All, At Monitoring Console (MC) --> Search Activity : Instance, there is "top 20 Memory-consuming searches", which is searching from index=_introspection. As I run the search, it is not recogniz... See more...
Hi All, At Monitoring Console (MC) --> Search Activity : Instance, there is "top 20 Memory-consuming searches", which is searching from index=_introspection. As I run the search, it is not recognizing saved search (scheduled search).  Why doesn't the search starting index=_introspection recognize saved search (scheduled search)? It seems not it returns results from all searches. How do I get to know memory consumption of all searches including saved search(scheduled search)? Do I have to join index=_introspection and index=_audit?      
Hi, I have some application logs in the following format:   ERROR | 2021-07-20 06:55:54 EDT | Field1 = Value1 | Field2 = Value2 | Long Error String - Another long error string | Field3 = Value3 | ... See more...
Hi, I have some application logs in the following format:   ERROR | 2021-07-20 06:55:54 EDT | Field1 = Value1 | Field2 = Value2 | Long Error String - Another long error string | Field3 = Value3 | ... | ...     Most of the tokens are in Field=Value format and Splunk is able to extract them just fine except the portion where there is no Field listed. Just two different error strings separated by a " - ".  (These strings may contain other special characters as part of the error) Is there a way I can extract both of them separately, e.g. signature_1, signature_2 without disturbing rest of the extractions? I would prefer doing this with props/transforms. I was thinking of using "DELIMS" option but not sure how to target just that particular part of the log.
Hello there, Does any one know how easily can Splunk Add on Applications can be tracked from a health check pov on when versions are due to go out of date / non supported and versions that are suppo... See more...
Hello there, Does any one know how easily can Splunk Add on Applications can be tracked from a health check pov on when versions are due to go out of date / non supported and versions that are supported?  As at present i cannot see a central repository / area that provides this information or a health check plug in that provide this as dashboard in Splunk? How do you mange your vulnerabilities / health checking on splunk add-on applications?
Suppose i have some process to run to give input and output count based on that we were calculating rejection percentage, if the rejection percentage is less than 20 then i need to display good if re... See more...
Suppose i have some process to run to give input and output count based on that we were calculating rejection percentage, if the rejection percentage is less than 20 then i need to display good if rejection percentage is more than 20 so ...bad...and of the corresponding input output not available...then want to display in progress....plz help to implement  ...input and output value we will be getting from data model
Hi Splunk Experts, Below is a sample event, I have below spath msg.message.details, I am trying to extract certain  fields from the details datapath. How can I extract 'msg.message.details' into fie... See more...
Hi Splunk Experts, Below is a sample event, I have below spath msg.message.details, I am trying to extract certain  fields from the details datapath. How can I extract 'msg.message.details' into fields?, I am still a newbie and learning on the go in splunk world, I am guessing to use rex, but is there a way using spath? Our index has structured other json paths eg:y has other spath eg:msg.message.header.correlationId, etc,  { [-] cf_app_id: test123 cf_app_name: test event_type: LogMessage job_index: ebcf8d13 message_type: OUT msg: { [-] level: INFO logger: UpdateContact message: { [-] details: Data{SystemId='null', language='English', parentSourceSystemAction='null', contactId='cf4cae75-28b3', status='Active', birthDate='1991-01-15', eventAction='Create', Accounts=[CustomerAccounts{ Case='000899', accountid='4DA4F29E', contactRelationship=ContactRelationship{expiryDate='', contactType='owner', endDate=''}}],workContact=WorkContact{faxNumber='null', mobileNumber='null', emailAddress='null', phoneNumber='null'},homeContact=HomeContact{faxNumber='null', mobileNumber='null', emailAddress='', phoneNumber='null'},businessAddress=null,personalAddress=[PersonalAddress{addressId='9205', locality='PARK', internationalPostCode='null', internationalState='null', additionalInfo='null', isPrimary='Y', streetNumberStart='null', addressType='null', status='CO', streetNumberStartSuffix='null', postalCode='765', streetNumberEnd='null', streetName='null', country='null', streetNumberEndSuffix='null', streetType='null', state='null', subAddress=SubAddress{buildingName='null', numberStart='null', addressLines=[MIL PDE,], details=[Details{value='null', detailType='null'}, Details{value='null', detailType='null'}]}}],idv=Identification{doc=License{state='null', number='null'}}} header: { [-] correlationId: 707000J-52f6-10df-00f3-f859-1c5ed entityId: cf75-2b3-cb38-cef-a72ad88 entityName: test errorCode: null errorMessage: null eventName: testevent processName: process1 processStatus: SUCCESS serviceName: testservice serviceType: Dispatch } } timestamp: 2021-07-20 } origin: rep timestamp: 1626764261880766200 } Any help is much appreciated. Thanks
Good afternoon! I have Palo Alto generating logs and redirecting them to Splunk, I am wanting to use Palo Alto Networks but I can't get it to work correctly, due to the configurations followed, the ... See more...
Good afternoon! I have Palo Alto generating logs and redirecting them to Splunk, I am wanting to use Palo Alto Networks but I can't get it to work correctly, due to the configurations followed, the only thing I just got is that it shows me the logs by Realtime Event Feed, but I I would like to understand and understand how Splunk and this Add from Palo Alto work, how to configure it, how to manage it since I cannot find a documentation that explains it very well, one of the things I would like to do is that the information of Palo Alto also appear in GlobalProtect etc, but I would like to understand how it works and how to redirect the information to the GlobalProtect window or well, understand concepts, thank you very much in advance!  
I have some events data in which I have fields like Eventid, EventTime, EventRunId, AccountID etc. As per my use case I need to export the search result data to AWS S3 and for that I am using "Even... See more...
I have some events data in which I have fields like Eventid, EventTime, EventRunId, AccountID etc. As per my use case I need to export the search result data to AWS S3 and for that I am using "Event Push by Deductiv" app. In this app I have written below query and that is converting all search results to .csv file and send to S3 bucket.         source="events.csv" host="pool150.info.com" sourcetype="csv" | s3ep credential=default_password outputfile="eventcsv.csv" outputformat=csv compression=false fields="accountId,eventId,eventrunId,EventTime"         My purpose is to write a search query through which I can export search results in incremental way. We can create an alert running every hour. For ex. Let's say I have searched data just now (15:00 GMT+5:30 ) and I got a record with latest EventTime 14:53 GMT+5:30. If I search the data again after 1 hour ( 16:00 GMT +5:30 ) then It should bring only records with EventTime > 14:53 PM GMT+5:30 and EventTime<=16:00 GMT +5:30 . We might need a variable to store the latest event time somewhere and then compare to the upcoming records.Also, in case the splunk server is restarted or faced breakdown then after the server is up, it should pick up/persist the latest EventTime to carry on the incremental search. My approach may be different and you can have some better solution as well. Please help me.  
Hi at all, I have to monitor a vcenter system (VM-Ware). I tried to download the app from Splunkbase, but it's at end of life. I read that it will be replaced with the IT Essentials Work and I dow... See more...
Hi at all, I have to monitor a vcenter system (VM-Ware). I tried to download the app from Splunkbase, but it's at end of life. I read that it will be replaced with the IT Essentials Work and I downloaded it. It seems ITSI! Can anyone explain the differences between this app and ITSI? (I already know that this is free and ITSI is a Premium App!). Had anyone experienced this App (that has 0 downloads!) Ciao. Giuseppe
Hi,I have a dns log whose fields are not extracted properly and so I used Rex. I encountered a problem. When i search index = dns * source = "516" host = dns -sender All fields are extracted correct... See more...
Hi,I have a dns log whose fields are not extracted properly and so I used Rex. I encountered a problem. When i search index = dns * source = "516" host = dns -sender All fields are extracted correctly. But when i search | "from datamodel:" Network_Resolution | search dns -sender My fields get value of unknown. Can anyone help me !!!!
Hello! Can anyone please help how to know if we ran an alert/not for a scheduled alert?  We set the below alert for every Monday 6:00 am. Alert Example: | makeresults | eval ip_ports = "10.120.1... See more...
Hello! Can anyone please help how to know if we ran an alert/not for a scheduled alert?  We set the below alert for every Monday 6:00 am. Alert Example: | makeresults | eval ip_ports = "10.120.121.100:9443" | eval ip_ports = split(ip_ports,",") | mvexpand ip_ports | rex field=ip_ports "(?<dest>[^:]+):(?<dest_port>\d+)" | table dest dest_port | lookup sslcert_lookup dest dest_port | eval days_left = round(ssl_validity_window/86400) | eval ssl_end_time=strftime(ssl_end_time,"%Y-%m-%d") | eval ssl_start_time=strftime(ssl_start_time,"%Y-%m-%d") | where days_left < 60  
I am getting the error below "File will not be read, seekptr checksum did not match (file=<file name>0). Last time we saw this initcrc, filename was different. You may wish to use larger initCrcLen ... See more...
I am getting the error below "File will not be read, seekptr checksum did not match (file=<file name>0). Last time we saw this initcrc, filename was different. You may wish to use larger initCrcLen for this sourcetype, or a CRC salt on this source. Consult the documentation or file a support case online at http://www.splunk.com/page/submit_issue for more info." I have crcSalt = <SOURCE> in the inputs.conf what more can I do?
Can I specify app name in Splunk query and run that query from any app ?