All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, I have a sample log like below, for which I have created an alert which triggers if the Expiration Date is greater than current date.  LOGS: Date : 17/08/2021 12:15:44 Build Number : 3274 Datab... See more...
Hi, I have a sample log like below, for which I have created an alert which triggers if the Expiration Date is greater than current date.  LOGS: Date : 17/08/2021 12:15:44 Build Number : 3274 Database Date : 2021-07-15 Expiration Date : 2021-08-17 License Expiration Date : 2021-08-17   MY QUERY: index=myIdx source="/my/logs/catalina.out" linecount=4 | regex _raw = ".*\sExpiration Date\s.*" | rex max_match=0 "^(?<lines>.+)\n+" | eval buildNumber=mvindex(lines,0) | eval expirationDate=mvindex(lines,2) | fields - lines | eval expirationDateVal = mvindex(split(expirationDate,":"),1) | eval buildNumberVal = mvindex(split(buildNumber,":"),1) | eval expiredConvert = strptime(expirationDateVal,"%m-%d-%Y") |eval expiredConvertDiffFormat = strptime(expirationDateVal,"%Y-%m-%d") | eval remDays =round((expiredConvert-now())/86400) | eval remDaysDiffFormat =round((expiredConvertDiffFormat-now())/86400) | where ( remDays <= 15 and remDays != "" ) or ( remDaysDiffFormat !="" and remDaysDiffFormat <= 15 ) | rename remDays as numDays remDaysDiffFormat as numDaysDiffFormat host as host |eval remainingDays =case(numDays!="",numDays,numDaysDiffFormat!="",numDaysDiffFormat)| where remainingDays > 0 | table remainingDays,host,buildNumberVal,expirationDate   Somehow, this query is not pulling up the logs. Is there something which I am missing in my query? The alert should have triggered yesterday. But it hasn't. Kindly help.   Thanks in advance.
here is log format: Type=0 name=aaa1 door=aaa2 street=aaa3 city=aaa4 country=aaa5 dr="" CN="" Type=0 name=bbb1 door=bbb2 street=bbb3 city=bbb4 country=bbb5 dr="" CN="" Type=1 name=ccc1 door="" str... See more...
here is log format: Type=0 name=aaa1 door=aaa2 street=aaa3 city=aaa4 country=aaa5 dr="" CN="" Type=0 name=bbb1 door=bbb2 street=bbb3 city=bbb4 country=bbb5 dr="" CN="" Type=1 name=ccc1 door="" street=ccc3 city=ccc4 country="" dr=ccc2 CN=ccc5 Type=1 name=ddd1 door="" street=ddd3 city=ddd4 country="" dr=ddd2 CN=ddd5 wanted to create table like below: NAME DOOR-NUMBER STREET CITY COUNTRY-NAME aaa1 aaa2 aaa3 aaa4 aaa5 bbb1 bbb2 bbb3 bbb4 bbb5 ccc1 ccc2 ccc3 ccc4 ccc5 ddd1 ddd2 ddd3 ddd4 ddd5  
Hi, For security monitoring matters, we are trying to collect authentication logs from Fortigate and Palo Alto devices but we only receive success events. We also need to get failed events. We do... See more...
Hi, For security monitoring matters, we are trying to collect authentication logs from Fortigate and Palo Alto devices but we only receive success events. We also need to get failed events. We don't think the problem is due to devices configuration because other SIEM can read failed authentication events with the same Log forward filters applied. Same for index=firewall_fortigate index=firewall_paloalto   Thanks by advance for your feedback
Hi, I want to run something similar to the below on metrics data stored in metrics index, can you please assist: eval ip34 = if(ip=="37.25.139.34",1,0) ,ip35 = if(ip=="37.25.139.35",1,0) | mstats... See more...
Hi, I want to run something similar to the below on metrics data stored in metrics index, can you please assist: eval ip34 = if(ip=="37.25.139.34",1,0) ,ip35 = if(ip=="37.25.139.35",1,0) | mstats sum(ip34) , sum(ip35) , avg(bytes) , stdev(bytes) , median(bytes) avg(response_time_s) , stdev(response_time_s) , median(response_time_s) where index=sfp_metrics earliest=-5m@m latest=@m span=1m by uri  It gives me: Any idea how to achieve this would be appreciated.   Best Regards,
I'd like to set the 192.x.x.x band ip for 7 days. index="*" earliest=-7d | rex "192\.(?<range>\d{1,3})\.(?<range>\d{1,3})\.(?<range>\d{1,3})" | where range >=xx AND range<=xx  How should I correc... See more...
I'd like to set the 192.x.x.x band ip for 7 days. index="*" earliest=-7d | rex "192\.(?<range>\d{1,3})\.(?<range>\d{1,3})\.(?<range>\d{1,3})" | where range >=xx AND range<=xx  How should I correct it?
I would like to know about the permission files under the metadata directory of each app.$ SPLUNK_HOME / etc / apps / <App name> / metadataThe permission file under the metadata directory is updated ... See more...
I would like to know about the permission files under the metadata directory of each app.$ SPLUNK_HOME / etc / apps / <App name> / metadataThe permission file under the metadata directory is updated when any operation is performed.Will it be done? <Background of inquiry> When checking the Splunk configuration file, the search head permission file mentioned aboveI noticed that it has been updated.After checking the contents, the authority is set individually for the lookup table etc . it seemed that the number of stanzas related to knowledge objects was increasing.Because I was confused because I did not know when the contents of the permission file were updated, please contact me. for splunk version 7.3.2
We have a client currently running their Splunk Enterprise environment (HF, Indexers and Search Head) in Windows 2016 and they would like to add a new HF in Windows 2019 for HA purposes. Is this supp... See more...
We have a client currently running their Splunk Enterprise environment (HF, Indexers and Search Head) in Windows 2016 and they would like to add a new HF in Windows 2019 for HA purposes. Is this supported or is there any compatibility issue? Kindly share your thoughts. Thanks! 
Hi Team, I need urgent help on how to whitelist  specific lines from logfile and ignoring rest. As an example this is a feed in my logfile :- [2021-08-18 03:32:09.797] 2021-08-18 03:31:59.000, ip:... See more...
Hi Team, I need urgent help on how to whitelist  specific lines from logfile and ignoring rest. As an example this is a feed in my logfile :- [2021-08-18 03:32:09.797] 2021-08-18 03:31:59.000, ip: 10.7.128.219, folder: 0, size: <nil>, event: ObjectRemoved:DeleteMarkerCreated, session: 15849,10.7.128.219, type: 2, region: eu-west-2, bucket: proftpd-prod-replicated, topic: arn:aws:sns:eu-west-2:563028249984:proftpd_prod_replicated_event_topic, key: export/sftp/ABE0A4FD16B68ADBC0B28AD415F/Credit_Index_Live_Latest/CREDIT_INDICES_LIVE_PRICING-20210811-0315.csv, sequencer: 00611C7F3529A4C883 deleteObject: Warning: Couldn't remove object '/export/sftp/ABE0A4FD16B68ADBC0B28AD415F/Credit_Index_Live_Latest/CREDIT_INDICES_LIVE_PRICING-20210811-0315.csv' from cache, cache might be stale Detected cache out of sync, now relisting whole directory [/export/sftp/ABE0A4FD16B68ADBC0B28AD415F/Credit_Index_Live_Latest] === Now testing diff of folder and cache... [folder: export/sftp/ABE0A4FD16B68ADBC0B28AD415F/Credit_Index_Live_Latest/] =============================== DIFF CALCULATION TOOK: 0.015115 [diffs: [/export/sftp/ABE0A4FD16B68ADBC0B28AD415F/Credit_Index_Live_Latest]: additions: 0, removals: 0, updates: 0, timestamp: 1629257529.797306] Updating timestamp from: 1629253911.017497 to: 1629257529.797306 RESULT [/export/sftp/ABE0A4FD16B68ADBC0B28AD415F/Credit_Index_Live_Latest]: Size: 499, folders: 5, footprint: 30856, cache_: 0x7f6781fd2878 /:D:1 1629257529.812982 /..:D: [VIRTUAL] - export 0 /export:D:1 1629257529.812975 /export/..:D: [VIRTUAL] - sftp 0 /export/sftp:D:1 1629257529.812971 /export/sftp/..:D: [VIRTUAL] - ABE0A4FD16B68ADBC0B28AD415F 0 /export/sftp/ABE0A4FD16B68ADBC0B28AD415F:D:1 1629257529.812988 /export/sftp/ABE0A4FD16B68ADBC0B28AD415F/..:D: [VIRTUAL] - Credit_Index_Live_Latest 0 /export/sftp/ABE0A4FD16B68ADBC0B28AD415F/Credit_Index_Live_Latest:D:494 1629257529.797306 /export/sftp/ABE0A4FD16B68ADBC0B28AD415F/Credit_Index_Live_Latest/..:D: [VIRTUAL] - CREDIT_INDICES_LIVE_PRICING-20210818-0330.csv 115660 - CREDIT_INDICES_LIVE_PRICING-20210818-0315.csv 115638 - CREDIT_INDICES_LIVE_PRICING-20210818-0300.csv 115636 - CREDIT_INDICES_LIVE_PRICING-20210818-0245.csv 115636   Out of the above lines I want only to enable feed for the line which is highlighted in red and ignore rest of the lines.  Please suggest this can be achieved?  Thanks in advance.   Regards, Prateek Sawhney 
We can have either IP address or hostname for host field, both is not supported yet. Below highlighted config in inputs.conf will force to set host as IP address: host = localhost connection_host =... See more...
We can have either IP address or hostname for host field, both is not supported yet. Below highlighted config in inputs.conf will force to set host as IP address: host = localhost connection_host = ip we tried to explore _meta field coming from UF. If we can make that dynamic to support variable, then we can have both IP and hostname. This would be similar to $decideOnStartup functionality for host field, but more flexible. Has anyone able to send both IP and Hostname dynamically for indexing through UF? For example: $hostname$ will yield hostname.. as used with inputs.conf, it would be [default] _meta = splunk_forwarder::$hostname$
Hi Team, Need help in parsing AWS Managed Active Directory AD Team is writing logs to cloudwatch, and we have Splunk Addon for AWS which consumes these logs through Kinesis stream.  I have props c... See more...
Hi Team, Need help in parsing AWS Managed Active Directory AD Team is writing logs to cloudwatch, and we have Splunk Addon for AWS which consumes these logs through Kinesis stream.  I have props config to convert the logs to xmlwineventlog sourcetype after which data is parsed but not all the fields. I want the addon to parse using the source [xmlwineventlog:security] but that is not happening,   Here is my props config [source://*securitylogs] Transforms-Index=override_st_props,override_source_props And transforms as below [override_st_props] REGEX=. FORMAT = sourcetype::xmlwineventlog DEST_KEY = MetaData:Sourcetype [override_source_props] REGEX = . FORMAT = source::xmlwineventlog:security DEST_KEY = MetaData:Source It is getting changed on sourcetype and source, but parsing is happening based on sourcetype as per windows addon and not on source Hope i made it clear, please help
Hey , I have a usecase where i need to take a difference between events and report the ones which are missing.  Example Events (each line is an event).  txId will be identifier field which will... See more...
Hey , I have a usecase where i need to take a difference between events and report the ones which are missing.  Example Events (each line is an event).  txId will be identifier field which will be sent with type "R", "D" or "F" Here in this case, i need to identify the txId for which i have "R" but missing "D" or "F" { "txId" : "0001", "type": "R"} { "txId" : "0001", "type": "F"} { "txId" : "0002", "type": "R"} { "txId" : "0003", "type": "R"} { "txId" : "0003", "type": "D"} In the above sample events, txId = 0002 is the one missing D or F type. I need to count such events & show in a panel as well as drill-down to those specific R events. Additional challenge is there will be more than 3 million "R" events and corresponding "D" or "F" events. Can you please help on this diff ?
I have some events that exceeds the default 10000-byte TRUNCATE limit.  This triggers "truncating line because limit of 10000 bytes has been exceeded."  In Splunk documentation, this is characterized... See more...
I have some events that exceeds the default 10000-byte TRUNCATE limit.  This triggers "truncating line because limit of 10000 bytes has been exceeded."  In Splunk documentation, this is characterized as "line-breaking issues". Because the events are received in JSON, I thought reducing line length would solve this problem without tweaking TRUNCATE.  But this doesn't.  After pretty-print, large JSON documents still get truncated. Does this mean that the "line breaking issue" is really "event breaking issue", that the indexer requires every event to be under TRUNCATE limit?  The JSON documents have the correct syntax including the opening and closing brackets.
which mode does  the splunk  forwarder support  ? If  push or pull mode is all supported, we want to know how to configure   the different mode,and  the  disadvantage and  between them? Thanks
Hello  I want to save hot/warm and cold separately when I make splunk index. Hot/Warm is stored in /tmp/hotwarm and cold is stored in /tmp/cold path. However, it is time to create an index in the ... See more...
Hello  I want to save hot/warm and cold separately when I make splunk index. Hot/Warm is stored in /tmp/hotwarm and cold is stored in /tmp/cold path. However, it is time to create an index in the Splunk UI. I know to separate routes from indexes.conf. It's just that I want to save it in the UI as Default in the path I want to save. What should I do? Thanks
Hello, I have a complex data source (sample events given below).  Is there any way I can write TIME_PREFIX and TIME_FORMAT for this data source? Thank you so much, greatly appreciated.   FOAT     ... See more...
Hello, I have a complex data source (sample events given below).  Is there any way I can write TIME_PREFIX and TIME_FORMAT for this data source? Thank you so much, greatly appreciated.   FOAT     A BRMCPRD  FMM0             0080       0       0 CFOL.OLG.GENERIC.REQUEST.FIT1                       2021-06-14 00:03:48.165 FOAT     A  RCTID     QMGR NAME      INDS I/P CNT O/P CNT     MQ Series Q name                                2021-06-14 00:03:48.162 FOAT     A -------- ---------------- RCTID     ---- ------- ------- -------------------------------                     2021-06-14 00:03:48.163 FOAT     A                        IOB   FRAME  COMMON     SWB     XWB     ECB     FRM1MB                      2021-06-14 00:08:09.521 FOAT     A BRMCPRD  FMM0             0080       0       0 CFOL.OLG.GENERIC.REQUEST.FIT1                       2021-06-14 00:28:09.361 FOAT     A      1       0        4        4       20        0.86   499.26     1.68                            2021-06-14 00:28:09.445 FOAT     A      2       0        3        2        3        1.19   498.92     2.19                            2021-06-14 00:28:09.446 FOAT     A      3       0        2        2        2        1.17   498.95     2.20 _                          2021-06-14 00:28:09.447 FOAT     A      4       0        4        2       10        1.24   498.87     2.27                            2021-06-14 00:28:09.448 FAAT     A END OF DISPLAY+                                                                                    2021-06-14 00:28:09.449 DFAT     A Utilization                     OK   .7 - .7 / .3 - .3 _                                           2021-06-14 23:58:11.233 DFAT     A CFCAOL Message Rate               OK   0.0 / 0.0                                                     2021-06-14 23:58:11.234 FISA    A DASRS Message Rate               OK   0.0 / 0.0                                                     2021-06-14 23:58:11.235 FISA  A Command Code timeouts past Min  OK   c-0 / i-0 / b-0                                               2021-06-14 23:58:11.236 FIAT     A BTIF Response Time              OK   n-0 / r-0 / t-0                                               2021-06-14 23:58:11.237 FIST     A Serv Ctr or C-Codes Disabled    OK   2                                                             2021-06-14 23:58:11.238 BNAT     A 02303F80       *ENBL* AN AT AU BR CI FR KC ME OG PH                                                2021-06-14 23:30:04.120 PODA     A CFOL         0.0        0.0                                                                        2021-06-14 18:56:09.072 PODA     A IDRS         0.0        0.0                                                                        2021-06-14 18:56:09.073 PODA     A EFTP         0.0        0.0                                                                        2021-06-14 18:56:09.074 TBCA     A AAES0009I 00.00.00 FROM TA 0A : AAER0412I ACT: Variation RASIGN activated from dir F:\TESTAVENVAR     2021-06-15 00:00:00.195
Hi, We have an existing Splunk deployment that uses SSL certs for security. A new STIG has a requirement to use FIPS. If I read through all of the previous FIPS related questions it seems that I n... See more...
Hi, We have an existing Splunk deployment that uses SSL certs for security. A new STIG has a requirement to use FIPS. If I read through all of the previous FIPS related questions it seems that I need to reinstall the same version, or upgrade the version of Splunk enterprise/ Splunk Universal Forwarder. I hit a snag with the reinstall the same version, and maybe someone can give a suggestion. When I try to reinstall over the current installation, the installer is too smart. It notices the same version is installed and exits instead of asking if the user wishes to continue. Does anyone know of a way to get the installer to reinstall over itself? this is a Windows installation. Any help will be useful. thx, Ken    
I have the Splunk Add-On for Windows installed on my deployment server in order to help collect data from my windows machines (forwarders). However, when the data comes in - it is all condensed down ... See more...
I have the Splunk Add-On for Windows installed on my deployment server in order to help collect data from my windows machines (forwarders). However, when the data comes in - it is all condensed down into a block and more or less unreadable. The entries from it have the tags like <Event>, <System>, etc but it isn't spaced out at all and bunched together. Was curious if anyone knows how to make the data from this add-on look like how all other data usually comes into splunk - spaced out and indented and more readable to the human eye essentially. Not sure if this would be a splunk configuration or a configuration that has to be done specifically to my Windows Add-On settings on my deployment server. Thanks!
Hello everyone, When I install Splunk enterprise on my personal Ubuntu machine, it directly changed the default python bin. It means that after the install when I type :    which python   I... See more...
Hello everyone, When I install Splunk enterprise on my personal Ubuntu machine, it directly changed the default python bin. It means that after the install when I type :    which python   It will return a bin located in /opt/splunk/bin/python, which is not the default python I want for my system. I'm having trouble to find information about what is done during the install and how to change this behavior... Thanks a lot for your help! 
Is the Splunk Add-On for Sophos compatible with getting data from my Macs? I have a deployment server (on Windows, the only OS compatible with hosting the Sophos Add-On) with Macs and Windows machine... See more...
Is the Splunk Add-On for Sophos compatible with getting data from my Macs? I have a deployment server (on Windows, the only OS compatible with hosting the Sophos Add-On) with Macs and Windows machines in my environment w/ the Universal Forwarder installed. I want to use this add-on to collect Sophos data from both my macs and windows machines but can't find anywhere if it will work. 
Hello! I was asked to find what IP addressable devices are listening on port 80 on our network. Can I find this information through a query? I'm new to Splunk analysis so I apologize if this seems b... See more...
Hello! I was asked to find what IP addressable devices are listening on port 80 on our network. Can I find this information through a query? I'm new to Splunk analysis so I apologize if this seems basic. Any and all help is greatly appreciated. Thanks!