All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

We're logging info/error logs in splunk/db. We're using .net and nlog. In db, we're getting it in the right order when sorting because of identity column. In Splunk, it's coming out of order if man... See more...
We're logging info/error logs in splunk/db. We're using .net and nlog. In db, we're getting it in the right order when sorting because of identity column. In Splunk, it's coming out of order if many log entries have the same date. Is there a chance to tell splunk to create a "identity column" for everything that is piped into it? We're piping the logs into Splunk using HTTP Event Collector. Thank you! Gunnar
I need to get a complete list of all users in Splunk Enterprise or Ent. Security & the date the user account was added. Thank u in advance.
DB Performance Metrics was stopped receiving at AppDyanamics Controller for 2  Database at a particular date and time only while we are able to receive and view the DB server infra-related metrics.  ... See more...
DB Performance Metrics was stopped receiving at AppDyanamics Controller for 2  Database at a particular date and time only while we are able to receive and view the DB server infra-related metrics.  DB collectors were working fine as we are able to receive metrics related to other Databases and Infra metrics. We have checked with DBA and Network Security if any change was implemented at their end but there was no change implemented. Please advise troubleshooting steps. Thanks Siva
Okay, so after the 60 days of Enterprise trial my license has expired. Now, how can I download the perpetual free license? Once I get into the store the only things I find require a payment, and the... See more...
Okay, so after the 60 days of Enterprise trial my license has expired. Now, how can I download the perpetual free license? Once I get into the store the only things I find require a payment, and the you have pricing questions we have answers website doesn't solve anything.
Hello, I am new too Splunk and am needing to split an Event at the Response Line.  Below is an example of an Event.   Request : August 17, 2021, 4:50 pm Data: {"requestNode":"Item","updatedBy":"W... See more...
Hello, I am new too Splunk and am needing to split an Event at the Response Line.  Below is an example of an Event.   Request : August 17, 2021, 4:50 pm Data: {"requestNode":"Item","updatedBy":"WebServices_User","elements":{"typeOfItem":"Stock","country":"1",""baseUnitOfMeasure":"EA","IsItASerializedProduct":false,"currencyCode":"1","freezeCodeCorpLevel":98,"fractionalInventory":false,"isItADirectShippedProduct":false,"globalHold":false,"replacementCost":9.6,"productForm":"Non-Hazardous\/Transferrable","PrimaryVendor":"V9723","landedProduct":true,"standardCost":11.425,"status":"Inactive","priceGroup":"1N","invoiceCost":0,"listCost":11.99,"ueType":"Nursery","ueLine":"CNCO","ueDepartment":"EUONYMUS","taxCategory":"07"}} Response: {"success":false,"message":"No valid Item exists","code":"205"}   The purpose is, I need to create Fields for each parameter in the Response Line, and with this line being a part of the Data portion of the Event, which has varying number of fields, we can't get the regex working.  Support said we needed to break out the Response line, but wouldn't offer any recommendation on which line breaker I should be using. I've tried adding a BREAK_ONLY_BEFORE to the sourcetype in props.conf, but after a Splunk restart, we stop seeing events for this sourcetype. Below is what the sourcetype looks like in props.conf. [webservices_log-too_small] BREAK_ONLY_BEFORE = ^[a-zA-Z](?:[_-]?\w)*:\s+\{"[a-zA-Z](?:[_-]?\w)*": PREFIX_SOURCETYPE = True is_valid = True maxDist = 9999   Any help on this would be awesome, I really appreciate it.   Thanks, Tom  
Hi, I am trying to compare the between two events (json format), say, I can pipe with "head 2" to output only two events and then compare them and hight light what's changed, something like this: <s... See more...
Hi, I am trying to compare the between two events (json format), say, I can pipe with "head 2" to output only two events and then compare them and hight light what's changed, something like this: <search syntax> | head 2 event 1     {         value:  20          status: high          category: A    } event 2     {          value: 25          status: low          category: A    } Output after compare looks like this or anything that can highlight the changes:  changed         origin                new value                  25                     20 status               low                     high   category is unchanged, so won't have to be highlighted. any help is appreciated.  
what is the need of metadata files under /etc/apps/appname/metadata, why it is modified continuously? @all
Hi Team, We have one requirement in Splunk dashboard where if we mouse over on particular table cell in one Panel, respective success/failure log should pop up.Can some one please help how this can ... See more...
Hi Team, We have one requirement in Splunk dashboard where if we mouse over on particular table cell in one Panel, respective success/failure log should pop up.Can some one please help how this can be achieved.  
I have a SHC consisting of 3x SHs with https enabled. are there any steps I need to do from Splunks end to enable a specific domain name forwarding from F5 to Splunk? for example user goes to https... See more...
I have a SHC consisting of 3x SHs with https enabled. are there any steps I need to do from Splunks end to enable a specific domain name forwarding from F5 to Splunk? for example user goes to https://shc/ and it should direct it to vip ip 10.0.0.0 which is turn will pick one of the 3 SHs and direct traffic to it. Sadly this is not working.
Hi, I have a sample log like below, for which I have created an alert which triggers if the Expiration Date is greater than current date.  LOGS: Date : 17/08/2021 12:15:44 Build Number : 3274 Datab... See more...
Hi, I have a sample log like below, for which I have created an alert which triggers if the Expiration Date is greater than current date.  LOGS: Date : 17/08/2021 12:15:44 Build Number : 3274 Database Date : 2021-07-15 Expiration Date : 2021-08-17 License Expiration Date : 2021-08-17   MY QUERY: index=myIdx source="/my/logs/catalina.out" linecount=4 | regex _raw = ".*\sExpiration Date\s.*" | rex max_match=0 "^(?<lines>.+)\n+" | eval buildNumber=mvindex(lines,0) | eval expirationDate=mvindex(lines,2) | fields - lines | eval expirationDateVal = mvindex(split(expirationDate,":"),1) | eval buildNumberVal = mvindex(split(buildNumber,":"),1) | eval expiredConvert = strptime(expirationDateVal,"%m-%d-%Y") |eval expiredConvertDiffFormat = strptime(expirationDateVal,"%Y-%m-%d") | eval remDays =round((expiredConvert-now())/86400) | eval remDaysDiffFormat =round((expiredConvertDiffFormat-now())/86400) | where ( remDays <= 15 and remDays != "" ) or ( remDaysDiffFormat !="" and remDaysDiffFormat <= 15 ) | rename remDays as numDays remDaysDiffFormat as numDaysDiffFormat host as host |eval remainingDays =case(numDays!="",numDays,numDaysDiffFormat!="",numDaysDiffFormat)| where remainingDays > 0 | table remainingDays,host,buildNumberVal,expirationDate   Somehow, this query is not pulling up the logs. Is there something which I am missing in my query? The alert should have triggered yesterday. But it hasn't. Kindly help.   Thanks in advance.
here is log format: Type=0 name=aaa1 door=aaa2 street=aaa3 city=aaa4 country=aaa5 dr="" CN="" Type=0 name=bbb1 door=bbb2 street=bbb3 city=bbb4 country=bbb5 dr="" CN="" Type=1 name=ccc1 door="" str... See more...
here is log format: Type=0 name=aaa1 door=aaa2 street=aaa3 city=aaa4 country=aaa5 dr="" CN="" Type=0 name=bbb1 door=bbb2 street=bbb3 city=bbb4 country=bbb5 dr="" CN="" Type=1 name=ccc1 door="" street=ccc3 city=ccc4 country="" dr=ccc2 CN=ccc5 Type=1 name=ddd1 door="" street=ddd3 city=ddd4 country="" dr=ddd2 CN=ddd5 wanted to create table like below: NAME DOOR-NUMBER STREET CITY COUNTRY-NAME aaa1 aaa2 aaa3 aaa4 aaa5 bbb1 bbb2 bbb3 bbb4 bbb5 ccc1 ccc2 ccc3 ccc4 ccc5 ddd1 ddd2 ddd3 ddd4 ddd5  
Hi, For security monitoring matters, we are trying to collect authentication logs from Fortigate and Palo Alto devices but we only receive success events. We also need to get failed events. We do... See more...
Hi, For security monitoring matters, we are trying to collect authentication logs from Fortigate and Palo Alto devices but we only receive success events. We also need to get failed events. We don't think the problem is due to devices configuration because other SIEM can read failed authentication events with the same Log forward filters applied. Same for index=firewall_fortigate index=firewall_paloalto   Thanks by advance for your feedback
Hi, I want to run something similar to the below on metrics data stored in metrics index, can you please assist: eval ip34 = if(ip=="37.25.139.34",1,0) ,ip35 = if(ip=="37.25.139.35",1,0) | mstats... See more...
Hi, I want to run something similar to the below on metrics data stored in metrics index, can you please assist: eval ip34 = if(ip=="37.25.139.34",1,0) ,ip35 = if(ip=="37.25.139.35",1,0) | mstats sum(ip34) , sum(ip35) , avg(bytes) , stdev(bytes) , median(bytes) avg(response_time_s) , stdev(response_time_s) , median(response_time_s) where index=sfp_metrics earliest=-5m@m latest=@m span=1m by uri  It gives me: Any idea how to achieve this would be appreciated.   Best Regards,
I'd like to set the 192.x.x.x band ip for 7 days. index="*" earliest=-7d | rex "192\.(?<range>\d{1,3})\.(?<range>\d{1,3})\.(?<range>\d{1,3})" | where range >=xx AND range<=xx  How should I correc... See more...
I'd like to set the 192.x.x.x band ip for 7 days. index="*" earliest=-7d | rex "192\.(?<range>\d{1,3})\.(?<range>\d{1,3})\.(?<range>\d{1,3})" | where range >=xx AND range<=xx  How should I correct it?
I would like to know about the permission files under the metadata directory of each app.$ SPLUNK_HOME / etc / apps / <App name> / metadataThe permission file under the metadata directory is updated ... See more...
I would like to know about the permission files under the metadata directory of each app.$ SPLUNK_HOME / etc / apps / <App name> / metadataThe permission file under the metadata directory is updated when any operation is performed.Will it be done? <Background of inquiry> When checking the Splunk configuration file, the search head permission file mentioned aboveI noticed that it has been updated.After checking the contents, the authority is set individually for the lookup table etc . it seemed that the number of stanzas related to knowledge objects was increasing.Because I was confused because I did not know when the contents of the permission file were updated, please contact me. for splunk version 7.3.2
We have a client currently running their Splunk Enterprise environment (HF, Indexers and Search Head) in Windows 2016 and they would like to add a new HF in Windows 2019 for HA purposes. Is this supp... See more...
We have a client currently running their Splunk Enterprise environment (HF, Indexers and Search Head) in Windows 2016 and they would like to add a new HF in Windows 2019 for HA purposes. Is this supported or is there any compatibility issue? Kindly share your thoughts. Thanks! 
Hi Team, I need urgent help on how to whitelist  specific lines from logfile and ignoring rest. As an example this is a feed in my logfile :- [2021-08-18 03:32:09.797] 2021-08-18 03:31:59.000, ip:... See more...
Hi Team, I need urgent help on how to whitelist  specific lines from logfile and ignoring rest. As an example this is a feed in my logfile :- [2021-08-18 03:32:09.797] 2021-08-18 03:31:59.000, ip: 10.7.128.219, folder: 0, size: <nil>, event: ObjectRemoved:DeleteMarkerCreated, session: 15849,10.7.128.219, type: 2, region: eu-west-2, bucket: proftpd-prod-replicated, topic: arn:aws:sns:eu-west-2:563028249984:proftpd_prod_replicated_event_topic, key: export/sftp/ABE0A4FD16B68ADBC0B28AD415F/Credit_Index_Live_Latest/CREDIT_INDICES_LIVE_PRICING-20210811-0315.csv, sequencer: 00611C7F3529A4C883 deleteObject: Warning: Couldn't remove object '/export/sftp/ABE0A4FD16B68ADBC0B28AD415F/Credit_Index_Live_Latest/CREDIT_INDICES_LIVE_PRICING-20210811-0315.csv' from cache, cache might be stale Detected cache out of sync, now relisting whole directory [/export/sftp/ABE0A4FD16B68ADBC0B28AD415F/Credit_Index_Live_Latest] === Now testing diff of folder and cache... [folder: export/sftp/ABE0A4FD16B68ADBC0B28AD415F/Credit_Index_Live_Latest/] =============================== DIFF CALCULATION TOOK: 0.015115 [diffs: [/export/sftp/ABE0A4FD16B68ADBC0B28AD415F/Credit_Index_Live_Latest]: additions: 0, removals: 0, updates: 0, timestamp: 1629257529.797306] Updating timestamp from: 1629253911.017497 to: 1629257529.797306 RESULT [/export/sftp/ABE0A4FD16B68ADBC0B28AD415F/Credit_Index_Live_Latest]: Size: 499, folders: 5, footprint: 30856, cache_: 0x7f6781fd2878 /:D:1 1629257529.812982 /..:D: [VIRTUAL] - export 0 /export:D:1 1629257529.812975 /export/..:D: [VIRTUAL] - sftp 0 /export/sftp:D:1 1629257529.812971 /export/sftp/..:D: [VIRTUAL] - ABE0A4FD16B68ADBC0B28AD415F 0 /export/sftp/ABE0A4FD16B68ADBC0B28AD415F:D:1 1629257529.812988 /export/sftp/ABE0A4FD16B68ADBC0B28AD415F/..:D: [VIRTUAL] - Credit_Index_Live_Latest 0 /export/sftp/ABE0A4FD16B68ADBC0B28AD415F/Credit_Index_Live_Latest:D:494 1629257529.797306 /export/sftp/ABE0A4FD16B68ADBC0B28AD415F/Credit_Index_Live_Latest/..:D: [VIRTUAL] - CREDIT_INDICES_LIVE_PRICING-20210818-0330.csv 115660 - CREDIT_INDICES_LIVE_PRICING-20210818-0315.csv 115638 - CREDIT_INDICES_LIVE_PRICING-20210818-0300.csv 115636 - CREDIT_INDICES_LIVE_PRICING-20210818-0245.csv 115636   Out of the above lines I want only to enable feed for the line which is highlighted in red and ignore rest of the lines.  Please suggest this can be achieved?  Thanks in advance.   Regards, Prateek Sawhney 
We can have either IP address or hostname for host field, both is not supported yet. Below highlighted config in inputs.conf will force to set host as IP address: host = localhost connection_host =... See more...
We can have either IP address or hostname for host field, both is not supported yet. Below highlighted config in inputs.conf will force to set host as IP address: host = localhost connection_host = ip we tried to explore _meta field coming from UF. If we can make that dynamic to support variable, then we can have both IP and hostname. This would be similar to $decideOnStartup functionality for host field, but more flexible. Has anyone able to send both IP and Hostname dynamically for indexing through UF? For example: $hostname$ will yield hostname.. as used with inputs.conf, it would be [default] _meta = splunk_forwarder::$hostname$
Hi Team, Need help in parsing AWS Managed Active Directory AD Team is writing logs to cloudwatch, and we have Splunk Addon for AWS which consumes these logs through Kinesis stream.  I have props c... See more...
Hi Team, Need help in parsing AWS Managed Active Directory AD Team is writing logs to cloudwatch, and we have Splunk Addon for AWS which consumes these logs through Kinesis stream.  I have props config to convert the logs to xmlwineventlog sourcetype after which data is parsed but not all the fields. I want the addon to parse using the source [xmlwineventlog:security] but that is not happening,   Here is my props config [source://*securitylogs] Transforms-Index=override_st_props,override_source_props And transforms as below [override_st_props] REGEX=. FORMAT = sourcetype::xmlwineventlog DEST_KEY = MetaData:Sourcetype [override_source_props] REGEX = . FORMAT = source::xmlwineventlog:security DEST_KEY = MetaData:Source It is getting changed on sourcetype and source, but parsing is happening based on sourcetype as per windows addon and not on source Hope i made it clear, please help
Hey , I have a usecase where i need to take a difference between events and report the ones which are missing.  Example Events (each line is an event).  txId will be identifier field which will... See more...
Hey , I have a usecase where i need to take a difference between events and report the ones which are missing.  Example Events (each line is an event).  txId will be identifier field which will be sent with type "R", "D" or "F" Here in this case, i need to identify the txId for which i have "R" but missing "D" or "F" { "txId" : "0001", "type": "R"} { "txId" : "0001", "type": "F"} { "txId" : "0002", "type": "R"} { "txId" : "0003", "type": "R"} { "txId" : "0003", "type": "D"} In the above sample events, txId = 0002 is the one missing D or F type. I need to count such events & show in a panel as well as drill-down to those specific R events. Additional challenge is there will be more than 3 million "R" events and corresponding "D" or "F" events. Can you please help on this diff ?