All Topics

Top

All Topics

As the title suggests, I am keen to know when ES license starts counting, from date of renewal or date of data ingestion ?
I have a UF on an rsyslog server. The UF is forwarding logs to the indexer successfully, but one of my two input flows is going to the wrong index, and I can't figure it out.  inputs.conf [monitor:... See more...
I have a UF on an rsyslog server. The UF is forwarding logs to the indexer successfully, but one of my two input flows is going to the wrong index, and I can't figure it out.  inputs.conf [monitor:///path/number/one/*] index = first_index sourcetype = first_index host_segment = 4 disabled = false [monitor:///path/number/two/*] index = second_index sourcetype = second_index host_segment = 4 disabled = false Data of sourcetype second_index makes it to the corresponding index, but data of sourcetype first_index ends up in the main index.  The only props and transforms I have configured are from the VMware add-on and its accessories, but I've scoured its conf files and have not found anything that would send this non-VMware data to main instead of where it belongs when it's specified in $SPLUNK_HOME/etc/system/local/inputs.conf. Any ideas? Thx!
Hello again Spelunkers!  So I have data that looks like this: assessment=normal [1.0] assessment=normal [1.1] assessment=suspect [0.75] assessment=suspect [0.88] assessment=bad [0.467] ... See more...
Hello again Spelunkers!  So I have data that looks like this: assessment=normal [1.0] assessment=normal [1.1] assessment=suspect [0.75] assessment=suspect [0.88] assessment=bad [0.467] I want a table column named rating that takes the "normal," "suspect," "bad" without the [###] after it. So I wrote the below thinking I can name the column rating and then capture any alpha characters and terminate at the white space between the word value and the [###] value. What would be the correct way of writing this? Thank you in advance!   | rex field=raw_ "assessment=(?<rating>/\w/\s)"  
Greetings All, I am very new to splunk and am creating a dashboard to show top non-compliances. For the below data, I want to display top non-compliant controls (example output also mentioned below)... See more...
Greetings All, I am very new to splunk and am creating a dashboard to show top non-compliances. For the below data, I want to display top non-compliant controls (example output also mentioned below) Could anyone please let me know how can I write a search query for the same? Thanks in advance.   Event_ID: abc1 Compliance_result: Non-Compliant Eval_results: { required_tags: { compliance: Compliant } encryption_enabled:{ compliance: Non-Compliant } public_access:{ compliance: Compliant } policy_enabled:{ compliance: Compliant } }   Event_ID: abc2 Compliance_result: Non-Compliant Eval_results: { required_tags: { compliance: Compliant } encryption_enabled:{ compliance: Non-Compliant } public_access:{ compliance: Non-Compliant } policy_enabled:{ compliance: Compliant } }   Generate Table in the below format -   Top Non Compliance controls: public_access - 2 encryption_enabled -1  
Hello,   Can i please know how to parse the value to the 2nd query from the output of 1st query. Any help would be appreciated.   1st query: index=<index_name>  sourcetype=<sourcetype_name> | ta... See more...
Hello,   Can i please know how to parse the value to the 2nd query from the output of 1st query. Any help would be appreciated.   1st query: index=<index_name>  sourcetype=<sourcetype_name> | table k8s_label | where k8s_label="id=<id_number>"   1st Query Output: name=peter project_id=123 user_id=2700835661 zone=us-west-2a   2nd Query: index=<index_name>  "server failed" Project_id=<need to get project_id  from the result of 1st query Output>     Thanks  
events are loaded with different currency from different countries and we are trying to have a view converting the currency into one currency.  We have uploaded CSV with average exchange rate per mon... See more...
events are loaded with different currency from different countries and we are trying to have a view converting the currency into one currency.  We have uploaded CSV with average exchange rate per month and would like to display a table using event date and use the rate from CSV, as the rates should be calculated as per the current rates and it should always change as we load new month rates    
I'm having a difficult time understanding what's the difference between the Content Pack for Unix Dashboards and Reporting and the Content Pack for Monitoring Unix and Linux. I see the the Content Pa... See more...
I'm having a difficult time understanding what's the difference between the Content Pack for Unix Dashboards and Reporting and the Content Pack for Monitoring Unix and Linux. I see the the Content Pack for Unix Dashboards and Reporting is available for use with IT Essentials Work and the Content Pack for Monitoring Unix and Linux is only available with ITSI. What are the differences in features between the two?
I have items visit log index with fields: category, item each event is a visit In addition, I have an index with all items in the system in form category, items_count I want to create a timechart o... See more...
I have items visit log index with fields: category, item each event is a visit In addition, I have an index with all items in the system in form category, items_count I want to create a timechart of categories: <category> -> <visited items>/<all items> other time What I did: index="visited" | eval cat_item = category."/".item | timechart dc(cat_item) by category | foreach * [ search index="cat" category="<<FIELD? >>" | eval <<FIELD>>= '<<FIELD>>'/items_count ] But this does not work timechart here creates a table with categories as columns and, each row contains the count of visited items  Now the problem is how I get column name, and value in the subquery. In the examples, the <<FIELD>> is used for the column name and column value alike.  Please help          
Hi Team, I want to extract aws-region from host name.  host= "my-service-name-.ip-101-99-126-252-us-west-2c".   I want to extract us-west-2 from the host. How I can achieve this.
I have a dropdown that has dynamic data, changes by the day, that I want filled in the dropdown for selection and use in the dashboard.  I've followed several entries from the community but the dropd... See more...
I have a dropdown that has dynamic data, changes by the day, that I want filled in the dropdown for selection and use in the dashboard.  I've followed several entries from the community but the dropdown is blank, only showing the ALL from the 'choice' entry.  Here is the SPL,   <fieldset submitButton="true">  <input type="dropdown" token="tok_site" searchWhenChanged="false">   <label>Site</label>   <search>    <query>earliest=-2h index=asset sourcetype=Armis:Asset                     | stats count by site.name    </query>   </search>   <choice value="*">ALL</choice>   <default>*</default>   <fieldForLabel>Site</fieldForLabel>   <fieldForValue>Site</fieldForValue>  </input> </fieldset> I will be adding a couple more dropdowns later, but they are dynamic as well.  If I can't get one to work, well.. Any suggestion on where I've made a mistake?
Hi Guys,         I have a scenario where i need to extract the file name from the event logs. The Event log first line looks like below. Event Log: [INFO] 2021-09-30T00:04:17.052Z 8d5eb00a-d033-4... See more...
Hi Guys,         I have a scenario where i need to extract the file name from the event logs. The Event log first line looks like below. Event Log: [INFO] 2021-09-30T00:04:17.052Z 8d5eb00a-d033-49a9-9d0f-c61011e4ae51 {"Records": [{"eventVersion": }]   Now i need to write a rex query to extract the file name "8d5eb00a-d033-49a9-9d0f-c61011e4ae51" from above event log. This file name changes for the every search query along with the timestamp.   Can someone suggest me how to resolve this?   Thanks.
Hi,     Actually I need a splunk to be deployed using helm chart can any one help me.     Thanks
hi i want to use sendmail spl command but it give me below error command="sendemail", (535, '5.7.3 Authentication unsuccessful') while sending mail to: myemail@mydomain.com   my spl command: inde... See more...
hi i want to use sendmail spl command but it give me below error command="sendemail", (535, '5.7.3 Authentication unsuccessful') while sending mail to: myemail@mydomain.com   my spl command: index=_internal | head 5 | sendemail to="myemail@mydomain.com" server=mail.server.com subject="Here is an email from Splunk" message="This is an example message" sendresults=true inline=true format=raw sendpdf=true     FYI:email setting from web server configuration already set correctly, and i test it with alert and send email correctly, but when i use sendmail spl command not work!   also i check config file it set correctly: /opt/splunk/etc/system/local/alert_actions.conf   any idea? thanks,
Does anyone know the amount of time a universal forwarder takes to go and recheck the DNS entries of servers listed in the outputs.conf file. If the servers are listed by servername and not IP in th... See more...
Does anyone know the amount of time a universal forwarder takes to go and recheck the DNS entries of servers listed in the outputs.conf file. If the servers are listed by servername and not IP in the outputs file than Splunk would go out and check for the IP of those servers.  I know it does this at forwarder startup; but doe it also recheck periodically?   I am looking at a situation where the DNS entries for the backend servers get changed to new IPs (in a disaster recovery scenario) and want to know how long it would take the forwarders to start picking up on the new IPs (without having to go out and cycle the all of the forwarders that would talk to the indexers whose IPs got changed). Thanks.
Hello everyone, I have added an IP on local_intel_ip.csv and it now appears on Threat Artifact panel. The correlation search "Threat Activity Detected" is enabled with Adaptive Response Actions a No... See more...
Hello everyone, I have added an IP on local_intel_ip.csv and it now appears on Threat Artifact panel. The correlation search "Threat Activity Detected" is enabled with Adaptive Response Actions a Notable and Risk Analysis. A notable event was triggered with this IP as destination IP, but the aforementioned Notable (Threat Activity Detected) was never triggered.  Any idea on what I might have done wrong? Thank you in advance. Chris
Is it possible to use data models from Common Information Model to use cases in splunk, if so, how can we do that 
Hi, I have difficulty to break a json into multiple events. Here is my log : (appear in one event, instead of 2)   { "InstanceInformationList": [ { "Version": false, ... See more...
Hi, I have difficulty to break a json into multiple events. Here is my log : (appear in one event, instead of 2)   { "InstanceInformationList": [ { "Version": false, "PlatformName": "Amazon Linux", "ComputerName": "ip-10-170-216-17.eu-east-1.compute.internal" }, { "PlatformType": "Linux", "IPAddress": "10.170.216.18", "AssociationOverview": { "DetailedStatus": "Failed", "InstanceAssociationStatusAggregatedCount": { "Failed": 1, "Success": 1 } }, "AssociationStatus": "Failed", "PlatformVersion": "2", "ComputerName": "ip-10-170-216-18.eu-east-1.compute.internal", "InstanceId": "i-00000000001", "PlatformName": "Amazon Linux" } ] }      And you can find my props.conf below :   [my_test] SHOULD_LINEMERGE = false INDEXED_EXTRACTIONS = json DATETIME_CONFIG = CURRENT TRUNCATE = 999999 JSON_TRIM_BRACES_IN_ARRAY_NAMES = true BREAK_ONLY_BEFORE = (\[\s+\{) MUST_BREAK_AFTER = (\},|\}\s+\]) SEDCMD-remove_header = s/(\{\s+.+?\[)//g SEDCMD-remove_footer = s/\]\s+\}//g       Can you help me to find the write parsing please ? Thank you.
I created a new splunk enterprise instance in which I want to connect to my already pre-existing main enterprise instance with the bulk of our data. The intention of having 2 is so I can track the he... See more...
I created a new splunk enterprise instance in which I want to connect to my already pre-existing main enterprise instance with the bulk of our data. The intention of having 2 is so I can track the heartbeat messages between each server to one another to alert when one or the other goes down. I already have the new instance connected to the old one through outputs.conf - and this gives me the ability to search for its heartbeat logs in index=_internal. However, connecting the main original instance to the new one is a different story. I have it forwarding to the new instance the same way, using outputs.conf. However, I believe that this is too much for the new instance to handle as it is a ton of data (which i don't even want to go there). Is there a way that I can have it establish the connection so I can monitor for heartbeats, but not send any data? Perhaps what settings can I tweak that disable the sending of anything but keep that connection between the two - without turning off indexing on the new instance so I am able to monitor and alert when the old instance stops sending heartbeats when it goes offline. 
I'm trying to exclude specific src_ip addresses from the results of a firewall query (example below). The query completes, however the src_ip addresses are not excluded and the following error is ret... See more...
I'm trying to exclude specific src_ip addresses from the results of a firewall query (example below). The query completes, however the src_ip addresses are not excluded and the following error is returned: [subsearch]: The lookup table 'dns_serves.csv' requires a .csv or KV store lookup definition.  Example: index=firewall | search NOT [|inputlookup dns_serves.csv | fields src_ip] | table src_ip dest_ip signature When running |inputlookup dns_servers.csv by itself the contents of the lookup are returned so I know the lookup is good. I've checked the lookup permissions, CSV encoding, and searches forum threads for a solution.   
Hi There Experts ,  In our current environment we have Splunk Integration with CA UIM monitoring tools to send Splunk alerts to CA UIM for Monitoring . While upgrading the splunk version we got to k... See more...
Hi There Experts ,  In our current environment we have Splunk Integration with CA UIM monitoring tools to send Splunk alerts to CA UIM for Monitoring . While upgrading the splunk version we got to know that Client have customized app for this integration which was on python 2 and as we are upgrading from 7 .3. to 8.1, there is issue with python compatibility .As new splunk versions supports only python 3 .  Any one has any idea on the workaround app or addon we can use from splunk base for integrating Splunk with CA UIM .  Please help