All Topics

Top

All Topics

I have 2 multivalue fields (old and new) containing group lists for 1 or more users. The new values is the list of groups that replace the old groups For example: user 1 has an old value of group1,... See more...
I have 2 multivalue fields (old and new) containing group lists for 1 or more users. The new values is the list of groups that replace the old groups For example: user 1 has an old value of group1, group2, group3 user 1 has a new value of group1, group2, group3, group4, and group5 user 2 has an old value of group3, group4, group5 user 1 has a new value of group4, group5, group6, group7, and group8 I'm trying to return group4 and group5 for user and group7 and group8 for user2
Hi, I have requirement to show the line chart comparison between todays count vs previous day. And, I have below SPL but we see the data from yesterday and today, and each graph line is separate.  ... See more...
Hi, I have requirement to show the line chart comparison between todays count vs previous day. And, I have below SPL but we see the data from yesterday and today, and each graph line is separate.  I want to see the lines together, one superimposed on the other. please could you suggest?   please can you suggest to compare them? Current SPL:   basesearch earliest=-1d@d latest=now | eval Day=if(_time<relative_time(now(),"@d"),"Yesterday","Today") | timechart span=15m count by Day Current visualization: Expected visualization is:    
Hi all, For this sort of json string, how can I extract KeyA, KeyB, KeyC?  { "KeyA": [ { "path": "/attibuteA", "op": "replace", "value": "hello" }, { "path": "/attibuteB", "op": "replace", "value":... See more...
Hi all, For this sort of json string, how can I extract KeyA, KeyB, KeyC?  { "KeyA": [ { "path": "/attibuteA", "op": "replace", "value": "hello" }, { "path": "/attibuteB", "op": "replace", "value": "hi" } ], "KeyB": [ { "path": "/attibuteA", "op": "replace", "value": "" }, { "path": "/attibuteC", "op": "replace", "value": "hey" }, { "path": "/attibuteD", "op": "replace", "value": "hello" } ], "KeyC": [ { "path": "/attibuteE", "op": "replace", "value": "" } ] }   My ideal output would look like: Key path op value KeyA attibuteA replace hello KeyA attibuteB replace hi KeyB attibuteA replace   KeyB attibuteC replace hey KeyB attibuteD replace hello Keyc attibuteE replace     Many thanks^
| table Status, timeval, CompanyCode, CN |appendpipe [stats count| eval error="thats not cool" | where count==0 |table error |fields - Status, timeval, CompanyCode, CN] these are the last two lin... See more...
| table Status, timeval, CompanyCode, CN |appendpipe [stats count| eval error="thats not cool" | where count==0 |table error |fields - Status, timeval, CompanyCode, CN] these are the last two lines of a search , so in this search if in fields (Status, timeval, CompanyCode, CN) there is no values or the all the fields are empty then i have to display a message which in this case is "thats not cool" , it is working but in the result as you can see all the empty fields are also displaying in the result . But I want only the error field if other fields are empty. Can anyone help?  
Hello,   I am trying to create some dashboards in ES and some other apps.  For convenience I would like to be able to access them from app drop-down menu, but I can`t find way to do so. Can someon... See more...
Hello,   I am trying to create some dashboards in ES and some other apps.  For convenience I would like to be able to access them from app drop-down menu, but I can`t find way to do so. Can someone tell me if this is even possible? If yes how!  P.S. We are using Splunk cloud deployment. 
Hello Experts, I'm currently having CSV file that contains fields such as ID, IP, OS, _time, status etc. I need to create a metric index. Do I need to change field names in the csv file to align w... See more...
Hello Experts, I'm currently having CSV file that contains fields such as ID, IP, OS, _time, status etc. I need to create a metric index. Do I need to change field names in the csv file to align with Splunk expectation or can I import data as it is? I'd appreciate any guidance or examples how to achieve this.? Thanks in advance
Hi, have a requests to restore 40weeks logs from dynamic data archive storage data for one of the index on splunk cloud.may i know process and best practices if any
The ssl is enabled and can not change when using Splunk Clound free trial, where I can find/download the certificate.
How can Splunk query which IPs have been requested continuously for more than 3 days? And there are multiple values in the firewallSource field, how can we know which IPs have both WAF and ATE in requ... See more...
How can Splunk query which IPs have been requested continuously for more than 3 days? And there are multiple values in the firewallSource field, how can we know which IPs have both WAF and ATE in requests during a certain time period_ LIMITED?
I'm currently working on crafting a Splunk Query to identify systems that have been inactive for a specified duration (which can vary based on user requirements). My intention is to utilize "Windows ... See more...
I'm currently working on crafting a Splunk Query to identify systems that have been inactive for a specified duration (which can vary based on user requirements). My intention is to utilize "Windows event logs" as the data source, focusing on EventCode=4624. Primarily, I'll be manipulating the default field "_time" as there isn't another relevant field available. I'd appreciate any guidance or suggestions you might have in this regard.
Hello, I am trying to determine why my table on Dashboard Studio is showing "No Data" but it shows on Dashboard Classic. I referenced the report in the code and I am using a token for data input. Wh... See more...
Hello, I am trying to determine why my table on Dashboard Studio is showing "No Data" but it shows on Dashboard Classic. I referenced the report in the code and I am using a token for data input. Whenever I open in search, it pulls up all the data I need, but just does not show in the Dashboard. {     "type": "ds.savedSearch",     "options": {         "ref": "E.1_Malicious_Emails_Inbound"     } }   I also checked the APP permissions, and they are in the same app and readable between the report and dashboard. Just No Data. Has anyone ran into an issue like this?    
Hello, I'm trying to install Splunk ITSI 4.17.1 in a Search Head Cluster with Splunk Enterprise 9.1.2. I already extract the .spl in the directory $SPLUNK_HOME$/etc/shcluster/apps but when I execut... See more...
Hello, I'm trying to install Splunk ITSI 4.17.1 in a Search Head Cluster with Splunk Enterprise 9.1.2. I already extract the .spl in the directory $SPLUNK_HOME$/etc/shcluster/apps but when I execute the command splunk apply shcluster-bundle it shows that it has deployed everything correctly but when I go to the Search  Heads none of the ITSI apps are deployed. i just made a test deploying another simple app just for testing purposes and it worked. Do you have any idea?
Here is a snippet of the URL I am sending and the time format in which it needs to be: startTime=2023-12-01T16%3A27%3A45.000Z&endTime=2023-12-01T16%3A32%3A45.000Z However, when I try to send "l... See more...
Here is a snippet of the URL I am sending and the time format in which it needs to be: startTime=2023-12-01T16%3A27%3A45.000Z&endTime=2023-12-01T16%3A32%3A45.000Z However, when I try to send "latesttime" or "earliesttime", splunk is sending it in epoch. How do I get the proper format of time for the URL within the workflow action? Thanks!
Hello, I'm trying to find information on how to use Splunk with Visual Studio Code. I have an authentication token on my development instance. I've installed the Visual Studio Code Extension for S... See more...
Hello, I'm trying to find information on how to use Splunk with Visual Studio Code. I have an authentication token on my development instance. I've installed the Visual Studio Code Extension for Splunk on GitHub. I'm lost from here on. What do I enter in the url and webroot fields in the launch.json file? "configurations": [ { "type": "chrome", "request": "launch", "name": "Launch Chrome against localhost", "url": "https://<host name>:8080", "webRoot": "${workspaceFolder}" } ] This opens Splunk in my Chrome browser, but it is an empty search field. I created splnb file in VSC, but when I run it, I receive ERROR: Unauthorized. Thanks in advance for any direction provided. God bless, Genesius
Hello, everyone! Currently, I have the Splunk Add-on for Unix and Linux version 8.1.0 installed on my heavy forwarder. However, I need to upgrade it to the latest version, and I am seeking recommend... See more...
Hello, everyone! Currently, I have the Splunk Add-on for Unix and Linux version 8.1.0 installed on my heavy forwarder. However, I need to upgrade it to the latest version, and I am seeking recommendations on how to carry out this process. Additionally, I would appreciate guidance on utilizing the deployment server to distribute the update to the Universal Forwarders. God bless. Regards
Hello! Our Splunk server receives dc logs on a daily basis from another network team. Under Files & Directories in Data Inputs, I have the file path for those logs configured to be continuously moni... See more...
Hello! Our Splunk server receives dc logs on a daily basis from another network team. Under Files & Directories in Data Inputs, I have the file path for those logs configured to be continuously monitored since we receive those logs from another organization. I set a custom index for those logs and it's not showing any data in that index. I've verified that it's not a permissions issue. I decided to manually upload one of those files into Splunk and noticed that they are .tsidx files. After uploading, I wasn't able to read any of the data on the .tsidx file. Is that normal? Am I doing anything incorrect? We need to be able to audit those dc logs. Thanks in advance!
Hello   Im working on testing something but Im not sure exactly would be the best solution. What I am trying to do is, using the timepicker, have a panel that loads id's. Then I'd like another pane... See more...
Hello   Im working on testing something but Im not sure exactly would be the best solution. What I am trying to do is, using the timepicker, have a panel that loads id's. Then I'd like another panel to search over the same timespan, in a different dataset, but only for the id's from the first panel. Is there a way to pass the results of a search that runs on page load to another search, maybe with a token(s)? the catch is that there may be a single id or there may be many id's. It would have to be a boolean of some sort I believe unless there's a better way to search one to many instances of a something.   My thinking is something like  search 1: <base search> | stats count by MSGID | fields - count that populates a <tok> on page load(or time selection) but the results would have to be formatted like    654165464 OR MSGID=584548549494 OR MSGID=54654645645   search 2 <base search2> MSGID=<tok> | stats count by MSGID | fields - count Is this something that can be done? What might I have to do to accomplish this? Thanks for the assistance!
Splunk is pleased to announce the latest enhancements to Edge Processor that will help to optimize your data management needs to filter, mask, and route more intelligently. These include support for ... See more...
Splunk is pleased to announce the latest enhancements to Edge Processor that will help to optimize your data management needs to filter, mask, and route more intelligently. These include support for Lookups, RawHec and Cryptographic Functions. Enhanced HEC capability, now with RawHEC: To complement the earlier release of data ingest and export using HTTP Event Collector (HEC,) Splunk Edge Processor can now receive events via a raw HEC endpoint.  Third-party cloud services can send data to Splunk without having to conform to a Splunk specific format or account for the inability to change code formats with an authentication token. That means you can push more data into Splunk, at a high bandwidth, and with control over data schema. To learn more, check out Splunk Docs. Enhanced Masking with Cryptographic Functions: Support for cryptographic functions builds on the masking capabilities in Edge Processor, and are necessary to ensure data integrity and confidentiality before it leaves your network boundaries. Where previously you could only redact data using Edge Processor, now you can hash data of your choosing, and perform analytics on top of it. For example, you might hash sensitive data like a credit card number before that data streams to Splunk platform; and you may still want to know how many transactions happened using that credit card over the past month. Now you can generate new insights and value into your data. New Lookups: Edge Processor now delivers support for Lookups that enables users to perform better data enrichment and in turn, to make more informed decisions about the data before its indexed.  This means that you can now identify which device failed but not in the event rather saved in another file. You can now erich data further up in the pipeline before indexing to accelerate detection, and you can append information to an event and write an event as an original event, without having to do additional research. For more about Splunk Edge Processor, including plans to support additional sources, destinations, and new functionality, see release notes and documentation.  And be sure to join the Slack user group #edge-processor to get real-time support from the Community. 
Hi! I received an event with the following time string:  2023-12-12T13:39:25.400399Z CEF:0..... This time is already in the correct timezone, but because of Z, splunk adds to 5 hours. I understan... See more...
Hi! I received an event with the following time string:  2023-12-12T13:39:25.400399Z CEF:0..... This time is already in the correct timezone, but because of Z, splunk adds to 5 hours. I understand that Z it is timezone indicator but how i can ignore it? Flow of this event is : Source --> HF --> Indexers. On HF or Indexers i dont have any props or transoforms settings. On Search Heads I extract a few fields from this event and it works. But i can't to extract this time correctly without Z. I put the following regex inside props.conf on my SHs. Also i tried to put this on indexer's props.conf:   TIME_PREFIX = ^\d{2,4}-\d{1,2}-\d{1,2}T\d{1,2}:\d{1,2}:\d{1,2}\.\d{1,6}    I tried to add TZ or TZ_ALIAS inside props.conf, but no effect. Where can I be wrong? Thanks
Hi, I am facing issue with "no recent logs found for the sourcetype =abc:xyz (example) and index=pqr (example) after 25th November Like we are able to see the logs till 25th of Nov So, please ... See more...
Hi, I am facing issue with "no recent logs found for the sourcetype =abc:xyz (example) and index=pqr (example) after 25th November Like we are able to see the logs till 25th of Nov So, please guide how to check it would be helpful. Thanks, Pooja