All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello Splunkers! I'm trying to upgrade my Splunk Enterprise from 9.0.x to 9.1.x . After checking the release notes, I saw that I need to add the following: Reference link: https://docs.splunk.c... See more...
Hello Splunkers! I'm trying to upgrade my Splunk Enterprise from 9.0.x to 9.1.x . After checking the release notes, I saw that I need to add the following: Reference link: https://docs.splunk.com/Documentation/Splunk/9.1.2/Installation/AboutupgradingREADTHISFIRST  I have done that and proceeded to perform the upgrade, but I received an error regarding the UTF8 even though I added the required line. Any suggestions for what I might do to overcome this issue?
Hello, How can I enable mouse hover feature on column chart to show its data in Dashboard Studio? I have been searching for an answer but haven't found anything work. Many thanks,       ind... See more...
Hello, How can I enable mouse hover feature on column chart to show its data in Dashboard Studio? I have been searching for an answer but haven't found anything work. Many thanks,       index=web AND uri_path!="*.nsf*" AND uri_path!="*:443" | timechart span=1d dc(src_ip) by src_ip limit=0      
I want to make box-plot graph using my data. I try to find a solution, but it need to install app from file at splunk. So it couldn't apply at my Apps. (Because "my Apps" and "install app" is diffe... See more...
I want to make box-plot graph using my data. I try to find a solution, but it need to install app from file at splunk. So it couldn't apply at my Apps. (Because "my Apps" and "install app" is different) Is there any way to draw box-plot at my own Apps not using "install app form file" ?
We are trying to ingest large (peta bytes) information into Splunk.  The Events are in JSON file structure like - 'audit_events_ip-10-23-186-200_1.1512077259453.json' The pipeline is like -  JSON ... See more...
We are trying to ingest large (peta bytes) information into Splunk.  The Events are in JSON file structure like - 'audit_events_ip-10-23-186-200_1.1512077259453.json' The pipeline is like -  JSON files > Folder > UF > HF Cluster > Indexer Cluster   ~ UF - inputs.conf [batch:///folder] _TCP_ROUTING = p2s_au_hf crcSalt = <SOURCE> disabled = false move_policy = sinkhole recursive = false whitelist = \.json$   We are seeing the events from specific files (NOT all) are getting duplicated. It indexes from some file 2 times exactly.  As it is [batch:///] which suppose to delete the file after reading it & crcSalt=<SOURCE>, we are NOT able to figure out why & what creates the duplicates.  Would appreciate any help, reference or pointers. Thanks in advance!!!
Hello, I have issues getting expected field value pairs using following props and transforms configuration files. Sample events and my configuration files are given below. Any recommendation will be... See more...
Hello, I have issues getting expected field value pairs using following props and transforms configuration files. Sample events and my configuration files are given below. Any recommendation will be highly appreciated.   My Configuration Files [mypropsfile] REPORT-mytranforms=myTransfile [myTransfile] REGEX = ([^"]+?):\s+([^"]+?) FORMAT = $1::$2   Sample Events 2023-11-15T18:56:30.098Z, User ID: 90A, User Type: TempEMP,  Product Code:  pc, UAT:  UTA-True, Event Type:  TEST,  EventID:  Lookup, Remote Host: 25.191.157.244 2023-11-15T18:56:29.098Z, User ID: 90A, Host:  vx2tbax.dev, User Type: TempEMP,  Product Code:  pc, UAT:  UTA-True, Event Type:  TEST,  EventID:  Lookup, Remote Host: 25.191.157.244 2023-11-15T18:56:28.098Z, User ID: 91B, User Type:  TempEMP,  Product Code:  pc, UAT:  UTA-True, Event Type:  TEST,  EventID:  Lookup, Remote Host: 25.191.157.244 2023-11-15T18:56:27.098Z, User ID: 91B, User Type:  TempEMP,  Product Code:  pc, UAT:  UTA-True, Event Type:  TEST,  EventID:  Lookup, Remote Host: 25.191.157.244 2023-11-15T18:56:27.001Z, User ID: 91B, User Type:  TempEMP,  Host:  vx2tbax.dev, Product Code:  pc, UAT:  UTA-True, Event Type:  TEST,  EventID:  Lookup, Remote Host: 25.191.157.244  
I am attempting to ingest an XML file but am getting stuck can someone please help. The data will ingest if I remove "BREAK_ONLY_BEFORE =\<item\>"  but with a new event per item.   this is the XML ... See more...
I am attempting to ingest an XML file but am getting stuck can someone please help. The data will ingest if I remove "BREAK_ONLY_BEFORE =\<item\>"  but with a new event per item.   this is the XML and code I have tried   <?xml version="1.0" standalone="yes"?> <DocumentElement> <item> <hierarchy>ASA</hierarchy> <hostname>AComputer</hostname> <lastscandate>2023-12-17T11:08:21+11:00</lastscandate> <manufacturer>VMware, Inc.</manufacturer> <model>VMware7,1</model> <operatingsystem>Microsoft Windows 10 Enterprise</operatingsystem> <ipaddress>168.132.11.200</ipaddress> <vendor /> <lastloggedonuser>JohnSmith</lastloggedonuser> <totalcost>0.00</totalcost> </item> <item> <hierarchy>ASA</hierarchy> <hostname>AComputer</hostname> <lastscandate>2023-12-17T12:20:21+11:00</lastscandate> <manufacturer>Hewlett-Packard</manufacturer> <model>HP Compaq Elite 8300 SFF</model> <operatingsystem>Microsoft Windows 8.1 Enterprise</operatingsystem> <ipaddress>168.132.136.160</ipaddress> <vendor /> <lastloggedonuser>JohnSmith</lastloggedonuser> <totalcost>0.00</totalcost> </item> <item> <hierarchy>ASA</hierarchy> <hostname>AComputer</hostname> <lastscandate>2023-12-17T11:54:28+11:00</lastscandate> <manufacturer>HP</manufacturer> <model>HP EliteBook 850 G5</model> <operatingsystem>Microsoft Windows 10 Enterprise</operatingsystem> <ipaddress>168.132.219.32, 192.168.1.221</ipaddress> <vendor /> <lastloggedonuser>JohnSmith</lastloggedonuser> <totalcost>0.00</totalcost> </item> <item> <hierarchy>ASA</hierarchy> <hostname>AComputer</hostname> <lastscandate>2023-12-17T11:50:20+11:00</lastscandate> <manufacturer>VMware, Inc.</manufacturer> <model>VMware7,1</model> <operatingsystem>Microsoft Windows 10 Enterprise</operatingsystem> <ipaddress>168.132.11.251</ipaddress> <vendor /> <lastloggedonuser>JohnSmith</lastloggedonuser> <totalcost>0.00</totalcost> </item>   Inputs.conf [monitor://D:\SplunkImportData\SNOW\*.xml] sourcetype=snow:all:devices index=asgmonitoring disabled = 0   Props.conf [snow:all:devices] KV_MODE=xml BREAK_ONLY_BEFORE =\<item\> SHOULD_LINEMERGE = false DATETIME_CONFIG = NONE
Every time I create a table visualization, I notice that the value 0 is always aligned on the left side while the rest is aligned on the right side. (322, 3483,0,0 is in the same column) Is ther... See more...
Every time I create a table visualization, I notice that the value 0 is always aligned on the left side while the rest is aligned on the right side. (322, 3483,0,0 is in the same column) Is there any reason behind it and any way to fix this? Thanks!  
Hi, We are ingesting Azure NSG flow logs and visualizing them using app Microsoft Azure App for Splunk https://splunkbase.splunk.com/app/4882 Data is in JSON format with multiple levels/records in ... See more...
Hi, We are ingesting Azure NSG flow logs and visualizing them using app Microsoft Azure App for Splunk https://splunkbase.splunk.com/app/4882 Data is in JSON format with multiple levels/records in a single event. Each record can have multiple flows, flow tuples etc. Adding few screenshots here to give the context. Default extractions for the main JSON fields look fine. But when it comes to values within the flow tuple field, i.e. records{}.properties.flows{}.flows{}.flowTuples{}, Splunk only keeps values from the very first entry. How can I make these src_ip, dest_ip fields also get multiple values(across all records/flow tuples etc)   Splunk extracts values only from that first highlighted entry Here is the extraction logic from this app.    [extract_tuple] SOURCE_KEY = records{}.properties.flows{}.flows{}.flowTuples{} DELIMS = "," FIELDS = time,src_ip,dst_ip,src_port,dst_port,protocol,traffic_flow,traffic_result     Thanks,
We use the free version of syslog-ng, and recently we had a requirement to have TLS on top of TCP, and we don't have the knowledge to implement it. Therefore we wonder whether anybody has migrated fr... See more...
We use the free version of syslog-ng, and recently we had a requirement to have TLS on top of TCP, and we don't have the knowledge to implement it. Therefore we wonder whether anybody has migrated from syslog-ng to SC4S to assist with more advanced requirements such as TCP/TLS? https://splunkbase.splunk.com/app/4740 If so, what lessons learned and motivations to do that?
Where is the data from the Splunk Enterprise Security (ES) Investigation Panel stored? In the previous version, it seemed to be stored in a KV lookup, but I can't find it in the current 7.x version.... See more...
Where is the data from the Splunk Enterprise Security (ES) Investigation Panel stored? In the previous version, it seemed to be stored in a KV lookup, but I can't find it in the current 7.x version. I understand that the Notable index holds information related to incidents from the Incident Review Dashboard. How can we map Splunk Notables and their Investigations together to generate a comprehensive report in the current 7.x ES version?
Hi Splunkers, I have a strange situation about a some universal forwarders. On some Windows host, a colleague has installed the UF using the graphical wizards. Those forwarders must be managed with... See more...
Hi Splunkers, I have a strange situation about a some universal forwarders. On some Windows host, a colleague has installed the UF using the graphical wizards. Those forwarders must be managed with a Deployment server. He has NOT used the "customize" options; so, he has not set which logs must be sent to HF (Application, Security and so on) and a destination HF/Indexers. He has only inserted: Admin username and password Deployment server IP address and port As wrote above, he didn't inserted HF and/or Indexers; the idea is that once the UF has spoken with the Deployment server, 2 apps that contains inputs.conf and outputs.conf are downloaded and, after that, logs are sent. On Deployment server (we checked), the apps that should to be downloaded form UF have been created and contains the above 2 files. So, why I wrote "the apps that should be downloaded?" Well, due logs are not collected and sent to HF, we performed some troubleshoot and we found that apps has not been downloaded.  I mean: on host where UF is installed, if we go on $SplunkUFHOME$\etc\apps, the 2 apps are not present. So, that means that no custom inputs.conf and outputs.conf are present on UF. Only the default provided with installation are present. First thing we thought: ok, we have network issues. But it seems not: we are perfectly able, from host with UF, to ping and telnet deployment server on its port. At same time, we can access firewall that manage this traffic and we don't see, on firewall logs, any evidence of blocked/truncated connections. UF can reach DS and vice versa without issues. We tried so to manually copy folders with apps inside UF (I know, very bad things, don't blame me please...) but the situation is always the same. So, the question is: if no network issues are present, what can be the root cause about no downloaded apps?  
Hi all, I have set up an indexer cluster to achieve High Availability at ingestion phase. i'm aware about Update peer configuration  and i have reviewed intrusction under details tab from SentinelO... See more...
Hi all, I have set up an indexer cluster to achieve High Availability at ingestion phase. i'm aware about Update peer configuration  and i have reviewed intrusction under details tab from SentinelOne App .   I can not see an explict mention to a indexer-cluster setup. What are the steps to setup input configuration for a indexer cluster avoiding data duplication? Thanks for your help
Hi there, Logs sent to SC4S include date, time and host in the event, however when they are sent to Indexer, the date, time and host are missing. How can I get them back so the logs will look exactl... See more...
Hi there, Logs sent to SC4S include date, time and host in the event, however when they are sent to Indexer, the date, time and host are missing. How can I get them back so the logs will look exactly the same? I would like date, time and host included in the event. I appreciate any hints. thanks and regards, pawelF
How to convert splunk event to stix 2.1 json because i think to  connection to a soc center now i use splunk enterprise how can i do ? any app can convert?
Hi. We are seeing weird behaviour on one of our universal forwarders. We have been sending logs from this forwarder for quite a while and this has been working properly the entire time. New logfiles... See more...
Hi. We are seeing weird behaviour on one of our universal forwarders. We have been sending logs from this forwarder for quite a while and this has been working properly the entire time. New logfiles are created every second hour and log lines are being appended to the newest file. Last night the universal forwarder stopped working normally. When a new file was created the forwarder sent the first line to Splunk. New lines appended later on are not being forwarded. There are no errors logged in the splunkd.log file on the forwarder, nor any error messages on the receiving index servers. Every time a new file is generated, the forwarder sends the first line to Splunk, but the appending lines seem to be ignored. As far as I can see, there has not been any changes on the forwarder, nor on the Splunk servers that might cause this defect. Is there any way to debug the parsing of the logfile on the forwarder to identify the issue? Any other ideas what can be the issue here? Thanks.
Host value in below file gets changed automatically every now and then. Can you help me write a bash script which can check the host value every 5min and if the value is different than the actual hos... See more...
Host value in below file gets changed automatically every now and then. Can you help me write a bash script which can check the host value every 5min and if the value is different than the actual hostname as in "uname -n". It will automatically correct the host value, save the file and then restart splunk service automatically? cat /opt/splunk/etc/system/local/inputs.conf [default] host=iorper-spf52
Hi at all, I have to parse Juniper Switch logs that are very similar to Cisco ios. In the Juniper Add-On there isn't anythig for parse these logs so I have to create a new Add-On. is there anythig... See more...
Hi at all, I have to parse Juniper Switch logs that are very similar to Cisco ios. In the Juniper Add-On there isn't anythig for parse these logs so I have to create a new Add-On. is there anythig that already did it and can give me some hint to avoid to create hot water? Ciao. Giuseppe
Hi, Running Splunk 9.0.7 and addon Splunk_TA_MS_Security version 2.1.1. I followed the instructions from the addon https://docs.splunk.com/Documentation/AddOns/released/MSSecurity/Configure and re... See more...
Hi, Running Splunk 9.0.7 and addon Splunk_TA_MS_Security version 2.1.1. I followed the instructions from the addon https://docs.splunk.com/Documentation/AddOns/released/MSSecurity/Configure and reviewed from Microsoft article  https://learn.microsoft.com/en-us/microsoft-365/security/defender/api-hello-world?view=o365-worldwide Basically I created an App Registration in our Azure tenant, add the following permissions and created a secret   with all this, I followed the Microsot article and run the powershell scripts to test the connection and the token I obtain only gets a single permission.     could someone tell me what I am doing wrong? I expected to get all the permissions assigned to the application and I think that is why I get the 403 error in the splunkd.log. 12-17-2023 13:14:32.037 +0100 ERROR ExecProcessor [19404 ExecProcessor] - message from ""C:\Program Files\Splunk\bin\Python3.exe" "C:\Program Files\Splunk\etc\apps\Splunk_TA_MS_Security\bin\microsoft_defender_endpoint_atp_alerts.py"" 403 Client Error: Forbidden for url: https://api-eu.securitycenter.microsoft.com/api/alerts?$expand=evidence&$filter=lastUpdateTime+gt+2023-11-17T12:14:31Z 12-17-2023 13:17:38.251 +0100 ERROR ExecProcessor [19404 ExecProcessor] - message from ""C:\Program Files\Splunk\bin\Python3.exe" "C:\Program Files\Splunk\etc\apps\Splunk_TA_MS_Security\bin\microsoft_365_defender_endpoint_incidents.py"" 403 Client Error: Forbidden for url: https://api.security.microsoft.com/api/incidents?$filter=lastUpdateTime+gt+2023-11-17T12:17:31Z   thanks  
Hi all,  I have this query: | timechart span=1s count AS TPS | eventstats max(TPS) as MaxPeakTPS | stats avg(TPS) as avgTPS first(peakTPS) as peakTPS first(peakTime) as peakTime | fieldforma... See more...
Hi all,  I have this query: | timechart span=1s count AS TPS | eventstats max(TPS) as MaxPeakTPS | stats avg(TPS) as avgTPS first(peakTPS) as peakTPS first(peakTime) as peakTime | fieldformat peakTime=strftime(peakTime,"%x %X") This currently outputs Max TPS when Max TPS took place as well as the AVG TPS. I was wondering if it's possible to also display Min TPS and when that took place?  TIA