All Topics

Top

All Topics

I'm trying to have a timechart showing the count of events by a category grouped by week. The search time is controlled by a radio button on the dashboard with options from 1w - 12 weeks with the end... See more...
I'm trying to have a timechart showing the count of events by a category grouped by week. The search time is controlled by a radio button on the dashboard with options from 1w - 12 weeks with the end date set to @w. I then have a drilldown that shows a table with more info about each event for that category in that time range. mysearch .... | dedup case_id | timechart span=1w count by case_category The chart looks fine but when I click on certain sections to load the drilldown, much more data appears than was suggested by the count in the timechart. For instance, looking at Nov 19-25, in the timechart it shows 26 events, but when I go to the drilldown it shows 61. When I open the drilldown search in Search, the issue seems to involve expanding the time range beyond one week. If I change the range from Nov 19-25 to Nov 19-27, the data from Nov 22-24 is either erased or reduced. Nov 19-25 stats count results: Nov 19: null Nov 20: 8 Nov 21: 14 Nov 22: 19 ** Nov 23: 20 ** Nov 24: 1 ** Nov 25: null Nov 19-28 stats count results: Nov 19: null Nov 20: 8 Nov 21: 14 Nov 22: 5 ** Nov 23: null ** Nov 24: null ** Nov 25: null Nov 26: null Nov 27: 35 Nov 28: 1
Greetings Community! I have a question regarding the Splunk Cloud License (classic), particularly when the license expires.  - Is there a message informing the license is about to expire? - After ... See more...
Greetings Community! I have a question regarding the Splunk Cloud License (classic), particularly when the license expires.  - Is there a message informing the license is about to expire? - After the expiration date, is there any grace period provided? - In case I decide to not renew the license, are we able somehow to download the company data before its total removal? Or after my license is expired I lose all indexed data? Thanks in advance any information on this matter. Kind Regards, Marcelo  
I want to run an Enrichment playbook inside a custom function. Looking to pass in a list of devices and call the playbook each time passing in a single deviceId at a time. What is the best way to do ... See more...
I want to run an Enrichment playbook inside a custom function. Looking to pass in a list of devices and call the playbook each time passing in a single deviceId at a time. What is the best way to do this?
I am getting error messages WARNING: web interface does not seem to be available! I just install the Splunk on my Mac.
I am trying to make a query which will give me the result of unique file names with month in column and a time span of 1 hour in row. Below is my query : index="app_cleo_db" origname="GEAC_Payroll*... See more...
I am trying to make a query which will give me the result of unique file names with month in column and a time span of 1 hour in row. Below is my query : index="app_cleo_db" origname="GEAC_Payroll*" | rex "\sorigname=\"GEAC_Payroll\((?<digits>\d+)\)\d{8}_\d{6}\.xml\"" | search origname="*.xml" | eval Date = strftime(_time, "%Y-%m-%d %H:00:00") | eval DateOnly = strftime(_time, "%Y-%m-%d") | transaction DateOnly, origname | timechart count by DateOnly But it is giving me an output with date as well as timestamp in the row like below: _time 2023-12-02 2023-12-03 2023-12-02 00:00:00 8 0 2023-12-02 00:30:00 0 0 2023-12-02 01:00:00 0 7 2023-12-02 01:30:00 0 0 2023-12-02 02:00:00 6 0 2023-12-02 02:30:00 0 0 2023-12-02 00:00:00 2 0 2023-12-03 00:30:00 0 5 2023-12-03 01:00:00 0 0 2023-12-03 01:30:00 0 20 2023-12-03 02:00:00 0 0 2023-12-03 02:30:00 34 0   I want the result to look like below _time 2023-12-02 2023-12-03 00:00:00 0 0 01:00:00 0 0  02:00:00 0 0 03:00:00 0 0
Hi All,   I have a Splunk search query executing the in the background(used Send to background option) while this is running my VPN got disconnected and after sometime I have reconnected to VPN and... See more...
Hi All,   I have a Splunk search query executing the in the background(used Send to background option) while this is running my VPN got disconnected and after sometime I have reconnected to VPN and the query is still runing in the background. My question is does it gives me complete results or any incomplete results?   Thanks
Hi, I am trying to implementing glass table for one of the use case. My use case have complex architecture but seems like I don't have much choice. It has only simple Arrow. For my use case I n... See more...
Hi, I am trying to implementing glass table for one of the use case. My use case have complex architecture but seems like I don't have much choice. It has only simple Arrow. For my use case I need flexible option so I can bend the arrow or having multiple staggered arrow. I tried to implement by joining multiple arrows but its very difficult and time consuming as small change require to adjust multiple arrows. Just looking for option. Is there any content pack ? or better option to connect services in glass table ? This is just simple example. my use case is way more complex.    
Got a search like this (I've obfuscated it a bit) | tstats count where index IN (index1, index2, index3) by _time , host | where match(host,"^.*.device.mycompany.com$") Got a great looking stats... See more...
Got a search like this (I've obfuscated it a bit) | tstats count where index IN (index1, index2, index3) by _time , host | where match(host,"^.*.device.mycompany.com$") Got a great looking stats table - and Im really pleased with the performance of tstats - awesome. I want to graph the results... easy right?  well no - I cannot for the life of me seem to break down a say, 60 minute span down by host, despite the fact I got this awesome oven ready totally graphable stats table so I am trying  | tstats count where index IN (index1, index2, index3) by _time , host | where match(host,"^.*.device.mycompany.com$") | timechart count by host but the count is counting the host, whereas I want to "count the count" ?  Any ideas?  this will be a super simple one I expect - I got a total mental block on this
I have installed Splunk Enterprise free trial into a VM as a root user. I know the best practice is to avoid using root to run as Splunk in case the underlying OS gets compromised and then the hacker... See more...
I have installed Splunk Enterprise free trial into a VM as a root user. I know the best practice is to avoid using root to run as Splunk in case the underlying OS gets compromised and then the hacker has access to your OS with root level. I am following the doc online and it says once you install Splunk as root, don't start the Splunk installation but rather add a new user and then change ownership of the Splunk folder to that new non-root user   But before I do that, when Splunk is installed I check its ownership and it's already configured to Splunk. Does this mean Splunk has already configured a non-root user automatically upon installation?   If so, how would I make sure it has read access to local files I want to monitor?
Hi, How do we copy the files .tgz from windows server to the Linux box ? Can anyone help me in doin this?  
Hi I’m trying to create two searches and having some problems. I hope somebody could help me with this. 1. 7 or more IDS Alerts from a single IP Address in one minute. I created something like bel... See more...
Hi I’m trying to create two searches and having some problems. I hope somebody could help me with this. 1. 7 or more IDS Alerts from a single IP Address in one minute. I created something like below, but it doesn’t seem to be working correctly: index=ids | streamstats count time_window=1m by src_ip | where count >=7 | stats values(dest_ip) as "Destination IP" values(attack) as "Attack" values(severity) as "Severity" values(host) as "FW" count by "Source IP" 2. 5 or more hosts in 1h attacked with the same IDS Signature This seems to be even more complex as it has 3 conditions: 5 hosts 1h The same IPS signature So, I’m not sure how to even start after failing first one. Could somebody help me with this please?
I am trying to use a table column for a drilldown but not display it. In XML dashboards I could do it by specifying: ``` <fields>["field1","field2"...]</fields> ``` I would then still be able... See more...
I am trying to use a table column for a drilldown but not display it. In XML dashboards I could do it by specifying: ``` <fields>["field1","field2"...]</fields> ``` I would then still be able to use field3 in setting drilldown tokens. How can I do this in Dashboard Studio? I can't find a way to hide a column without removing the ability to then also refer to it in tokens.
Let's say that I have a dashboard A containing a table. App Name App Host LinkToB LinkToC LinkToD abc host 1 LinkToB LinkToC LinkToD def host 2 LinkToB LinkToC LinkToD xyz h... See more...
Let's say that I have a dashboard A containing a table. App Name App Host LinkToB LinkToC LinkToD abc host 1 LinkToB LinkToC LinkToD def host 2 LinkToB LinkToC LinkToD xyz host 1 LinkToB LinkToC LinkToD   I have 3 other dashboards (B,C,D). I want to click "LinkToX" to link to X dashboard. However, in Splunk Dashboard Studio UI, I can only link the table to one dashboard. Is there any way to configure JSON to make the table able to link to multiple dashboard? Or is there any way to make cells in table become clickable URL link instead? Thank you!
Hi AppDynamics team, I'm trying to configure the windows service application with Unhandled exception Error to monitor using .NET agent referring the below link. Configure the .NET Agent for Window... See more...
Hi AppDynamics team, I'm trying to configure the windows service application with Unhandled exception Error to monitor using .NET agent referring the below link. Configure the .NET Agent for Windows Services and Standalone Applications (appdynamics.com) here is the part of config.xml file which i have added <standalone-applications> <standalone-application executable="D:\sample project\MQ_ConsoleApp1\MQ_ConsoleApp1\bin\x64\Release\MQ_ConsoleApp1.exe"> <tier name="DotNet Tier" /> </standalone-application> and also, I have tried configuring the entry points as well for the windows service. but unable to get the transactions. Please let me know if i missed any more configuration steps. please help me in resolving the issue.  Thanks in advance.
Hello Splunkers, I m currently implementing a connection from multiple GCP Buket to Splunk enterprise. The Add-on automatically index the datas from those buckets on the _timestamps it get them (So... See more...
Hello Splunkers, I m currently implementing a connection from multiple GCP Buket to Splunk enterprise. The Add-on automatically index the datas from those buckets on the _timestamps it get them (So if I have a list of transactions from mars to november 2023, that are forwarded today, they will still be index at the same time. However, I would like for some of those datas to be indexed using a timefields present in the data, depending on the apps that use them (For example App 1 has a time fields named "Start_date" and app 2 has another one named "end_date") Unfortunately, i cant think of a way to do it, maybe in the props.conf file, but I'm not sure. Any advices? Thanks
Hi Team, While running the below search we are not getting license calculation for 2-3 indexes(showing 0) but for other indexes I am able to see the results. index=_internal source="*license_usag... See more...
Hi Team, While running the below search we are not getting license calculation for 2-3 indexes(showing 0) but for other indexes I am able to see the results. index=_internal source="*license_usage.log" sourcetype=splunkd | stats sum(b) as Bytes by idx | eval GB=round(Bytes/1024/1024/1024,3) | rename h as Host, s as Source, st as Sourcetype, idx as Index, GB as "License Used in GB" | table Index, "License Used in GB" I am trying to understand why it is happening for only 2-3 indexes. We have the index data present on both the indexers. 
I was trying to configure the forwarder for a while and couldn't succeed, therefore I was watching a video where the person told to make sure to have your status enabled. I thought that the reason th... See more...
I was trying to configure the forwarder for a while and couldn't succeed, therefore I was watching a video where the person told to make sure to have your status enabled. I thought that the reason that I am not receiving data could be that I might have it as disabled. Then I proceeded to enable everything on the manage apps section. Then I got the message that I needed to restart however the website couldn't automatically restart it by itself and told me to do it through the command line. I've searched for it but couldn't find it. I then decided to restart the pc, afterwards when I opened the website I got the message of " This site can’t be reached 127.0.0.1 refused to connect." I then tried to stop and start the splunkd from the cmd with admin access but it didn't quite fix it either Example of manage apps section (NOT MINE)
How to extract field from below event I want nname,ID,app and Time , here nname is mule_330299_prod_App01_Clt1 ID=91826354-d521-4a01-999f-35953d99b829 app=870a76ea-8033-443c-a312-834363u3d Time=2... See more...
How to extract field from below event I want nname,ID,app and Time , here nname is mule_330299_prod_App01_Clt1 ID=91826354-d521-4a01-999f-35953d99b829 app=870a76ea-8033-443c-a312-834363u3d Time=2023-12-23T14:22:43.025Z CSV Content:nname,Id,app,Time mule_330299_prod_App01_Clt1,91826354-d521-4a01-999f-35953d99b829,870a76ea-8033-443c-a312-834363u3d,2023-12-23T14:22:43.025Z mule_29999_dev_WebApp01_clt1,152g382226vi-44e6-9721-aa7c1ea1ec1b,26228e-28sgsbx-943b-58b20a5c74c6,2024-01-06T13:29:15.762867Z  like this we have multiple lines in one event 
Hello. I am trying to route some events to a different index based on a field on the events. The events are JSON formatted. This is an example: { "topic": "audits", "events": [ {... See more...
Hello. I am trying to route some events to a different index based on a field on the events. The events are JSON formatted. This is an example: { "topic": "audits", "events": [ { "admin_name": "john doe john.doe@juniper.net", "device_id": "00000000-0000-0000-1000-5c5b35xxxxxx", "id": "8e00dd48-b918-4d9b-xxxx-xxxxxxxxxxxx", "message": "Update Device \"Reception\"", "org_id": "2818e386-8dec-2562-xxxx-xxxxxxxxxxx", "site_id": "4ac1dcf4-9d8b-7211-xxxx-xxxxxxxxxxxx", "src_ip": "xx.xx.xx.xx", "timestamp": 1549047906.201053 } ] } We are receiving the events into a heavy forwarder. And we forward them the event to an indexer. We want to send the events with the topic audits to a different index than the default one (imp_low). I have tried with these settings in the heavy forwarder:   -Props.conf --------------------------------------------- [_json-Mist_Juniper] DATETIME_CONFIG = INDEXED_EXTRACTIONS = json KV_MODE = none LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true category = Structured pulldown_type = 1 TRANSFORMS-force_index = setindexHIGH -Transforms .conf: ------------------------- [setindexHIGH] SOURCE_KEY = topic REGEX = (audits) DEST_KEY = _MetaData:Index FORMAT = imp_high   But it is not working, all the events are going to the "imp_low" index.  Thanks
Hi I have a dashboard that displays CSV I want to add lists for him to display that are not in the CSV But the list I'm adding includes the records that are in the CSV I want to create a list tha... See more...
Hi I have a dashboard that displays CSV I want to add lists for him to display that are not in the CSV But the list I'm adding includes the records that are in the CSV I want to create a list that will not include the records in the CSV This code gets me the whole list   | index="------" interface="--" |stats values(interface) as importers   This code brings me the list from the CSV   index="------------" code=* | search [|inputlookup importers.csv |lookup importers.csv interfaceName OUTPUTNEW system environment timerange |stats values(interfaceName) as importers_csv     I want a code that brings me the list without the records in the CSV Thanks