All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I am not sure I understand where the tokens are being set and being used. Can you not just remove Output=$form.output$ from the search for the panel where it isn't available?
Hi Splunkers, I am facing weird issue with addcoltotals command. While it is working perfectly fine if i open a new search tab but once i add the same query in Dashboard it is breaking down. I am t... See more...
Hi Splunkers, I am facing weird issue with addcoltotals command. While it is working perfectly fine if i open a new search tab but once i add the same query in Dashboard it is breaking down. I am trying to run the command in SplunkDB connect. Below is the snippet for reference. Below is the query index=db_connect_dev_data |rename PROCESS_DT as Date | table OFFICE,Date,MOP,Total_Volume,Total_Value | search OFFICE=GB1 |eval _time=strptime(Date,"%Y-%m-%d") |addinfo |eval info_min_time=info_min_time-3600,info_max_time=info_max_time-3600 |where _time>=info_min_time AND _time<=info_max_time |table Date,MOP,OFFICE,Total_Volume,Total_Value | addcoltotals "Total_Volume" "Total_Value" label=Total_GB1 labelfield=MOP |filldown | eval Total_Value_USD=Total_Value/1000000 | eval Total_Value_USD=round(Total_Value_USD,5) | stats sum(Total_Volume) as "Total_Volume",sum("Total_Value_USD") as Total_Value(mn) by MOP |search MOP=* |table MOP,Total_Volume,Total_Value(mn) Let me know if anyone know why it is happening,
You could try something like this | appendpipe [| stats avg(*) as average_*] | addcoltotals | foreach average_* [| eval <<MATCHSEG1>>=if(isnull(<<MATCHSEG1>>),<<FIELD>>,<<MATCHSEG1>>)] | fi... See more...
You could try something like this | appendpipe [| stats avg(*) as average_*] | addcoltotals | foreach average_* [| eval <<MATCHSEG1>>=if(isnull(<<MATCHSEG1>>),<<FIELD>>,<<MATCHSEG1>>)] | fields - average_*
Hello guys, so I'm currently trying to set up Splunk Enterprise in a cluster architecture  (3 search heads and 3 indexers) on Kubernetes using the official Splunk operator and Splunk enterprise helm ... See more...
Hello guys, so I'm currently trying to set up Splunk Enterprise in a cluster architecture  (3 search heads and 3 indexers) on Kubernetes using the official Splunk operator and Splunk enterprise helm chart, in my case what is the most recommended way to set the initial admin credentials, do I have to access every instance and define a "user-seed.conf" file under $SPLUNK_HOME/etc/system/local and then restart the instance, or is there an automated way to set the password across all instances by leveraging the helm chart.
Hi Team, Please help with above question. Thanks
Hello Team, I have a parent dashboard where I have 5 panels. These are linked to one child dashboard based on the token passing filter the data changes. However I notice that for one panel there is... See more...
Hello Team, I have a parent dashboard where I have 5 panels. These are linked to one child dashboard based on the token passing filter the data changes. However I notice that for one panel there is no field as Output due to which i get "no results found". Is there a logic to remove this token passed from the code. |search $form.app_tkn$ Category="A event" Type=$form.eventType$ Output=$form.output$
index=mainframe sourcetype=BMC:DEFENDER:RACF:bryslog host=s0900d OR host=s0700d | timechart limit=50 count(event) BY host | addcoltotals I am looking add the AVG from each 1 week total for eac... See more...
index=mainframe sourcetype=BMC:DEFENDER:RACF:bryslog host=s0900d OR host=s0700d | timechart limit=50 count(event) BY host | addcoltotals I am looking add the AVG from each 1 week total for each day 
Apart from all that @richgalloway already mentioned, this document shows results of testing on some particular reference hardware. It's by no means a guarantee that an input will work with this perfo... See more...
Apart from all that @richgalloway already mentioned, this document shows results of testing on some particular reference hardware. It's by no means a guarantee that an input will work with this performance. Also remember that windows eventlog inputs get the logs by calling the system using winapi whereas file input just reads the file straight from the disk (most probably using memory-mapped files since it's most effective method). And last but definitely not least - as I already pointed out - UF typically doesn't break data into events!
Were you able to resolve the issue? I am getting the same error. 
That document is for a specific source where the event size is well-defined.  The information there cannot be generalized because the size of an "event" is unknown.  I've seen event sizes range from ... See more...
That document is for a specific source where the event size is well-defined.  The information there cannot be generalized because the size of an "event" is unknown.  I've seen event sizes range from <100 to >100,000 bytes so it is very difficult to produce an EPS number without knowing more about the data you wish to ingest. It's possible the documentation for other TAs provides the information you seek.  Have you looked at the TAs for your data?
I know there is Splunk Add-on for AWS, but I heard there is a simpler and easier way to read the buckets directly without using that Add-on. Is that true?  
Thank you @PickleRick for you concern, 1. I have tried to embed report on my website as per following this document Embed scheduled reports. But I was not able to embed Splunk data report on my webs... See more...
Thank you @PickleRick for you concern, 1. I have tried to embed report on my website as per following this document Embed scheduled reports. But I was not able to embed Splunk data report on my website, it was showing “Report not available” and in console showing 401 Unauthorized status code, Please check this image and reply. 2. I will follow this app well  Embedded Dashboards For Splunk (EDFS) 3.  Sure, i will try to use backend (server-side code) service to get Splunk data securely using the REST API.  Please let me explore more about REST API for backend service and where i have to request for REST API.
Unfortunately, all the fields required are present in the raw event. The tags were also produced correctly.  In this (quite old) thread https://community.splunk.com/t5/Splunk-Search/Define-user-fiel... See more...
Unfortunately, all the fields required are present in the raw event. The tags were also produced correctly.  In this (quite old) thread https://community.splunk.com/t5/Splunk-Search/Define-user-field-in-Security-Essentials/m-p/312738 I've found an issue somehow connected to mine, as mine problems are also connected to the user and src_user field extraction and my AD server is also in a non-english language.   Has anyone found the underlying issue? 
Hello,  I am trying to create a custom view (also via Xpath) from EventViewer and later insert it into Splunk via a "WinEventLog" and leveraging the Windows Addon. Can it be done using "WinEven... See more...
Hello,  I am trying to create a custom view (also via Xpath) from EventViewer and later insert it into Splunk via a "WinEventLog" and leveraging the Windows Addon. Can it be done using "WinEventLog" or some other way in inputs.conf as it is for Application/Security/System?  [WinEventLog://MyCustomLog] As suggested here I tried this configuration but no logs were onboarded and it returned no error also in _internal logs.  Has anyone found a custom solution for inserting these newly created custom views from the EventViewer to Splunk? Thanks
I am looking for something similar to this. https://docs.splunk.com/Documentation/WindowsAddOn/8.1.2/User/PerformancereferencefortheSplunkAdd-onforWindows
Hi @gcusello  did you get answer from @woodcock regarding applying on all etc/system/local/authorize.conf search head nodes (preferably from GUI if possible) ? Thanks.  
It's interesting what you write here, syslog-ng was the first one to support TCP based logging, so it definitely supports that. Dropping of UDP packets is a known issue, but there are a lot of ways ... See more...
It's interesting what you write here, syslog-ng was the first one to support TCP based logging, so it definitely supports that. Dropping of UDP packets is a known issue, but there are a lot of ways to improve that. Here are a couple of blog posts: https://axoflow.com/syslog-over-udp-kernel-syslog-ng-tuning-avoid-losing-messages/ https://axoflow.com/detect-tcp-udp-message-drops-syslog-telemetry-pipelines/ Kafka is just a message bus, you will need a component to actually receive the messages and then hand them over to kafka. of course syslog-ng can do that: https://axoflow.com/docs/axosyslog-core/chapter-destinations/configuring-destinations-kafka-c/  
    I finally got it this way: transform.conf: [my-log] Format = $1::$2 regex = FICHERO_LOG(\d+)\s+==s+([^=\n]+)\n MV-ADD = true props.conf REPORT-log = my-log   thank you all for your help
The whole idea is tricky. If you want to use an embedded report - that's relatively easy and safe - you can embed _scheduled_ report so they are generated using predefined configuration and you can e... See more...
The whole idea is tricky. If you want to use an embedded report - that's relatively easy and safe - you can embed _scheduled_ report so they are generated using predefined configuration and you can embed the resulting report into your external webpage. With stuff more "dynamic" it's way more tricky because it involves a whole lot of making sure you don't accidentally create a security vulnerability for your environment and don't let the user do much more with your Splunk installation than you initially intended. There is an app https://splunkbase.splunk.com/app/4377 which is supposed to do that or at least help with that but I haven't used it and I don't know if/what limitations it has. Of course another thing would be to use REST api in your app to call Splunk servers and generate proper output which you could then present to your user but due to the security issues I mentioned before, it's something you'd rather do in your backend and only present the user with the results embedded in your web app by said backend than let the browser call the Splunk search-heads directly.
@pranay03 Are you able to resolve this issue. I am as well facing same issue. In the console log it shows it started watching, however i could not see the logs in Web portal. Please let me know... See more...
@pranay03 Are you able to resolve this issue. I am as well facing same issue. In the console log it shows it started watching, however i could not see the logs in Web portal. Please let me know your thoughts and suggestions.