I successfully installed splunk using the ansible-role-for-splunk one a single machine. It worked as expected. I am trying now to deploy a distributed splunk system (7 VMs in total). I prepared the i...
See more...
I successfully installed splunk using the ansible-role-for-splunk one a single machine. It worked as expected. I am trying now to deploy a distributed splunk system (7 VMs in total). I prepared the inventory based on https://github.com/splunk/ansible-role-for-splunk/blob/master/environments/production/inventory.yml. when i ran the playbook, the bahviour is 7 individual installations of splunk instead of a distributed installation with indexer cluster, search head etc. My understanding was that based on the group name in the inventory, ansible role will install only the required components. Is it not true?
I am posting my playbook and inventory file (as first 2 replies). thanks
Hi,
I’m creating an application via the app wizard and I wish to render a custom view.
I have a view.py file with the follow function,
def display_action1(provides, all_app_runs, context):
...
See more...
Hi,
I’m creating an application via the app wizard and I wish to render a custom view.
I have a view.py file with the follow function,
def display_action1(provides, all_app_runs, context):
import phantom.rules as phantom
context['results'] = ['a', 'b', 'c']
phantom.debug('runing display_action1')
print('runing display_action1')
return 'display_action1.html'
which render the correct view in the widget.
However, i want to be able to view the debug or print statements.
where can i view this?
Kind regards,
Ansir
Hi Splunkers, I'm working on a condition where i have to create a new field based off some column values. Example: Column A Column B column c yes no ...
See more...
Hi Splunkers, I'm working on a condition where i have to create a new field based off some column values. Example: Column A Column B column c yes no abc yes yes ef yes no gh no no kl no no mn Based of the columns, I need to create a new field called "result" based of two conditions. 1. if column c is abc or gh or mn then result is "yes" 2. If Column A or Column B is yes, then result should be "yes" I tried doing with eval but one is replacing with other condition. I want to apply the first condition first and for the remaining values I need to check for second condition.
Hi All,
We are planning an upgrade for our really old universal forwarders from 6.1.2 to 8.1.x and seems like now the documentation for these really old forwarders are not there .can someone gu...
See more...
Hi All,
We are planning an upgrade for our really old universal forwarders from 6.1.2 to 8.1.x and seems like now the documentation for these really old forwarders are not there .can someone guide as to how the upgrade process should go and the intermediate versions we need to upgrade to ?
I was thinking to directly install a new version of splunk UF (8.1.x)and shift my configs there and the fish bucket directory ,would that be a sane approach without re indexing anything ,or there are different things that need to be considered?.
I'm trying to optimize my Splunk Windows Event Log dashboard, and wanted to add CSV exclusion file that would filter out some events that aren't necessary to monitor.
CSV file contents:
...
See more...
I'm trying to optimize my Splunk Windows Event Log dashboard, and wanted to add CSV exclusion file that would filter out some events that aren't necessary to monitor.
CSV file contents:
TaskName
EventCode
Microsoft Edge
4101
Firefox
4101
To filter events, I tried this search query:
AND NOT [ | inputlookup wineventlog_exclusions_v2.csv | rename TaskName as query | fields query, EventCode ]
However it doesn't give me what I want, it converts search string to:
(NOT EventCode="4104" OR NOT "Microsoft Edge") (NOT EventCode="4104" OR NOT "Firefox"))
But I want this:
AND NOT ((EventCode="4104" AND "Microsoft Edge") OR (EventCode="4104" AND "Firefox"))
Is there an easy way of using "AND" OR "AND" for CSV inputlookup?
How would I find license usage by field?
For example; I want to know which field values within a specific sourcetype account for the greatest volume of events and license usage.
Thanks in adv...
See more...
How would I find license usage by field?
For example; I want to know which field values within a specific sourcetype account for the greatest volume of events and license usage.
Thanks in advance!
Hi,
I want to minus yesterday' total event with today's total event and divide by yesterday's total event.
To see Increase in Intrusion Events.
Please help me on query part.
Trying to get apps into splunk, I am entering my correct Splunk.com username and password and nothing happens. No error message is popping up, nothing.
Hello, I'm fairly new using Splunk and I'm trying to determine which command would be best to extract and insert data from the ap_name column into the space_id column I made using the following eval ...
See more...
Hello, I'm fairly new using Splunk and I'm trying to determine which command would be best to extract and insert data from the ap_name column into the space_id column I made using the following eval command: | inputlookup <lookup value> | search ap_name=* | eval space_id = building_num + "-" + room The first hyphen within the AP name indicates the floor number the AP is on. The hurdles I'm experiencing are the following: - Extract and insert ONLY the first number(s) after the first hyphen within the AP name - If the floor number is between 0 - 9 inserting a "0" to the space ID result I'm also unsure if it would be easier to make another eval column using the extraction for the floor number, then add the new value into the space_id. Any assistance and/or guidance on this is greatly appreciated! ap_group ap_latitude ap_longitude ap_name building_num install_status location model_id room space_id test1 123123 234234 sample-14-4 0272 In use Sample Tower (0272 315 1434 0272-1434 test2 345345 456456 sample2-1-19 1110 In use Sample Two House (1110) 315 160 1110-160 test3 567567 678678 sample3-10-9 0189 In use Sample Three Tower (0189) 315 1007 0189-1007
Hello, I want to append the results from one field to another, however, I only want to fill the null and blank spaces of the field. For this I tried the following: | eval FIELD2= if(isnull(FIE...
See more...
Hello, I want to append the results from one field to another, however, I only want to fill the null and blank spaces of the field. For this I tried the following: | eval FIELD2= if(isnull(FIELD1) OR FIELD1="",mvappend(FIELD2,FIELD1),"") What happens is that it will substitute the filled slots from FIELD2 with "", so it doesn't do what I need.
Is there any solution for this?
Hi,
How would I go about getting cisco FTD logs into Splunk Cloud? Would I need to install a forwarder on the same network I have the FTD installed and then send those to the Splunk Cloud from ...
See more...
Hi,
How would I go about getting cisco FTD logs into Splunk Cloud? Would I need to install a forwarder on the same network I have the FTD installed and then send those to the Splunk Cloud from the forwarder? If so, can it be a Windows Forwarder?
Hi colleagues,
Is there option to add text filter with different type of matching, like it is in Excel (containts, doesnt contain, begginig with, end with, etc.) to dashboard.
Hello Team, My Trial license is expired. Now I am planning to switch to a free license. For that, I followed the below steps.
Settings -> Licences -> Change License group -> Free Licence -> Rebo...
See more...
Hello Team, My Trial license is expired. Now I am planning to switch to a free license. For that, I followed the below steps.
Settings -> Licences -> Change License group -> Free Licence -> Reboot.
But the free license is not reflecting after reboot.
Kindly help me with this.
Thanks!!
Hi everyone, To use the new Windows server2019–2022, which is OS compatible, we have planned to migrate the Search Head, Indexers, and Deployment Server instances of Splunk (old and new). Since...
See more...
Hi everyone, To use the new Windows server2019–2022, which is OS compatible, we have planned to migrate the Search Head, Indexers, and Deployment Server instances of Splunk (old and new). Since this is my first action, I need specific "backup and installation" instructions in order to complete the task safely. Current Splunk install: Windows 2016 - Splunk Ent 9.0.1 New Splunk install: Windows 2019/2022 running Splunk Ent 9.0.2 The architecture includes: 1 Search Head (OS - Windows 2016) 2 Indexers (OS - Windows 2016) 1 Deployement Server (OS - Windows 2016) 62 Universal Forwarders (OS - Linux) Thanks in advance for any help.
Splunk randomly varies the style in which dates and times are available. In an alert email, $job.trigger_date$ comes out as "March 04, 2023", and then I have to have it display $job.trigger_timeHMS$ ...
See more...
Splunk randomly varies the style in which dates and times are available. In an alert email, $job.trigger_date$ comes out as "March 04, 2023", and then I have to have it display $job.trigger_timeHMS$ afterwards to get "21:45:06 -0500". On the other hand, $job.earliestTime$ comes out "2023-03-04T20:00:00.000-05 00" and $job.latestTime comes out similarly. Is there a way to get them all to come out consistently formatted like "March 4, 2023, 21:45:06"? (And then I can suffix it with "Eastern Time".)
Hi, I have a simple setup of a Splunk universal forwarder on a windows server forwarding data to a single Linux server acting as Splunk indexer/search head. Sometimes the connection to this server ...
See more...
Hi, I have a simple setup of a Splunk universal forwarder on a windows server forwarding data to a single Linux server acting as Splunk indexer/search head. Sometimes the connection to this server can drop from the windows box and when it is restored, a large number of messages not sent when the connection had dropped get forwarded. How can I empty the Splunk universal forwarder messages queue via the command line just before the connection is reinstated, so that any unsent messages are dropped?
I have a dashboard with a couple of base datasources. Let's call them "base_search1" and "base_search2" I have chained datasources which are used to output base_search1 | stats avg(field) as a Radi...
See more...
I have a dashboard with a couple of base datasources. Let's call them "base_search1" and "base_search2" I have chained datasources which are used to output base_search1 | stats avg(field) as a Radial base_search1 | timechart span=1d avg(field) and the same thing for base_search2. This presents nicely as a radial and line chart for both base_search1 and base_search2. Request: Now, I want to also display the radial and time chart for the combination of base_search1's results and base_search2's results. Assume both datasources both output the same field name that I want to average over. How do I do this? Will the solution change if I want to average over more than 2 datasources? Let's say I have 4 base searches on the dashboard. If this was not on a dashboard, I would do something like base_search1 | append [base_search2] | timechart span=1d avg(field) but I don't know how that translates to working with datasources in dashboards.
Hi,
I need to have buttercup store service tree on my local ITSI for test purpose!
Anyone can hep me where can I find "buttercup store" service tree sample (like attached image) to create on ITS...
See more...
Hi,
I need to have buttercup store service tree on my local ITSI for test purpose!
Anyone can hep me where can I find "buttercup store" service tree sample (like attached image) to create on ITSI? OR
Is there any search query to build it?
Thanks,
Majbo
Hi I want to calculate duration. For example, I have 2 different event in a source First event:
04/03/2023 PLUGIN_CLIENT_CONNECT (this is not a field in event) 17:10:15.000
Second event:
0...
See more...
Hi I want to calculate duration. For example, I have 2 different event in a source First event:
04/03/2023 PLUGIN_CLIENT_CONNECT (this is not a field in event) 17:10:15.000
Second event:
04/03/2023 PLUGIN_CLIENT_DISCONNECT (this is not a field in event) 17:51:15.000
I want to cal duration between them.
Expected result with table:
Start Time End Time Hours Ago
04/03/2023 04/03/2023 41 minutes (and hours if have) 17:10:15.000 17:51:15.000
Anyone can help me??? Thank you for pay time