I wonder if anyone else has experienced this and can advise? we upgraded from 9.0.3 to 9.1.1 also upgraded ES to 7.2.0 and CIM to 5.2.0 However when we go onto the CIM setup from the enterprise sec...
See more...
I wonder if anyone else has experienced this and can advise? we upgraded from 9.0.3 to 9.1.1 also upgraded ES to 7.2.0 and CIM to 5.2.0 However when we go onto the CIM setup from the enterprise security menu now, the Tags Allow list is empty. on the underlying datamodels.conf the tags_whitelist is still populated under the relevant data model stanzas but not displaying on the gui?
Hi at all, I have to extract raw data from an Elastic Search infrastructure ingesting them in Splunk Enterprise 9.1.1. I saw that there are two apps in Splunkbase: "ElasticSPL Add-on for Splunk" an...
See more...
Hi at all, I have to extract raw data from an Elastic Search infrastructure ingesting them in Splunk Enterprise 9.1.1. I saw that there are two apps in Splunkbase: "ElasticSPL Add-on for Splunk" and "Elasticsearch Data Integrator - Modular Input". The first seems to be not certified on Splunk Enterprise but only for Splunk Cloud. Does anyone used them? are they different, which one should be prefereable? Thank you for your advice. Ciao. Giuseppe
We have recently upgraded our Splunk Enterprise to the version 9.0.4. We observed that some of the behaviour in the system are different. For example, when we run a search with timechart/stats comma...
See more...
We have recently upgraded our Splunk Enterprise to the version 9.0.4. We observed that some of the behaviour in the system are different. For example, when we run a search with timechart/stats command and without mentioning the index field, the results are same but under the Events part, it shows empty events for the respective timestamp. Below is the sample query and respective results. host=abc sourcetype=xyz |timechart count This was not occurring earlier. Though we don't mention the index field, the results use to populate with the respective event logs. Not sure whether this is the expected behavior or it's a bug. Is this something which we can fix from the end user side? Please anyone help me on this. I would also like to know the limitations or restrictions which are introduced with this Splunk version.
Hello, I have below code for a dropdown menu and the problem is the moment i select any of the value from drop down dependent panels load without waiting for Submit button. How can this be fixed. S...
See more...
Hello, I have below code for a dropdown menu and the problem is the moment i select any of the value from drop down dependent panels load without waiting for Submit button. How can this be fixed. Submit Button code: <fieldset submitButton="true" autoRun="false"> <input token="field1" type="time" searchWhenChanged="false"> <label>Time Picker</label> <default> <earliest>-15m</earliest> <latest>now</latest> </default> </input> Dropdown and Token <input type="dropdown" token="subsummary" depends="$loadsummary$" searchWhenChanged="false"> <label>Summary Selection</label> <choice value="FUNC">Function Summary</choice> <choice value="MQ">MQ Summary</choice> <change> <condition value="FUNC"> <set token="funcsummary">true</set> <unset token="funcsummaryMQ"></unset> </condition> <condition value="MQ"> <set token="funcsummaryMQ">true</set> <unset token="funcsummary"></unset> </condition> </change> Sample Panel: <row depends="$funcsummaryMQ$"> <panel depends="$funcsummaryMQ$"> <title>ABC</title> <table> <search > <query>index="SAMPLE" </query> </search> <option name="count">100</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="refresh.display">progressbar</option> <option name="rowNumbers">false</option> <option name="wrap">true</option> </table> </panel> </row>
Hi, I have 2 saved searches that fetch data from datamodel (pivot table) and the result of these savedsearch is storing to a summary index. But 2 days before I found that my saved search query got p...
See more...
Hi, I have 2 saved searches that fetch data from datamodel (pivot table) and the result of these savedsearch is storing to a summary index. But 2 days before I found that my saved search query got partially deleted from UI , but it is present in the backend. Is there anyone can help to understand how it is happening. What is the rootcause for this
How to add below logs in Splunk , as while entering to SH were able to find the app(env_d),inside that there are "bin, default, metadata" splunk/en-US/app/env_d/search App: Env_d AppServ...
See more...
How to add below logs in Splunk , as while entering to SH were able to find the app(env_d),inside that there are "bin, default, metadata" splunk/en-US/app/env_d/search App: Env_d AppServers: ap8sd010 thru ap8sd019 Logs folders: /app/docker/en1/logs /app/docker/en2/logs /app/docker/en3/logs /app/docker/en4/logs /app/docker/en5/logs
Hi, Is it possible to add a deployer server to existing search head cluster? Existing search heads are fresh and we have no problem about loss any configuration. By the way the mentioned search h...
See more...
Hi, Is it possible to add a deployer server to existing search head cluster? Existing search heads are fresh and we have no problem about loss any configuration. By the way the mentioned search head cluster is connected to a multisite indexer cluster and each search head's site attribute set to "site0".
I need to run a Splunk search with "transaction" command and I have four pattern variations for the start of the transaction and two pattern variations for the end of that transaction. I read the do...
See more...
I need to run a Splunk search with "transaction" command and I have four pattern variations for the start of the transaction and two pattern variations for the end of that transaction. I read the documentation and experimented but still not sure how exactly I should do this. I am operating on complex extensive data so it's not immediately clear whether I am doing this correctly and I need to get it right. I tried the following: 1. Wildcards in startswith and endswith: "endswith=...*..." 2. The syntax "endswith=... OR endswith=...". -- same for startswith 3. The syntax "endswith=... OR ...". 4. Regular expressions instead of wildcards: .* instead of * Could you suggest the right way of doing this? Thank you!
How to calculate percentrank in Splunk? I appreciate your help Below is the expected result: Percentrank exc and Percentrank inc are excel functions. Student Score Percentrank exc Percen...
See more...
How to calculate percentrank in Splunk? I appreciate your help Below is the expected result: Percentrank exc and Percentrank inc are excel functions. Student Score Percentrank exc Percentrank inc Student 1 10 91% 100% Student 2 9 82% 89% Student 3 8 73% 78% Student 4 7 64% 67% Student 5 6 55% 56% Student 6 5 45% 44% Student 7 4 36% 33% Student 8 3 27% 22% Student 9 2 18% 11% Student 10 1 9% 0%
I have a query to fetch Kernel version from all the Linux servers . We update the Kernel Patch every quarter . I have to hardcode the kernel versions in the search query 3.10.0-1160.92.1.el7.x86_64 ...
See more...
I have a query to fetch Kernel version from all the Linux servers . We update the Kernel Patch every quarter . I have to hardcode the kernel versions in the search query 3.10.0-1160.92.1.el7.x86_64 every quarter . There are 3 versions which I need to hardcode in the search query . Is there any specific way where we can update the query automatic .
I have a simple drilldown on my dashboard that users can click on the cell that links to an external website. How can I get Splunk to log the URL that the users clicked? Is it possible to achieve?...
See more...
I have a simple drilldown on my dashboard that users can click on the cell that links to an external website. How can I get Splunk to log the URL that the users clicked? Is it possible to achieve? Thanks a lot.
Can someone please help me with this. So I have the following query: source=abc type=Change msg=" consumed" event_type="*" Now for each of the above searches i need to do the following: source=ab...
See more...
Can someone please help me with this. So I have the following query: source=abc type=Change msg=" consumed" event_type="*" Now for each of the above searches i need to do the following: source=abc AND type=Change AND msg=" finished" event_type= above event type Basically for each first one do another search for the same event_type What would the full query look like? Thanks
Hi all i have the below query where i have a lookup file with Error messages im trying to match the error messages in the lookup and then matching those in the rawdata and showing in table. However...
See more...
Hi all i have the below query where i have a lookup file with Error messages im trying to match the error messages in the lookup and then matching those in the rawdata and showing in table. However my final result query field is coming as empty rest all are populating. Need help in the query i was trying to add before the table command | lookup ErrorMessage.csv query OUTPUT query but not working need help index=abc host="LINUX123" " source="/new/dir/apps/servers/service*.log" "Error data*" [ | inputlookup ErrorMessage.csv | fields + ErrorMessage | rename ErrorMessage as query] | table _time,host,query, _raw lookup file content ErrorMessage.csv File Not Found Error data in client transacton thanks in advance
How to add the LINE_BREAKER in propd .conf for the below events to get it split to different events . Currently these are comign as combines together Path =567 xcss sdsf Path = 5673 dvgsdbdv ...
See more...
How to add the LINE_BREAKER in propd .conf for the below events to get it split to different events . Currently these are comign as combines together Path =567 xcss sdsf Path = 5673 dvgsdbdv v Path = 43343 dvddv I tried LINE_BREAKER = ([\r\n]+)\Path SHOULD_LINEMERGE = FALSE But didnt worked
index=gbts-vconnection * onEvent DISCONNECTED (host=Host1) | rex field=_raw "(?ms)^(?:[^:\\n]*:){5}(?P<IONS>[^;]+)(?:[^:\\n]*:){8}(?P<Device>[^;]+)(?:[^;\\n]*;){4}\\w+:(?P<VDI>\\w+)" offset_field=_e...
See more...
index=gbts-vconnection * onEvent DISCONNECTED (host=Host1) | rex field=_raw "(?ms)^(?:[^:\\n]*:){5}(?P<IONS>[^;]+)(?:[^:\\n]*:){8}(?P<Device>[^;]+)(?:[^;\\n]*;){4}\\w+:(?P<VDI>\\w+)" offset_field=_extracted_fields_bounds This lists all Devices that have disconnected. I'm trying to create a chart that lists only Macs, or Windows based on a key word like "mac" or "laptop" in the Device name. I tried using eval command but can't seem to get it working.
I have two lookups. One consists of the allowed URLs. The other consists of the URLs from a firewall. For example in the first google.com
dummy.com In the second site1.google.com
site...
See more...
I have two lookups. One consists of the allowed URLs. The other consists of the URLs from a firewall. For example in the first google.com
dummy.com In the second site1.google.com
site2.google.com The first lookup is ingested from a file sent by the FW team. I create the second lookup with this search index=my_firewall sourcetype=my_sourcetype (rule=rule_1 OR rule=rule_2 OR rule=rule_3) [ | inputlookup external_url.csv ]
| fields url
| dedup url
| table url
| outputlookup external_results.csv This gives me the sites that have been reached over the time period. Next I use this search | inputlookup external_url.csv
| lookup external_results.csv url OUTPUTNEW url as isFound I think this is giving me what I want, but I can't view the output the way I want. I would like to see allowed_url fw_url isFound Using the sample data google.com site_1.google.com true
google.com site_2.google.com true
dummy.com false TIA, Joe
I need to identify the count of events that have a duration that is less than the p95 value. Sample search index=xyz status=complete | stats p95(dur) as p95Dur What can I add to the end of the s...
See more...
I need to identify the count of events that have a duration that is less than the p95 value. Sample search index=xyz status=complete | stats p95(dur) as p95Dur What can I add to the end of the search to id the number of events less than the p95Dur value?
I have data in two different applications. I need to get fields from one query to use as filters for another, like this: ``` app=app1 | rex field=environment_url "https:\/\/(?<app_name>.*)\.foo\....
See more...
I have data in two different applications. I need to get fields from one query to use as filters for another, like this: ``` app=app1 | rex field=environment_url "https:\/\/(?<app_name>.*)\.foo\.com" | where app_name in [ search app=app2 | table app_name ] ``` app2 has a field named app_name which I'm turning into a table. app1 doesn't have this field, but I'm creating and extracting it with a regex. I only want the app names from app1 if they exist in the table I'm creating from app2. This query isn't working for me, what can I do? Thank you for any help.
I have data in two different applications. I need to get fields from one query to use as filters for another, like this: app=app1 | rex field=environment_url "https:\/\/(?<app_name>.*)\.foo...
See more...
I have data in two different applications. I need to get fields from one query to use as filters for another, like this: app=app1 | rex field=environment_url "https:\/\/(?<app_name>.*)\.foo\.com" | where app_name in [ search app=app2 | table app_name ] app2 has a field named app_name which I'm turning into a table. app1 doesn't have this field, but I'm creating and extracting it with a regex. I only want the app names from app1 if they exist in the table I'm creating from app2. This query isn't working for me, what can I do? Thank you for any help.