All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi, If I have process Events like PID | ProcessName |  CommandLine | SpawnedByPID 100 | process_1 | process_1_commandLine | 99 101 | process_2 | process_2_commandLine | 100 200 | process_3 |  ... See more...
Hi, If I have process Events like PID | ProcessName |  CommandLine | SpawnedByPID 100 | process_1 | process_1_commandLine | 99 101 | process_2 | process_2_commandLine | 100 200 | process_3 |  process_3_commandLine | 199 201 |  process_4 |  process_4_commandLine | 200 Is there any Viz that will map processes in some Folder/EDR like tree (where I can also click on node and get mora info). For example, final results are based on PID but Viz looks like something like | -> process_name_99 |----> process_1 (on hower or Click will get token process_1_commandLine) |--------> process_2 | -> process_name_99 |----> process_3 |-------->process_4 Something like psTree just more advanced and connected by PID not names.
Hi! Is it possible to report errors without throwing an exception / crashing the app? I'd like to report some custom user data for certain events like here described, without throwing an exception h... See more...
Hi! Is it possible to report errors without throwing an exception / crashing the app? I'd like to report some custom user data for certain events like here described, without throwing an exception https://docs.appdynamics.com/appd/23.x/23.6/en/end-user-monitoring/mobile-real-user-monitoring/instrument-android-applications/customize-the-android-instrumentation#id-.CustomizetheAndroidInstrumentationv23.2-user-dataCustomUserData  I tried the following, but it wasn't reported, nor I could watch it in crashes view Instrumentation.setUserData("Custom_event_key", "Some event happened"); Instrumentation.reportError(e, ErrorSeverityLevel.CRITICAL); If this is possible, where can I monitor that data in AppDynamics? Or is this just extra data which will only be added to crash reports?
Hello, we have a requirement for this as well. Is there any update to this discussion? We have a need to integrate data sourced from ThreatResponse into our splunk solution.
That message appears when a query uses a token that has no value.  Check all tokens in the dashboard to make sure they are defined before the query executes.  Perhaps there is a spelling error somewh... See more...
That message appears when a query uses a token that has no value.  Check all tokens in the dashboard to make sure they are defined before the query executes.  Perhaps there is a spelling error somewhere.
Hello @yuanliu  addcoltotals will show up at the end of the row, so if I have multipages, it will now show on the first page In the real data, I  have more than 10, so addcoltotals will not show ... See more...
Hello @yuanliu  addcoltotals will show up at the end of the row, so if I have multipages, it will now show on the first page In the real data, I  have more than 10, so addcoltotals will not show up on the front page Why  did Splunk get 1129.3600000000001, not 1129.36? Thanks
Hello @bowesmana  addcoltotals will show up at the end of the row, so if i have multipages, it will now show on the first page Why  Splunk get 1129.3600000000001 from? The correct total should be ... See more...
Hello @bowesmana  addcoltotals will show up at the end of the row, so if i have multipages, it will now show on the first page Why  Splunk get 1129.3600000000001 from? The correct total should be 1129.36 Thanks
Hello Team, Everyone has probably seen this error.  Error in 'TsidxStats': _time aggregations are not yet supported except for count/min/max/range/earliest/latest I try to understand stats command... See more...
Hello Team, Everyone has probably seen this error.  Error in 'TsidxStats': _time aggregations are not yet supported except for count/min/max/range/earliest/latest I try to understand stats command use which fields.  I don't want to try every field. Can I see this fields list on GUI or CLI?
Hi, from the logs, i have extracted the below data(table1). I would like to add another column as in Table2 with custom keyword if filename begins xyz then "Core". Please could you suggest what s... See more...
Hi, from the logs, i have extracted the below data(table1). I would like to add another column as in Table2 with custom keyword if filename begins xyz then "Core". Please could you suggest what splunk query or logic we could apply? Splunk Query: base search | rex field User | rex field Folder | rex filed File | table User Folder File Table1:  User Folder File ABC first xyz07122023   Table 2: Required Output User Folder File Consumer ABC first xyz07122023 Core
Hi all ! I display a map in a dashboard. The map only contains 1 point I need to center the map dynamically for this point. But I can't do it because I'm not able to insert into the "center" value ... See more...
Hi all ! I display a map in a dashboard. The map only contains 1 point I need to center the map dynamically for this point. But I can't do it because I'm not able to insert into the "center" value other value than integer (I tried token for example) How can I perform it ? Thank you a lot     My search :      "ds_search_1_new": { "type": "ds.search", "options": { "query": "xxx | table lon lat", "queryParameters": { "earliest": "$time.earliest$", "latest": "$time.latest$" }, "enableSmartSources": true }, "name": "mapSearch" }     My visualization :      "viz_map_1": { "type": "splunk.map", "options": { "zoom": 0, "layers": [ { "type": "marker", "latitude": "> primary | seriesByName('lat')", "longitude": "> primary | seriesByName('lon')", "bubbleSize": "> primary | seriesByName('lat')" } ], "center": [ 0, 0 ] }, "dataSources": { "primary": "ds_search_1_new" }, "title": "" }      
thsnks @gcusello - I got your point. why this needs to be almost realtime is, because it includes Live Calls Data so as soon as a call is landed, it should be reflected on this dashboard.  for now, ... See more...
thsnks @gcusello - I got your point. why this needs to be almost realtime is, because it includes Live Calls Data so as soon as a call is landed, it should be reflected on this dashboard.  for now, as you suggested, I have created a saved search (Report) to run evey minute and then dashboard panels are using  | loadjob and refreshes every 15 seconds.  Alternately, if I create a Splunk User with only (and limited) access to this dashboard and then it can be used by anyone who wants to access this dashboard - in this case I would expect multiple search job requests won't be submitted hence this should not cause performance issues I am seeing currently - is this understading correct? Thank you. 
Hello team I am facing issue in setting up cloud like architecture using docker-splunk I am following this page: https://github.com/splunk/docker-splunk/blob/develop/docs/advanced/DISTRIBUTED_TOPOL... See more...
Hello team I am facing issue in setting up cloud like architecture using docker-splunk I am following this page: https://github.com/splunk/docker-splunk/blob/develop/docs/advanced/DISTRIBUTED_TOPOLOGY.md And I am getting error in starting SH and CM containers getting below error on sh1   fatal: [localhost]: FAILED! => { "attempts": 60, "changed": false, "cmd": [ "/opt/splunk/bin/splunk", "init", "shcluster-config", "-auth", "admin:Abc@1234", "-mgmt_uri", "https://sh1:8089", "-replication_port", "9887", "-replication_factor", "2", "-conf_deploy_fetch_url", "https://dep1:8089", "-secret", "", "-shcluster_label", "shc_label" ], "delta": "0:00:00.593771", "end": "2023-12-06 07:05:46.787788", "rc": 22, "start": "2023-12-06 07:05:46.194017" } STDERR: WARNING: Server Certificate Hostname Validation is disabled. Please see server.conf/[sslConfig]/cliVerifyServerName for details. Required parameter secret does not have a value.   And error on starting cm1 container   fatal: [localhost]: FAILED! => { 2023-12-07 11:02:09 "attempts": 5, 2023-12-07 11:02:09 "changed": false, 2023-12-07 11:02:09 "cmd": [ 2023-12-07 10:59:48 core/2.11/user_guide/become.html#risks-of-becoming-an-unprivileged-user 2023-12-07 10:59:49 [WARNING]: Using world-readable permissions for temporary files Ansible needs 2023-12-07 10:59:49 to create when becoming an unprivileged user. This may be insecure. For 2023-12-07 10:59:49 information on securing this, see https://docs.ansible.com/ansible- 2023-12-07 10:59:49 core/2.11/user_guide/become.html#risks-of-becoming-an-unprivileged-user 2023-12-07 10:59:49 [WARNING]: Using world-readable permissions for temporary files Ansible needs 2023-12-07 10:59:49 to create when becoming an unprivileged user. This may be insecure. For 2023-12-07 10:59:49 information on securing this, see https://docs.ansible.com/ansible- 2023-12-07 10:59:49 core/2.11/user_guide/become.html#risks-of-becoming-an-unprivileged-user 2023-12-07 10:59:49 [WARNING]: Using world-readable permissions for temporary files Ansible needs 2023-12-07 10:59:49 to create when becoming an unprivileged user. This may be insecure. For 2023-12-07 10:59:49 information on securing this, see https://docs.ansible.com/ansible- 2023-12-07 10:59:49 core/2.11/user_guide/become.html#risks-of-becoming-an-unprivileged-user 2023-12-07 11:02:09 "/opt/splunk/bin/splunk", 2023-12-07 11:02:09 "start", 2023-12-07 11:02:09 "--accept-license", 2023-12-07 11:02:09 "--answer-yes", 2023-12-07 11:02:09 "--no-prompt" 2023-12-07 11:02:09 ], 2023-12-07 11:02:09 "delta": "0:00:15.870844", 2023-12-07 11:02:09 "end": "2023-12-07 05:32:09.015177", 2023-12-07 11:02:09 "rc": 1, 2023-12-07 11:02:09 "start": "2023-12-07 05:31:53.144333" 2023-12-07 11:02:09 } 2023-12-07 11:02:09 2023-12-07 11:02:09 STDOUT: 2023-12-07 11:02:09 2023-12-07 11:02:09 2023-12-07 11:02:09 Splunk> Take the sh out of IT. 2023-12-07 11:02:09 2023-12-07 11:02:09 Checking prerequisites... 2023-12-07 11:02:09 Checking http port [8000]: open 2023-12-07 11:02:09 Checking mgmt port [8089]: open 2023-12-07 11:02:09 Checking appserver port [127.0.0.1:8065]: open 2023-12-07 11:02:09 Checking kvstore port [8191]: open 2023-12-07 11:02:09 Checking configuration... Done. 2023-12-07 11:02:09 Checking critical directories... Done 2023-12-07 11:02:09 Checking indexes... 2023-12-07 11:02:09 Validated: _audit _configtracker _internal _introspection _metrics _metrics_rollup _telemetry _thefishbucket history main summary 2023-12-07 11:02:09 Done 2023-12-07 11:02:09 Checking filesystem compatibility... Done 2023-12-07 11:02:09 Checking conf files for problems... 2023-12-07 11:02:09 Done 2023-12-07 11:02:09 Checking default conf files for edits... 2023-12-07 11:02:09 Validating installed files against hashes from '/opt/splunk/splunk-9.1.2-b6b9c8185839-linux-2.6-x86_64-manifest' 2023-12-07 11:02:09 All installed files intact. 2023-12-07 11:02:09 Done 2023-12-07 11:02:09 All preliminary checks passed. 2023-12-07 11:02:09 2023-12-07 11:02:09 Starting splunk server daemon (splunkd)... 2023-12-07 11:02:09 Done 2023-12-07 11:02:09 2023-12-07 11:02:09 2023-12-07 11:02:09 Waiting for web server at http://127.0.0.1:8000 to be available............ 2023-12-07 11:02:09 2023-12-07 11:02:09 WARNING: web interface does not seem to be available! 2023-12-07 11:02:09 2023-12-07 11:02:09 2023-12-07 11:02:09 STDERR: 2023-12-07 11:02:09 2023-12-07 11:02:09 PYTHONHTTPSVERIFY is set to 0 in splunk-launch.conf disabling certificate validation for the httplib and urllib libraries shipped with the embedded Python interpreter; must be set to "1" for increased security 2023-12-07 11:02:09 2023-12-07 11:02:09 2023-12-07 11:02:09 MSG: 2023-12-07 11:02:09 2023-12-07 11:02:09 non-zero return code 2023-12-07 11:02:09 2023-12-07 11:02:09 PLAY RECAP ********************************************************************* 2023-12-07 11:02:09 localhost : ok=60 changed=2 unreachable=0 failed=1 skipped=48 rescued=0 ignored=0 2023-12-07 11:02:09   I am using this yaml file   version: "3.6" networks: splunknet: driver: bridge attachable: true services: sh1: networks: splunknet: aliases: - sh1 image: ${SPLUNK_IMAGE:-splunk/splunk:latest} hostname: sh1 container_name: sh1 environment: - SPLUNK_START_ARGS=--accept-license - SPLUNK_INDEXER_URL=idx1,idx2,idx3,idx4 - SPLUNK_SEARCH_HEAD_URL=sh2,sh3 - SPLUNK_SEARCH_HEAD_CAPTAIN_URL=sh1 - SPLUNK_CLUSTER_MASTER_URL=cm1 - SPLUNK_ROLE=splunk_search_head_captain - SPLUNK_DEPLOYER_URL=dep1 - SPLUNK_PASSWORD=Abc@1234 - SPLUNK_LICENSE_URI=/tmp/defaults/splunk_license_expire_on_January_02_2024.License - SPLUNK_APPS_URL - DEBUG=true ports: - 8000 - 8089 volumes: - ./defaults:/tmp/defaults sh2: networks: splunknet: aliases: - sh2 image: ${SPLUNK_IMAGE:-splunk/splunk:latest} hostname: sh2 container_name: sh2 environment: - SPLUNK_START_ARGS=--accept-license - SPLUNK_INDEXER_URL=idx1,idx2,idx3,idx4 - SPLUNK_SEARCH_HEAD_URL=sh2,sh3 - SPLUNK_SEARCH_HEAD_CAPTAIN_URL=sh1 - SPLUNK_CLUSTER_MASTER_URL=cm1 - SPLUNK_ROLE=splunk_search_head - SPLUNK_DEPLOYER_URL=dep1 - SPLUNK_PASSWORD=Abc@1234 - SPLUNK_LICENSE_URI=/tmp/defaults/splunk_license_expire_on_January_02_2024.License - SPLUNK_APPS_URL - DEBUG=true ports: - 8000 - 8089 volumes: - ./defaults:/tmp/defaults sh3: networks: splunknet: aliases: - sh3 image: ${SPLUNK_IMAGE:-splunk/splunk:latest} hostname: sh3 container_name: sh3 environment: - SPLUNK_START_ARGS=--accept-license - SPLUNK_INDEXER_URL=idx1,idx2,idx3,idx4 - SPLUNK_SEARCH_HEAD_URL=sh2,sh3 - SPLUNK_SEARCH_HEAD_CAPTAIN_URL=sh1 - SPLUNK_CLUSTER_MASTER_URL=cm1 - SPLUNK_ROLE=splunk_search_head - SPLUNK_DEPLOYER_URL=dep1 - SPLUNK_PASSWORD=Abc@1234 - SPLUNK_LICENSE_URI=/tmp/defaults/splunk_license_expire_on_January_02_2024.License - SPLUNK_APPS_URL - DEBUG=true ports: - 8000 - 8089 volumes: - ./defaults:/tmp/defaults dep1: networks: splunknet: aliases: - dep1 image: ${SPLUNK_IMAGE:-splunk/splunk:latest} hostname: dep1 container_name: dep1 environment: - SPLUNK_START_ARGS=--accept-license - SPLUNK_INDEXER_URL=idx1,idx2,idx3,idx4 - SPLUNK_SEARCH_HEAD_URL=sh2,sh3 - SPLUNK_SEARCH_HEAD_CAPTAIN_URL=sh1 - SPLUNK_CLUSTER_MASTER_URL=cm1 - SPLUNK_ROLE=splunk_deployer - SPLUNK_DEPLOYER_URL=dep1 - SPLUNK_PASSWORD=Abc@1234 - SPLUNK_LICENSE_URI - SPLUNK_APPS_URL - DEBUG=true ports: - 8000 - 8089 volumes: - ./defaults:/tmp/defaults cm1: networks: splunknet: aliases: - cm1 image: ${SPLUNK_IMAGE:-splunk/splunk:latest} hostname: cm1 container_name: cm1 environment: - SPLUNK_START_ARGS=--accept-license - SPLUNK_INDEXER_URL=idx1,idx2,idx3,idx4 - SPLUNK_SEARCH_HEAD_URL=sh2,sh3 - SPLUNK_SEARCH_HEAD_CAPTAIN_URL=sh1 - SPLUNK_CLUSTER_MASTER_URL=cm1 - SPLUNK_ROLE=splunk_cluster_master - SPLUNK_DEPLOYER_URL=dep1 - SPLUNK_PASSWORD=Abc@1234 - SPLUNK_LICENSE_URI - SPLUNK_APPS_URL - DEBUG=true ports: - 8000 - 8089 volumes: - ./defaults:/tmp/defaults idx1: networks: splunknet: aliases: - idx1 image: ${SPLUNK_IMAGE:-splunk/splunk:latest} hostname: idx1 container_name: idx1 environment: - SPLUNK_START_ARGS=--accept-license - SPLUNK_INDEXER_URL=idx1,idx2,idx3,idx4 - SPLUNK_SEARCH_HEAD_URL=sh2,sh3 - SPLUNK_SEARCH_HEAD_CAPTAIN_URL=sh1 - SPLUNK_CLUSTER_MASTER_URL=cm1 - SPLUNK_ROLE=splunk_indexer - SPLUNK_DEPLOYER_URL=dep1 - SPLUNK_PASSWORD=Abc@1234 - SPLUNK_LICENSE_URI - SPLUNK_APPS_URL - DEBUG=true ports: - 8000 - 8089 volumes: - ./defaults:/tmp/defaults idx2: networks: splunknet: aliases: - idx2 image: ${SPLUNK_IMAGE:-splunk/splunk:latest} hostname: idx2 container_name: idx2 environment: - SPLUNK_START_ARGS=--accept-license - SPLUNK_INDEXER_URL=idx1,idx2,idx3,idx4 - SPLUNK_SEARCH_HEAD_URL=sh2,sh3 - SPLUNK_SEARCH_HEAD_CAPTAIN_URL=sh1 - SPLUNK_CLUSTER_MASTER_URL=cm1 - SPLUNK_ROLE=splunk_indexer - SPLUNK_DEPLOYER_URL=dep1 - SPLUNK_PASSWORD=Abc@1234 - SPLUNK_LICENSE_URI - SPLUNK_APPS_URL - DEBUG=true ports: - 8000 - 8089 volumes: - ./defaults:/tmp/defaults idx3: networks: splunknet: aliases: - idx3 image: ${SPLUNK_IMAGE:-splunk/splunk:latest} hostname: idx3 container_name: idx3 environment: - SPLUNK_START_ARGS=--accept-license - SPLUNK_INDEXER_URL=idx1,idx2,idx3,idx4 - SPLUNK_SEARCH_HEAD_URL=sh2,sh3 - SPLUNK_SEARCH_HEAD_CAPTAIN_URL=sh1 - SPLUNK_CLUSTER_MASTER_URL=cm1 - SPLUNK_ROLE=splunk_indexer - SPLUNK_DEPLOYER_URL=dep1 - SPLUNK_PASSWORD=Abc@1234 - SPLUNK_LICENSE_URI - SPLUNK_APPS_URL - DEBUG=true ports: - 8000 - 8089 volumes: - ./defaults:/tmp/defaults idx4: networks: splunknet: aliases: - idx4 image: ${SPLUNK_IMAGE:-splunk/splunk:latest} hostname: idx4 container_name: idx4 environment: - SPLUNK_START_ARGS=--accept-license - SPLUNK_INDEXER_URL=idx1,idx2,idx3,idx4 - SPLUNK_SEARCH_HEAD_URL=sh2,sh3 - SPLUNK_SEARCH_HEAD_CAPTAIN_URL=sh1 - SPLUNK_CLUSTER_MASTER_URL=cm1 - SPLUNK_ROLE=splunk_indexer - SPLUNK_DEPLOYER_URL=dep1 - SPLUNK_PASSWORD=Abc@1234 - SPLUNK_LICENSE_URI - SPLUNK_APPS_URL - DEBUG=true ports: - 8000 - 8089 volumes: - ./defaults:/tmp/defaults   Can someone help me resolve this?
Hi, Any kind of real time attacks - Unauthorized attacks, Malicious access attempts, Command and controller traffic, Inbound/outbound malicious traffic, port scanning, palo alto threat detected traf... See more...
Hi, Any kind of real time attacks - Unauthorized attacks, Malicious access attempts, Command and controller traffic, Inbound/outbound malicious traffic, port scanning, palo alto threat detected traffic etc....
Hi Team, I need to configure Splunk alert to notify us in case of no logs updated on given server or many servers more than an hour and below are requirements: 1. Totally 40 servers require monito... See more...
Hi Team, I need to configure Splunk alert to notify us in case of no logs updated on given server or many servers more than an hour and below are requirements: 1. Totally 40 servers require monitoring 2. Each server has an average 3 log paths NOTE: Seen existing solution where config is meant for single server host; I need amicable solution to cover all 40 servers. Please let me know if anything.
Thanks @yuanliu, I was able to use this to derive from my actual logs
<label>Test</label>   <init>     <unset token="msg"></unset>     <unset token="form.msg"></unset>     <set token="showResult">true</set>   </init>   <fieldset submitButton="false"></fieldset> ... See more...
<label>Test</label>   <init>     <unset token="msg"></unset>     <unset token="form.msg"></unset>     <set token="showResult">true</set>   </init>   <fieldset submitButton="false"></fieldset>   <row>     <panel id="logo">       <html>         <p>           <img src="/static/app/CIS-CM)-CERT-Portal/cis-new.jpg" alt="Cis"/>         </p>       </html>     </panel>     <panel id="details">       <html>         <h>           <b> Check Report</b>         </h>         <div id="desc">           <p>             This report displays the  Check details.           </p>         </div>       </html>     </panel>   </row>   <row>     <panel id="hideshow">       <html depends="$hideglo$">         <a id="show">Show Filters</a>       </html>       <html rejects="$hideglo$">           <a id="hide">Hide Filters</a>       </html>     </panel>   </row>   <row>      <panel id="global" rejects="$hideglo$">       <input type="dropdown" id="orgselect" token="org" searchWhenChanged="false">         <label>Organization</label>         <showClearButton>false</showClearButton>         <search>           <query>|  `orgList`</query>           <earliest>0</earliest>           <latest>now</latest>         </search>         <fieldForLabel>cust_name</fieldForLabel>         <fieldForValue>cust_name</fieldForValue>         <prefix>em7_cust_name="</prefix>         <suffix>" em7_cust_name!=Cisco </suffix>       </input>       <input type="dropdown" id="region" token="region" searchWhenChanged="false">         <label>Region*</label>     <showClearButton>false</showClearButton>         <selectFirstChoice>true</selectFirstChoice>         <search>           <query>             |inputlookup cert_groups_lookup             | lookup cert_servers_lookup group_id OUTPUTNEW em7_org_id             | mvexpand em7_org_id             | dedup em7_org_id,group_id             | search em7_org_id="$cust_id$"             | sort 0 group_name           </query>           <earliest>0</earliest>           <latest>now</latest>         </search>         <fieldForLabel>group_name</fieldForLabel>         <fieldForValue>group_id</fieldForValue>         <prefix>group_id="</prefix>         <suffix>"</suffix>       </input>
Hi There!    I'm facing the error "Search is waiting for the input" <form stylesheet="dashboard.css,infobutton.css" script="multiselect_functions.js,infobutton.js" version="1.1" theme="dark"> ... See more...
Hi There!    I'm facing the error "Search is waiting for the input" <form stylesheet="dashboard.css,infobutton.css" script="multiselect_functions.js,infobutton.js" version="1.1" theme="dark"> <label>Agent Operational Dashboard</label> <description>v4.3</description> <init> <set token="agent_index">1T</set> <set token="console_stand_scope">OR `console_stand(*)`</set> <set token="form.cacp">*</set> <set token="form.sap">*</set> <set token="form.origin">*</set> </init> <search id="init"> <done> <condition match="isnull($scope$) OR $scope$ == &quot;agent_console_&quot;"> <set token="cmdb_scope">*</set> </condition> <condition match="$scope$ == &quot;agent_cmdb_&quot;"> <set token="cmdb_scope">IN</set> </condition> </done> <query> | makeresults </query> <earliest>$search_start$</earliest> <latest>$search_end$</latest> </search> <search> <query> | makeresults | eval LimitVersion_ens=`get_obsolete_version(Agent_Endpoint_Security)` | eval LimitVersion_agent=`get_obsolete_version(Agent_Agent)` </query> <done> <set token="ens_obsolete_version">$result.LimitVersion_ens$</set> <set token="agent_obsolete_version">$result.LimitVersion_agent$</set> </done> </search> <search id="compliance_agent"> <query> `compliance_agent_op("agent_index_source IN($agent_index$) $console_stand_scope$", now(), $timerange$, agent,$machine$, $scope$, $origin$, $country$, $cacp$, $sap$)` </query> <earliest>$search_start$</earliest> <latest>$search_end$</latest> </search> <search id="compliance_all_agent"> <query> `compliance_agent_op("`agent_scope_filter($cmdb_scope$)`", now(), $timerange$, agent,$machine$, $scope$, $origin$, $country$, $cacp$, $sap$)` </query> <earliest>$search_start$</earliest> <latest>$search_end$</latest> </search> <search> <done> <set token="search_start">$result.search_start$</set> <set token="search_end">$result.search_end$</set> </done> <query>| makeresults | fields - _time | eval now=now() | eval prev_day=if(strftime(now, "%a")="Mon" AND "$weekends$"="exclude", -3, -1) | eval search_start=relative_time(now, prev_day."d@d") | eval search_end=search_start + 86400</query> </search> <fieldset submitButton="false" autoRun="true"> <input type="multiselect" token="agent_index" searchWhenChanged="true"> <label>Choose Agent console</label> <choice value="1T,2A*,2S">All</choice> <choice value="1T">Agent Stand</choice> <choice value="2A*">Agent Scad</choice> <choice value="2S">Agent SCAPA</choice> <default>1T</default> <initialValue>1T</initialValue> <delimiter>, </delimiter> <change> <set token="agent_index_label">$label$</set> </change> <change> <condition match="like($agent_index$,&quot;%1T23%&quot;)"> <set token="console_stand_scope">OR `console_stand($cmdb_scope$)`</set> </condition> <condition match="!like($agent_index$,&quot;%1T23%&quot;)"> <set token="console_stand_scope"></set> </condition> </change> </input> <input type="dropdown" token="timerange" searchWhenChanged="true"> <label>Last Communication</label> <choice value="-1d@d">Previous day</choice> <choice value="-7d@d">Last 7 days</choice> <choice value="-15d@d">Last 15 days</choice> <choice value="-21d@d">Last 21 days</choice> <choice value="-30d@d">Last 30 days</choice> <choice value="-3mon">Last 3 months</choice> <choice value="-6mon">Last 6 months</choice> <choice value="-12mon">Last 1 year</choice> <change> <eval token="time_timechart">case($value$ == "-1d@d","1",$value$ == "-7d@d","2",$value$ == "-15d@d","3",$value$ == "-21d@d","4",$value$ == "-30d@d","5",$value$ == "-3mon","6",$value$ == "-6mon","7",$value$ == "-12mon","8")</eval> </change> <default>-15d@d</default> <initialValue>-15d@d</initialValue> </input> <input type="radio" token="origin" searchWhenChanged="true"> <label>Location</label> <choice value="*">All Locations</choice> <choice value="NAT">NAT</choice> <choice value="ROO">ROO</choice> <default>*</default> <initialValue>*</initialValue> <change> <unset token="form.country"></unset> </change> </input> <input type="multiselect" token="country" searchWhenChanged="true"> <label>Country</label> <search> <query>| inputlookup b1a_asset_country.csv where nat_roo="$origin$" | dedup country | fields country </query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <delimiter> </delimiter> <fieldForLabel>country</fieldForLabel> <fieldForValue>country</fieldForValue> <choice value="*">All</choice> <default>*</default> <initialValue>*</initialValue> </input> <input type="multiselect" token="machine" searchWhenChanged="true"> <label>Machine type</label> <choice value="*">All</choice> <choice value="VDI">VDI</choice> <choice value="Industrial">Industrial</choice> <choice value="Stand">Stand</choice> <choice value="MacOS">MacOS</choice> <default>*</default> <initialValue>*</initialValue> </input> <input type="radio" token="business_assets" searchWhenChanged="true"> <label>Business assets</label> <choice value="*">All assets</choice> <choice value="cacp">CACP</choice> <choice value="sap">SAP</choice> <default>*</default> <initialValue>*</initialValue> <change> <condition match="$business_assets$ == &quot;cacp&quot;"> <set token="cacp">true</set> <set token="sap">*</set> </condition> <condition match="$business_assets$ == &quot;sap&quot;"> <set token="sap">true</set> <set token="cacp">*</set> </condition> <condition match="$business_assets$ == &quot;*&quot;"> <set token="sap">*</set> <set token="cacp">*</set> </condition> </change> </input> <input type="dropdown" token="scope" searchWhenChanged="true"> <label>Scope</label> <choice value="agent_console_">Agent Console</choice> <choice value="agent_cmdb_">CMDB</choice> <default>agent_console_</default> <initialValue>agent_console_</initialValue> <change> <condition match="$scope$ == &quot;agent_console_&quot;"> <unset token="cmdb_scope"></unset> <set token="cmdb_scope">*</set> </condition> <condition match="$scope$ == &quot;agent_cmdb_&quot;"> <unset token="cmdb_scope"></unset> <set token="cmdb_scope">IN</set> </condition> </change> </input> <input type="multiselect" token="office_filter" searchWhenChanged="true"> <label>Front/Back office (only Stand Global compliance)</label> <choice value="Front Office">Front Office</choice> <choice value="Back Office">Back Office</choice> <initialValue>Front Office,Back Office</initialValue> <default>Front Office,Back Office</default> <valuePrefix>"</valuePrefix> <valueSuffix>"</valueSuffix> <delimiter>, </delimiter> <change> <eval token="office_filter_drilldown">replace($form.office_filter$ + "","([^,]+),?","&amp;form.office_filter=$1")</eval> </change> </input> <input type="radio" token="weekends" searchWhenChanged="true"> <label>Weekends</label> <choice value="exclude">Exclude Weekends</choice> <choice value="include">Include Weekends</choice> <default>exclude</default> <initialValue>exclude</initialValue> </input> </fieldset> <row> <panel> <title>Full Perimeter Compliance (all EPO)</title> <chart> <title>All Consoles</title> <search base="compliance_all_agent"> <query>| chart count by $scope$global_compliance | sort $scope$global_compliance</query> </search> <option name="charting.chart">pie</option> <option name="charting.drilldown">all</option> <option name="charting.fieldColors">{"Compliant":0x55AA55,"Non Compliant":0xCC0000","Not Applicable":"0xFFC300 "}</option> <option name="charting.seriesColors">[0x55AA55, 0xCC0000]</option> <option name="refresh.display">progressbar</option> <drilldown> <link target="_blank">/app/agent_operational_antivirus_details?form.compliance_filter=$click.value$&amp;form.agent_index=*&amp;form.timerange=$timerange$&amp;form.antivirus_filter=*&amp;form.machine=$machine$&amp;form.origin=$origin$&amp;form.country=$country$&amp;form.business_assets=$business_assets$&amp;form.scope=$scope$</link> </drilldown> </chart> </panel> </row> Thanks in Advance!!!!
I created a kvstore on the search header cluster. When I clean up the environment and want to use API or CLI to delete and recreate the KVStore, I find that the data of the KVStore will recover on ... See more...
I created a kvstore on the search header cluster. When I clean up the environment and want to use API or CLI to delete and recreate the KVStore, I find that the data of the KVStore will recover on its own after a period of time. Why is this? BR
Hi @Pravinsugi, let me understand: for each customerOrderNumber, you have three Received message classes, you want to check that's this is true fro each one, is it correct? At first, next time... See more...
Hi @Pravinsugi, let me understand: for each customerOrderNumber, you have three Received message classes, you want to check that's this is true fro each one, is it correct? At first, next time, please share your samples in text format to avoid to rewrite all of them. Then I suppose that you already extracted the two fields customerOrderNumber and Received_message_class, otherwise you have to extract them, but to help you in this I need of your sample logs in text format. Anyway, you should run something like this: <your_search> | stats dc(Received_message_class) AS Received_message_class_count values(Received_message_class) AS Received_message_class BY customerOrderNumber | eval status=if(Received_message_class_cont=3,"OK","there are only the following Messages: ".Received_message_class | table customerOrderNumber status Ciao. Giuseppe
     i have three events like received message class.if you seee the pic,you will be seeing 3 event for each customer .each event have customerordernumber.i want to check for each and every custom... See more...
     i have three events like received message class.if you seee the pic,you will be seeing 3 event for each customer .each event have customerordernumber.i want to check for each and every customer I have all three event message in the splunk log.how to write splunk query for that.
Is this what you mean?  Please let me know if I have misunderstood and thank you again   { "visualizations": { "viz_glNXouAy": { "type": "splunk.singlevalue", "options": {}, "dataSource... See more...
Is this what you mean?  Please let me know if I have misunderstood and thank you again   { "visualizations": { "viz_glNXouAy": { "type": "splunk.singlevalue", "options": {}, "dataSources": { "primary": "ds_gREZNTgj" }, "context": {}, "showProgressBar": false, "showLastUpdated": false }, "viz_1ibEKiXT": { "type": "splunk.singlevalue", "options": {}, "dataSources": { "primary": "ds_PozPBYIA_ds_gREZNTgj" }, "context": {}, "showProgressBar": false, "showLastUpdated": false }, "viz_rYhOWilO": { "type": "splunk.events", "options": {}, "dataSources": { "primary": "ds_Aoy6m25x_ds_PozPBYIA_ds_gREZNTgj" }, "context": {}, "showProgressBar": false, "showLastUpdated": false }, "viz_HS70GboS": { "type": "splunk.singlevalue", "options": {}, "dataSources": { "primary": "ds_n4Q7l7oK" }, "context": {}, "showProgressBar": false, "showLastUpdated": false }, "viz_cFoVJm3n": { "type": "splunk.singlevalue", "options": {}, "dataSources": { "primary": "ds_BgOJ54ak" }, "context": {}, "showProgressBar": false, "showLastUpdated": false }, "viz_Oh0reZaV": { "type": "splunk.markdown", "options": { "markdown": "base0\n\nsearch base0\n```\nindex=_internal\n```" } }, "viz_5jz1AX2v": { "type": "splunk.markdown", "options": { "markdown": "base0chain1a\n\nsearch base0\n```\nindex=_internal\n```\n\nsearch chain1a\n```\n| search useTypeahead=true\n```" } }, "viz_A5Kcf02B": { "type": "splunk.markdown", "options": { "markdown": "base0chain1achain2a\n\nsearch base0\n```\nindex=_internal\n```\n\nsearch chain1a\n```\n| search useTypeahead=true\n```\n\nsearch chain2a\n```\n| stats count\n```" } }, "viz_ymCZMl2z": { "type": "splunk.markdown", "options": { "markdown": "combined non-chained search\n```\nindex=_internal \n| search useTypeahead=true \n| stats count\n```\n" } }, "viz_rjA0XgMd": { "type": "splunk.markdown", "options": { "markdown": "base0\n\nexpected" } }, "viz_c4r59ekz": { "type": "splunk.markdown", "options": { "markdown": "base0chain1a\n\n**unexpected**" } }, "viz_hFoP7IsM": { "type": "splunk.markdown", "options": { "markdown": "base0chain1achain2a\n\n**unexpected**" } }, "viz_7lRWKreL": { "type": "splunk.markdown", "options": { "markdown": "combined non-chained search\n\nexpected" } } }, "dataSources": { "ds_n4Q7l7oK": { "type": "ds.search", "options": { "query": "index=_internal", "queryParameters": { "earliest": "$global_time.earliest$", "latest": "$global_time.latest$" } }, "name": "base0" }, "ds_gQjuR7jY": { "type": "ds.search", "options": { "query": "index=_internal\n| search useTypeahead=true", "queryParameters": { "earliest": "$global_time.earliest$", "latest": "$global_time.latest$" } }, "name": "base1" }, "ds_gREZNTgj": { "type": "ds.chain", "options": { "extend": "ds_gQjuR7jY", "query": "| stats count" }, "name": "base1chain2" }, "ds_PozPBYIA_ds_gREZNTgj": { "type": "ds.chain", "options": { "extend": "ds_Aoy6m25x_ds_PozPBYIA_ds_gREZNTgj", "query": "| stats count" }, "name": "base0chain1achain2a" }, "ds_Aoy6m25x_ds_PozPBYIA_ds_gREZNTgj": { "type": "ds.chain", "options": { "extend": "ds_n4Q7l7oK", "query": "| search useTypeahead=true" }, "name": "base0chain1a" }, "ds_BgOJ54ak": { "type": "ds.search", "options": { "query": "index=_internal \n| search useTypeahead=true \n| stats count" }, "name": "base0chain1achain2aFull" } }, "defaults": { "dataSources": { "ds.search": { "options": { "queryParameters": { "latest": "$global_time.latest$", "earliest": "$global_time.earliest$" } } } } }, "inputs": { "input_global_trp": { "type": "input.timerange", "options": { "token": "global_time", "defaultValue": "-24h@h,now" }, "title": "Global Time Range" } }, "layout": { "type": "absolute", "options": { "width": 1440, "height": 1200, "display": "auto" }, "structure": [ { "item": "viz_glNXouAy", "type": "block", "position": { "x": 0, "y": 940, "w": 270, "h": 300 } }, { "item": "viz_1ibEKiXT", "type": "block", "position": { "x": 580, "y": 90, "w": 270, "h": 300 } }, { "item": "viz_rYhOWilO", "type": "block", "position": { "x": 290, "y": 90, "w": 270, "h": 300 } }, { "item": "viz_HS70GboS", "type": "block", "position": { "x": 0, "y": 90, "w": 270, "h": 300 } }, { "item": "viz_cFoVJm3n", "type": "block", "position": { "x": 1170, "y": 90, "w": 270, "h": 300 } }, { "item": "viz_Oh0reZaV", "type": "block", "position": { "x": 0, "y": 390, "w": 290, "h": 300 } }, { "item": "viz_5jz1AX2v", "type": "block", "position": { "x": 280, "y": 390, "w": 290, "h": 300 } }, { "item": "viz_A5Kcf02B", "type": "block", "position": { "x": 580, "y": 390, "w": 290, "h": 300 } }, { "item": "viz_ymCZMl2z", "type": "block", "position": { "x": 1170, "y": 390, "w": 290, "h": 300 } }, { "item": "viz_rjA0XgMd", "type": "block", "position": { "x": 0, "y": 0, "w": 290, "h": 80 } }, { "item": "viz_c4r59ekz", "type": "block", "position": { "x": 290, "y": 0, "w": 290, "h": 80 } }, { "item": "viz_hFoP7IsM", "type": "block", "position": { "x": 580, "y": 0, "w": 290, "h": 80 } }, { "item": "viz_7lRWKreL", "type": "block", "position": { "x": 1150, "y": 10, "w": 290, "h": 80 } } ], "globalInputs": [ "input_global_trp" ] }, "description": "", "title": "Chain Test" }