All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello, Is ther any command we can use to check the version of Universal Forwarder that is running actually?
Hi Team, I want to pull the license usage stats particularly for 4 to 5 hosts for the last 30 days with a time span of 1 day in GB and bring it in a dashboard so kindly help out with the query.  ... See more...
Hi Team, I want to pull the license usage stats particularly for 4 to 5 hosts for the last 30 days with a time span of 1 day in GB and bring it in a dashboard so kindly help out with the query.   host information host 1 = xyz host 2 = abc host 3 = def host 4 = ghi host= vbg  
I would like to run a search query every few min, how can i do that. E.g. index = "a" sourcetype = "b" Any help is appreciated.
I read https://docs.splunk.com/Documentation/MLApp/5.3.1/API/SavingModels and it highlights the way of creating your custom model and saving it with codecs in a way that Splunk can understand.  Is ... See more...
I read https://docs.splunk.com/Documentation/MLApp/5.3.1/API/SavingModels and it highlights the way of creating your custom model and saving it with codecs in a way that Splunk can understand.  Is there any way that I can decode my model (not custom, rather one created via the fit command) so I can see the structure of my model and the properties?  I want to be able to export the model to use it in a personal system.  Appreciate the help,
I started with the following query, required to join a knowledge library with discovered hosts. The results are stored in a summary index for quick(er) retrieval.  The first query is usually less tha... See more...
I started with the following query, required to join a knowledge library with discovered hosts. The results are stored in a summary index for quick(er) retrieval.  The first query is usually less than 100 events, the second is 70,000+ every time, but the whole thing runs in less than 60s.  The problem is the 50,000 JOIN subsearch limitation.       index=qualys sourcetype="qualys:hostDetection" QID=* SEVERITY IN (3 4 5) LAST_SCAN_DATETIME=* earliest=-15min latest=now() | join type=inner QID [search index=qualys sourcetype="qualys:knowledgebase" QID=* earliest=1 latest=now() | dedup QID | table QID THREAT_INTEL_VALUES CVSS_V3_BASE PUBLISHED_DATETIME THREAT_INTEL_IDS VENDOR_REFERENCE TITLE] |table _time IP DNS NETBIOS TRACKING_METHOD OS TAGS QID TITLE TYPE SEVERITY STATUS LAST_SCAN_DATETIME LAST_FOUND_DATETIME LAST_FIXED_DATETIME PUBLISHED_DATETIME THREAT_INTEL_VALUES THREAT_INTEL_IDS CVSS_V3_BASE VENDOR_REFERENCE RESULTS       In order to overcome the JOIN/subsearch limit and maybe gain some efficiencies I tried using eventstats instead.  The resultant query is below, and runs for over an hour with questionable results (never really finishes.  Pretty sure it is not giving me the same output as the JOIN.  What am I doing wrong? New query     (index=syn_sec_qualys sourcetype="qualys:hostDetection" QID=* SEVERITY IN(3 4 5) LAST_SCAN_DATETIME=* earliest=-15m@m latest=now) OR (index=syn_sec_qualys sourcetype="qualys:knowledgebase" QID=* earliest=1 latest=now) | eventstats values(_time) AS _time, values(DNS) AS DNS values(TRACKING_METHOD) AS TRACKING_METHOD values(NETBIOS) AS NETBIOS values(OS) AS OS, values(TAGS) AS TAGS, values(TITLE) AS TITLE, values(TYPE) AS TYPE, values(SEVERITY) AS SEVERITY, values(LAST_SCAN_DATETIME) AS LAST_SCAN_DATETIME, values(LAST_FOUND_DATETIME) AS LAST_FOUND_DATETIME, values(LAST_FIXED_DATETIME) AS LAST_FIXED_DATETIME values(PUBLISHED_DATETIME) AS PUBLISHED_DATETIME values(THREAT_INTEL_VALUES) AS THREAT_INTEL_VALUES, values(THREAT_INTEL_IDS) AS THREAT_INTEL_IDS values(CVSS_V3_BASE) AS CVSS_V3_BASE, values(VENDOR_REFERENCE) AS VENDOR_REFERENCE, values(RESULTS) AS RESULTS BY IP, QID | TABLE _time IP DNS NETBIOS TRACKING_METHOD OS TAGS QID TITLE TYPE SEVERITY STATUS LAST_SCAN_DATETIME LAST_FOUND_DATETIME LAST_FIXED_DATETIME PUBLISHED_DATETIME THREAT_INTEL_VALUES THREAT_INTEL_IDS CVSS_V3_BASE VENDOR_REFERENCE RESULTS        
Updated the post since the error changed into "Error in 'where' command. The expression is malformed. Expected)" My aim is to use two separate stings of tokens in my search to pass the following: ... See more...
Updated the post since the error changed into "Error in 'where' command. The expression is malformed. Expected)" My aim is to use two separate stings of tokens in my search to pass the following: 1) "Start of the Month" + "Year"   ($month1$ $year$) 2) "End of the Month" + "Year" ($month2$ $year$) I was trying to combine the strings using <eval> within the <change> step but had no luck on finding a guide to implement it properly. Would it be better just to add the <eval> within the search or would it be better to do it during the change to avoid any complications?       <form theme="dark"> <label>CSC/ERSC/PSI PAGING Report</label> <fieldset submitButton="true" autoRun="true"> <input type="dropdown" token="lpar"> <label>Select to View</label> <choice value="----">----</choice> <choice value="D7X0">D7X0</choice> <choice value="H7X0">H7X0</choice> <choice value="D1D0">D1D0</choice> <choice value="DAD0">DAD0</choice> <choice value="E1D0">E1D0</choice> <choice value="H1D0">H1D0</choice> <choice value="WSYS">WSYS</choice> <choice value="YSYS">YSYS</choice> <default>----</default> </input> <input type="dropdown" token="year"> <label>Select Year</label> <choice value="----">----</choice> <choice value="2022">2022</choice> <default>----</default> </input> <input type="dropdown" token="month1"> <label>Select Month</label> <choice value="****">****</choice> <choice value="01/01/">January</choice> <choice value="02/01/">February</choice> <choice value="03/01/">March</choice> <choice value="04/01/">April</choice> <choice value="05/01/">May</choice> <choice value="06/01/">June</choice> <choice value="07/01/">July</choice> <choice value="08/01/">August</choice> <choice value="09/01/">September</choice> <choice value="10/01/">Ocotber</choice> <choice value="11/01/">November</choice> <choice value="12/01/">December</choice> <default>****</default> <change> <condition label="****"> <set token="month2">----</set> </condition> <condition label="January"> <set token="month2">01/31/</set> </condition> <condition label="February"> <set token="month2">02/29/</set> </condition> <condition label="March"> <set token="month2">03/31/</set> </condition> <condition label="April"> <set token="month2">04/30/</set> </condition> <condition label="May"> <set token="month2">05/31/</set> </condition> <condition label="June"> <set token="month2">06/30/</set> </condition> <condition label="July"> <set token="month2">07/31/</set> </condition> <condition label="August"> <set token="month2">08/31/</set> </condition> <condition label="September"> <set token="month2">09/30/</set> </condition> <condition label="October"> <set token="month2">10/31/</set> </condition> <condition label="November"> <set token="month2">11/30/</set> </condition> <condition label="December"> <set token="month2">12/31/</set> </condition> </change> </input> </fieldset> <row> <panel> <chart> <search> <query>index=mainframe-platform sourcetype="mainframe:mpage" MVS_SYSTEM_ID=$lpar$ | eval DATE=strftime(strptime(DATE,"%d%b%Y"),"%Y-%m-%d") | eval _time=strptime(DATE." ","%Y-%m-%d") | where _time &gt;= strptime("$month1$""$year$", "%m/%d/%Y") AND _time &lt;= strptime("$month2$""$year$", "%m/%d/%Y") | chart sum(VIO_PAGING_SEC) as "$lpar$ Sum of VIO_PAGING_SEC" sum(SYSTEM_PAGEFAULTS_SEC) as "$lpar$ SYSTEM_PAGEFAULTS_SEC" sum(SWAP_PAGIN_SEC) as "$lpar$ SWAP_PAGIN_SEC" sum(LOCAL_PAGEFAULTS_SEC) as "$lpar$ LOCAL_PAGEFAULTS_SEC" over _time</query> <earliest>0</earliest> <latest></latest> </search> <option name="charting.chart">column</option> <option name="charting.drilldown">none</option> </chart> </panel> </row> </form>     Would appreciate the help.
Hi, I have an filter for selecting the country values, provided this as a drop down. we have options like singapore,malaysia,china,vietnam and also have an option of ALL. Based on the above selec... See more...
Hi, I have an filter for selecting the country values, provided this as a drop down. we have options like singapore,malaysia,china,vietnam and also have an option of ALL. Based on the above selection I have an panel that shows the success / failure counts graphs will appear. The issue I am facing is I am getting the values like(Null,Value,Other,18,38) in countryCode column if I run the dashboard. But I don't see any event with the countryCode parameter having this values. So, can you help on fixing this issue?? Thanks, Sahana   
I am looking for Splunk query to find out Windows remote desktop service status and also to find to port 3389 is listening on server..
I need to push data from a Splunk report to Graphite. I know there's an archived app in Splunkbase but I'm sure the python is incompatible with Splunk 8.2.4.  Anyone have a method for pushing schedul... See more...
I need to push data from a Splunk report to Graphite. I know there's an archived app in Splunkbase but I'm sure the python is incompatible with Splunk 8.2.4.  Anyone have a method for pushing scheduled report data to Graphite?
We are receiving data via a diode. However, event logs are from multiple hosts. How can we parse data from different hosts and direct it to the indexers?
Hi guys,   I'm a splunk noob here and I'm going nuts. I know this is an extremely simple search and I can't get it right. I'm trying to create a search for remote access applications based on ou... See more...
Hi guys,   I'm a splunk noob here and I'm going nuts. I know this is an extremely simple search and I can't get it right. I'm trying to create a search for remote access applications based on our firewall index. IP cidr will be pulled from a lookup file (network_assets.csv) and matching to the source ip from my events. There's fields from the lookup file that do not exist in the events. I'm particullarly interested in adding this field called usertags (which is included in the lookup).  I am using these links as a reference and I can't get it to work. https://community.splunk.com/t5/Splunk-Search/How-do-I-append-columns-to-a-search-via-inputlookup-where-the/m-p/402136 index=fw | search appcat=Remote.Access | search app!="RDP" AND app!="WMI.DCERPC" | lookup network_assets.csv cidr | eval cidr=src | search usertags="*server*" | table src dest app url appcat usertags My search currently does not give me any results. Any help would be much appreciated
Hi Friends,   I am trying to list out all the available splunk lookups and want to display count of records present in each lookups. However i found rest command to list out all the lookups but... See more...
Hi Friends,   I am trying to list out all the available splunk lookups and want to display count of records present in each lookups. However i found rest command to list out all the lookups but how to get count of records for each lookup ? Rest command :  | rest/servicesNS/-/-/data/lookup-table-files |table title  I want to display count of records present in each lookup in another column. Is it possible to display with SPL? Requesting your valuable feedback and help. Thank you in advance. Himanshu    
Hi All, I am new to the UF on Windows and here is the deployment in my lab: 1 Splunk Enterprise instance running on Centos8 1 UF running on Windows pointing to the instance above For now, I a... See more...
Hi All, I am new to the UF on Windows and here is the deployment in my lab: 1 Splunk Enterprise instance running on Centos8 1 UF running on Windows pointing to the instance above For now, I am able to retrieve the events on seach bar like "host="DESKTOP-JQJVH8A" source="WinEventLog:Security"". What I am confused is about the configuration file: outputs.conf: D:\SplunkUniversalForwarder\etc\system\local inputs.conf: D:\SplunkUniversalForwarder\etc\apps\SplunkUniversalForwarder\local Why is the inputs.conf not in the same directory as outputs.conf, is this owning to the installation? Say I would like to add some more stanzas in the inputs.conf, do I need to create a new inputs.conf in etc\system\local or modify the existing one in etc\apps\SplunkUniversalForwarder\local? Thanks.
Hi, I have a Splunk Cloud trial instance. I am using a Sprint Boot application to make a simple HttpPost call to the HEC in Batch mode.  The format of the event is JSON and I am not adding line bre... See more...
Hi, I have a Splunk Cloud trial instance. I am using a Sprint Boot application to make a simple HttpPost call to the HEC in Batch mode.  The format of the event is JSON and I am not adding line breaks between two events.  Splunk is receiving the requests and adding them as events. However, each of my events is getting truncated and is not showing up as a well formed JSON. I can see that the entire event is not being added, and when I measured the size of each event, it was coming up to 10kb. I then found this: https://docs.splunk.com/Documentation/Splunk/latest/Data/Configureeventlinebreaking Specifically, I think I'm being impacted by this: The Splunk platform uses the LINE_BREAKER and TRUNCATE settings to evaluate and break events over 10kB into multiple lines of 10kB each. Questions 1. Is there no way to send events to Splunk Cloud larger than 10 kb? 2. If it is indeed supported, what configuration do we need which can be performed via Splunk Web, since we don't have access to config files etc in Splunk Cloud? Is it something related to Source Types (Advanced config)?
My company was acquired, we just migrated email domains, but need to update all user's email addresses so they can use google auth to sign in. I can't modify emails addresses as an admin. Any simple ... See more...
My company was acquired, we just migrated email domains, but need to update all user's email addresses so they can use google auth to sign in. I can't modify emails addresses as an admin. Any simple solution?
I am trying to Install the Cluster Agent with the Kubernetes CLI I am getting the below error  #kubectl create -f cluster-agent.yaml error: error validating "cluster-agent.yaml": error validating ... See more...
I am trying to Install the Cluster Agent with the Kubernetes CLI I am getting the below error  #kubectl create -f cluster-agent.yaml error: error validating "cluster-agent.yaml": error validating data: [ValidationError(Clusteragent.spec): unknown field "account" in com.appdynamics.v1alpha1.Clusteragent.spec, ValidationError(Clusteragent.spec): unknown field "appName" in com.appdynamics.v1alpha1.Clusteragent.spec, ValidationError(Clusteragent.spec): unknown field "controllerUrl" in com.appdynamics.v1alpha1.Clusteragent.spec, ValidationError(Clusteragent.spec): unknown field "serviceAccountName" in com.appdynamics.v1alpha1.Clusteragent.spec]; if you choose to ignore these errors, turn validation off with --validate=false When i change the apiVersion: appdynamics.com/v1 kubectl create -f cluster-agent.yaml error: resource mapping not found for name: "k8s-cluster-agent" namespace: "appdynamics" from "cluster-agent.yaml": no matches for kind "Clusteragent" in version "appdynamics.com/v1" ensure CRDs are installed first Please help me to resolve this kubectl version Client Version: v1.24.0 Kustomize Version: v4.5.4 Server Version: v1.22.6
Hi All, I am trying to built the parsing stanza for one of the data, while testing I am getting an pop-up message stating that "could not use the strptime to parse timestamp from “2022-26-05T11:29:... See more...
Hi All, I am trying to built the parsing stanza for one of the data, while testing I am getting an pop-up message stating that "could not use the strptime to parse timestamp from “2022-26-05T11:29:57”.   As soon as I apply the Time_Format stanza the Splunk is throwing the message.  I am not sure what I am missing here.  so could you please help me resolving this issue.   Event details: <Event CompactMode="1" sEventType="OpResult" dwBasicEventType="9" dwAppSpecificEventID="5000" sEventID="EVENT_ID_SCHEDULER_STARTED" sOriginatingApplicationName="RED Identity Management Console" sOriginatingApplicationComponent="Scheduler" sOriginatingApplicationVersion="5.5.3.0" sOriginatingSystem="XXXXXXXXXXXXX" sOriginatingAccount="XXXX\XXXXX" dtPostTime="2022-26-05T11:29:57" sMessage="RED Identity Management Console (running as user XXXX\XXXXX) on system XXXXXXXXXXXXX; - background processor started"/> Props stanza SHOULD_LINEMERGE=false LINE_BREAKER=([\r\n]+)\<Event NO_BINARY_CHECK=true TIME_PREFIX=dtPostTime\=\" TIME_FORMAT=%Y-%m-%dT%H:%M:%S MAX_TIMESTAMP_LOOKAHEAD=20 Event Details: [5/26/2022 4:09:55 PM UTC] Note: Unknown provider type; cannot verify object name 'tbl_BaseJobInfo' valid for data store. Props.conf SHOULD_LINEMERGE=false LINE_BREAKER=([\r\n]+)\[\d+\/\d{2}\/\d{4}\s\d+\:\d{2}\:\d{2}\s[^\]]+\] NO_BINARY_CHECK=true disabled=false TIME_PREFIX=^\[ TIME_FORMAT=%m-%d-%Y %I:%M:%S %p %s MAX_TIMESTAMP_LOOKAHEAD=25
Hello, I'm having a problem with Dashboard Studio in Splunk Enterprise (version 8.2.5). I would like to create a visualization with a drilldown that lets the user click on a given data point (for... See more...
Hello, I'm having a problem with Dashboard Studio in Splunk Enterprise (version 8.2.5). I would like to create a visualization with a drilldown that lets the user click on a given data point (for example a bar in a bar chart) or a given record of a table and open a new dashboard that contains more detailed visualizations. As far as I know this is possibile by adding a Link to custom URL drilldown and providing the URL of the dashboard in the configuration. Now I would like to drill down to a "details" dashboard but setting the value of a token, so that the visualizations in this "details" dashboard are already filtered by a value in a given field, e.g. /app/search/my_destination_dashboard?form.my_field=$passed_token|u$ However the value that should be assigned to the token is dynamic, i.e. it should depends on the particular data point that the user clicked. For example, If I click on a particular record of a table the value of the token (passed_token) should be set to clicked cell value. It seems this is possibile with Dashboard Studio in Splunk Cloud Platform (see here). But I was not able to reproduce it in Splunk Enterprise. Here it is an example. The following is the JSON definition of the "source" dashboard (made with Studio) where the table visualization has a Link to custom URL drilldown:   { "visualizations": { "viz_aWKTkUpc": { "type": "splunk.table", "dataSources": { "primary": "ds_e3l7tAe8" }, "title": "Number of events per item", "eventHandlers": [ { "type": "drilldown.customUrl", "options": { "url": "/app/search/test__target_1?form.item_name=$item_name|u$", "newTab": true } } ], "description": "Selected item: $item_name$" } }, "dataSources": { "ds_vW29Fvqp": { "type": "ds.search", "options": { "query": "| makeresults count=100 \n| eval _items=\"banana,apple,grapefruit,lemon,orange\" \n| makemv delim=\",\" _items \n| eval _a=10 \n| eval _rand_i = random() % _a \n| eval _n=mvcount(_items) \n| eval _j = _rand_i % _n \n| eval item = mvindex(_items, _j) " }, "name": "base" }, "ds_e3l7tAe8": { "type": "ds.chain", "options": { "extend": "ds_vW29Fvqp", "query": "| stats count by item" }, "name": "table" }, "ds_1FI28nVT": { "type": "ds.chain", "options": { "query": "| stats count by item \n| table item", "extend": "ds_vW29Fvqp" }, "name": "item_list" } }, "defaults": { "dataSources": { "ds.search": { "options": { "queryParameters": { "latest": "$global_time.latest$", "earliest": "$global_time.earliest$" } } } } }, "inputs": { "input_global_trp": { "type": "input.timerange", "options": { "token": "global_time", "defaultValue": "0," }, "title": "Global Time Range" }, "input_Sc6kQbF9": { "options": { "items": [ { "label": "All", "value": "*" } ], "defaultValue": "*", "token": "item_name" }, "title": "Select item", "type": "input.dropdown", "dataSources": { "primary": "ds_1FI28nVT" }, "encoding": { "label": "primary[0]", "value": "primary[0]" } } }, "layout": { "type": "grid", "options": {}, "structure": [ { "item": "viz_aWKTkUpc", "type": "block", "position": { "x": 0, "y": 0, "w": 1200, "h": 282 } } ], "globalInputs": [ "input_global_trp", "input_Sc6kQbF9" ] }, "description": "", "title": "Test - source" }   The following is the XML definition of the "target" dashboard (where I want to land):   <dashboard> <label>Test - target 1</label> <search id="base"> <query> | makeresults count=100 | eval _items="banana,apple,grapefruit,lemon,orange" | makemv delim="," _items | eval _a=10 | eval _rand_i = random() % _a | eval _n=mvcount(_items) | eval _j = _rand_i % _n | eval item = mvindex(_items, _j) </query> <earliest>$time.earliest$</earliest> <latest>$time.latest$</latest> </search> <search id="sel_search" base="base"> <query> | search item=$form.item_name|s$ </query> </search> <fieldset submitButton="false" autoRun="true"> <input type="time" token="time" searchWhenChanged="true"> <label>Time</label> <default> <earliest>0</earliest> <latest></latest> </default> </input> <input type="dropdown" token="item_name" searchWhenChanged="true"> <label>Select item</label> <search base="base"> <query> | stats count by item | table item </query> </search> <fieldForLabel>item</fieldForLabel> <fieldForValue>item</fieldForValue> <initialValue>*</initialValue> <default>*</default> <choice value="*">All</choice> </input> </fieldset> <row> <panel id="selected_item"> <html> <style> #selected_item { text-align: left; } </style> <p>Selected item name: <b>$form.item_name$</b> </p> </html> </panel> </row> <row> <panel> <title>Events</title> <table> <search id="table_events" base="sel_search"> <query> | table _time, item </query> </search> <option name="drilldown">none</option> </table> </panel> </row> </dashboard>   The drilldown works but you must first set the token value by using the dropdown input and then you can click on the table. II tried to modify the JSON definitio of the "source" dashboard" as shown in the example in the doc for the Splunk Cloud Dashboard Studio:   { "visualizations": { "viz_aWKTkUpc": { "type": "splunk.table", "dataSources": { "primary": "ds_e3l7tAe8" }, "title": "Number of events per item", "eventHandlers": [ { "type": "drilldown.setToken", "options": { "tokens": [ { "token": "item_name", "key": "row.item.value" } ] } } ], "description": "Selected item: $item_name$" } }, "dataSources": { "ds_vW29Fvqp": { "type": "ds.search", "options": { "query": "| makeresults count=100 \n| eval _items=\"banana,apple,grapefruit,lemon,orange\" \n| makemv delim=\",\" _items \n| eval _a=10 \n| eval _rand_i = random() % _a \n| eval _n=mvcount(_items) \n| eval _j = _rand_i % _n \n| eval item = mvindex(_items, _j) " }, "name": "base" }, "ds_e3l7tAe8": { "type": "ds.chain", "options": { "extend": "ds_vW29Fvqp", "query": "| stats count by item" }, "name": "table" } }, "defaults": { "dataSources": { "ds.search": { "options": { "queryParameters": { "latest": "$global_time.latest$", "earliest": "$global_time.earliest$" } } } }, "tokens": { "default": { "item_name": { "value": "*" } } } }, "inputs": { "input_global_trp": { "type": "input.timerange", "options": { "token": "global_time", "defaultValue": "0," }, "title": "Global Time Range" } }, "layout": { "type": "grid", "options": {}, "structure": [ { "item": "viz_aWKTkUpc", "type": "block", "position": { "x": 0, "y": 0, "w": 1200, "h": 300 } } ], "globalInputs": [ "input_global_trp" ] }, "description": "", "title": "Test - source - mod" }   i.e. removing the dropdown input, changing the eventHandlers type property to drilldown.setToken and adding a default value to the token item_name in the defaults section. Whenever I click on a row of the table, the token item_name should be assigned the value of the cell under the "item" column and the table description "Selected item name: ..." should be updated. But it seems not to work in Splunk Enterprise Dashboard Studio. However, even if it worked, the Set Token drilldown would only set the token value for the current dashboard and not redirect to an external URL. I need the drilldown to do both: set the token value and then open an URL where I pass the token value as a query parameter to land onto a "filtered" dashboad. Does anyone know if this type of drilldown is possibile with Dashboard Studio in Splunk Enterprise and how to do it?
Dear Sir or Madam, Could you please advise me about transferring logs to the Splunk server when there is no open port for listening? The only open port is 80 which is reversed proxied to 8000 thr... See more...
Dear Sir or Madam, Could you please advise me about transferring logs to the Splunk server when there is no open port for listening? The only open port is 80 which is reversed proxied to 8000 through Apache configurations for Splunk web UI as shown below: <VirtualHost *:80> ProxyPass         /  http://localhost:8000/                                                                                                ProxyPassReverse  /  http://localhost:8000/                                                                                                                </VirtualHost> I Will be so grateful if you advise me about the best solution for transferring logs without opening an additional port? I really appreciate your help and support. Kind Regards, Farid
i found for below query  the search is happening based on default time field which is  _time  , so when ever i am choosing the date and time based on default time which is '5/26/22 7:40:00.000 AM' th... See more...
i found for below query  the search is happening based on default time field which is  _time  , so when ever i am choosing the date and time based on default time which is '5/26/22 7:40:00.000 AM' then the events are populating but if i am selecting any date and time which is align with my custom time field which is 'originaltime'  then i am not getting any event , am i doing any thing wrong here index="summary_carrier_service" originalsource="*gps-request-processor-dev*" originalsourcetype= "*eu-central-1*" event="*Request" | fields event category labelType documentType regenerate businessKey businessValue sourceNodeType sourceNodeCode geoCode jobId status sourcetype source originaltime | addinfo | eval ts=strptime(originaltime,"%Y-%m-%d %H:%M:%S") | where (ts>info_min_time and ts<=info_max_time)