All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi Andre.Penedo, Thanks for posting on community. Analysis: Seems like you are attempting to create a health rule using wildcard to enable dynamic database monitoring. Answer: Unfortunately, it'... See more...
Hi Andre.Penedo, Thanks for posting on community. Analysis: Seems like you are attempting to create a health rule using wildcard to enable dynamic database monitoring. Answer: Unfortunately, it's not possible to create a Health Rule that dynamically evaluates data from different tablespaces in AppDynamics. I tried to recreate your condition and make health rules using wildcards. However, it appears that specifying metrics using wildcards is only supported for JVM, machine, and CLR branches, and not for database custom metrics. Use Wildcards in Metric Definitions Specifying metrics using wildcards, is only supported in JVM, machine, and CLR branches. Suggestion: Currently, the best approach is to create individual health rules for each known tablespace, as you might have aware of. This ensures monitoring but requires manual updates when new tablespaces are added. However, This could be a valuable enhancement for future releases.How about submitting this as request to our Idea Exchange? Hope this help. Regards, Martina
Your illustrated fragment suggests that your raw events are either XML or contains XML documents.  I strongly discourage treating structured data such as XML as plain text.  Please post complete samp... See more...
Your illustrated fragment suggests that your raw events are either XML or contains XML documents.  I strongly discourage treating structured data such as XML as plain text.  Please post complete sample event. (Anonymize as needed.)
Hi @inventsekar.  OEL8 is Oracle Enterprise Linux 8.9. No, we haven't opened the case yet, it's in progress.
I have a raw Nessus file that I've processed by separating host names into individual hosts. However, I am encountering a problem with extracting data between <ReportItem> tags, especially when there... See more...
I have a raw Nessus file that I've processed by separating host names into individual hosts. However, I am encountering a problem with extracting data between <ReportItem> tags, especially when there are multiple lines involved (I have multiple report Items in one event under a hostname) .   Here is the regular expression I am using:   | rex field=_raw max_match=0 "\<ReportItem\s(?<pluginout>.*?)\<\/ReportItem\>" OR | rex field=_raw max_match=0 "\<ReportItem\s(?<pluginout>.*(\s+)?)\<\/ReportItem\>"     Unfortunately, it doesn't seem to capture anything that spans multiple lines, as shown in the example below:   "<ReportItem>     ...     (multiline content)     ... </ReportItem>"   Could you please help me adjust my regular expression to correctly capture multiline content within <ReportItem?   Note: ReportItem without multi lines are extracting fine.   any help would be appreciated  
Hello Splunkers!  You can now easily install Splunk Enterprise and the Universal Forwarder using this handy script. It supports all available versions and can be installed on any Linux distributio... See more...
Hello Splunkers!  You can now easily install Splunk Enterprise and the Universal Forwarder using this handy script. It supports all available versions and can be installed on any Linux distribution. For detailed installation steps, please visit : https://github.com/PraxisForge/Install_Splunk #Upgrade universal forwarder version (nix) #Splunk Enterprise(nix) #Universal Forwarder(nix) #Upgrade Splunk Enterprise version(nix)
Hello Splunkers!  You can now easily install Splunk Enterprise and the Universal Forwarder using this handy script. It supports all available versions and can be installed on any Linux distributio... See more...
Hello Splunkers!  You can now easily install Splunk Enterprise and the Universal Forwarder using this handy script. It supports all available versions and can be installed on any Linux distribution. For detailed installation steps, please visit : https://github.com/PraxisForge/Install_Splunk Upgrade universal forwarder version (nix) #Splunk Enterprise(nix) #Universal Forwarder(nix) #Upgrade Splunk Enterprise version(nix)
Alternatively, if you already have all possible paths and there are not too many, coalesce can be more succinct.  For the two path illustrated, | eval id = coalesce('response.resources.id', 'respone... See more...
Alternatively, if you already have all possible paths and there are not too many, coalesce can be more succinct.  For the two path illustrated, | eval id = coalesce('response.resources.id', 'respones.actors.id')
| spath input=data | foreach *.id [| eval id=if(isnotnull('<<FIELD>>'),'<<FIELD>>',id)]
I have splunk events that has a splunk field as json string named "data" I want to group these events by "id". This id could appear in 2 different path.     Event Type 1 data= {       "response... See more...
I have splunk events that has a splunk field as json string named "data" I want to group these events by "id". This id could appear in 2 different path.     Event Type 1 data= {       "response": {        "resources":[             {                "type": "loginUser",                "id": "1234"            }         ]  } }   Event Type 2 data= {        "response": {                "actors":                {                       "type": "loginUser",                      "id": "1234"               }         } }  
First, thank you for illustrating data, and explain your requirement clearly.  Second, the illustration appears to be a fragment of a valid JSON object.  Is this correct?  Is the "message" key a top ... See more...
First, thank you for illustrating data, and explain your requirement clearly.  Second, the illustration appears to be a fragment of a valid JSON object.  Is this correct?  Is the "message" key a top node in raw event?  Splunk should have given you a field "message" with the following value (no special instruction required): message gimlet::hardware_controller: State { target: Idle, state: Idle, cavity: 42400, fuel: 0, shutdown: None, errors: ErrorCode(E21)} Is this correct? Your problem is a simple one, but illustrating data correctly will save you lots of trouble in the future. Provided that top-node "message" field exists, all you need to do is   | rex field=message "ErrorCode\((?<error_code>[^\)]+)"   This is an emulation of a raw event that would give you that message field without instruction   | makeresults | eval _raw = "{\"message\":\"gimlet::hardware_controller: State { target: Idle, state: Idle, cavity: 42400, fuel: 0, shutdown: None, errors: ErrorCode(E21)}\"}" | spath ``` data emulation above ```   Play with it and compare with real data.  Output using this emulation is _time _raw error_code message 2024-07-15 15:05:20 {"message":"gimlet::hardware_controller: State { target: Idle, state: Idle, cavity: 42400, fuel: 0, shutdown: None, errors: ErrorCode(E21)}"} E21 gimlet::hardware_controller: State { target: Idle, state: Idle, cavity: 42400, fuel: 0, shutdown: None, errors: ErrorCode(E21)}   Hope this helps
I am trying to ingest data from a CSV file. One of the columns in the CSV file contain SQL queries. The header has field names that are comma-separated, but the field containing the SQL queries is no... See more...
I am trying to ingest data from a CSV file. One of the columns in the CSV file contain SQL queries. The header has field names that are comma-separated, but the field containing the SQL queries is not being extracted correctly. It seems that if any of the SQL queries are longer than 1,000 characters, Splunk will only extract the first 1,000 characters of the query to the field. How can I make it so that Splunk will extract the entire SQL query from the CSV file? For example, here is the _raw for one of the rows in the CSV file: 2024-07-15 16:30:12.207504,job_name,24 9 * * *,Warehouse_ABCDEF,indexName,sourcetype,"SELECT Lowest_Tier_Rating, Application_Name, Base_Application_Name, Application_Status, Application_Short_Description, Application_Comments, Application_CDL_Associated_Legal_Entities, Application_Environment, Application_Function, Application_Platform, RegSCI, RegSCI_Indirect, RegSCI_Critical, Application_Title, Application_Support_Group, Application_Info_Classification, Application_Info_Risk, Sr_Systems_Director_Responsible_Name, Sr_Systems_Director_Responsible_ID, Team_Process_Lead_Responsible_Name, Team_Process_Lead_Responsible_ID, Executive_Director_Responsible_Name, Executive_Director_Responsible_ID, Systems_Director_Responsible_Name, Systems_Director_Responsible_ID, Managing_Director_Responsible_Name, Managing_Director_Responsible_ID, Team_Content_Lead_Responsible_Name, Team_Content_Lead_Responsible_ID, Area_Process_Lead_Responsible_Name, Area_Process_Lead_Responsible_ID, Area_Content_Lead_Responsible_Name, Area_Content_Lead_Responsible_ID, manager_Responsible_Name, Manager_Responsible_ID, Infrastructure_Team_Lead_Responsible_Name, Infrastructure_Team_Lead_Responsible_ID, Domain_Lead_Responsible_Name, Domain_Lead_Responsible_ID, ResolvedDeviceName, DeviceName, IP_Address, Common_Name, Host, Valid_From, Valid_To, Source, Validity_Period, Validity_Period_Months, Key_Size, Signature_Algorithm, Organizational_Unit, Issuer, Serial_Number, Contact, Installations, Nickname, Problems, NC, Port, Device, DN, Validity_Status, IsManaged\ FROM ""ab_cd_ef"".""Dashboards"".""ABC_Venafi_Certificate_MappedTo_ABCDEF_Applications"";",0,1   And here is the extracted SQL Query: SELECT Lowest_Tier_Rating, Application_Name, Base_Application_Name, Application_Status, Application_Short_Description, Application_Comments, Application_CDL_Associated_Legal_Entities, Application_Environment, Application_Function, Application_Platform, RegSCI, RegSCI_Indirect, RegSCI_Critical, Application_Title, Application_Support_Group, Application_Info_Classification, Application_Info_Risk, Sr_Systems_Director_Responsible_Name, Sr_Systems_Director_Responsible_ID, Team_Process_Lead_Responsible_Name, Team_Process_Lead_Responsible_ID, Executive_Director_Responsible_Name, Executive_Director_Responsible_ID, Systems_Director_Responsible_Name, Systems_Director_Responsible_ID, Managing_Director_Responsible_Name, Managing_Director_Responsible_ID, Team_Content_Lead_Responsible_Name, Team_Content_Lead_Responsible_ID, Area_Process_Lead_Responsible_Name, Area_Process_Lead_Responsible_ID, Area_Content_Lead_Responsible_Name, Area_Content_Lead_Responsible_ID, manager_Responsible_Name, Manager_Respo
I have a search yielding the following result "message":"gimlet::hardware_controller: State { target: Idle, state: Idle, cavity: 42400, fuel: 0, shutdown: None, errors: ErrorCode(E21)}" The value i... See more...
I have a search yielding the following result "message":"gimlet::hardware_controller: State { target: Idle, state: Idle, cavity: 42400, fuel: 0, shutdown: None, errors: ErrorCode(E21)}" The value in parenthesis will be blank if no error is detected, and can vary depending on the type of error detected. Possible values include: E1, E2, E3, E....,E21  I would like to extract the value within the parenthesis and note the first time it occurred, and place these results into a table How I can create a query which will identify the error code and place it and the time it occured into a table?    
Thank you for sharing text raw events.  I plugged your new text into emulation.  Still get the correct results.  Can you run the emulation and compare with real data?   | makeresults | eval data=mv... See more...
Thank you for sharing text raw events.  I plugged your new text into emulation.  Still get the correct results.  Can you run the emulation and compare with real data?   | makeresults | eval data=mvappend("[{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00031AdjPro\",\"TOTAL\":0,\"PROCESSED\":1,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00031AdjPro\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":2118,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00035\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":1,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00035Med\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":1,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00052Med\",\"TOTAL\":0,\"PROCESSED\":2898,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00052Med\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":3,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00075\",\"TOTAL\":0,\"PROCESSED\":94,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00075\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":2,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00075\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":59,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00119\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":3,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00201H\",\"TOTAL\":0,\"PROCESSED\":1,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00243\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":1,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00243\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":1,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00283H\",\"TOTAL\":0,\"PROCESSED\":7,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00283H\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":104,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00302\",\"TOTAL\":0,\"PROCESSED\":1395,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00302\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":5,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00304\",\"TOTAL\":0,\"PROCESSED\":299,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00304\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":16,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00310\",\"TOTAL\":0,\"PROCESSED\":2,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00365\",\"TOTAL\":0,\"PROCESSED\":588,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00365\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":619,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00479H\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":4,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00479H\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":1,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00607Bundle\",\"TOTAL\":0,\"PROCESSED\":1,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00646\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":1,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00646\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":2,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00681\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":1,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00721H\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":1,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_02071\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":2,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_02229\",\"TOTAL\":0,\"PROCESSED\":1,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_02278\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":1,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_HOSPICE_CLM\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":1,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":15}]", "[{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00011H\",\"TOTAL\":0,\"PROCESSED\":1,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":16},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00011H\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":35,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":16},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_00061\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":1,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":16},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_HOSPICE\",\"TOTAL\":0,\"PROCESSED\":27,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":16},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_HOSPICE\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":60,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":16},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_HOSPICE_CLM\",\"TOTAL\":0,\"PROCESSED\":1240,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":16},{\"TARGETSYSTEM\":\"CPW\",\"ARUNAME\":\"CPW_HOSPICE_CLM\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":14,\"PROCESSING\":0,\"DATE\":\"7/14/2024\",\"DAYHOUR\":16}]") | mvexpand data | rename data as _raw | spath ``` the above emulates index = ***** host=**** source=*** ```   The search used is the same as I posted previously (except substituting with your index search with emulation).   index = cba_hcck8s_UHGWM110-013948 host=prod_poc source=poc | fields - {}.* ``` optional ``` | spath path={} | mvexpand {} | fields - _* ``` optional ``` | spath input={} | eval totalCount = SKIPPED + PROCESSED | chart sum(SKIPPED) as SKIPPED,sum(PROCESSED) as Processed sum(totalCount) as TotalClaims by DAYHOUR DATE   And result is DAYHOUR Processed: 7/14/2024 SKIPPED: 7/14/2024 TotalClaims: 7/14/2024 15 5287 2930 8217 16 1268 110 1378
Hello, We started to investigate but we did not finish. The fast mode does not improve the research performances. We deactivated some operations in the props.conf file. The performances became norma... See more...
Hello, We started to investigate but we did not finish. The fast mode does not improve the research performances. We deactivated some operations in the props.conf file. The performances became normal when we deactivated the extracts operations. In the next steps, we planed to investigate further, by checking the internal logs, manually running the extracts operations, and check if some specific lines harm the add-on performances. Do not hesitate to answer if you have any other suggestions!
Well, this is a rather old thread and Greg hasn't been much online lately. You might get bigger chance of getting a reply if you post your question about an app in a new thread. (possibly linking to ... See more...
Well, this is a rather old thread and Greg hasn't been much online lately. You might get bigger chance of getting a reply if you post your question about an app in a new thread. (possibly linking to this one for reference).
I have problems with the integration of SentinelOne and Splunk Cloud. I'm using the app https://splunkbase.splunk.com/app/5433 and try 2 different SentinelOne consoles (set the URL and Token for eac... See more...
I have problems with the integration of SentinelOne and Splunk Cloud. I'm using the app https://splunkbase.splunk.com/app/5433 and try 2 different SentinelOne consoles (set the URL and Token for each console). I don't know if I am missing something or what is the problem. Can some one help me to understand this integration?  - Under API Configuration we have the URL usea1-***.sentinelone.net  and Token (maybe the token is not the correct one? Where can I find it?) - Under Inputs set the destination index for the logs. - Under Base Configuration I can't set the index created (the same as the Inputs tab) Btw, I don't have the administration of SentinelOne console.
https://docs.splunk.com/Documentation/SplunkCloud/9.2.2403/Data/IngestLookups If the data is being ingested into Splunk Enterprise, then in the transforms.conf file, you can configure an ingest-ti... See more...
https://docs.splunk.com/Documentation/SplunkCloud/9.2.2403/Data/IngestLookups If the data is being ingested into Splunk Enterprise, then in the transforms.conf file, you can configure an ingest-time eval that uses the lookup() eval function. This configuration method is only supported in Splunk Enterprise, not Splunk Cloud Platform. For more information, see the rest of the current documentation page. If you have access to the Edge Processor solution, you can use an Edge Processor to apply lookups to your data before routing that data to Splunk Enterprise or Splunk Cloud Platform. For more information, see About the Edge Processor solution and Enrich data with lookups using an Edge Processor in the Use Edge Processors manual.
Well, this is hard to believe (but not impossible). 1. Compare the search time for the same search in fast mode and verbose mode 2. Check "job inspect" screen for the same search with the addon ena... See more...
Well, this is hard to believe (but not impossible). 1. Compare the search time for the same search in fast mode and verbose mode 2. Check "job inspect" screen for the same search with the addon enabled and disabled. Compare where it's taking most of the time. Compare search job logs.
Hi, did you succed with this integration? I'm in the same situation...
And how would you tell one event from another? Specify what makes a line be a start of a new event.