All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I would like know, if this feature has added now?
Hey Splunkers, I wanted to get a list of all the lookup files on my SH and their file sizes along with other data. I can't get the size from the rest API. Appreciate any and all answers.  Below... See more...
Hey Splunkers, I wanted to get a list of all the lookup files on my SH and their file sizes along with other data. I can't get the size from the rest API. Appreciate any and all answers.  Below are the searches I've been trying to use: | rest/servicesNS/-/-/data/lookup-table-files | rename "eai:acl.app" as app, "eai:acl.owner" as owner, "eai:acl.sharing" as sharing, "eai:data" as path | table title owner sharing app |foreach title [| inputlookup <<FIELD>> | foreach *      [| eval b_<<FIELD>>=len(<<FIELD>>) + 1 ]  | addtotals b_* fieldname=b  | stats sum(b) as b  | eval mb=b/1000/1000, gb=mb/1000  | fields mb] The foreach does not allow non-streaming commands and thus this does not work. Using a direct eval like below: | rest/servicesNS/-/-/data/lookup-table-files | rename "eai:acl.app" as app, "eai:acl.owner" as owner, "eai:acl.sharing" as sharing, "eai:data" as path | table title owner sharing app |eval size= [| inputlookup 'title' | foreach *      [| eval b_<<FIELD>>=len(<<FIELD>>) + 1 ]  | addtotals b_* fieldname=b  | stats sum(b) as b  | eval mb=b/1000/1000, gb=mb/1000  | fields mb] This also does not work since the inner search cannot see the outer values.  I have been trying to work with subsearches, foreach and the map command but couldn't get anywhere. Thanks in advance folks
That was one of my theories, but unfortunately, after checking, we do have some missing events. We only receive random events in XML and all events in wineventlog format.
Excellent! Is there a way of doing this directly with SPL?
Hello!! THanks for your answer! You are indeed correct! The event has some level that is treated as a Json, but nested in the "log" variable, the "stdout" variable has another dictionary within it t... See more...
Hello!! THanks for your answer! You are indeed correct! The event has some level that is treated as a Json, but nested in the "log" variable, the "stdout" variable has another dictionary within it that is being treated as a string, making it difficult to be worked with SPL. I did my research and it seems this might be an issue with the way the data is being parsed before arriving to splunk, before checking that I guess I'm stuck with searching for string literals Thank you for your time and help!!
Given that this doesn't appear to be wholly correct JSON, you could start with something like this | rex "DETAILS: (?<details>\[.*\])" | spath input=details
Look at my explanation above - your stdout field is not a json structure - it's a string containing a json structure so it cannot be automatically parsed as json structure. You have to take the stdou... See more...
Look at my explanation above - your stdout field is not a json structure - it's a string containing a json structure so it cannot be automatically parsed as json structure. You have to take the stdout field and manually use stdout on this field to parse out the fields from it.
Again - don't use the "by DateTime" clause. Do a normal timechart and then - if you want to wrap it by day, use the timewrap command.
I see... Well it seems like spath (and spl functionality in general) is working fine with the events, except for the contents in stdout... I spoke with an acquaintance and it looks like it's most li... See more...
I see... Well it seems like spath (and spl functionality in general) is working fine with the events, except for the contents in stdout... I spoke with an acquaintance and it looks like it's most likely due to the way the data is parsed before arriving to splunk. I can't thank you enough for your time and effort helping me!! It looks like this has to be checked outside of splunk tho, I'll close the ticket and come back with updates if I'm able to find a solution.
Hi, The beyond trust log fields are not getting extracted. I tried both Index time field extraction and Search time field extractions to extract the fields. Below are the sample logs. Dec 5 13:12... See more...
Hi, The beyond trust log fields are not getting extracted. I tried both Index time field extraction and Search time field extractions to extract the fields. Below are the sample logs. Dec 5 13:12:06 192.168.251.35 Agent Desc: "",Agent ID: "AppAudit",Agent Ver: "",Category: "PMM API SignAppIn",Source Host: "",Event Desc: "",Event Name: "Login",OS: "",Event Severity: "0",Source IP: "192.168.251.35",Event Subject: "106",Event Type: "0",User: "VPN-OTSA-EDMS-HANU",Workgroup Desc: "",Workgroup ID: "",Workgroup Location: "",AuditID: "2200680",ActionType: "Login",SystemName: "PMM API SignAppIn",AppUserID: "106",CreateDate: "12/5/2023 1:11:34 PM",UserName: "VPN-OTSA-EDMS-HANU",IPAddress: "192.168.251.35",Mapped Credential: "Primary",Mapped Credential Id: "2",Mapped Credential Description: "OFID-PS-Usersync",Mapped Credential Platform: "ActiveDirectory",Mapped Credential Domain/Server: "opecfund.org",Authenticate Credential Id: "2",Authenticate Credential UserName: "opecfund.org\OFID-PS-Usersync@opecfund.org",Authenticate Credential Description: "OFID-PS-Usersync",Authenticate Credential Platform: "ActiveDirectory",Domain Name: "opecfund.org",SAM Account Name: "VPN-OTSA-EDMS-HANU",Group: "opecfund.org\OFID-BTPRAPS-Vendor",Authentication Type: "Active Directory via API" Dec 5 13:12:06 192.168.251.35 Agent Desc: "",Agent ID: "AppAudit",Agent Ver: "",Category: "PMM API SignAppIn",Source Host: "",Event Desc: "",Event Name: "Login",OS: "",Event Severity: "0",Source IP: "192.168.251.35",Event Subject: "38",Event Type: "0",User: "SA-BTPSGlobalRequestor4SRAAPI",Workgroup Desc: "",Workgroup ID: "",Workgroup Location: "",AuditID: "2200678",ActionType: "Login",SystemName: "PMM API SignAppIn",AppUserID: "38",CreateDate: "12/5/2023 1:11:23 PM",UserName: "SA-BTPSGlobalRequestor4SRAAPI",IPAddress: "192.168.251.35",Authentication Type: "API" Dec 5 13:12:06 192.168.251.35 Agent Desc: "",Agent ID: "AppAudit",Agent Ver: "",Category: "PMM API Authentication Rule Failure",Source Host: "",Event Desc: "",Event Name: "Login",OS: "",Event Severity: "0",Source IP: "192.168.251.35",Event Subject: "-1",Event Type: "0",User: "opecfund.org\SA-BTPSGlobalRequestor4SRAAPI",Workgroup Desc: "",Workgroup ID: "",Workgroup Location: "",AuditID: "2200677",ActionType: "Login",SystemName: "PMM API Authentication Rule Failure",AppUserID: "-1",CreateDate: "12/5/2023 1:11:23 PM",UserName: "opecfund.org\SA-BTPSGlobalRequestor4SRAAPI",IPAddress: "192.168.251.35",Host Name: "SVR-BTPS01",User Name: "opecfund.org\SA-BTPSGlobalRequestor4SRAAPI",API Key: "****************************************************************************************************************************4416",IP Address: "192.168.251.35",Authentication Rule: "API Key",Message: "Invalid RunAs - UserNameOrPasswordAreIncorrect" Dec 5 13:11:06 192.168.251.35 Agent Desc: "",Agent ID: "AppAudit",Agent Ver: "",Category: "PMM API SignAppIn",Source Host: "",Event Desc: "",Event Name: "Login",OS: "",Event Severity: "0",Source IP: "192.168.251.35",Event Subject: "38",Event Type: "0",User: "SA-BTPSGlobalRequestor4SRAAPI",Workgroup Desc: "",Workgroup ID: "",Workgroup Location: "",AuditID: "2200675",ActionType: "Login",SystemName: "PMM API SignAppIn",AppUserID: "38",CreateDate: "12/5/2023 1:10:28 PM",UserName: "SA-BTPSGlobalRequestor4SRAAPI",IPAddress: "192.168.251.35",Authentication Type: "API". Can someone help here Below are the props and transform which i tried on Index time field extraction [beyondtrust] KV_MODE = none SHOULD_LINEMERGE = false LINE_BREAKER = ([\r\n]+)(\w{3} \d{1,2} \d{2}:\d{2}:\d{2} \d+\.\d+\.\d+\.\d+) NO_BINARY_CHECK = true REPORT-keyvaluepairs = keyvalue [keyvalue] REGEX = (\w+\s?\w+): "[^"]*" FORMAT = $1::$2 MV_ADD = true   Search time field extraction: [beyondtrust] EXTRACT-AgentDesc = Agent Desc: "(?P<Agent_Desc>[^"]+)" EXTRACT-AgentID = Agent ID: "(?P<Agent_ID>[^"]+)" EXTRACT-AgentVer = Agent Ver: "(?P<Agent_Ver>[^"]+)" EXTRACT-Category = Category: "(?P<Category>[^"]+)" EXTRACT-SourceHost = Source Host: "(?P<Source_Host>[^"]+)" EXTRACT-EventDesc = Event Desc: "(?P<Event_Desc>[^"]+)" EXTRACT-EventName = Event Name: "(?P<Event_Name>[^"]+)" EXTRACT-OS = OS: "(?P<OS>[^"]+)" EXTRACT-EventSeverity = Event Severity: "(?P<Event_Severity>\d+)" EXTRACT-SourceIP = Source IP: "(?P<Source_IP>[^"]+)" EXTRACT-EventSubject = Event Subject: "(?P<Event_Subject>[^"]+)" EXTRACT-EventType = Event Type: "(?P<Event_Type>\d+)" EXTRACT-User = User: "(?P<User>[^"]+)" EXTRACT-WorkgroupDesc = Workgroup Desc: "(?P<Workgroup_Desc>[^"]+)" EXTRACT-WorkgroupID = Workgroup ID: "(?P<Workgroup_ID>[^"]+)" EXTRACT-WorkgroupLocation = Workgroup Location: "(?P<Workgroup_Location>[^"]+)" EXTRACT-AuditID = AuditID: "(?P<Audit_ID>\d+)" EXTRACT-ActionType = ActionType: "(?P<Action_Type>[^"]+)" EXTRACT-SystemName = SystemName: "(?P<System_Name>[^"]+)" EXTRACT-AppUserID = AppUserID: "(?P<App_User_ID>[^"]+)" EXTRACT-CreateDate = CreateDate: "(?P<Create_Date>[^"]+)" EXTRACT-UserName = UserName: "(?P<UserName>[^"]+)" EXTRACT-IPAddress = IPAddress: "(?P<IPAddress>[^"]+)" EXTRACT-AuthenticationType = Authentication Type: "(?P<Authentication_Type>[^"]+)" EXTRACT-HostName = Host Name: "(?P<Host_Name>[^"]+)" EXTRACT-APIKey = API Key: "(?P<API_Key>[^"]+)" EXTRACT-IPAddress2 = IP Address: "(?P<IP_Address2>[^"]+)" EXTRACT-AuthenticationRule = Authentication Rule: "(?P<Authentication_Rule>[^"]+)" EXTRACT-Message = Message: "(?P<Message>[^"]+)"
Try something like this | table Etat, "Control-M", "Dynatrace", "ServicePilot", "Centreon" | fillnull value=0
Hi, I tried your solution but it didn't work. How do I modify the query to get the desired output. for below Query index="app_cleo_db"  origname="GEAC_Payroll*"  | rex "\sorigname=\"GEAC_Payroll\... See more...
Hi, I tried your solution but it didn't work. How do I modify the query to get the desired output. for below Query index="app_cleo_db"  origname="GEAC_Payroll*"  | rex "\sorigname=\"GEAC_Payroll\((?<digits>\d+)\)\d{8}_\d{6}\.xml\"" | search origname="*.xml" | eval Date = strftime(_time, "%Y-%m-%d %H:00:00") | eval DateOnly = strftime(_time, "%Y-%m-%d") | transaction DateOnly, origname | timechart span=1h count by DateOnly   I am getting below output time               2023-12-02  2023-12-03 2023-12-02 00:00  0           0 2023-12-02 01:00  0           0 2023-12-02 02:00  0           0 2023-12-02 03:00  0           0 2023-12-02 04:00  0           0 2023-12-02 05:00  0           0 2023-12-02 06:00  0           0 2023-12-02 07:00  1           0 2023-12-02 08:00  0           0 2023-12-02 09:00  0           0 2023-12-02 10:00  2           0 2023-12-02 11:00  1           0 2023-12-02 12:00  1           0 2023-12-02 13:00  1           0 2023-12-02 14:00  3           0 2023-12-02 15:00  4           0 2023-12-02 16:00  0           0 2023-12-02 17:00  0           0 2023-12-02 18:00  0           0 2023-12-02 19:00  0           0 2023-12-02 20:00  0           0 2023-12-02 21:00  0           0 2023-12-02 22:00  0           0 2023-12-02 23:00  0           0 2023-12-03 00:00  0           0 2023-12-03 01:00  0           0 2023-12-03 02:00  0           0 2023-12-03 03:00  0           0 2023-12-03 04:00  0           1 2023-12-03 05:00  0           3 2023-12-03 06:00  0           202 2023-12-03 07:00  0           52 2023-12-03 08:00  0           141 2023-12-03 09:00  0           188 2023-12-03 10:00  0           256 2023-12-03 11:00  0           185 2023-12-03 12:00  0           121 2023-12-03 13:00  0           52 2023-12-03 14:00  0           32 2023-12-03 15:00  0           9 2023-12-03 16:00  0           0 2023-12-03 17:00  0           0 2023-12-03 18:00  0           0 2023-12-03 19:00  0           0 2023-12-03 20:00  0           0 2023-12-03 21:00  0           0 2023-12-03 22:00  0           0 2023-12-03 23:00  0           0 but i want like below output like this where the 00:00 to 23:00 is stable time   2023-12-02  2023-12-03      00:00  0           0 01:00  0           0 02:00  0           0 03:00  0           0 04:00  0           1 05:00  0           3 06:00  0           202 07:00  1           52 08:00  0           141 09:00  0           188 10:00  2           256 11:00  1           185 12:00  1           121 13:00  1           52 14:00  3           32 15:00  4           9 16:00  0           0 17:00  0           0 18:00  0           0 19:00  0           0 20:00  0           0 21:00  0           0 22:00  0           0 23:00  0           0  
Hello   I'm using Splunk 9.1.1 . Like the title, I am looking for a way to check the cherrypy version of the Splunk. How can I check it? I look forward to hearing from you.
Hello community, I'm having a problem that's probably easy to solve, but I can't figure it out. I have a query that will query an index that contains alerts from Splunk OnCall. And I count each ale... See more...
Hello community, I'm having a problem that's probably easy to solve, but I can't figure it out. I have a query that will query an index that contains alerts from Splunk OnCall. And I count each alert source (via the associated routingkey from OnCall) and its status (Acknowledged or not). `victorops_incidents` | sort lastAlertTime desc | dedup incidentNumber | fields * | search org="*" routingKey=** pagedPolicies{}.policy.name!=0_Reroute_alertes currentPhase!=RESOLVED | eval currentPhase=case(like(currentPhase, "%UNACKED%"), "Non acquitté", like(currentPhase, "%ACKED%"), "En cours") | eval routingKey=case(like(routingKey, "%routingcontrol-m%"), "Control-M", like(routingKey, "%dyn%"), "Dynatrace", like(routingKey, "%centreon%"), "Centreon", like(routingKey, "%servicepilot%"), "ServicePilot", like(routingKey, "%p_1%"), "P1") | rename currentPhase as Etat, routingKey as Source | chart count by Etat, Source | sort - Etat I have an almost perfect table which summarizes everything but I am missing some information: I sometimes have a source which has not generated any alert so it is absent from the table (in the screen below, I have the sources "Control-M", "Dynatrace" and "ServicePilot" but I am missing "Centreon" because the latter did not have any incidents in the period of time) : My question is the following: how to make all the sources appear but display 0 when they have not had any alerts? Best regards, Rajaion
There is one tricky thing about this. A HF will work on its own perfectly well with the forwarder license. But if you have more than one HF and you want to add them to be monitored in the MC (some ... See more...
There is one tricky thing about this. A HF will work on its own perfectly well with the forwarder license. But if you have more than one HF and you want to add them to be monitored in the MC (some people do want to see their HFs in the MC as a "half-indexer" in order to see the queues and such), you might get alerts about duplicate use of the same license. In such case you need to add them to LM.
Also - you can "stack up" time specifiers. Look at this example:  
We've been through this exercise several times now.  Show us what you've learned.  What have you tried as a means of extracting fields from those events?  What were the results?  Have you at least id... See more...
We've been through this exercise several times now.  Show us what you've learned.  What have you tried as a means of extracting fields from those events?  What were the results?  Have you at least identified what fields must be extracted?
That's interesting (and I'm not being sarcastic here) because the docs as far back as 7.0 say the same - This refers to a binary or script in the bin folder of the app the alert action is define... See more...
That's interesting (and I'm not being sarcastic here) because the docs as far back as 7.0 say the same - This refers to a binary or script in the bin folder of the app the alert action is defined in, or to a path pointer file, also located in the bin folder.  
Hi all, i want to extract fields from event which is in json format INFO [processor: anchsdgeiskgcbc/5; event: 1-57d28402-9058-11ee-83b7-021a6f9d1f1c] : DETAILS: [ { "ERROR_MESSAGE": "\nError: ... See more...
Hi all, i want to extract fields from event which is in json format INFO [processor: anchsdgeiskgcbc/5; event: 1-57d28402-9058-11ee-83b7-021a6f9d1f1c] : DETAILS: [ { "ERROR_MESSAGE": "\nError: abchdvshsuaajs.\n", "NUMBER": "123r57", "DB_TIMESTAMP": "2023-11-30" }, { "ERROR_MESSAGE": "\nError: ehwegagsuabajehss.\n", "NUMBER": "63638w82u", "DB_TIMESTAMP": "2023-11-30" }, and similarly we have these error data in one event Fields to be extracted - ERROR_MESSAGE NUMBER DB_TIMESTAMP
hi, i built an app, when I run the app's action in a playbook, I dont have an option to get the data results. I used: action_result.add_data() but it didnt seem to make a difference, how can I so... See more...
hi, i built an app, when I run the app's action in a playbook, I dont have an option to get the data results. I used: action_result.add_data() but it didnt seem to make a difference, how can I solve it?