All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello, I tried to configure SplunkForwarder but I got this error message. Ple "Unable to initialize modular input”upload_pcap” defined in the app “splunk_app_stream.” Introspecting scheme=upload_pc... See more...
Hello, I tried to configure SplunkForwarder but I got this error message. Ple "Unable to initialize modular input”upload_pcap” defined in the app “splunk_app_stream.” Introspecting scheme=upload_pcap:script running failed (exited with code 1)"
There are many ways to do this, but using if function is perhaps my last choice.  Try this:   | rex field=index "_(?<app_id>\w+?)_(?<environment>(non_)*prod)"   Here is an emulation for you to pl... See more...
There are many ways to do this, but using if function is perhaps my last choice.  Try this:   | rex field=index "_(?<app_id>\w+?)_(?<environment>(non_)*prod)"   Here is an emulation for you to play with and compare with real data.   | makeresults format=csv data="index sony_app_XXXXXX_non_prod sony_app_XXXXXX_prod sony_app_123456_non_prod sony_app_xyzabc_prod" ``` the above emulates index = sony_* ```   Output from this emulation is app_id environment index app_XXXXXX non_prod sony_app_XXXXXX_non_prod app_XXXXXX prod sony_app_XXXXXX_prod app_123456 non_prod sony_app_123456_non_prod app_xyzabc prod sony_app_xyzabc_prod Hope this helps.
Hi, i want to move a file from a client into Deployment Server via Search Head. I was thinking of something like  | makeresults | eval content="the content of text file that need to be sent over... See more...
Hi, i want to move a file from a client into Deployment Server via Search Head. I was thinking of something like  | makeresults | eval content="the content of text file that need to be sent over to DS." | search [ | rest splunk_server=ds /services/search/jobs search="| outputlookup test.csv" ]   but it seems that the rest command does not support anything except search (it does not work with pipes after search either),  but it is not the same using rest from cli or rest queries from outside Splunk.  Since it would be a challenge to store credentials in an app protected, doing it using script or cli would be my last option. Doing it using the web interface would be better for further development. Thanks
First, when posting type 2 which is in JSON, please use raw text.  Splunk's "syntax highlights" view is non-compliant and very difficult to process. (See the crazy rex in my emulation below; you also... See more...
First, when posting type 2 which is in JSON, please use raw text.  Splunk's "syntax highlights" view is non-compliant and very difficult to process. (See the crazy rex in my emulation below; you also introduced additional syntax errors when attempting to simplify or anonymize.)  Also in type 2, you should preserve the uuid's value as that's the only key that distinguishes between the two.  For everyone's benefit, I'm posting reconstructed raw events from type 2:   { "@message": { "attributeContract": { "extendedAttributes": [ ], "maskOgnlValues": false, "uniqueUserKeyAttribute": "uuid" }, "attributeMapping": { "attributeContractFulfillment": { "uuid": { "source": { "type": "ADAPTER" }, "value": "9c5b94b1-35ad-49bb-b118-8e8fc24abf80" } }, "attributeSources": [ ], "issuanceCriteria": { "conditionalCriteria": [ ] } }, "configuration": { "fields": [ { "name": "Application ObjectClass", "value": "cartmanUser" }, { "name": "Application Entitlement Attribute", "value": "cartmanRole" }, { "name": "IAL to Enforce", "value": 2 } ], "id": "Cartman", "name": "Cartman" } }, "@timestamp": "2025-01-01T00:00:01.833685" } { "@message": { "attributeContract": { "extendedAttributes": [ ], "maskOgnlValues": false, "uniqueUserKeyAttribute": "uuid" }, "attributeMapping": { "attributeContractFulfillment": { "uuid": { "source": { "type": "ADAPTER" }, "value": "550e8400-e29b-41d4-a716-446655440000" } }, "attributeSources": [ ], "issuanceCriteria": { "conditionalCriteria": [ ] } }, "configuration": { "fields": [ { "name": "Application ObjectClass", "value": "cartmanUser" }, { "name": "Application Entitlement Attribute", "value": "cartmanRole" }, { "name": "IAL to Enforce", "value": 1 } ], "id": "Cartman", "name": "Cartman" } }, "@timestamp": "2025-01-02T00:00:01.833685" }   Like @bowesmana, I fail to see see the relevance of type 1.  Type 2 is all you need to produce the results you want.  I also don't see why you want to print two tables rather than printing one table with two rows (differentiated by UUID).  So, this is what I'm going to show. Actual code is pretty simple.  My main time was sunken in reconstruct valid JSON data from your pasted text.   | fields @message.attributeMapping.attributeContractFulfillment.uuid.value ``` ^^^ this line is just to declutter output ``` | spath path=@message.configuration.fields{} | eval restructured_fields = json_object() | foreach @message.configuration.fields{} mode=multivalue [eval restructured_fields = json_set(restructured_fields, json_extract(<<ITEM>>, "name"), json_extract(<<ITEM>>, "value"))] | spath input=restructured_fields   (This foreach syntax above requires Splunk 9.0.)  Output from the two reconstructed events is as follows: @message.attributeMapping.attributeContractFulfillment.uuid.value Application Entitlement Attribute Application ObjectClass IAL to Enforce 9c5b94b1-35ad-49bb-b118-8e8fc24abf80 cartmanRole cartmanUser 2 550e8400-e29b-41d4-a716-446655440000 cartmanRole cartmanUser 1 Does this satisfy your requirements? It is useful to print out the two intermediate JSON objects used in this search so you can clearly see dataflow: @message.configuration.fields{} restructured_fields { "name": "Application ObjectClass", "value": "cartmanUser" } { "name": "Application Entitlement Attribute", "value": "cartmanRole" } { "name": "IAL to Enforce", "value": 2 } {"Application ObjectClass":"cartmanUser","Application Entitlement Attribute":"cartmanRole","IAL to Enforce":2} { "name": "Application ObjectClass", "value": "cartmanUser" } { "name": "Application Entitlement Attribute", "value": "cartmanRole" } { "name": "IAL to Enforce", "value": 1 } {"Application ObjectClass":"cartmanUser","Application Entitlement Attribute":"cartmanRole","IAL to Enforce":1} @message.configuration.fields{}, of source, is extracted directly from raw data. Here is an emulation for you to play with and compare with real data type 2:   | makeresults | fields - _time | eval sourcetype = "type2", data = mvappend("{ [-] @message: { [-] attributeContract: { [-] extendedAttributes: [ [-] ] maskOgnlValues: false uniqueUserKeyAttribute: uuid } attributeMapping: { [-] attributeContractFulfillment: { [-] uuid: { [-] source: { [-] type: ADAPTER } value: 9c5b94b1-35ad-49bb-b118-8e8fc24abf80 } } attributeSources: [ [-] ] issuanceCriteria: { [-] conditionalCriteria: [ [-] ] } } configuration: { [-] fields: [ [-] { [-] name: Application ObjectClass value: cartmanUser } { [-] name: Application Entitlement Attribute value: cartmanRole } { [-] name: IAL to Enforce value: 2 } ] id: Cartman name: Cartman } } @timestamp: 2025-01-01T00:00:01.833685 }", "{ [-] @message: { [-] attributeContract: { [-] extendedAttributes: [ [-] ] maskOgnlValues: false uniqueUserKeyAttribute: uuid } attributeMapping: { [-] attributeContractFulfillment: { [-] uuid: { [-] source: { [-] type: ADAPTER } value: 550e8400-e29b-41d4-a716-446655440000 } } attributeSources: [ [-] ] issuanceCriteria: { [-] conditionalCriteria: [ [-] ] } } configuration: { [-] fields: [ [-] { [-] name: Application ObjectClass value: cartmanUser } { [-] name: Application Entitlement Attribute value: cartmanRole } { [-] name: IAL to Enforce value: 1 } ] id: Cartman name: Cartman } } @timestamp: 2025-01-02T00:00:01.833685 }") | rex field=data mode=sed "s/\[-]//g s/\n+([\w@])/\n\"\1/g s/([^\"]): (true|false|\d+\n)/\1\": \2/g s/([^\"]):(\W+\n)/\1\":\2/g s/([^\"]): (.+)/\1\": \"\2\"/g s/([\w\"}\]])\n([\"{\[])/\1,\n\2/g" | mvexpand data | rename data AS _raw | spath ``` data type 2 emulation above ``` (Can you see how crazy that rex command is?)   For completeness, this is how you extract data from type 1 in case it is of use to you:   | eval message = replace(message, "'", "") | spath input=message   message field should have been present at search type.  The result from your sample data is UserAccessSubmission.csp UserAccessSubmission.mail UserAccessSubmission.objectClass UserAccessSubmission.trackingId UserAccessSubmission.uuid sourcetype trackingid Butters sean@southpark.net cartmanUser tid:13256464 abc123 type1 tid:13256464 Butters sean@southpark.net cartmanUser tid:13256464 abc123 type1 tid:13256464 Butters sean@southpark.net cartmanUser tid:13256464 abc123 type1 tid:13256464 Butters sean@southpark.net StanUser tid:13256464 abc123 type1 tid:13256464 Butters sean@southpark.net StanUser tid:13256464 abc123 type1 tid:13256464 This is emulation of data type 1 used to extract the above.   | makeresults | fields - _time | eval sourcetype = "type1", data = split("2025-01-01 00:00:00,125 trackingid=\"tid:13256464\"message='{\"UserAccessSubmission\":{\"uuid\":\"abc123\",\"mail\":\"sean@southpark.net\",\"trackingId\":\"tid:13256464\",\"objectClass\":\"cartmanUser\",\"csp\":\"Butters\"}}' 2025-01-01 00:01:00,125 trackingid=\"tid:13256464\"message='{\"UserAccessSubmission\":{\"uuid\":\"abc123\",\"mail\":\"sean@southpark.net\",\"trackingId\":\"tid:13256464\",\"objectClass\":\"cartmanUser\",\"csp\":\"Butters\"}}' 2025-01-02 00:01:00,125 trackingid=\"tid:13256464\"message='{\"UserAccessSubmission\":{\"uuid\":\"abc123\",\"mail\":\"sean@southpark.net\",\"trackingId\":\"tid:13256464\",\"objectClass\":\"cartmanUser\",\"csp\":\"Butters\"}}' 2025-01-02 00:01:00,125 trackingid=\"tid:13256464\"message='{\"UserAccessSubmission\":{\"uuid\":\"abc123\",\"mail\":\"sean@southpark.net\",\"trackingId\":\"tid:13256464\",\"objectClass\":\"StanUser\",\"csp\":\"Butters\"}}' 2025-01-02 00:01:00,125 trackingid=\"tid:13256464\"message='{\"UserAccessSubmission\":{\"uuid\":\"abc123\",\"mail\":\"sean@southpark.net\",\"trackingId\":\"tid:13256464\",\"objectClass\":\"StanUser\",\"csp\":\"Butters\"}}'", " ") | mvexpand data | rename data AS _raw | extract ``` data type 1 emulation above ```    
Hi @dmoberg , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Poin... See more...
Hi @dmoberg , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
I was able to solve it.   By populating the dropdowns in the Dashboard from an Inputlookup (with data from a scheduled search), it started working to use the approach I detailed below (setting the ... See more...
I was able to solve it.   By populating the dropdowns in the Dashboard from an Inputlookup (with data from a scheduled search), it started working to use the approach I detailed below (setting the FirstLoad Token). Not sure exactly why this made it work but at least it works.
You also need to give permission to the splunk user to write to the volumes you are writing to on the indexer, i.e. if you have a /hot and /cold volume on the indexer, the splunk user needs to have o... See more...
You also need to give permission to the splunk user to write to the volumes you are writing to on the indexer, i.e. if you have a /hot and /cold volume on the indexer, the splunk user needs to have ownership and permissions to write to these volumes.
Hi @Nrosemommy , could you better describe your infrastructure? have you a Forwarder? did you configured it to send logs to Splunk? did you confgured Splunk to receive logs? did you used an add-... See more...
Hi @Nrosemommy , could you better describe your infrastructure? have you a Forwarder? did you configured it to send logs to Splunk? did you confgured Splunk to receive logs? did you used an add-on? which logs do you want to send? how did you checked the situation you're reporting? Ciao. Giuseppe
@usukhbayar_g Check Point Firewalls do generate logs, but they primarily focus on security events and traffic data rather than detailed system information like CPU, memory usage, or hardware sensors.... See more...
@usukhbayar_g Check Point Firewalls do generate logs, but they primarily focus on security events and traffic data rather than detailed system information like CPU, memory usage, or hardware sensors. You should check if your firewall is logging these system details. Please check this document how to enable those logs to be monitored.  https://community.checkpoint.com/t5/Security-Gateways/Monitoring-RAM-and-CPU-usage/td-p/144877 
As @ITWhisperer points out, neither substring or regex is the correct tool to extract information from structured data such as JSON.  I assume that that so-called "string" is not the entire event bec... See more...
As @ITWhisperer points out, neither substring or regex is the correct tool to extract information from structured data such as JSON.  I assume that that so-called "string" is not the entire event because otherwise Splunk would have automatically extracted role at search time.  Suppose you have an event like this (I'm quite convinced that you missed a comma between '00"' and '"name'.)   stuff before ... {"code":"1234","bday":"15-02-06T07:02:01.731+00:00", "name":"Alex", "role":"student","age":"16"} stuff after ...   What you do is to use regex to extract the part that is compliant JSON (not a portion of it), then use spath or fromjson to extract all key-value pairs. The following should usually work:   | rex "^[^{]*(?<json_message>{.+})" | spath input=json_message   That sample data will return age bday code json_message name role 16 15-02-06T07:02:01.731+00:00 1234 {"code":"1234","bday":"15-02-06T07:02:01.731+00:00", "name":"Alex", "role":"student","age":"16"} Alex student Here is an emulation for you to play with and compare with real data   | makeresults | eval _raw = "stuff before ... {\"code\":\"1234\",\"bday\":\"15-02-06T07:02:01.731+00:00\", \"name\":\"Alex\", \"role\":\"student\",\"age\":\"16\"} stuff after ..." ``` data emulation above ```  
Hi @SN1  telent command need to run on forwader as mentioned by @gcusello and also hope you followed stpes menioned by @livehybrid 
@Nrosemommy  Could you clarify what you mean by "What are you looking for exactly?" Are you asking what specific information needs to be checked to determine where the data is being sent, or are you... See more...
@Nrosemommy  Could you clarify what you mean by "What are you looking for exactly?" Are you asking what specific information needs to be checked to determine where the data is being sent, or are you referring to something else?
ah yeh, dont often use top, guess that looks as close as we're gonna get.
I've spent hours and hours trying to figure out how to do what I thought would be very easy. There are examples of how to use the Pie chart on the Examples tab of this page: https://splunkui.splunk... See more...
I've spent hours and hours trying to figure out how to do what I thought would be very easy. There are examples of how to use the Pie chart on the Examples tab of this page: https://splunkui.splunk.com/Packages/visualizations/?path=%2FPie They all work great and have an option to show the code.  None of them do anything with the point.click event. On the Events tab, it shows the data changing when you click a wedge, but doesn't have an option to show the code.  How is it doing what it's doing? How do you wire up the event to fire your own code?  I started with the first example on the Examples tab, then tried to add a handler.  I've tried onClick, pointClick, onPointClick, onPressed, tried passing it in through options and as a last resort, tried every idea ChatGPT could come up with.  It finally gave up and suggested I ask here. If I wrap the Pie in a div and add an onClick handler to the div, I can capture the event and see which wedge was clicked.  But that seems messy and not how it's intended to be used. Anybody got any ideas on how to get this wired up? Just seeing what the code looks like that's running on the Events tab would do the trick.
Yes, I enabled inputs in the TA. I'm already receiving data for 2 linux entities.  Is data from the TA being indexed? No  
hello after this command on deployment server it is showing this error telnet: Unable to connect to remote host: Connection refused
Hello, I am using multiple tokens on a dashboard created in Dashboard Studio and have added default values for the same in json source. When I save and reload the dashboard, these default values fo... See more...
Hello, I am using multiple tokens on a dashboard created in Dashboard Studio and have added default values for the same in json source. When I save and reload the dashboard, these default values for tokens are removed and table viz (using these tokens) is showing as blank Can you please suggest how can I fix this? Thank you.   "defaults": {          "dataSources": {              "ds.search": {                  "options": {                      "queryParameters": {}                  }              }          },          "tokens": {              "default": {                  "tokStatus": {                      "value": "Status=\"*\""                  },                  "tokPlannedStartDt": {                      "value": "LIKE(isPlannedStartDtPassed,\"%\")"                  },                  "tokPlannedEndDt": {                      "value": "LIKE(isPlannedEndDtPassed,\"%\")"                  }              }          }      }
I'm not clear on what the data in type 1 is used for. You have object class in type 2 source as well as type 1 - I assume CartmanUser is meant to be cartmanUser What are the IAL to enforce values 1 ... See more...
I'm not clear on what the data in type 1 is used for. You have object class in type 2 source as well as type 1 - I assume CartmanUser is meant to be cartmanUser What are the IAL to enforce values 1 and 2 supposed to correlate to, to give you the 1 and 2 in your tables?  
Hello,  I tasked to create dashboard on Splunk that shows Check Point Firewall system information. Is there any way that Splunk can display Check Point Firewall system information? Such as CPU, me... See more...
Hello,  I tasked to create dashboard on Splunk that shows Check Point Firewall system information. Is there any way that Splunk can display Check Point Firewall system information? Such as CPU, memory, traffic rate also hardware sensors (if possible). We use Check Point app for Splunk. But looks like it has no information related to system info. I don't even know if Check Point Firewall creates logs of this kind of information. Below screenshots are from Smart Console.  Splunk Enterprise 9.3.1 Check Point Firewall R81.20 Any ideas would be appreciated.    
I noticed my iPad is getting data and I didn’t open a splunk account. How do I figure out where it is going?