All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @Nrosemommy , could you better describe your infrastructure? have you a Forwarder? did you configured it to send logs to Splunk? did you confgured Splunk to receive logs? did you used an add-... See more...
Hi @Nrosemommy , could you better describe your infrastructure? have you a Forwarder? did you configured it to send logs to Splunk? did you confgured Splunk to receive logs? did you used an add-on? which logs do you want to send? how did you checked the situation you're reporting? Ciao. Giuseppe
@usukhbayar_g Check Point Firewalls do generate logs, but they primarily focus on security events and traffic data rather than detailed system information like CPU, memory usage, or hardware sensors.... See more...
@usukhbayar_g Check Point Firewalls do generate logs, but they primarily focus on security events and traffic data rather than detailed system information like CPU, memory usage, or hardware sensors. You should check if your firewall is logging these system details. Please check this document how to enable those logs to be monitored.  https://community.checkpoint.com/t5/Security-Gateways/Monitoring-RAM-and-CPU-usage/td-p/144877 
As @ITWhisperer points out, neither substring or regex is the correct tool to extract information from structured data such as JSON.  I assume that that so-called "string" is not the entire event bec... See more...
As @ITWhisperer points out, neither substring or regex is the correct tool to extract information from structured data such as JSON.  I assume that that so-called "string" is not the entire event because otherwise Splunk would have automatically extracted role at search time.  Suppose you have an event like this (I'm quite convinced that you missed a comma between '00"' and '"name'.)   stuff before ... {"code":"1234","bday":"15-02-06T07:02:01.731+00:00", "name":"Alex", "role":"student","age":"16"} stuff after ...   What you do is to use regex to extract the part that is compliant JSON (not a portion of it), then use spath or fromjson to extract all key-value pairs. The following should usually work:   | rex "^[^{]*(?<json_message>{.+})" | spath input=json_message   That sample data will return age bday code json_message name role 16 15-02-06T07:02:01.731+00:00 1234 {"code":"1234","bday":"15-02-06T07:02:01.731+00:00", "name":"Alex", "role":"student","age":"16"} Alex student Here is an emulation for you to play with and compare with real data   | makeresults | eval _raw = "stuff before ... {\"code\":\"1234\",\"bday\":\"15-02-06T07:02:01.731+00:00\", \"name\":\"Alex\", \"role\":\"student\",\"age\":\"16\"} stuff after ..." ``` data emulation above ```  
Hi @SN1  telent command need to run on forwader as mentioned by @gcusello and also hope you followed stpes menioned by @livehybrid 
@Nrosemommy  Could you clarify what you mean by "What are you looking for exactly?" Are you asking what specific information needs to be checked to determine where the data is being sent, or are you... See more...
@Nrosemommy  Could you clarify what you mean by "What are you looking for exactly?" Are you asking what specific information needs to be checked to determine where the data is being sent, or are you referring to something else?
ah yeh, dont often use top, guess that looks as close as we're gonna get.
I've spent hours and hours trying to figure out how to do what I thought would be very easy. There are examples of how to use the Pie chart on the Examples tab of this page: https://splunkui.splunk... See more...
I've spent hours and hours trying to figure out how to do what I thought would be very easy. There are examples of how to use the Pie chart on the Examples tab of this page: https://splunkui.splunk.com/Packages/visualizations/?path=%2FPie They all work great and have an option to show the code.  None of them do anything with the point.click event. On the Events tab, it shows the data changing when you click a wedge, but doesn't have an option to show the code.  How is it doing what it's doing? How do you wire up the event to fire your own code?  I started with the first example on the Examples tab, then tried to add a handler.  I've tried onClick, pointClick, onPointClick, onPressed, tried passing it in through options and as a last resort, tried every idea ChatGPT could come up with.  It finally gave up and suggested I ask here. If I wrap the Pie in a div and add an onClick handler to the div, I can capture the event and see which wedge was clicked.  But that seems messy and not how it's intended to be used. Anybody got any ideas on how to get this wired up? Just seeing what the code looks like that's running on the Events tab would do the trick.
Yes, I enabled inputs in the TA. I'm already receiving data for 2 linux entities.  Is data from the TA being indexed? No  
hello after this command on deployment server it is showing this error telnet: Unable to connect to remote host: Connection refused
Hello, I am using multiple tokens on a dashboard created in Dashboard Studio and have added default values for the same in json source. When I save and reload the dashboard, these default values fo... See more...
Hello, I am using multiple tokens on a dashboard created in Dashboard Studio and have added default values for the same in json source. When I save and reload the dashboard, these default values for tokens are removed and table viz (using these tokens) is showing as blank Can you please suggest how can I fix this? Thank you.   "defaults": {          "dataSources": {              "ds.search": {                  "options": {                      "queryParameters": {}                  }              }          },          "tokens": {              "default": {                  "tokStatus": {                      "value": "Status=\"*\""                  },                  "tokPlannedStartDt": {                      "value": "LIKE(isPlannedStartDtPassed,\"%\")"                  },                  "tokPlannedEndDt": {                      "value": "LIKE(isPlannedEndDtPassed,\"%\")"                  }              }          }      }
I'm not clear on what the data in type 1 is used for. You have object class in type 2 source as well as type 1 - I assume CartmanUser is meant to be cartmanUser What are the IAL to enforce values 1 ... See more...
I'm not clear on what the data in type 1 is used for. You have object class in type 2 source as well as type 1 - I assume CartmanUser is meant to be cartmanUser What are the IAL to enforce values 1 and 2 supposed to correlate to, to give you the 1 and 2 in your tables?  
Hello,  I tasked to create dashboard on Splunk that shows Check Point Firewall system information. Is there any way that Splunk can display Check Point Firewall system information? Such as CPU, me... See more...
Hello,  I tasked to create dashboard on Splunk that shows Check Point Firewall system information. Is there any way that Splunk can display Check Point Firewall system information? Such as CPU, memory, traffic rate also hardware sensors (if possible). We use Check Point app for Splunk. But looks like it has no information related to system info. I don't even know if Check Point Firewall creates logs of this kind of information. Below screenshots are from Smart Console.  Splunk Enterprise 9.3.1 Check Point Firewall R81.20 Any ideas would be appreciated.    
I noticed my iPad is getting data and I didn’t open a splunk account. How do I figure out where it is going?
Hi, I am trying to install Splunk version 0.116.0 in an EKS cluster but getting error in operator pod: webhook \minstrumentation.kb.io\ : failed to call webhook: Post \"https://splunk-otel-collec... See more...
Hi, I am trying to install Splunk version 0.116.0 in an EKS cluster but getting error in operator pod: webhook \minstrumentation.kb.io\ : failed to call webhook: Post \"https://splunk-otel-collector-operator-webhook.namespace.svc:443/mutate-opentelemetry-io-v1aplha1-instrumentation?timeout=10s: no endpoints available of service"\ . I am unable to understand what is going on here. I have been following the latest docs on 0.116.0 onwards install guide. I had no problem on 0.113.0. The doc does mention to see a similar error, but there was no race condition while enabling operator. Pods are running successfully but operator pod logs show the above error. Thanks Divya
I am not sure where to even start on this one.    I have 2 log file types I need to extract data to get final accounts. I need to combine by objectClasses so that when on a given day "ial to enforc... See more...
I am not sure where to even start on this one.    I have 2 log file types I need to extract data to get final accounts. I need to combine by objectClasses so that when on a given day "ial to enforce" in log Type 2 is sets the count for number of Type 1 events. I need to run this over a year. Thank you in advance!!!!  -----Type 1 2025-01-01 00:00:00,125 trackingid="tid:13256464"message='{"UserAccessSubmission":{"uuid":"abc123","mail":"sean@southpark.net","trackingId":"tid:13256464","objectClass":"cartmanUser","csp":"Butters"}}' 2025-01-01 00:01:00,125 trackingid="tid:13256464"message='{"UserAccessSubmission":{"uuid":"abc123","mail":"sean@southpark.net","trackingId":"tid:13256464","objectClass":"cartmanUser","csp":"Butters"}}' 2025-01-02 00:01:00,125 trackingid="tid:13256464"message='{"UserAccessSubmission":{"uuid":"abc123","mail":"sean@southpark.net","trackingId":"tid:13256464","objectClass":"cartmanUser","csp":"Butters"}}' 2025-01-02 00:01:00,125 trackingid="tid:13256464"message='{"UserAccessSubmission":{"uuid":"abc123","mail":"sean@southpark.net","trackingId":"tid:13256464","objectClass":"StanUser","csp":"Butters"}}' 2025-01-02 00:01:00,125 trackingid="tid:13256464"message='{"UserAccessSubmission":{"uuid":"abc123","mail":"sean@southpark.net","trackingId":"tid:13256464","objectClass":"StanUser","csp":"Butters"}}'   ------- Type 2 { [-] @message: { [-] attributeContract: { [-] extendedAttributes: [ [-] ] maskOgnlValues: false uniqueUserKeyAttribute: uuid } attributeMapping: { [-] attributeContractFulfillment: { [-] uuid: { [-] source: { [-] type: ADAPTER } value: uuid } } attributeSources: [ [-] ] issuanceCriteria: { [-] conditionalCriteria: [ [-] ] } } configuration: { [-] fields: [ [-] { [-] name: Application ObjectClass value: cartmanUser } { [-] name: Application Entitlement Attribute value: cartmanRole } { [-] name: IAL to Enforce value: 2 } } id: Cartman name: Cartman } @timestamp: 2025-01-01T00:00:01.833685 }   { [-] @message: { [-] attributeContract: { [-] extendedAttributes: [ [-] ] maskOgnlValues: false uniqueUserKeyAttribute: uuid } attributeMapping: { [-] attributeContractFulfillment: { [-] uuid: { [-] source: { [-] type: ADAPTER } value: uuid } } attributeSources: [ [-] ] issuanceCriteria: { [-] conditionalCriteria: [ [-] ] } } configuration: { [-] fields: [ [-] { [-] name: Application ObjectClass value: cartmanUser } { [-] name: Application Entitlement Attribute value: cartmanRole } { [-] name: IAL to Enforce value: 1 } } id: Cartman name: Cartman } @timestamp: 2025-01-02T00:00:01.833685 }     The Goal would be to get something like this Table 1   Ial to enforce is 2 CartmanUser 2     Table 2   Ial to enforce is 1 CartmanUser 1  
So, you would use it like this index = main source=xyz (TERM(A1) OR TERM(A2) ) ("- ENDED" OR "- STARTED" ) | rex field=TEXT "((A1-) |(A2-) )(?<Func>[^\-]+)" | eval Function=trim(Func), DAT = str... See more...
So, you would use it like this index = main source=xyz (TERM(A1) OR TERM(A2) ) ("- ENDED" OR "- STARTED" ) | rex field=TEXT "((A1-) |(A2-) )(?<Func>[^\-]+)" | eval Function=trim(Func), DAT = strftime(relative_time(_time, "+0h"), "%d/%m/%Y") | rename DAT as Date_of_reception | eval {Function}_TIME=_time | stats values(Date_of_reception) as Date_of_reception values(*_TIME) as *_TIME by JOBNAME ``` This adds in all the entries in the lookup at the end of the current results ``` | inputlookup append=t File.csv ``` This then joins all the lookup fields to your result data based on JOBNAME ``` | stats values(*) as * by JOBNAME ``` Now order the fields as needed and sort ``` | table JOBNAME Description Date_of_reception STARTED_TIME ENDED_TIME | sort -STARTED_TIME i.e. append all the lookup data to the end and collapse it on JOBNAME Note that your _time handling is a little strange - not sure what you're trying to do, but what's wrong with just | eval DATE_of_reception=strftime(_time, "%d/%m/%Y") Note also that if you have more than 1 START_TIME or END_TIME, the sort will not work correctly on the multivalue field.
Glad it worked, these types of makeresults search are insignificant, they only ever sit on the search head as they are never searching data from the indexers. I often use background searches and tok... See more...
Glad it worked, these types of makeresults search are insignificant, they only ever sit on the search head as they are never searching data from the indexers. I often use background searches and tokens to create data that can then be used in <html> panels. They don't consume much.
We also have this issue after upgrading to 9.4.0.  Deployment server still works at deploying apps so we have been ignoring it.  A solution would be nice though.
Hi @PickleRick , totally agree with you, some business models provide interesting challenges let's say..... Can you use OS environment variables in the inputs.conf? If so, would they only be read o... See more...
Hi @PickleRick , totally agree with you, some business models provide interesting challenges let's say..... Can you use OS environment variables in the inputs.conf? If so, would they only be read on UF start up? Cheers Andre
Using the data in the second picture, please show how you want it displayed in the layout of the first picture. It is not clear what the relationship between the two sets of data is.