All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I am trying to get our Add-on that was developed for standalone Splunk to work in a SHC environment. The Add-on takes input from the user in a setup view and saves the configuration values via the R... See more...
I am trying to get our Add-on that was developed for standalone Splunk to work in a SHC environment. The Add-on takes input from the user in a setup view and saves the configuration values via the REST API using the Splunk JS SDK.  I am able to replicate  our sa_our_app.conf by adding this stanza in server.conf: [shclustering] conf_replication_include.sa_our_app = true We are able to replicate the setup view in the UI across the search  head members. The Add-on also uses a custom REST endpoint during setup to write the modular alert html (stored in /data/ui/alert).  Is there a way to replicate this html across all members of the SHC?
I feel i'm so close, but can't quite make it work. I've tried map and am now trying a sub search (I think it's a sub search). I'm trying to get the time difference between two events, but now using t... See more...
I feel i'm so close, but can't quite make it work. I've tried map and am now trying a sub search (I think it's a sub search). I'm trying to get the time difference between two events, but now using the "_time" field, instead using a timestamp field of my own. My events look something like this   {     action: "start",     correlationId:"_GUID_",     timestamp: "2021-07-13T03:44:46.100Z" } {     action: "end",     correlationId:"_GUID_",     timestamp: "2021-07-13T03:44:46.260Z" }     And my query so far is index=* action=start | eval start_time=timestamp | join correlationId [ search index=action=end | eval end_time=timestamp ] | eval timeTaken=end_time-start_time But timeTaken is never populated. It seams my `timestamp` field has a "none" in it as well as a timestamp, but i'm not sure why as the raw text does not have any spaces or anything.   I also tried a selfjoin, that overwrite the first `timestamp` with the second one, and a map, which came back with no results. 
Hello there. I noticed lately (in a kinda painful way ) that if the time field is present in json sent to a HEC collector endpoint, the timestamp is not getting parsed from the message. But since ... See more...
Hello there. I noticed lately (in a kinda painful way ) that if the time field is present in json sent to a HEC collector endpoint, the timestamp is not getting parsed from the message. But since the documentation differs between 8.0 and 7.x in this regard (https://docs.splunk.com/Documentation/Splunk/7.3.9/Data/HECRESTendpoints - doesn't say a word about timestamp parsing whereas 8.0.0 gives whole paragraph about optional parameter affecting the parsing) - does anyone know whether the 7.x versions behaved the same way? I mean - did the timestamp parsing in 7.x also wasn't performed at all if the time field was present? Did the behaviour changed or was it simply that the docs were supplemented?
Good morning, all! I am trying to fill in a table based on if an IP address is in a lookup. I have a lookup table called "IPAddresses.csv" with the addresses in a column called "value", and a field i... See more...
Good morning, all! I am trying to fill in a table based on if an IP address is in a lookup. I have a lookup table called "IPAddresses.csv" with the addresses in a column called "value", and a field in the event called addr. I want to fill a cell in a table with "In IP List" or "Not in IP List" something like this: IPAddresses.csv value Hostname 192.168.1.1 Host A 192.168.1.3 Host B 192.168.1.5 Host C 192.168.1.7 Host D   Splunk Table In IP Addresses addr In List 192.168.1.1 Not In List 192.168.1.2 In List 192.168.1.3 Not In List 192.168.1.4   I have a very immature Splunk knowledge base, so I am not even sure where to start. I would assume that it would require an eval if match statement in conjunction with a lookup, but I am not sure how to join the two. Any help would be greatly appreciated! Thank you!
Hi, I am new to splunk and am trying to build one timechart. we have the following timechart search query which is not giving the correct values in statistics but when we browse the events from the ... See more...
Hi, I am new to splunk and am trying to build one timechart. we have the following timechart search query which is not giving the correct values in statistics but when we browse the events from the statistics the required data seems to be there.  Not able to figure out how timechart exactly works here. Query as below , request help / explanation for the behavior. Filtered for a particular bizname, I select the date range from say 00:45 to 1:30 for a particular day. I get the wrong "Percentage" value [say 60%] for the first block [00:45 to 1:00], but when go to the events and check it comes out to be 93%. What am I doing wrong here. index=index1 sourcetype=*XYZ*  | dedup col1, col2,col3 | search bizname="ABC" | where completed in("Y","N") | eval status=if(completed ="Y",100,0) | timechart span=15m mean(status) as Percentage by bizname useother=false limit=100 | fillnull value=100 Thanks.  
Hello My client company uses Splunk and Cybereason. At first, I used the Cybereason For Splunk app 1.1.0. modified the cybereason_rest_client.py file as below. self.session = requests.session() ... See more...
Hello My client company uses Splunk and Cybereason. At first, I used the Cybereason For Splunk app 1.1.0. modified the cybereason_rest_client.py file as below. self.session = requests.session() self.session.verify = False Cybereason For Splunk 1.3.0 was released recently, upgrading the app. ERROR occurs in $SPLUNK_HOME/var/log/splunk/cybereason path with modularinput.log and restclient.log. -- modularinput.log ERROR -- 2021-07-13 15:02:21, 354 log_level=ERROR pid=11744 tid=MainThread file="cybereason.py" function="run" line_number="182" version="CybereasonForSplunk.v.1.3.0" Traceback: Traceback (most recent call last): File "/splunk/splunk_test/splunk/etc/apps/CybereasonForSplunk/bin/cybereason.py", line 138, in run events = cyb.get_time_bound_malops(earliest=chk["last_time"], latest=now) File "/splunk/splunk_test/splunk/etc/apps/CybereasonForSplunk/bin/cybereason_rest_client.py", line 420, in get_time_bound_malops raise e File "/splunk/splunk_test/splunk/etc/apps/CybereasonForSplunk/bin/cybereason_rest_client.py", line 358, in get_time_bound_malops severity_dict = self._get_mapped_serverities(earliest, latest) File "/splunk/splunk_test/splunk/etc/apps/CybereasonForSplunk/bin/cybereason_rest_client.py", line 680, in _get_mapped_serverities raise Exception(ret.content) Exception: b'<!DOCTYPE html><html><head><title>Error report</title></head><body><h1>HTTP Status 404 - Not Found</h1></body></html>' 2021-07-13 15:02:21, 354 log_level=ERROR pid=11744 tid=MainThread file="cybereason.py" line_number="181" version="CybereasonForSplunk.v.1.3.0" message=b'<!DOCTYPE html><html><head><title>Error report</title></head><body><h1>HTTP Status 404 - Not Found</h1></body></html>'" filename="cybereason.py" exception_line="138" input="cybereason://cybereason" section="malops" -- restclient.log ERROR -- 2021-07-13 15:02:21, 354 log_level=ERROR pid=11744 tid=MainThread file="cybereason_rest_client.py" function="get_time_bound_malops" line_number="419" version="CybereasonForSplunk.v.1.3.0" message="b'<!DOCTYPE html><html><head><title>Error report</title></head><body><h1>HTTP Status 404 - Not Found</h1></body></html>'" exception_type="Exception" exception_arguments="b'<DOCTYPE html><html><head><title>Error report</title></head><body><h1>HTTP Status 404 - Not Found</h1></body></html>'" exception_type="Exception" exception_arguments="b'<!DOCTYPE html><html><head><title>Error report</title></head><body><h1>HTTP Status 404 - Not Found</h1></body></html>'" filename="cybereason_rest_client.py" line="358" section="get_time_bound_malops" Where is the problem? Thanks
Hello folks,   I encountered a problem when trying to filter events from WinEventLog and EventCode 4662.  When I use the next regex in a tester or in a SPL with a data set unfiltered, it works fine... See more...
Hello folks,   I encountered a problem when trying to filter events from WinEventLog and EventCode 4662.  When I use the next regex in a tester or in a SPL with a data set unfiltered, it works fine. But using it in a blacklist only allows a fraction of the messages when "Default Property Set" is in the first row after Properties.   blacklist9 = EventCode="4662" Message="(Tipo\sde\sobjeto:(?!\s*groupPolicyContainer))[\s\S]*(Propiedades:(?![\s\S]*Default Property Set))" I tried some changes to the regex but I do not find a solution for this. Thanks for your time.
I want to map multiple value field to one single value field. Ex: COL1     |     COL2 VAL1     |     Val11                       Val12 VAL2     |     Val21                       Val22         ... See more...
I want to map multiple value field to one single value field. Ex: COL1     |     COL2 VAL1     |     Val11                       Val12 VAL2     |     Val21                       Val22                      Val23 And the output I want is: COL1     |     COL2 VAL1     |     Val11,VAL12 VAL1     |     Val21,VAL22,VAL23  
Hi All I have a bar chart generated using a timechart command I want to increase the width of the bar column they seem to be very thin I have tried using the below setting as well still not wor... See more...
Hi All I have a bar chart generated using a timechart command I want to increase the width of the bar column they seem to be very thin I have tried using the below setting as well still not working changed the value to 5 as well <option name="charting.chart.columnStyle.width">1</option> This setting was also giving me a validation warning     
I want to extract data between 2 curly brackets {} from below ErrorText string  
Deleted the schedule report/alert.  They keep sending letters to the post office. They are not in the system. empty reports come from deleted scheduled reports.  Thanks
I have a query like this   sourcetype=tseltdw tags{}= "request" | fillnull data.service,data.service1, api_revamp,data.status, tags{}, keyword, keyword_api,data.timeTaken | eval keyword_api=if(ke... See more...
I have a query like this   sourcetype=tseltdw tags{}= "request" | fillnull data.service,data.service1, api_revamp,data.status, tags{}, keyword, keyword_api,data.timeTaken | eval keyword_api=if(keyword LIKE "user/628%" OR keyword LIKE "user/08%" ,"user/msisdn",keyword) | eval data.service1= if(len('data.service')>200, "null",'data.service') | eval datex=strftime(_time,"%Y-%m-%d") | eval datetime=strftime(_time,"%Y-%m-%d %H:00:00") | eval hourx=strftime(_time,"%H") | eval data.uri3= if(len('data.uri2')>100, "null",'data.uri2') | stats count as trx by datex, hourx, datetime, data.service1, data.status, tags{}, data._id, keyword_api,api_revamp, data.timeTaken | sort data.timeTaken asc and return like this.  Can anyone help me how to return one value only with p90 percentile by data.timeTaken? Much appreciated for any help, thank you.
Hi, I have multiple hosts and would like to find out the approximate daily Log size of each host .  Please help me to resolve my issue
Hello Splunkers. I'm working on some of the usecases on ES and one of the request that I've got from my upper management is to consolidate all the usecases and their notables and send them a singl... See more...
Hello Splunkers. I'm working on some of the usecases on ES and one of the request that I've got from my upper management is to consolidate all the usecases and their notables and send them a single email everyday. Is there any way that I can do it from the Splunk UI? Please help. Thanks.
I have a TimeField with data format is like  4 Days 14 Hours 40 Minutes  and sometimes 7 Hours 40 Minutes TimeField 4 Days 14 Hours 40 Minutes 7 Hours 40 Minutes 40 Minutes   I want... See more...
I have a TimeField with data format is like  4 Days 14 Hours 40 Minutes  and sometimes 7 Hours 40 Minutes TimeField 4 Days 14 Hours 40 Minutes 7 Hours 40 Minutes 40 Minutes   I want to convert this field values into seconds so that i can sort my data based on time. Thanks!
I have to display a data for the specific date . there is 2 way to pick the date, one is from DB as system date, another is the date taken from event. So please help me do this in dashboard 
Hi, I have a single Value visualization with trellis. The dashboard was created to get the backlog count + how long the backlog is in the system. on a list view, the result is: sp_type values(sp_t... See more...
Hi, I have a single Value visualization with trellis. The dashboard was created to get the backlog count + how long the backlog is in the system. on a list view, the result is: sp_type values(sp_tot) Post-o.splexcbj-HAPQ 1484 (0min) Post-o.splexcbj-HUEVQ 32 (0min)   im using: eval sp_type=sp_qtype."-".sp_qname stats sum(sp_msgnum) as "total" by sp_type sp_msgbcklog eval sp_tot=total." (".sp_msgbcklog."min)" sort total streamstats count as "AA" eval sp_type = printf("%*s", len(sp_type) + AA, sp_type) where total!="0" stats values(sp_tot) by sp_type  then i use single value visualization, the display is like this: but the problem is that i need to make the result is in RED color, and to make sure that the result will not overlap each other like the one in the screenshot.  *i added streamstats to sort and make sure that the one with highest value will return first Any idea how to achieve that?  
Below are my 2 log lines -  1.Successfully received message RECEIVED, payload={\"reference_id\":\"ABCD\"...} 2. Successfully published COMPLETED,  payload=(referenceId=ABCD,... For the given refer... See more...
Below are my 2 log lines -  1.Successfully received message RECEIVED, payload={\"reference_id\":\"ABCD\"...} 2. Successfully published COMPLETED,  payload=(referenceId=ABCD,... For the given referenceId ABCD, I want to search if "COMPLETED" message was published or not.  I am trying to do nested search but not getting the right result -  index=xyz "Successfully *"  "COMPLETED"  | rex "referenceId=(?<referenceId>[^,]*).*" | join reference_id in [search index=xyz  "Successfully * message" AND ("RECEIVED") | rex "reference_id\\\\\":\\\\\"(?<reference_id>[^\\\\]*).*" | dedup reference_id | fields reference_id] | stats count by referenceId | where count < 1 I am expecting output like -  ABCD 0
Hello all, I'm having trouble getting the correct difference in time when subtracting from the "now() " functions. Any help would be appreciated. Here is my sample query :   Where my start time sta... See more...
Hello all, I'm having trouble getting the correct difference in time when subtracting from the "now() " functions. Any help would be appreciated. Here is my sample query :   Where my start time stamp looks like: 2005-07-05T04:28:34.453494Z     index=main | where status_1="open" | eval start=strptime(create_time, "%Y-%m-%dT%H:%M:%S.%6QZ" | eval current_time=now() | eval diff=current_time-start | fieldformat diff=tostring(diff, "duration") | table _time, id_box, diff, start, end 
Our event log has request and response. Request and response body can either be a json object or json array. I need to extract resquest.body and response.body to construct a field "httpdetails" which... See more...
Our event log has request and response. Request and response body can either be a json object or json array. I need to extract resquest.body and response.body to construct a field "httpdetails" which is a string . How can i achieve this using single spath function. example of log events :     { "message": { "request": { "body": {} }, "response": { "body": [ { "id": "85118db6-2d5c-6bb0-ff67-5bc9ef5d4a1f", "createdon": "2021-07-08T00:37:02.512Z" } ] } } }         { "message": { "request": { "body": { "$limitafter": "2021-07-08T20:08:29.983Z" } }, "response": { "statuscode": 200, "body": { "count": "22" } } } }     Splunk query : | spath output=response_data message.response.body | spath output=request_data message.request.body | eval request_data=if(isnull(request_data) , NULL , request_data) | eval response_data=if(isnull(response_data),  NULL, response_data) | eval httpdetails="\n"+request_data+"\n-----------------Response---------------\n"+response_data, httpdetails = split(httpdetails,"\n") | eval details=if(isnotnull(httpdetails), httpdetails, details)  After running this query "httpdetails" is shown below. Here response_data for first log event is coming as NULL instead of object array. How can I fix this??