All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I see thank you for letting me know!  
Hi @sabari80 , what's your issue? anyway, I created a macro (called e.g. "non_working_hours") and I call it, in this way if I need to modify one hour I have to do this in only one search. In addit... See more...
Hi @sabari80 , what's your issue? anyway, I created a macro (called e.g. "non_working_hours") and I call it, in this way if I need to modify one hour I have to do this in only one search. In addition, I created a lookup containing all the days of the next three years with the indication of holydays, in this way, in my macro, I can check also holydays, in addition to off office hours and weekends. Ciao. Giuseppe 
Hi @nathanielchin , as @ITWhisperer said, in Dashboard Studio there isn't the Post process Search feature, but it's available a very near feature called "chained searches". In other words, you have... See more...
Hi @nathanielchin , as @ITWhisperer said, in Dashboard Studio there isn't the Post process Search feature, but it's available a very near feature called "chained searches". In other words, you have to create your base search and then create the other searches starting from the base search, chaining the new search to it. For more infos see at https://docs.splunk.com/Documentation/SplunkCloud/latest/DashStudio/dsChain  Ciao. Giuseppe
Hi @UnsuperviseLeon , as @PickleRick said, fields are lister in interesting fields only if you have them in at least 20% of the events, you can check these fields putting in the main search one of t... See more...
Hi @UnsuperviseLeon , as @PickleRick said, fields are lister in interesting fields only if you have them in at least 20% of the events, you can check these fields putting in the main search one of these new fields (e.g. my_field=*). then, it isn't sure that these fields are correctly parsed by the standard Windows parser, you have to check this and eventually add the missing parsings. Ciao. Giuseppe
Hi @st1 , don't use transaction command because it's very slow, please try something like this, adapting my solution to your use case (e.g. the thresholds in the last row): index=honeypot sourcetyp... See more...
Hi @st1 , don't use transaction command because it's very slow, please try something like this, adapting my solution to your use case (e.g. the thresholds in the last row): index=honeypot sourcetype=honeypotLogs ("SSH2_MSG_USERAUTH_FAILURE" OR "SSH2_MSG_USERAUTH_SUCCESS") | eval kind=if(searchmatch("SSH2_MSG_USERAUTH_FAILURE", "success","failure") | stats dc(kind) AS kind_count) count(eval(kind="success)) As success_count count(eval(kind="failure)) As failure_count BY sessionID | where kind_count=2 AND success_count>0 AND failure_count>10  Ciao. Giuseppe
Hi @cherrypick , good for you, see next time! For the other people of Community, please describe how you solved the issue. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by... See more...
Hi @cherrypick , good for you, see next time! For the other people of Community, please describe how you solved the issue. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Hi @sgro777 , sorry, my error, please try: eventtype=builder (user_id IN ($id$) OR user_mail IN ($email$)) | eval ... Ciao. Giuseppe
Hi @irkey , let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the c... See more...
Hi @irkey , let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
nevermind doesn't work  
Hi,   Sorry for the confusion , I just pasted a single input stanza , however I have 8 different monitoring stanza's in my inputs.conf and they are all working and ingesting the data. crcSalt = ... See more...
Hi,   Sorry for the confusion , I just pasted a single input stanza , however I have 8 different monitoring stanza's in my inputs.conf and they are all working and ingesting the data. crcSalt = <DATETIME> What It Does: This setting includes the file's last modification time in the checksum calculation. Use Case: It's useful when you want Splunk to reindex the file if the file's last modified timestamp changes, even if the content stays the same. So for my usecase I need to ingest the complete csv file data daily , so used crcSalt = <DATETIME>. (Im doing right or wrong , please correct) Small set of data means only getting few rows data from the csv file and not the complete csv data. Can you please help.   Thank you    
how to check index and volume parameters and index size
1. You obviously can't read data from 8 files if you have input set for just one of them 2. Leave the crcSalt setting alone. It is very very rarely needed. Usually you should rather set initCrcLe... See more...
1. You obviously can't read data from 8 files if you have input set for just one of them 2. Leave the crcSalt setting alone. It is very very rarely needed. Usually you should rather set initCrcLength if the files have common header/preamble 3. What do you mean by "small set of data is being ingested"? 4. Did you check splunk list monitor and splunk list inputstatus
Are you looking for something like this?   index=itsi_summary | eval kpiid = mvappend(kpiid, itsi_kpi_id) | stats latest(alert_value) as alert_value latest(alert_severity) as health_score by ... See more...
Are you looking for something like this?   index=itsi_summary | eval kpiid = mvappend(kpiid, itsi_kpi_id) | stats latest(alert_value) as alert_value latest(alert_severity) as health_score by kpiid kpi | join type=left kpiid [| inputlookup service_kpi_lookup | stats latest(title) as title by kpis._key | rename kpis._key as kpiid ] | search title IN ("<Service Names>") kpi!="ServiceHealthScore"  
Hi, Im currently working on ingesting 8 csv files from a path using inputs.conf on a UF. And the data is getting ingested . The issue is these 8 csv files are overwritten daily by new data by a aut... See more...
Hi, Im currently working on ingesting 8 csv files from a path using inputs.conf on a UF. And the data is getting ingested . The issue is these 8 csv files are overwritten daily by new data by a automation script so the data inside the csv file is changed daily.   I want to ingest the complete csv data daily into Splunk , but what I can see is only a small set of data is getting ingested but not the complete csv file data.   My inputs.conf is  [monitor://C:\file.csv] disabled = false sourcetype = xyz index = abcd crcSalt = <DATETIME>   Can someone please help me , whether Im using the correct input or not?   The ultimate requirement is to ingest the complete csv data from the 8 csv files daily into Splunk.   Thank you.
How to check the splunk lsit monitor/ where etc
This is the error message from splunk server ERROR UserManagerPro [727840 TcpChannelThread] - Requesting user info through AQR returned an error Error in Attribute query request, AttributeQueryTrans... See more...
This is the error message from splunk server ERROR UserManagerPro [727840 TcpChannelThread] - Requesting user info through AQR returned an error Error in Attribute query request, AttributeQueryTransaction err=No error, AttributeQueryTransaction descr=Method Not Allowed, AttributeQueryTransaction statusCode=405 for user: ......... This is from access log (http) 401: "GET /services/authentication/current-context HTTP/1.1" 401 148 "-" "python-requests/2.31.0" - - - 19ms And the audit log it said not valid user=n/a, action=validate_token, info=JsonWebToken validation failed
Thanks 
Well... there is a possibility of defining an output using a short-ttl DNS name (dyn-DNS), it's not something I'd recommend. Static addresses definitely make your life easier.
Interesting Fields is just a GUI feature that shows fields present in at least 10 (15?) percent of events. Just because field is not listed there doesn't mean it's not being parsed out from the event... See more...
Interesting Fields is just a GUI feature that shows fields present in at least 10 (15?) percent of events. Just because field is not listed there doesn't mean it's not being parsed out from the event. Actually with renderXml=true you get xml-formatted events from which all fields should be automatically parsed.
Hi @Mojal  @marnall  I am facing the same issue with my Splunk Cluster. Were y'all able to find any workarounds/solutions? P.S: I have deployed the splunk cluster via splunk-operator in m... See more...
Hi @Mojal  @marnall  I am facing the same issue with my Splunk Cluster. Were y'all able to find any workarounds/solutions? P.S: I have deployed the splunk cluster via splunk-operator in my kubernetes environment.