All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @jagan_vannala , maybe it's a mistyping, but in the solution with NOT you don't need to add !, in other words: host="*" NOT sessionId=X Anyway, your two searchs has different results because w... See more...
Hi @jagan_vannala , maybe it's a mistyping, but in the solution with NOT you don't need to add !, in other words: host="*" NOT sessionId=X Anyway, your two searchs has different results because with sessionId!=X you tale all the logs where the filed sessionId is present and hasn't the value "X", instead with NOT sessionId=X you have all the events except the ones with sessionId=X , even if the sessionId field isn't present. Ciao. Giuseppe
HI Team, When i am trying to exclude one field by inserting condition sessionId!=X its not working . even though I used "NOT" condition but the field which i am trying to exclude is still showing in... See more...
HI Team, When i am trying to exclude one field by inserting condition sessionId!=X its not working . even though I used "NOT" condition but the field which i am trying to exclude is still showing in results. could you please help how i can exclude  particular field host="*"  sessionId!=X  host="*" NOT sessionId!=X 
Hi @user487596 , Splunk is a search engine, so you can use it for this: you must know the rules (e.g. searching for the password word) and then apply to the indexes. At first I'd start identifying... See more...
Hi @user487596 , Splunk is a search engine, so you can use it for this: you must know the rules (e.g. searching for the password word) and then apply to the indexes. At first I'd start identifying the login and create user actions for each environment in your infrastructure (e.g. in windows these action are identifed with EventCode = 4624 and 4720), then you can run searches with those specific filters to see if there are clear text passwords. Ciao. Giuseppe
HI, @gcusello  I need to find secrets (passwords, api-tokens, etc.) in all data (events) in all indexes that are in splunk, the question is in the approach: how to do this so as not to overload splu... See more...
HI, @gcusello  I need to find secrets (passwords, api-tokens, etc.) in all data (events) in all indexes that are in splunk, the question is in the approach: how to do this so as not to overload splunk.
Hi @user487596 , could you better describe your requisite? In Splunk access to data is managed at index level, in other words, you can define for each role, which are the indexes that the users wit... See more...
Hi @user487596 , could you better describe your requisite? In Splunk access to data is managed at index level, in other words, you can define for each role, which are the indexes that the users with that role can access. In addition, it's also possible to add some additional restrictions, but always at Role level, not user level. Ciao. Giuseppe
Hello, I've created a dashboard that is showing 4 teams in dropdown menu. Now when I choose one of the team, I want to only see the panels for the specific team.  I've created the drop-down input a... See more...
Hello, I've created a dashboard that is showing 4 teams in dropdown menu. Now when I choose one of the team, I want to only see the panels for the specific team.  I've created the drop-down input and given it a label called Team. I have created statis options like Team 1, Team 2, Team 3, Team 4. So, my question is how do I assign each panel chart to one of the teams in the drop down? From some of the online searching I've done - it is asking to use tokenization concept. Could you please help me achieve this result.
can you share the props or SEDCMD you are using right now?
  Hi, I'm trying to get the Guard duty log using the Splunk Add-on for AWS app. The input method is Generic S3, and logs from cloudtrail or WAF come in well, but the Guard duty log is not comi... See more...
  Hi, I'm trying to get the Guard duty log using the Splunk Add-on for AWS app. The input method is Generic S3, and logs from cloudtrail or WAF come in well, but the Guard duty log is not coming in. Of course, the data is in the S3 bucket. I'm attaching the guard duty.log.   Thank you.
Hello everyone! How can we solve the problem of searching for secrets in all or some splunk indexes so that splunk is not heavily loaded: how can this be implemented? (approach).  It is obvious that... See more...
Hello everyone! How can we solve the problem of searching for secrets in all or some splunk indexes so that splunk is not heavily loaded: how can this be implemented? (approach).  It is obvious that the list of indexes needs to be limited. What else?
@loganramirez  To schedule a PDF email to a mail server that does not require SMTP authentication, you must have the list_settings capability and use the sendemail command. If you want users who do ... See more...
@loganramirez  To schedule a PDF email to a mail server that does not require SMTP authentication, you must have the list_settings capability and use the sendemail command. If you want users who do not have the admin, splunk-system-role, or can_delete roles to be able to send email notifications from their searches, you must grant them the list_settings capability. By default, only the admin, splunk-system-role, and can_delete roles have access to list_settings.    
I found below response handler, will this work or does it require any modification? as per the sample in my original request. class ArrayHandler: def __init__(self,**args): pass def __call__(self... See more...
I found below response handler, will this work or does it require any modification? as per the sample in my original request. class ArrayHandler: def __init__(self,**args): pass def __call__(self, response_object,raw_response_output,response_type,req_args,endpoint,oauth2=None): if response_type == "json": raw_json = json.loads(raw_response_output) column_list = [] for column in raw_json['columns']: column_list.append(column['name']) for row in raw_json['rows']: i = 0; new_event = {} for row_item in row: new_event[column_list[i]] = row_item i = i+1 print(print_xml_stream(json.dumps(new_event))) else: print_xml_stream(raw_response_output)
Nope, I think I ended up with using sed in props to remove the offending " ".
Hello @ccampbell, The reason you are not able to upgrade the app to the latest version is that the app is not listed to be compatible with SplunkCloud platform on Splunkbase. If you wish to have the... See more...
Hello @ccampbell, The reason you are not able to upgrade the app to the latest version is that the app is not listed to be compatible with SplunkCloud platform on Splunkbase. If you wish to have the updated package available for upgrade on SplunkCloud platform, you'll need to have the developer of the app update the platform compatibility details. Please refer to the following screenshot displaying platform compatibility for the app.   I assume that the app inspect vetting would have failed for the app package and hence platform compatibility for SplunkCloud would be missing. As a solution to this, you can download the app package from Splunkbase, modify the app_id (app.conf and the folder name, and anywhere else within the package if used), repackage the app and upload it as a private app. This approach is not recommended since it doesn't allow you to track and stay updated with the future updates. Additionally, while uploading the private app, if the app inspect fails, you'll need to work on fixing the errors and repackage the app again and vet it continuously until it succeeds the vetting process. The best approach would be to have the developer engaged and make the app compatible to SplunkCloud platform.   Thanks, Tejas   --- If the above solution helps, an upvote is appreciated.!!  
Hello @shai, In this scenario, you'll need to combine your certs along with SplunkCloud certificates. Just append the CA file to include self signed certificate and SplunkCloud rootCA and use the sa... See more...
Hello @shai, In this scenario, you'll need to combine your certs along with SplunkCloud certificates. Just append the CA file to include self signed certificate and SplunkCloud rootCA and use the same for communication. This chain will help you communicate with both Splunk Enterprise On-Prem and SplunkCloud environment.   Thanks, Tejas.   --- The above solution helps, an upvote is appreciated.!! 
Hi @TheEggi98 , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Alright Thank you i will use sourcetype and index overriding and then make the data of the newly added available for our qs cluster to build dashboards
We have below data in json format, i need help with a custom json response handler so splunk can break every event separately.  Each event starts with the record_id { "eventData": [ { "record_id"... See more...
We have below data in json format, i need help with a custom json response handler so splunk can break every event separately.  Each event starts with the record_id { "eventData": [ { "record_id": "19643", "eventID": "1179923", "loginID": "PLI", "userDN": "cn=564SD21FS8DF32A1D87FAD1F,cn=Users,dc=us,dc=oracle,dc=com", "type": "CredentialValidation", "ipAddress": "w.w.w.w", "status": "success", "accessTime": "2024-08-29T06:23:03.487Z", "oooppd": "5648sd1csd-952f-d630a41c87ed-000a3e2d", "attributekey": "User-Agent", "attributevalue": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/128.0.0.0 Safari/537.36" }, { "record_id": "19644", "eventID": "1179924", "loginID": "OKP", "userDN": "cn=54S6DF45S212XCV6S8DF7,cn=Users,dc=us,dc=CVGH,dc=com", "type": "Logout", "ipAddress": "X.X.X.X", "status": "success", "accessTime": "2024-08-29T06:24:05.040Z", "oooppd": "54678S3D2FS962SDFV3246S8DF", "attributekey": "User-Agent", "attributevalue": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/128.0.0.0 Safari/537.36" } ] }
Hi @TheEggi98 , if the file to read is always the same in both inputs, Splunk doesn't read twice a file and the solution is the second one I described (overriding). If instead you have different fi... See more...
Hi @TheEggi98 , if the file to read is always the same in both inputs, Splunk doesn't read twice a file and the solution is the second one I described (overriding). If instead you have different files in the same path to read in the two inputs, you can specify in the input stanza the different file name to read also using the same path. Ciao. Giuseppe
Hi @gcusello  thanks for the fast response. if im not wrong i theoretically could bypass the precedence by doing this (at least btool dont complain) but i will not do that [monitor://<path to lo... See more...
Hi @gcusello  thanks for the fast response. if im not wrong i theoretically could bypass the precedence by doing this (at least btool dont complain) but i will not do that [monitor://<path to logfile>.log] ... [monitor://<path to same logfile>.lo*] ...   When overriding sourcetype and index on the indexer, am i able to route data of the second sourcetype to our qs cluster to build dashboards?
Hi @TheEggi98 , you cannot read the same files in two input stanzas, ony one (by precedence rules) will be used. If in the same path, you have to read different files for each input, you can specif... See more...
Hi @TheEggi98 , you cannot read the same files in two input stanzas, ony one (by precedence rules) will be used. If in the same path, you have to read different files for each input, you can specify in the stanzas the correct file to read. If instead data are in the same file, the only solution is to read it with one input stanza and then override index and eventually sourcetype values on the Indexers or (if present) on Heavy Forwarders, following the instructions at  for sourcetype https://docs.splunk.com/Documentation/SplunkCloud/8.2.2203/Data/Advancedsourcetypeoverrides?_gl=1*4u2o7n*_gcl_au*NTk4MTY5Ny4xNzI0ODM2ODI0*FPAU*NTk4MTY5Ny4xNzI0ODM2ODI0*_ga*MTY2Mzg1NDI2Mi4xNzI0ODM2ODI0*_ga_5EPM2P39FV*MTcyNDkxNDM3OC41LjEuMTcyNDkxNTE2NS4wLjAuMTM4NTMxMDQ2NA..*_fplc*SVZreWQzalBQTTVYVjFvczZ3Sm45a3lBd1REUGtiV3c4bktjeDdzejliWm9NbEYlMkJ2Z2VGb2E4JTJCYzdsNld4QSUyQmJ0NnAwVTNKaU93OWJGbk1uSmVBa0R3M3l4ZWcyNElnZFZISldBS0VlOSUyRmxycm00UUp5NXdDd2xXb3clMkJXQSUzRCUzRA.. and for index https://community.splunk.com/t5/Getting-Data-In/Route-data-to-index-based-on-host/td-p/10887?_gl=1*1079w7n*_gcl_au*NTk4MTY5Ny4xNzI0ODM2ODI0*FPAU*NTk4MTY5Ny4xNzI0ODM2ODI0*_ga*MTY2Mzg1NDI2Mi4xNzI0ODM2ODI0*_ga_5EPM2P39FV*MTcyNDkxNDM3OC41LjEuMTcyNDkxNTIxNS4wLjAuMTM4NTMxMDQ2NA..*_fplc*M29uUHdZbnRsT3VuRlgxaktXenFld01sdDJ1dXVqcFExYTRDSFF5OTZRd2pTRmFzRGE4OU11YUZaS0dtdG5iSWNuckRxTGRFT2l4cDVrZDlQTnNLUTFEOVVIemRxQyUyQjhyTmpvJTJGeUZ5bUs1Vng2eWtkMUxWcEpSdDFQWEtZQSUzRCUzRA.. Ciao. Giuseppe