All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Need to create summary index continuously realtime, now have two questions: 1-run splunk forwarder on client and logs send to splunk server, in each lines lots of data exist so need to create the su... See more...
Need to create summary index continuously realtime, now have two questions: 1-run splunk forwarder on client and logs send to splunk server, in each lines lots of data exist so need to create the summary index as soon as log received and store summary of line on that summary index continuously realtime.   2-is it possible Automatically Create new index for each day like this myindex-20240115, myindex-20240116, as data comings from forwarder?    Thanks
Hi Team i had provided user roles  has Read only access. but user having edit and delete the reports, how to restrict  the user access as Read only access for the reports, The below had provided ... See more...
Hi Team i had provided user roles  has Read only access. but user having edit and delete the reports, how to restrict  the user access as Read only access for the reports, The below had provided  configuration and Roles capablities  had given below.   Please help me  how to restrict the user  access.  User not able to delete option  and not able edit Splunk Queries.     [savedsearches/krack_delete] access = read : [ * ], write : [ power ] export = system owner = vijreddy@xxxxxxxxxxxxx.com version = 9.0.5.1 modtime = 1704823240.999623300    
Hi,  I have a dataset with very poor qulity and multiple encoding error. Some fields contain data like "Алексей" which sould be "Алексей". My first idea to ... See more...
Hi,  I have a dataset with very poor qulity and multiple encoding error. Some fields contain data like "Алексей" which sould be "Алексей". My first idea to convert taht, was to search every falty dataset and convert this extermally with a script but I'm curious if theres a better way using splunk. But I have no idea how to get there. I somehow need to cet every &#(\d{4}); and I could facilitate printf("%c", \1) to get the correct unicode character but I have no Idea how to apply that to every occourance in a single field. Currently I have data like this: id name 1 Алексей   Where I wanno get is that: id name correct_name 1 Алексей Алексей   Any ideas if that is possible without using python sripts in splunk? Regards Thorsten
Hello, I am looking for any guidance, info about the possibility of using Microsoft AMA agents to forward logs to splunk instead of using Splunk universal forwarders. I know you will say but why?! ... See more...
Hello, I am looking for any guidance, info about the possibility of using Microsoft AMA agents to forward logs to splunk instead of using Splunk universal forwarders. I know you will say but why?! lets say I have some requirements and constraints that oblige me to use AMA agents  I need to know the feasibality of this integration and if there are any known issues or limitations. Thanks you for your help. (excuse me if my question is vague, i am kinda lost here  )
Hi  Can you please tell me how can i  extract the events for which the difference of current_time and timestampOfReception is greater that 4 hours for the below Splunk query :    `eoc_stp_event... See more...
Hi  Can you please tell me how can i  extract the events for which the difference of current_time and timestampOfReception is greater that 4 hours for the below Splunk query :    `eoc_stp_events_indexes` host=p* OR host=azure_srt_prd_0001 (messageType= seev.047* OR messageType= SEEV.047*) status = SUCCESS targetPlatform = SRS_ESES NOT [ search (index=events_prod_srt_shareholders_esa OR index=eoc_srt) seev.047 Name="Received Disclosure Response Command" | spath input=Properties.appHdr | rename bizMsgIdr as messageBusinessIdentifier | fields messageBusinessIdentifier ] | eval Current_time =strftime(now(),"%Y-%m-%d %H:%M:%S ") | eval diff= Current_time-timestampOfReception | fillnull timestampOfReception , messageOriginIdentifier, messageBusinessIdentifier, direction, messageType, currentPlatform, sAAUserReference value="-" | sort -timestampOfReception | table diff , Current_time, timestampOfReception, messageOriginIdentifier, messageType, status, messageBusinessIdentifier, originPlatform, direction, sourcePlatform, currentPlatform, targetPlatform, senderIdentifier, receiverIdentifier, currentPlatform, | rename timestampOfReception AS "Timestamp of reception", originPlatform AS "Origin platform", sourcePlatform AS "Source platform", targetPlatform AS "Target platform", senderIdentifier AS "Sender identifier", receiverIdentifier AS "Receiver identifier", messageOriginIdentifier AS "Origin identifier", messageBusinessIdentifier AS "Business identifier", direction AS Direction, currentPlatform AS "Current platform", sAAUserReference AS "SAA user reference", messageType AS "Message type"
We are using a SAAS based controller. If we needed to restore aspects of our configuration from yesterday, or from perhaps a week or month ago, what is the process for us to do that? Do you perform r... See more...
We are using a SAAS based controller. If we needed to restore aspects of our configuration from yesterday, or from perhaps a week or month ago, what is the process for us to do that? Do you perform regular (and granular) backups on our behalf, or are we expected to download configurations ourselves? If so, what options are there that allow us to automate this? E.g. APIs, jobs etc
i need to masking email on my data, i'm tring using transforms.com but [emailaddr-anonymizer] REGEX = ([A-z0-9._%+-]+@[A-z0-9.-]+\.[A-z]{2,63}) FORMAT = ********@********* DEST_KEY = _raw  if I d... See more...
i need to masking email on my data, i'm tring using transforms.com but [emailaddr-anonymizer] REGEX = ([A-z0-9._%+-]+@[A-z0-9.-]+\.[A-z]{2,63}) FORMAT = ********@********* DEST_KEY = _raw  if I do this the entire log is masked, however I want only the email to be masked, please can someone help me
Hi I didn't find an email address from the developer Christopher Caldwell so I try it this way. The BlueCat Address Manager Restful API changes from version 1 to version 2 and version 1 will be r... See more...
Hi I didn't find an email address from the developer Christopher Caldwell so I try it this way. The BlueCat Address Manager Restful API changes from version 1 to version 2 and version 1 will be removed in 2025. Are there any plans to update the Add-on to support the new API? I would be very pleased! Greetings, Mirko
Hello Splunkers, I've a Region filter over the dashboard. This Region filter has values AMER and EMEA.   I've a requirement to reorder the above fields based on the selection of Region filter ... See more...
Hello Splunkers, I've a Region filter over the dashboard. This Region filter has values AMER and EMEA.   I've a requirement to reorder the above fields based on the selection of Region filter as follows. I want "<Region> Mandatory" field to be appear before "<Region> All" Thanks in advance. @tscroggins @yuanliu @bowesmana     
Hello Community, We have a challenge with our SysMon Instance. While testing compatibilities we noticed that after SysMon gets upgraded it no longer talks to the SIEM for some weird reason.  Has a... See more...
Hello Community, We have a challenge with our SysMon Instance. While testing compatibilities we noticed that after SysMon gets upgraded it no longer talks to the SIEM for some weird reason.  Has anyone experienced anything like this before? Regards, Dan
while configuring RF and SH, can we configure that only one server should be used for saving all copies of data and does not participate in indexing, only participate in searching when needed.
 recently , I converted lookup files to .csv lookup files and after converting them the result of the dashboard is It is showing nothing but only this. and if this helps we have custom scripts i... See more...
 recently , I converted lookup files to .csv lookup files and after converting them the result of the dashboard is It is showing nothing but only this. and if this helps we have custom scripts in backend.
HI All, I need to display the results same as like below  |chart count over API by StatusCode  API  200 300 400 400 total --   ---      ----     --      --- but I need to display the results... See more...
HI All, I need to display the results same as like below  |chart count over API by StatusCode  API  200 300 400 400 total --   ---      ----     --      --- but I need to display the results behind API more fields like host and method as well API host method 200 300 400  total  --     ---    ---              --      ---    ---- please help to get the results
Hi  Can someone help to explain how we can use Not-exists in Splunk.  Example is attached below for which i need to use this  function in Splunk.  1) Search1 generates a set of results.  2) Searc... See more...
Hi  Can someone help to explain how we can use Not-exists in Splunk.  Example is attached below for which i need to use this  function in Splunk.  1) Search1 generates a set of results.  2) Search2 also generated a set of results.  There is a common field between the 2 Searches. I want to add a search in splunk as below :  Results of Search1 (Not exists (results of Search2 )) common field = Field1    Search1 `eoc_stp_events_indexes` host=p* OR host=azure_srt_prd_0001   | table timestampOfReception, messageOriginIdentifier, messageType, status, messageBusinessIdentifier, originPlatform, direction, sourcePlatform, currentPlatform, targetPlatform, senderIdentifier, receiverIdentifier, currentPlatform Search2 : (index=events_prod_srt_shareholders_esa OR index=eoc_srt) seev.047 Name="Created Disclosure Response Status Advice Accepted" | table  messageBusinessIdentifier Field1 messageBusinessIdentifier      
Hi, Will disable the app (ES Content Updates)  affect the functionality of Enterprise Security? Thanks Regards  
Hello, I want to add a user to Splunk. I have a free license trial, and there is no "USER" oder "ADD USER" on my interface on splunk enterprise. How else can i do that?
Hi All, I need to display the results same as like below  |chart count over API by StatusCode  API  200 300 400 400 total --   ---      ----     --      --- but I need to display the results... See more...
Hi All, I need to display the results same as like below  |chart count over API by StatusCode  API  200 300 400 400 total --   ---      ----     --      --- but I need to display the results behind API more fields like host and method as well API host method 200 300 400  total  --     ---    ---              --      ---    ---- please help to get the results
Could not load lookup=LOOKUP-minemeldfeeds_dest_lookup I am getting this error in one of the dashboards panels , any solutions?
Hi All, I have tried looking over the documentation for this, but I am super confused. And really struggling to wrap my head around this. I have an environment where Splunk is ingesting syslog from... See more...
Hi All, I have tried looking over the documentation for this, but I am super confused. And really struggling to wrap my head around this. I have an environment where Splunk is ingesting syslog from 2 firewalls. The logs are only audit / management related, and these need to be sent to a sperate server for compliance (hence splunk). I  want to configure a retention policy where this data is deleted after 1 year, as that is the specific requirement. From what i can tell, i just need to add the "frozentimeinseconds" line to the index conf file for the "main" index (as this is where the events are going) Current ingestion is ~150,000 events per day. And daily ingestion is ~30-35MB.However, this is subject to change in the future as more firewalls come online etc.. There is plenty of storage available. However the requirement is just 1 year of searchable data. But I keep seeing things about hot/warm/cold/frozen etc.. and i just dont get it. All thats needed is 1 year of searchable data, anything older than (time.now() - 365 days) can be deleted.   Can someone please assist me with what i need to do to make this work
Hello,   I have a CSV file with many MANY columns (in my case there are 7334 columns with an average length of 145-146 chars each. This is a telemetry file exported from some networking equipment a... See more...
Hello,   I have a CSV file with many MANY columns (in my case there are 7334 columns with an average length of 145-146 chars each. This is a telemetry file exported from some networking equipment and this is just part of the exported data... The file has over 1000 data rows but I'm just trying to add 5 rows at the moment. Trying to create an input for the file fails when adding more that 4175 columns with the following error: "Accumulated a line of 512256 bytes while reading a structured header, giving up parsing header" I have already tried to increase all TRUNCATION settings to well above this value (several orders of magnitude) as well as the "[kv]" limits in the "limits.conf" file. Nothing helps. I searched the forum here but couldn't find anything relevant. A Google search yielded two results, one where people just decided that headers that are too long are the user's problem and did not offer any resolution (not even to say it's not possible). The other result just went unanswered. Couldn't find anything relevant in the Splunk online documentation or REST API specifications either. I will also mention that processing the full data file with Python using either the standard csv parser or Pandas works just fine and very quickly. The total file size is ~92MB which is not big at all IMHO. My Splunk info: Version:9.1.2 Build:b6b9c8185839 Server:834f30dfffad Products:hadoop Needless to say the web frontend crashes entirely when I try to create the input so I'm doing everything via the Python SDK now. Any ideas if this can be fixed to I can add all of my data?