All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello @gcusello, I tested with Chrome, Firefox and Edge, it's the same problem. I don't know how I can debug, Thank you
Hello Pickle Rick, Thanks for your help!  It worked! 
Hi There is two options to get those into your lookup. Get those from you ldap query. This is obviously the best option as then those are absolutely correct. Unfortunately I haven't any suitable A... See more...
Hi There is two options to get those into your lookup. Get those from you ldap query. This is obviously the best option as then those are absolutely correct. Unfortunately I haven't any suitable AD to look what fields those are and how you could get those. I'm quite trustful that those are there. Just ask from your AD admins and they probably help you. If you have standard how those are created based on other attributes then just regenerate those before you add entry to lookup. r. Ismo
Hi,  I've got the email but this is not what I wanted.  I want to set up my controller's account, username, and password. if I clicked my controller name(?), it prompts me to enter my account in... See more...
Hi,  I've got the email but this is not what I wanted.  I want to set up my controller's account, username, and password. if I clicked my controller name(?), it prompts me to enter my account information. you can see my screenshots for better understanding. could you help me to reset them? thank you in advance.
My bad. I forgot to add a time variable to the eventstats. By disregarding time the query checks whether a user has more than 10 failed logins over the entire search span. If your data already has a ... See more...
My bad. I forgot to add a time variable to the eventstats. By disregarding time the query checks whether a user has more than 10 failed logins over the entire search span. If your data already has a date stamp then you can use that. Alternatively you could adjust the query like this: index=gbts-vconnection sourcetype=VMWareVDM_debug "onEvent: DISCONNECTED" (host=Host1 OR host=host2) | rex field=_raw "(?ms)^(?:[^:\\n]*:){5}(?P<IONS>[^;]+)(?:[^:\\n]*:){8}(?P<Device>[^;]+)(?:[^;\\n]*;){4}\\w+:(?P<VDI>\\w+)" offset_field=_extracted_fields_bounds | eval temp_date = strftime(_time, "%Y-%m-%d") | eventstats count as failed_count by IONS,temp_date | where failed_count>=10 | timechart dc(IONS) as IONS span=1d  
For whoever else that needs this: your search field IN ... will accept this multivalue token passed to it: $your_multivalue_select_tokenhere_that_also_accepts_*_as_default|s$
Hi, I have an issue here with the fishbucket of the Universal Forwarder. I have tried to look for quite a lot of documentation, but it seems that there is too little documentation, and there are also... See more...
Hi, I have an issue here with the fishbucket of the Universal Forwarder. I have tried to look for quite a lot of documentation, but it seems that there is too little documentation, and there are also few topics on it. The problem I am facing is that the fishbucket is taking up a large amount of space, about 2GB on the hard drive, while the limit configuration in limits.conf is:  file_tracking_db_threshold_mb = 500  In some other topics, I read that the fishbucket can be up to 2 or 3 times larger than the configured limit. And this happens because of its backup mechanism with file save and snapshot.tmp However, is there a limit to the size of the fishbucket? Will it continue to expand over time without limit, or only expand to a certain limit? PS:  i have nmon TA install on my server. Please, provide me with Splunk documentation on this part. Thank you.    
Thanks. I am on Splunk Enterprise on a local server. Do you know where the file you reference is and its name?
Hi Team, I have two event , attaching screenshot for reference 1.how to retrieve the uniqObjectIds and display in table form 2.how to retrieve the objectIds,version and display their value in diff... See more...
Hi Team, I have two event , attaching screenshot for reference 1.how to retrieve the uniqObjectIds and display in table form 2.how to retrieve the objectIds,version and display their value in different table column form first event: msg: unique objectIds name: platform-logger pid: 8 uniqObjectIds: [ [-]      275649     108976    ]    uniqObjectIdsCount: 1 second event:  event: { [-] body: { "objectType": "material", "objectIds": [ "275649" ], "version": "latest" } msg: request body The query i came closest is below but still unable to get what i wanted. Actual : Expected: in a table , i get the each object in different row .ex |uniqueIds| |275649| ||108976     index="" source IN ("") | eval PST=_time-28800 | eval PST_TIME=strftime(PST, "%Y-%d-%m %H:%M:%S") | eval split_field= split(_raw, "Z\"}") | mvexpand split_field | rex field=split_field "objectIdsCount=(?<objectIdsCount>[^,]+)" | rex field=split_field "uniqObjectIdsCount=(?<uniqObjectIdsCount>[^,]+)" | rex field=split_field "recordsCount=(?<recordsCount>[^,]+)" | rex field=split_field "sqsSentCount=(?<sqsSentCount>[^,]+)"|where objectType="material" | table_time,PST_TIME,objectType,objectIdsCount,uniqObjectIdsCount,recordsCount,sqsSentCount | sort _time desc      
Hi @herguzav , you should see your question in a different way: what are your requisites? what's the wanted result? starting from this point of view, you can analyze your logs identifying the con... See more...
Hi @herguzav , you should see your question in a different way: what are your requisites? what's the wanted result? starting from this point of view, you can analyze your logs identifying the conditions to verify and if you already have the eventtypes and fields in the DataModel. At least you can see if you really need to add a field or a constrain to the Datamodel. Only for example (because it already exists): if you need to check the failed logins on Linux, you can analyze the Linux message ("Failed Password") and create (if not exists) the related eventtype, then you can see if you have in the Data Model the requested fields (e.g. user, source_ip, etc...), if not, you can add them. Ciao. Giuseppe
Hi @olawalePS , the issue is probably related to the time format: you have different formats in yout data: 1,2 or 3 digits in milliseconds, probably your eval command correctly extracts data only wh... See more...
Hi @olawalePS , the issue is probably related to the time format: you have different formats in yout data: 1,2 or 3 digits in milliseconds, probably your eval command correctly extracts data only when it matchjes the correct format. You sould try to normalize your data, sometimes like this: | eval timestamp1=strptime(lastContactTime,"%Y-%m-%dT%H:%M:%S.%NZ"), timestamp2=strptime(lastContactTime,"%Y-%m-%dT%H:%M:%S.%2NZ"), timestamp2=strptime(lastContactTime,"%Y-%m-%dT%H:%M:%S.%3NZ") | eval timestamp=coalesce(timestamp1,timestamp2,timestamp3) Ciao. Giuseppe
Hi, What are the steps for setting up an email alert when SQL Server and SQL Agent services is down?
Hi @gcusello    no worries about that other wise thanks for your time, sure .   Karma Points are appreciated too for me
Hi @smanojkumar , You could use the "IN" operator for this scenario. Let's assume the field name is "field1", so you could construct the multi-select input like following, This would have the ... See more...
Hi @smanojkumar , You could use the "IN" operator for this scenario. Let's assume the field name is "field1", so you could construct the multi-select input like following, This would have the output of the selected values like following, field1 IN("value1", "value2"), which is same as, field1="value1" OR field1="value2" If you find the solution helpful, kindly upvote. Thanks
Hi @CSReviews , there isn't any limit to the volume of daily indexed data also in exceeding. The only limit is that you can exceed the 500MB limit only two times in 30 solar days, otherwise you'll ... See more...
Hi @CSReviews , there isn't any limit to the volume of daily indexed data also in exceeding. The only limit is that you can exceed the 500MB limit only two times in 30 solar days, otherwise you'll be in violation and searches will be blocked. Remember that there's a Splunk License for students, for more infos see at https://www.splunk.com/en_us/about-us/splunk-pledge/academic-license-application.html?locale=en_us  Ciao. Giuseppe
Hi There!    I would like to pass two values based on the selection of inputs in multiselect drill down,    Assume I'm having Multiselect options as v1, v2, v3, v4    Based on the selection Eg.... See more...
Hi There!    I would like to pass two values based on the selection of inputs in multiselect drill down,    Assume I'm having Multiselect options as v1, v2, v3, v4    Based on the selection Eg. If v1 and v2 were selected, I would like to pass "value1" "value 2" in "OR" condition to a token of a base search. Thanks in Advance!
Hello, I've set up an identity lookup using ldapsearch - it creates an identity of "username" that contains various details about the user, including the email address. It works well in identifying ... See more...
Hello, I've set up an identity lookup using ldapsearch - it creates an identity of "username" that contains various details about the user, including the email address. It works well in identifying the user as `username` and `useremail@domain'. However I'd like to also have it identify users based on `domain\username` and `username@domain' (which is actually different than `useremail` in our case) since a lot of our logs contain the user field in those formats. What's the best way to do that? 
Dear All, I have look up file with Transaction details and Transaction Name Like below. Will be great if someone suggest hot to handle below scenario.  Tran_lookup    Transaction_Details ABC     S... See more...
Dear All, I have look up file with Transaction details and Transaction Name Like below. Will be great if someone suggest hot to handle below scenario.  Tran_lookup    Transaction_Details ABC     Shopping CDE    Rent From my splunk index i am running Stats command like below (Tran from index = Tran_lookup) from  count(Tran) as count , Avg( responstime) as avgrt by Tran  I need to add matching Transaction_Details from lookup  to the final stats results: Current Results: Tran   Count avgrt Required Results (Matching Transaction_Details  to be pulled based on Tran )  Tran Transaction_Details  Count avgrt
Hello, I am looking to use Splunk free edition to teach students about searching through logs. I plan on setting up Splunk within a virtual environment, generating logs, and then exporting the data. ... See more...
Hello, I am looking to use Splunk free edition to teach students about searching through logs. I plan on setting up Splunk within a virtual environment, generating logs, and then exporting the data. Then having students install Splunk on their own machines and import the generated data.  On the free edition, it states "Are you planning to ingest a large (over 500 MB per day) data set only once, and then analyze it? The Splunk Free license lets you bulk load a much larger data sets up to 2 times within a 30 day period". My question- What is the maximum data that can be imported at a single time? Although the virtual environment will be small, only a few workstations and servers, I am worried that the sample data sets I generate might be too large. Thank you
thank you