All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @tdavison76 , I could be more detailed if you could share some sample of your logs, anyway, if you want to take all the content of the "alert.message" field after "On-Prem - ", you could try: | ... See more...
Hi @tdavison76 , I could be more detailed if you could share some sample of your logs, anyway, if you want to take all the content of the "alert.message" field after "On-Prem - ", you could try: | rex field="alert.message" "\<\=On Prem - (?<Name>.*)" Ciao. Giuseppe
We have a lookup in Splunk that we are looking to send a few columns in the lookup to another product via a POST API call. My question is, are there any Splunk add-ons that i can leverage to do this?... See more...
We have a lookup in Splunk that we are looking to send a few columns in the lookup to another product via a POST API call. My question is, are there any Splunk add-ons that i can leverage to do this? I see there is an HTTP alert action that can make a POST, however with this being a lookup (csv) i am not sure it will work correctly. 
I recently migrated from v8 to v9 for Splunk and I am having issues with ldapsearch not returning data that it had previously returned. I am trying to pull lastLogon for accurate tracking but this at... See more...
I recently migrated from v8 to v9 for Splunk and I am having issues with ldapsearch not returning data that it had previously returned. I am trying to pull lastLogon for accurate tracking but this attribute will not return anything. lastLogontimestamp works but is too far out of sync for my requirements on reporting. I have LDAP configuration in the Active Directory add-on set to 3269 and everything else works fine except this one attribute. I setup delegation to read lastLogonTimestamp and then everything so its not a permissions issue from what I can see. Any help would be appreciated. 
Hello, I need help on passing a field value from a Dashboard table into a "Link to search" drilldown but can't figure it out. I have a table that contains a "host" field.  I am needing to be able t... See more...
Hello, I need help on passing a field value from a Dashboard table into a "Link to search" drilldown but can't figure it out. I have a table that contains a "host" field.  I am needing to be able to click on any of the returned hosts and drill into all of the events for that host.   I've tried in hopes that the $host$ would be replaced with the actual host name with this drilldown query: source="udp:514" host="$host$.doman.com" but, of course failed, it just get's replaced with "*". I'm sure I'm probably way off on how to do this, but any help would be awesome.   Thanks in advance. Tom
How to identify Stream_event function is called at time interval or during create/edit data input. 
Hello everyone, I am terrible at regex,  I am trying to regex a field called "alert.message" to create another field with only the contents of alert.message after "On-Prem - ".  I can achieve this i... See more...
Hello everyone, I am terrible at regex,  I am trying to regex a field called "alert.message" to create another field with only the contents of alert.message after "On-Prem - ".  I can achieve this in regex101 with: (?<=On-Prem - ).* But, I know in splunk we have to give it a field name.  I can't figure out the correct syntax to add the field name so it would work. In example of one I've tried without success: rex field="alert.message" "\?(?<Name><=On Prem - ).*" If possible, could someone help me out with this one ? Thanks for any help, Tom  
So I want to build a dashboard with _introspection index , some of the metrics I am looking for are THP (enabled/disabled), Ulimits, CPU, Mem, Disk usage, swap usage, clocks sync (realtime & hardware... See more...
So I want to build a dashboard with _introspection index , some of the metrics I am looking for are THP (enabled/disabled), Ulimits, CPU, Mem, Disk usage, swap usage, clocks sync (realtime & hardware) etc. I couldnt find any solid documentation for _introspection index as to under which source, component these variables will be stored also what all data is available in the index.  Can someone please point me to a doumented list of all the data points in the index if any docs exists. Also any specific component/source I can find the KPIs I mentioned above.
Hi @Sailesh6891 , it's a best practive to create a custom add-on containing all the parsing rules for your data, also because I suppose that there are other parsing rules that you need to add. but ... See more...
Hi @Sailesh6891 , it's a best practive to create a custom add-on containing all the parsing rules for your data, also because I suppose that there are other parsing rules that you need to add. but anyway you can also put this two lines in another props.conf. Ciao. Giuseppe
No, I have not used LINE_BREAKING option.  Do I need to create a props.conf under splunk_home$/etc/apps/local/  and mention these 2 lines ?i.e [sourcetype] and LINE_BREAKING =  :::::::::::::::::::
Hi @Sailesh6891 , did you tried to use LINE_BREKING option in props.conf? [your-sourcetype] LINE_BREAKING = ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: Ciao. G... See more...
Hi @Sailesh6891 , did you tried to use LINE_BREKING option in props.conf? [your-sourcetype] LINE_BREAKING = ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: Ciao. Giuseppe
Hi,  I have a log file on the server which I ingested in splunk through input app where I defined the index , sourcetype and monitor statement in inputs.conf. Log file on the server looks like below... See more...
Hi,  I have a log file on the server which I ingested in splunk through input app where I defined the index , sourcetype and monitor statement in inputs.conf. Log file on the server looks like below: xyz asdfoasdf asfanfafd ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: sdfsdfja agf[oija[gfojerg fgoaierr apodsifa[soigaiga[oiga[dogj ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: sadfnasd;fiasfdoiasndf'i dfdf fd garehaehseht shse thse tjst ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: asdf;nafdsknasdf asdfknasdfln asdf;nasdkfnasf asogja'fja foj'apogj aogj agf   When I try searching the log file in splunk, Logs are visible howerver events are not breaking as I expect it to come. I want events to be separated as below   Event 1: xyz asdfoasdf asfanfafd ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: Event 2: sdfsdfja agf[oija[gfojerg fgoaierr apodsifa[soigaiga[oiga[dogj :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::   Event 3: sadfnasd;fiasfdoiasndf'i dfdf fd garehaehseht shse thse tjst ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: Event 4: asdf;nafdsknasdf asdfknasdfln asdf;nasdkfnasf asogja'fja foj'apogj aogj agf :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::  
Hi @anooshac, you have to coalesce the key fields and then correclate them using stats: if the fields to correlate are field1 and field2 and the fields to display are field3 and field4 from the typ... See more...
Hi @anooshac, you have to coalesce the key fields and then correclate them using stats: if the fields to correlate are field1 and field2 and the fields to display are field3 and field4 from the type1 and field5 from type2 index=your_index sourcetype=your_sourcetype type IN (type1, type2) | eval key=coalesce(field1,field2) | stats values(field3) AS field3 values(field4) AS field4 values(field5) AS field5 BY key Cao. Giuseppe  
Hi all, I have 2 events present in a source type, with different data. There is one field which has same data in both the events but the field names are different. Can anyone suggest a method other ... See more...
Hi all, I have 2 events present in a source type, with different data. There is one field which has same data in both the events but the field names are different. Can anyone suggest a method other than JOIN to combine 2 events? I tried combining the fields by coalesce command, once i combine them i was not able to see the combined fields. I want to combine the events and do some calculations.
The output 1 and 2 are the dynamic values which we get the values from the field "Field1".  I tried with your two queries but no luck. if i removed the condition(where) i can get the results. Seems l... See more...
The output 1 and 2 are the dynamic values which we get the values from the field "Field1".  I tried with your two queries but no luck. if i removed the condition(where) i can get the results. Seems like there is an issue with the condition (output1 and output2)
That's a good direction! unfortunately still not working 100% , i used your code in my props.conf : [APIGW] SEDCMD-trim-file = s/(\\"file\\":\s*\\")([^\\"]{5000,}?)/\1long_file/g and here are the... See more...
That's a good direction! unfortunately still not working 100% , i used your code in my props.conf : [APIGW] SEDCMD-trim-file = s/(\\"file\\":\s*\\")([^\\"]{5000,}?)/\1long_file/g and here are the results: it's like it only replace the 5000 first character instead the entire filed but this is a big step in the right direction thank you for your help! i will try taking it from here but it will be mostly appreciated if you have the solution in you mind and can share it EDIT:  From a few tests I've made it stops the field change exactly after 5000 characters instead of running till the first comma / end of field.  EDIT2:  the regex that was needed was: SEDCMD-trim-file = s/(\\"file\\":\s*\\")([^\\"]{5000,})(\\")/\1long_file/g but thank you for all the help! EDIT3: Well, apparently this solution alone is not enough, I also had to increase the truncate value because when the secmd command run it replaces the string  at the end meaning it first recive the default 10,000 characters and only than replace which is not good enough because the final result is still truncated events, i needed to increase truncate value so it will recive the entire event and later on it's doing the replacement.
Hi There, before solving this i thought to double check this one thru this idea: 1) I thought to write a small python script which will fetch all links from the Splexicon page 2) Verify the http re... See more...
Hi There, before solving this i thought to double check this one thru this idea: 1) I thought to write a small python script which will fetch all links from the Splexicon page 2) Verify the http return code (404 or 401 or .. ), so we can make sure that no more broken links will be there.    As i was working on this, i noticed the Splunk version hardcoded on the docs links: 1) for example, take this one - https://docs.splunk.com/Splexicon:Eventdata this page got two more links under the "For more information" In Getting Data In: Overview of event processing Assign the correct source types to your data the clean links are: https://docs.splunk.com/Documentation/Splunk/9.3.2/Data/Overviewofeventprocessing https://docs.splunk.com/Documentation/Splunk/9.3.2/Data/Setsourcetype when i was working on this last time, i remember that the links would have "latest" now it is hard-coded as "9.3.2"  so, if Splunk releases 9.3.3, then, may i know if some Splunk Docs admin will manually edit/update these links, pls suggest, thanks. 
Hi @alec_stan , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Hi @gcusello  Great thanks.
Hi @inessa40408 , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Hello. Thank you for your reply. You are right, I have given little information.   We have a Windows devices. Theses devices have a limited network map. It does not save log files regarding all c... See more...
Hello. Thank you for your reply. You are right, I have given little information.   We have a Windows devices. Theses devices have a limited network map. It does not save log files regarding all connections to the WiFi network. The only way to get this information is to go to the CMD, run the command: netsh wlan show wlanreport and then this report will be saved in the folder: C:\ProgramData\Microsoft\Windows\WlanReport\wlan-report-latest.html.   But, this script saves the report only after manual entry on the device. We need this report to be saved constantly.   For example, with a frequency of: once an hour. So that later this file could be loaded into splunk to analyze the operation and connection to the WiFi network. Yes, of course, we would like to have more information from this device, such as: signal strength of the equipment, connection breaks, ping failed, MAC addresses of access points to which the device connects. But for now this is not a priority, as I would like to automate saving the LOG file to a specific folder.   I will be so thankfull if you have any ideas or advice on this matter, I would be grateful for the advice   If you have any clarifying questions: do not hesitate to ask me.   Thanks in advance for your answer