All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

That is so funny I never even thought of that   @Shan could you confirm if @bowesmana's reply answers your question and accept it as the answer if so
    I tried,, but the search returning no result.         
Hey man, Thanks for the quick reply, I've installed the UF on the DC. So i need to change some configs on the DC then under the Splunk folder to point back to the Splunk server? What changes need to... See more...
Hey man, Thanks for the quick reply, I've installed the UF on the DC. So i need to change some configs on the DC then under the Splunk folder to point back to the Splunk server? What changes need to be made, i'm guessing its that notepad file under the splunk /etc/system/local. I see outputs file is set to use my Splunk Server and DeploymentClient is pointing to Splunk server IP also. Service is running on DC and firewall rules checked also. Do i need to configure something else on the Splunk server that isn't the receiver landing page? In the Choose logs from this host field(under remote sources), when i chuck the IP of the DC in there it just keeps saying unable to get WMI classes from host. Do i even need to fill out this page? Under forwarder management it says " no clients or apps are currently available on this deployment server" Does that mean i need to install forwarder on the server too? ....and yes just some frustration there.  Cheers
I need to ask if i want to move splunk servers to another data store (vsphere)   would this affects anything regarding splunk it self?    
Can you give a bit more about your query because having to use appendpipe to get dates filled in seems a little unusual. This example | makeresults | eval count=split("2,0,2,0,0,0,0",",") | mvexpand... See more...
Can you give a bit more about your query because having to use appendpipe to get dates filled in seems a little unusual. This example | makeresults | eval count=split("2,0,2,0,0,0,0",",") | mvexpand count | streamstats c | eval _time=now() - ((7 - c) * 86400) | fields - c will produce this single viz whether or not you add | timechart span=1d max(count) as count  
This will partly depend on what proportion of the total data you are looking to exclude. If the excluded proctitles are a significant proportion of the data, then using a post process where or regex ... See more...
This will partly depend on what proportion of the total data you are looking to exclude. If the excluded proctitles are a significant proportion of the data, then using a post process where or regex clause may not perform so well, but you will have to play with that. Setting tags will still involve a search time extraction to evaluate the tag, so under the hood the search is being done. You might want to look at the TERM directive - see this link  https://conf.splunk.com/files/2020/slides/PLA1089C.pdf You will need to understand what constitutes a TERM in your data and whether that will work for your use case, but that can significantly improve performance. When you are looking at this type of performance issue, go look at the job properties in the job inspector - look at scan count values - the more you scan, the more data you are having to check. You could go down the indexed extraction route where you set a field at index time, but that is somewhat static and if you need to exclude a new proctitle, then that won't help, but it will improve search performance at the cost of index performance and disk space.
Or based on your other question, you can directly set that criteria in the initial search, i.e. index=test event.Properties.errMessage!=OK
You will either have to show the pie chart as a trellis chart so it shows one chart for each country or create a composite field containing both country and error message, as the pie chart can only s... See more...
You will either have to show the pie chart as a trellis chart so it shows one chart for each country or create a composite field containing both country and error message, as the pie chart can only show one dimension, i.e. index=test | iplocation Properties.ip | dedup Properties.ip | eval composite=country.":".'event.Properties.errMessage' | stats count by composite  Then the composite will be Australia:OK and so on.  
| where event.Properties.errMessage != "OK"
Create a composite field with the two labels concatenated and count by that
It appears to be related to the size of the visualisation. I changed f to be 100003138 and still I see all 6 slices but if I simply change the size of the visualisation area of the pie chart I w... See more...
It appears to be related to the size of the visualisation. I changed f to be 100003138 and still I see all 6 slices but if I simply change the size of the visualisation area of the pie chart I will see this  
Hi Team Can anyone help me with Splunk search query to split the successful login from invalid?  Ex - I want to exclude OK from the search, want to see only the locket out, invalid, invalid paramet... See more...
Hi Team Can anyone help me with Splunk search query to split the successful login from invalid?  Ex - I want to exclude OK from the search, want to see only the locket out, invalid, invalid parameter   Thanks     
Hi  Can anyone help me with below query  I have created a pie chart based on the error message, however i am not sure how to add country along  index=test | iplocation Properties.ip | dedup Prop... See more...
Hi  Can anyone help me with below query  I have created a pie chart based on the error message, however i am not sure how to add country along  index=test | iplocation Properties.ip | dedup Properties.ip | stats count by event.Properties.errMessage        
Double wild-carded strings are not very efficient. Could you perhaps extract the "proc" values into a field and then use a where command to exclude to events with the undesired values?
I don't know the decrypt command so this might be completely irrelevant, but, is the output (emitted) field a multi value field and if so do you need to use mvexpand to separate out the strings that ... See more...
I don't know the decrypt command so this might be completely irrelevant, but, is the output (emitted) field a multi value field and if so do you need to use mvexpand to separate out the strings that you want to filter on? Another possibility is perhaps the regex command | regex process_decoded!="SELECT"
Try something like this index=compare sourcetype="accountA" OR sourcetype="accountB" | rename nameB as nameA, addressB as addressA, cellB as cellA | eventstats count by accid nameA addressA cellA | ... See more...
Try something like this index=compare sourcetype="accountA" OR sourcetype="accountB" | rename nameB as nameA, addressB as addressA, cellB as cellA | eventstats count by accid nameA addressA cellA | where count==1
Probably - the token you could try to use in the drilldown is $trellis.value$
Hello, How do I compare 2 source types within the same index and find the Gap. For Example: index=compare sourcetype=accountA and sourcetype=accountB; we have some account info in accountA but not i... See more...
Hello, How do I compare 2 source types within the same index and find the Gap. For Example: index=compare sourcetype=accountA and sourcetype=accountB; we have some account info in accountA but not in accountB and objective is to find that gap.   sourcetypeA accid   nameA  addressA cellA 002         test1   tadd1    1234 003         test2    tadd2    1256 003      test2         tadd2    5674 004         test3     tadd3   2345 005         test4      tadd4  4567 006        test5      tadd5   7800 006    test5           tadd5   9900   sourcetypeB accid   nameB  addressB cellB 002       test1        tadd1    1234 003      test2         tadd2    5674 004     test3          tadd3   2345 005     test4           tadd3  4567 006    test5           tadd5   9900   Output will be: 003         test2    tadd2    1256 006        test5      tadd5   7800   Any Recommendation will be highly appreciated.  
Hi, I have setup the Object and event input configuration in the salesforce TA, I am able to see the object logs but unable to see the event logs in splunk cloud.   Any directions of triaging the ... See more...
Hi, I have setup the Object and event input configuration in the salesforce TA, I am able to see the object logs but unable to see the event logs in splunk cloud.   Any directions of triaging the issue? Appropriate permissions are provided for the salesforce user.
You might be wanting to configure Splunk to start at boot time. /opt/splunk/bin/splunk enable boot-start ref: https://docs.splunk.com/Documentation/Splunk/latest/Admin/ConfigureSplunktostartatboott... See more...
You might be wanting to configure Splunk to start at boot time. /opt/splunk/bin/splunk enable boot-start ref: https://docs.splunk.com/Documentation/Splunk/latest/Admin/ConfigureSplunktostartatboottime