All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@Wiessiet- Thanks for posting your findings on Splunk community.   Though I have a suggestion for you. If you already have a solution to the problem. What I would do is post a question and post a r... See more...
@Wiessiet- Thanks for posting your findings on Splunk community.   Though I have a suggestion for you. If you already have a solution to the problem. What I would do is post a question and post a reply to your question & accept your answer. This way other people see that question as resolved & available with solution straightaway.   I hope this make sense!!
@Anit_Mathew- If it is Splunk ES default dashboard without any change in it and if it is still giving you error you can raise Splunk support ticket for it.  
@mpc7zh- Downgrading never works the same way as Upgrade. Will the commands work?? -> Yes, maybe. But it will break functionalities and stuff. Soar may not work properly. It may even create issues w... See more...
@mpc7zh- Downgrading never works the same way as Upgrade. Will the commands work?? -> Yes, maybe. But it will break functionalities and stuff. Soar may not work properly. It may even create issues when you upgrade it in the future as well.   Maybe Splunk support might be able to help you with this. You can raise a support ticket for it.   But my question to you is, why do you need to downgrade in the first place?? Because a lot of time, usually thing that you need to do might be possible even without downgrading the soar.   I hope this helps!!! Kindly upvote if it does!!!
@JagsP- I don't see this App on the Splunkbase. Where did you get this App?? ( The reason I'm asking is because the error is coming from the Python code within this App.)   I hope this helps!!!
@cybersunny- Can you please post part of the dashboard XML which includes this mailto?? ( You can hide any sensitive information.)  
@ryanaa- I think question better suitable for Machine Learning community.
@joe2- I would like to clarify few points and I think you will get the idea on how you can do something like that: Your query-1 is not working, because it seems you are using the old query, that ma... See more...
@joe2- I would like to clarify few points and I think you will get the idea on how you can do something like that: Your query-1 is not working, because it seems you are using the old query, that macro from old query does not exist anymore it seems. The new query is based on firewall data. Here - https://research.splunk.com/network/1ff9eb9a-7d72-4993-a55e-59a839e607f1/ But because this is dependent on firewall traffic data, it only works when you have firewall between those two machines. It could be traditional firewall or AWS firewall or anything. For your query-2, again you are looking for source=firewall* data. And windows data contain contain that sourcetype that's why you are seeing no results.   Summary: If you have the traffic monitoring device in-between those two machines, use the traffic logs to detect it. https://research.splunk.com/network/1ff9eb9a-7d72-4993-a55e-59a839e607f1/ https://research.splunk.com/network/3141a041-4f57-4277-9faa-9305ca1f8e5b/ But you don't then the only option is to have something on the Windows Victim device that logs all traffic to the machine, and use that traffic logs to build the query.   I hope this helps!!! Kindly upvote if it does!!!
@darling- Its public release App now, I don't think there should be any restriction on which Cloud stack you can install. But for better clarity maybe you can contact Splunk support for Cloud & they... See more...
@darling- Its public release App now, I don't think there should be any restriction on which Cloud stack you can install. But for better clarity maybe you can contact Splunk support for Cloud & they should resolve your confusion.   I hope this helps!!!
Your solution works perfectly. I still need to do some wider testing to make sure there's no gaps, but it looks like exactly what I need.....the only issue is....I'm not sure *exactly* what it works.... See more...
Your solution works perfectly. I still need to do some wider testing to make sure there's no gaps, but it looks like exactly what I need.....the only issue is....I'm not sure *exactly* what it works. I know what fillnull and eval do, but the way you've used mvfilter confuses me. If you have the time, could you explain in simple terms how your solution works, pelase?
Thnak you for your help.   For example, If I have a MV field with the values "red", "blue", "N/A", "N/A" I would want to filter out the "N/A" fields. However, if instead I have an MV field with ... See more...
Thnak you for your help.   For example, If I have a MV field with the values "red", "blue", "N/A", "N/A" I would want to filter out the "N/A" fields. However, if instead I have an MV field with the single value "red", then I would want it left alone And third, if I have an MV field with the values "N/A", "N/A", and "N/A", then I would want it left alone. Only when there's a MV field with both the "N/A" field and a non-N/A  field do I want the N/A fields removed.
hi, could you check this requirements. For example, is your cpu supported avx / avx2 instructions, if yes, is it enabled ? https://docs.splunk.com/Documentation/Splunk/9.4.0/Admin/MigrateKVstore h... See more...
hi, could you check this requirements. For example, is your cpu supported avx / avx2 instructions, if yes, is it enabled ? https://docs.splunk.com/Documentation/Splunk/9.4.0/Admin/MigrateKVstore https://www.mongodb.com/docs/manual/administration/production-notes/ i hope this help
Ok. If your initial stats doesn't include _time field, there's nothing to bin. That's why you're getting no results.
What do you mean by "check"? Do you filter your initial results so that you have only those where field F contains at least two values of which one is 'N/A' and one isn't? Or do you want to do a cond... See more...
What do you mean by "check"? Do you filter your initial results so that you have only those where field F contains at least two values of which one is 'N/A' and one isn't? Or do you want to do a conditional evaluation? (All other values which do not contain 'N/A' are left as they were).
| eval filtered=mvfilter(mvfield!="N/A") | fillnull value="N/A" filtered
@AJH2000  Try fetching logs manually using Postman or CURL.  If you get a 200 OK response with logs, the API is working. If there are authentication errors, check your token and API permissions.
Hi @pedropiin , you stats and bin statemets are wrong, please try this: <your_search> | bin span=1d _time | eval var1=... | eval var2=... | sort var2 | eval var3=... | stats count(var1) AS va... See more...
Hi @pedropiin , you stats and bin statemets are wrong, please try this: <your_search> | bin span=1d _time | eval var1=... | eval var2=... | sort var2 | eval var3=... | stats count(var1) AS var1 count(var2) AS var2 count(var3) AS var3 BY day  About sensitive information, you can mask them, for me it's interesting only the event structure and the field extractions. Ciao. Giuseppe
@dtaylor  Check this,  if a multivalue field contains both "N/A" and at least one non-"N/A" value. If both conditions are met, it removes "N/A" and returns the remaining values otherwise, it keeps t... See more...
@dtaylor  Check this,  if a multivalue field contains both "N/A" and at least one non-"N/A" value. If both conditions are met, it removes "N/A" and returns the remaining values otherwise, it keeps the field unchanged.   
I've been smashing my head against this issue for the past few hours. I need to check a multivalue field to see if it contains the "N/A" *and* any value that isn't "N/A". If this is true, I need to f... See more...
I've been smashing my head against this issue for the past few hours. I need to check a multivalue field to see if it contains the "N/A" *and* any value that isn't "N/A". If this is true, I need to filter whatever "N/A" exist within the field and return the remaining non-N/A values as a multivalue field.
Thank you for your reply. I have found the cause of the issue, there was a mistake in executing the query in the app, which does not have permission to access the database on Splunk. Thanks again!
I'm also assuming that you've already set maxKBps = 0 in limits.conf: # $SPLUNK_HOME/etc/system/local/limits.conf [thruput] maxKBps = 0