All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Thanks for the advice. Well after working with Splunk for +10 years I frankly don’t agree with the “simple string-based manipulation that Splunk can in the ingestion pipe”, I’d say I’ve seen amazin... See more...
Thanks for the advice. Well after working with Splunk for +10 years I frankly don’t agree with the “simple string-based manipulation that Splunk can in the ingestion pipe”, I’d say I’ve seen amazing (to the extend crazy) things done with props and transforms. Said that, Splunk might not be able to do exactly what I’m after here, but I’m willing to spend time trying anyway, as this will have a major impact on the performance at search time. Yes, there are some meta data that need to stay with each event to be able to find them again. I have some ideas in my head on how to twist this, but right now I’m on vacation, and can’t test them the next weeks time or so, so I’m just “warming up”, and looking for / listening in to others crazy ideas of what they have achieved in Splunk
Yes, I used that image but it still didn't work. Thanks for sharing the documentation.
Hello, I am confused about the "Expires" thing when setting an alert. I have my alert scheduled every day and the expires = 24 hours, does that mean after 24 hours the alert will NOT run no more? Tha... See more...
Hello, I am confused about the "Expires" thing when setting an alert. I have my alert scheduled every day and the expires = 24 hours, does that mean after 24 hours the alert will NOT run no more? Thank you.
Is there a native way to run scripts in pwsh.exe managed environment? It's not mentioned in docs so I believe not: https://docs.splunk.com/Documentation/Splunk/9.3.1/Admin/Inputsconf   We all know... See more...
Is there a native way to run scripts in pwsh.exe managed environment? It's not mentioned in docs so I believe not: https://docs.splunk.com/Documentation/Splunk/9.3.1/Admin/Inputsconf   We all know there is [powershell://<name>] in inputs.conf to run "classic" powershell scripts. Actually, it runs script in "classic" PowerShell environment. Depending on which Windows version/build Universal Forwarder is installed on, it will be PS version up to 5.1 (which is managed by powershell.exe binary btw). But now we have a brand-new PowerShell Core (managed by a different binary: pwsh.exe). PowerShell Core have new features, not available in "classic" PowerShell and they're not 100% compatible. Additionally, PowerShell Core is platform agnostic - so we can install it on Linux and run PowerShell Core based scripts there (don't ask me why anyone would do that, but it's possible). Currently I'm running PowerShell Core scripts, by starting batch script in cmd environment, then cmd starts pwsh.exe with defined parameter to run my PowerShell Core based script - not elegant at all.
Hi @sainag_splunk  I have reconfigured HEC and I am able to send data to HEC indexer via Post man. Since, I had configured OTEL collector according to HEC but I am not able to see data from OTEL col... See more...
Hi @sainag_splunk  I have reconfigured HEC and I am able to send data to HEC indexer via Post man. Since, I had configured OTEL collector according to HEC but I am not able to see data from OTEL collector. Can you please suggest where went wrong. Thank you in advance. Regards, Eshwar
Eventstats should work as well (streamstats relies obviously on the order of results that's why I'm sorting on index so that the "payload" events are before the "joining" events; if your indexes are ... See more...
Eventstats should work as well (streamstats relies obviously on the order of results that's why I'm sorting on index so that the "payload" events are before the "joining" events; if your indexes are named differently, you need to adjust this sort). Both commands have their own limitations and it will probably depend on particular use case which approach is more effective.
Nice work @PickleRick !  Imaginative approach!!  I tried out your solution and it appears to work if you replace streamstats with eventstats.  Feels like that should work to me and eventstats feels m... See more...
Nice work @PickleRick !  Imaginative approach!!  I tried out your solution and it appears to work if you replace streamstats with eventstats.  Feels like that should work to me and eventstats feels more efficient than streamstats.  Any thoughts?
I would also like to check my .crt certificates from my own Microsoft CA, is that possible?
Hello, Friends! So, I tried to change the height of the gap between these components:  But in the Edit Dashboard I didn't find anything to change this: Thank you, guys
I recognize PPS logs But seriously - mvindex does not assign anything within a multivalued field. It picks one (or more) of the values from an mvfield. As a general remark - multivalued fields a... See more...
I recognize PPS logs But seriously - mvindex does not assign anything within a multivalued field. It picks one (or more) of the values from an mvfield. As a general remark - multivalued fields are really tricky to work with and if you need to correlate between separate multivalued fields (and I suspect you're aiming at something like that)... this is not going to end well. What is the busines  case and the actual data? Maybe it can be dealt with differently? EDIT: But yes, mvindex can be indexed with dynamically asigned values. A run-anywhere example: | makeresults | eval mv=mvappend("a1","a2","a3") | eval index=mvfind(mv,"a2") | eval value=mvindex(mv,index)  
@yuanliu Thank you for your response.  I tried below query but it doesn't seem to be working.  When  I further cut down the query for testing, looks like "|where index!=B" is not working.  Everything... See more...
@yuanliu Thank you for your response.  I tried below query but it doesn't seem to be working.  When  I further cut down the query for testing, looks like "|where index!=B" is not working.  Everything before this query is working but when I add this condition, I get 0 results. also, the query seems to be very aggressive.  My index A has almost close to 70k events and index B has around 10k events.  Splunk was crashing few times when I try to run the query.   Any suggestions, how to address this ?
So, I didn't find how to use a base search, and then I just decided to proceed with a simple query as well in the Search Page. P.S. The stuff between ` are Macros, you can check here: https://itsi-*... See more...
So, I didn't find how to use a base search, and then I just decided to proceed with a simple query as well in the Search Page. P.S. The stuff between ` are Macros, you can check here: https://itsi-*.splunkcloud.com/en-GB/manager/itsi/data/macros , it's interesting things but is not helpful for me right now.  Thank you, friend! Maximiliano Lopes
I'm not sure how to do that. Let me figure this out how to search in dashboard panel. I'm new to Splunk still learning. 
Hi gcusello, thank you for reply and support. You where right there where 2 versions of app deployed in different directory. How should I proceed to safelly remove unwanted application from Splun... See more...
Hi gcusello, thank you for reply and support. You where right there where 2 versions of app deployed in different directory. How should I proceed to safelly remove unwanted application from Splunk deployed on cloud? Thank you? BR
Good morning ITWhisperer, Thank you very much for the prompt response. The example I provided was for description purposes. The question is generally about using eval mvindex where number of fields... See more...
Good morning ITWhisperer, Thank you very much for the prompt response. The example I provided was for description purposes. The question is generally about using eval mvindex where number of fields (with the same name) changes depending on some circumstances. For simplicity, lets presume we have some logs with "Action" field. The Action field may appear several times in a log, having different values. We do not know exactly how many Action fields we have in a particular event. As I said, it could be one, two three or even 10. That's the challenge. I need to be able to operate on those fileds, but each of them will represent differnt step: Event 1: Action: scan Action: forward-sandbox Action: Release Action: Relay   Event2: Action: scan Action: Release Action: Relay   Event3: Action: scan Action: Reject   In the example above, we have events containing Action fields. However, depending on the actions taken, number of those fields will vary. Therefore, it is difficult for me to use mvindex. I know how to use mvindex where number of fields with the same name or multivalued fields is known.   In our case, we do not know how many occurences of Action we have in a given event.   I hope this makes sense?   Kind Regards,   Mike.
OK. Got it. A run-anywhere search including mockup data | makeresults format=csv data="index,index1Id,curEventId,prevEventId,eventId,eventOrigin index1,23,11,13,, index1,34,12,14,, index1,35,12,... See more...
OK. Got it. A run-anywhere search including mockup data | makeresults format=csv data="index,index1Id,curEventId,prevEventId,eventId,eventOrigin index1,23,11,13,, index1,34,12,14,, index1,35,12,16,, index1,65,17,11,, index1,88,15,12,, index2,,,,11,1 index2,,,,12,2 index2,,,,13,3 index2,,,,14,4 index2,,,,15,5 index2,,,,16,6 index2,,,,17,7" ```This is just a mockup data preparation; now the fun begins``` ```We make two EventId fields from our original one (we can't use rename because we don't want to overwrite the values in the "joining" events with null values``` | eval curEventId=if(index="index1",curEventId,eventId) | eval prevEventId=if(index="index1",prevEventId,eventId) ```And now we "copy over" the values from "single side" results into the compound "both sides" result``` ```Be cautious about streamstats limitations``` | sort - index | fields - index | streamstats values(eventOrigin) AS curEventOrigin by curEventId | streamstats values(eventOrigin) AS prevEventOrigin by prevEventId ```We only need the combined results, not the partial ones``` | where isnotnull(index1Id) ```clear empty fields``` | fields - eventId eventOrigin  
Addon worked fine until upgrade to 9.3.1, exports to azure now is halted with an error message CRITICAL Could not connect to Azure Blob: NameError("name 'BlobServiceClient' is not defined") We've d... See more...
Addon worked fine until upgrade to 9.3.1, exports to azure now is halted with an error message CRITICAL Could not connect to Azure Blob: NameError("name 'BlobServiceClient' is not defined") We've deployed the latest 2.4.0 version available from splunkbase
You could filter for the errors, extract the customerid and count by customerid. Then determine the percentage of all the errors each customerid has and then alert if this percentage is greater than ... See more...
You could filter for the errors, extract the customerid and count by customerid. Then determine the percentage of all the errors each customerid has and then alert if this percentage is greater than a nominal value.
i have the same situation when i try to migration my splunk from old server RHEL6.9 to new RHEL8.8 i did rsync complete and created user splunk as new server ran rpm -i of same version to the new s... See more...
i have the same situation when i try to migration my splunk from old server RHEL6.9 to new RHEL8.8 i did rsync complete and created user splunk as new server ran rpm -i of same version to the new server and shows complete with zero error, however when i went to /opt/splunk/bin and ./splunk start it shows same error like you.   May i know any update of yours, did you fixed ?