All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi, I am looking to parse the nested JSON events. basically need to break them into multiple events. I an trying some thing like this but its just duplicating same record in multiple lines.   ... See more...
Hi, I am looking to parse the nested JSON events. basically need to break them into multiple events. I an trying some thing like this but its just duplicating same record in multiple lines.   | spath path=list.entry{}.fields output=items | mvexpand items   I am looking to get all key/vale pair as single event under  "fields"  Sample Records   { "total": 64, "list": { "entry": [ { "recordId": 7, "created": 1682416024092, "id": "e70dbd86-53cf-4782-aa84-cf28cde16c86", "fields": { "NumDevRes001": 11111, "NumBARes001": 3, "lastUpdated": 1695960000000, "engStartDate": 1538452800000, "RelSupport001": 0, "UnitTest001": 0, "Engaged": 1, "ProdGroup001": 1, "QEResSGP001": 0.5, "QEResTOR001": 1, "QEResLoc001": 3, "SITBugs001": 31, "QEResIND001": 5, "QEResLoc003": 3, "QEResLoc002": 3, "Project": "Registration Employee Directory Services", "AutoTestCount001": 1657, "AppKey001": "ABC", }, "ownedBy": "TEST1" }, { "recordId": 8, "createdBy": "TEST2", "created": 1682416747947, "id": "91e88ae6-0b64-48fc-b8ed-4fcfa399aa3e", "fields": { "NumDevRes001": 22222, "NumBARes001": 3, "lastUpdated": 1695960000000, "engStartDate": 1538452800000, "RelSupport001": 0, "UnitTest001": 0, "Engaged": 1, "ProdGroup001": 1, "QEResSGP001": 0.5, "QEResTOR001": 1, "QEResLoc001": 3, "SITBugs001": 31, "QEResIND001": 5, "QEResLoc003": 3, "QEResLoc002": 3, "Project": "Registration Employee Directory Services", "AutoTestCount001": 1657, "AppKey001": "ABC", }, "ownedBy": "TEST2" } ] } }          
Hello I'm trying to calculate ratio of two fields but im getting wrong results if i'm calculating each one of them separately im getting right results but together something is wrong     ... See more...
Hello I'm trying to calculate ratio of two fields but im getting wrong results if i'm calculating each one of them separately im getting right results but together something is wrong     index=clientlogs sourcetype=clientlogs Categories="*networkLog*" "Request.url"="*v3/auth*" Request.url!=*twofactor* "Request.actionUrl"!="*dev*" AND "Request.actionUrl"!="*staging*" | eval UserAgent = case(match(UserAgent, ".*ios.*"), "iOS FE",match(UserAgent, ".*android.*"), "Android FE",1=1, "Web FE") | dedup UserAgent, _time | stats count as AttemptsFE by UserAgent _time | appendcols [search index=clientlogs sourcetype=clientlogs Categories="*networkLog*" "Request.url"="*v3/auth*" Request.url!=*twofactor* "Request.actionUrl"!="*dev*" AND "Request.actionUrl"!="*staging*" "Request.status" IN (201, 207) NOT "Request.data.twoFactor.otp.expiresInMs"="*" | eval UserAgent = case(match(UserAgent, ".*ios.*"), "iOS FE",match(UserAgent, ".*android.*"), "Android FE",1=1, "Web FE") | dedup UserAgent, _time | streamstats count as SuccessFE by UserAgent _time] | eval SuccessRatioFE = round((SuccessFE/AttemptsFE)*100, 2) | eval SuccessRatioFE = (SuccessFE/AttemptsFE)*100 | timechart bins=100 avg(SuccessRatioFE) as SuccessRatioFE BY UserAgent      
I'm planning to start an integration between Splunk and ESET endpoint security cloud platform, but I facing the following issue: the Syslog-ng server started receiving uncleared/encrypted logs from ... See more...
I'm planning to start an integration between Splunk and ESET endpoint security cloud platform, but I facing the following issue: the Syslog-ng server started receiving uncleared/encrypted logs from the ESET endpoint security, so the logs appear on the HF server like this:  ^A^B  ^L 7 ^] ^W  ^^  ^Y  ^X # ^W (^D^C^E^C^F^C^H^G^H^H^H ^H 2 I think I want to decrypt the logs when received by the syslog-ng because Splunk can't handle any decryption process, I need help with how I can decrypt the logs in the Syslog-ng.
It is not clear what your criteria are for determining what an anomaly is. Also, from your example, you don't need to combine the fields, you could just to something like this | stats sum(error) as... See more...
It is not clear what your criteria are for determining what an anomaly is. Also, from your example, you don't need to combine the fields, you could just to something like this | stats sum(error) as count by Svc Cust Evnt | sort -count
Hi @MattHatter  Did you find a solution to this ? We had exactly the same problem and we managed to get is resolved by editing the lookup_edit file (under Settings - User Interface - Views) as foll... See more...
Hi @MattHatter  Did you find a solution to this ? We had exactly the same problem and we managed to get is resolved by editing the lookup_edit file (under Settings - User Interface - Views) as follows: <view template="lookup_editor:/templates/generic.html" type="html" isDashboard="true" isVisible="true"> <label> Lookup edit </label> </view>
If this is not working, please share your exact dashboard source as something might have been lost in converting the answer to your solution.
Trying to find anomalies for events. I have multiple services and multiple customers. I have an error "bucket" that is caputuring events for failures, exceeded, notified, etc. I'm looking for a way ... See more...
Trying to find anomalies for events. I have multiple services and multiple customers. I have an error "bucket" that is caputuring events for failures, exceeded, notified, etc. I'm looking for a way to identify when there are anomalies or outliers for each of the services/customers. I have combined (eval) service, customer, and the error and just counting the number of error events generated by each service/customer. So for example: svcA svcB svcC custA custB custC would give svcA-custA-failures 10 svcA-custA-exceeded 5 svcA-custA-notified 25 svcB-custA-failures 11 svcB-custA-exceeded 9 svcB-custA-notified 33 svcB-custB-failures 3 svcA-custB-exceeded 7 svcA-custB-notified 22 svcA-custC-exceeded 8 svcA-custC-failures 3 svcA-custC-notified 267 svcC-custC-exceeded 1 svcC-custC-failures 4 svcC-custB-notified 145 svcC-custA-notified 17   Something along the lines of this: | eval Svc-Cust-Evnt=Svc."-".Cust."-".Evnt | stats sum(error) by Svc-Cust-Evnt | rename sum(error) as count | sort -count
You can do those “indexer stuff” with that forwarder licence. Only thing what is missing is indexing. You need to open only management access. Normally this is port 8089/tcp. Then if/when you want t... See more...
You can do those “indexer stuff” with that forwarder licence. Only thing what is missing is indexing. You need to open only management access. Normally this is port 8089/tcp. Then if/when you want to monitor those with MC you need to access also MC -> LC that same port and those as indexer and create some own groups for those etc.
Thank you for your reply, I was looking into the SaaS doc and my future set-up will be on SaaS. Best Regards
@richgalloway  As I'm trying  to exclude the fields like user_watchlist & ip_options under the vpn index etc. Can you pls share the props and transforms  .conf to exclude the above fields by crea... See more...
@richgalloway  As I'm trying  to exclude the fields like user_watchlist & ip_options under the vpn index etc. Can you pls share the props and transforms  .conf to exclude the above fields by creating a  custom app. Thanks
Hi @jwalrath1 , let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all t... See more...
Hi @jwalrath1 , let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
That is really interesting and you are right - I tried these variants C:\Windows\system32\cmd.exe /d /c C:\ProgramData\Symantec\Symantec Endpoint Protection\14.3.8289.5000.105\Data\Definitions\WebEx... See more...
That is really interesting and you are right - I tried these variants C:\Windows\system32\cmd.exe /d /c C:\ProgramData\Symantec\Symantec Endpoint Protection\14.3.8289.5000.105\Data\Definitions\WebExtDefs\20230830.063\webextbridge.exe* C:\Windows\system32\cmd.exe /d /c C:\ProgramData\Symantec\Symantec Endpoint Protection\14.3.8* C:\Windows\system32\cmd.exe /d /c C:\ProgramData\Symantec\Symantec Endpoint Protection\*\webextbridge.exe* and the top two do not work, the last does. If I make the second one end in 14.3.* then it DOES work. Not sure what's going on there, 
You would either have to include that subsearch part as an OR in the outer search and munge the data so you could join the data sets with stats somehow, or create a lookup through a saved search on a... See more...
You would either have to include that subsearch part as an OR in the outer search and munge the data so you could join the data sets with stats somehow, or create a lookup through a saved search on a regular basis (if it changes) and use the lookup to filter rather than the subsearch, then you'd have anything you need
I have a search and subsearch that is working as required but there is a field in the subsearch that I want to display in the final table output but is not a field to be searched on. index=aruba sou... See more...
I have a search and subsearch that is working as required but there is a field in the subsearch that I want to display in the final table output but is not a field to be searched on. index=aruba sourcetype="aruba:stm" "*Denylist add*" OR "*Denylist del*" | eval stuff=split(message," ") | eval mac=mvindex(stuff,4) | eval mac=substr(mac,1,17) | eval denyListAction=mvindex(stuff,3) | eval denyListAction= replace (denyListAction,":","") | eval reason=mvindex(stuff,5,6) | search mac="*:*" [ search index=main host=thestor Username="*adgunn*" | dedup Client_Mac | eval Client_Mac = "*" . replace(Client_Mac,"-",":") . "*" | rename Client_Mac AS mac | fields mac ] | dedup mac,denyListAction,reason | table _time,mac,denyListAction,reason What I want is for the value held in field Username to be included in the table command of the outer search.  How do I pass it from the subsearch to be used in the table command and not used as part of the search? Thanks.
(Update) Use to_json with transpose.   | eval sha256 = sha256(_raw) | transpose 0 header_field=sha256 | search column=_raw | fields - column | tojson default_type=json | fields _raw   Your sample... See more...
(Update) Use to_json with transpose.   | eval sha256 = sha256(_raw) | transpose 0 header_field=sha256 | search column=_raw | fields - column | tojson default_type=json | fields _raw   Your sample data thus give _raw {"13a485b005f3ef9af9d1e9326223f5f86d60ff1d9677d0f5e4749f91ad650227":{"key1":"val1","key2":"val2"},"b92a2ad0ea51aa55a9b298a752a6de0997c96324b3c4e74ec8d4876af490d67a":{"key1":"val1a","key2":"val2a"}} I think this is closer to what you ask. Another method (initial attempt): Use json_set in foreach.  Assuming the "event" you described is _raw. (Works the same if they are in a different field such as "event".  Just replace _raw with "event".)   | stats values(_raw) as event | eval consolidated = json_object() | foreach event mode=multivalue [eval consolidated = json_set(consolidated, sha256(<<ITEM>>), <<ITEM>>)]   Your sample events will give event consolidated { "key1": "val1", "key2":"val2"} { "key1": "val1a", "key2":"val2a"} {"13a485b005f3ef9af9d1e9326223f5f86d60ff1d9677d0f5e4749f91ad650227":"{ \"key1\": \"val1\", \"key2\":\"val2\"}","b92a2ad0ea51aa55a9b298a752a6de0997c96324b3c4e74ec8d4876af490d67a":"{ \"key1\": \"val1a\", \"key2\":\"val2a\"}"} Drawback; This produces an embedded JSON string (as opposed to a JSON object) as value of sha256. Here is an emulation you can play with and compare with real data   | makeresults | eval data = mvappend("{ \"key1\": \"val1\", \"key2\":\"val2\"}", "{ \"key1\": \"val1a\", \"key2\":\"val2a\"}") | mvexpand data | rename data AS _raw ``` data emulation above ```    
Nice ref.  Thanks, @bowesmana! (Looks like something since 8.)
Hi @Shalini ... Can you update us the current license's duration(expiry date)..  Upgrade Order: Step 1) from your current 7.2, you should upgrade to 8.1.x Step 2) then from 8.1.x, you should upgra... See more...
Hi @Shalini ... Can you update us the current license's duration(expiry date)..  Upgrade Order: Step 1) from your current 7.2, you should upgrade to 8.1.x Step 2) then from 8.1.x, you should upgrade to 9.1.x https://docs.splunk.com/Documentation/Splunk/9.1.1/Installation/AboutupgradingREADTHISFIRST  
Hi @bowesmana , Thank you for clarifying that Splunk lookup does not support regex patterns. I have just attempted to include the following event in the Splunk lookup, with a wildcard at the end, i... See more...
Hi @bowesmana , Thank you for clarifying that Splunk lookup does not support regex patterns. I have just attempted to include the following event in the Splunk lookup, with a wildcard at the end, in order to match other events occurring after "webextbridge.exe." But, looks like it is not working Original event :- C:\Windows\system32\cmd.exe /d /c C:\ProgramData\Symantec\Symantec Endpoint Protection\14.3.XXXX.5000.105\Data\Definitions\WebExtDefs\20230830.063\webextbridge.exe chrome-extension://XXXXXXXXXXXXXXXXXXXXXXXXXXXXX/ --parent-window=0 &lt; \\.\pipe\chrome.nativeMessaging.in.XXXXXXXXXXXa3 &gt; \\.\pipe\chrome.nativeMessaging.out.10f754de9b9001a3 Splunk lookup table field value :- "C:\Windows\system32\cmd.exe /d /c C:\ProgramData\Symantec\Symantec Endpoint Protection\14.3.8289.5000.105\Data\Definitions\WebExtDefs\20230830.063\webextbridge.exe*" Regards VK
should be possible i think.  maybe, could you please copy paste the dashboard's html here (removing / anonymizing hostnames, important details)
>>> Actually you don’t need that communication at all, you could change HF license mode to use forwrder licence when it can use all HF features to forward events to the next full splunk instances (hf... See more...
>>> Actually you don’t need that communication at all, you could change HF license mode to use forwrder licence when it can use all HF features to forward events to the next full splunk instances (hf, uf or indexer). It can just forward but not index anything.   yes @isoutamo .. we thought that idea. but, as HF does some "preprocessing" (field extractions, etc) of logs, right.. so, if we use HF just like a UF(only for forwarding the logs), then indexer's job is same like as if we dont have the HF at all, right (i mean, the indexer needs to do full job of all processing of logs)   EDIT >>> the HF - LM communication is always one way, from HF to LM never other way.  you mean, HF will send request to LM asking the license info then it takes care of its job. there is no need of LM requesting/sending/asking info from/to the HF? ok, simple question... between HF and LM... please update us the ports configuration. thanks @isoutamo , karma points given appreciating your response. thanks again.