All Topics

Top

All Topics

Hi all, I am new to SPLUNK and would appreciate some community wisdom. We are trying to get data from an external AWS s3 bucket (hosted and managed by 3rd party supplier) onto our internal enterpris... See more...
Hi all, I am new to SPLUNK and would appreciate some community wisdom. We are trying to get data from an external AWS s3 bucket (hosted and managed by 3rd party supplier) onto our internal enterprise SPLUNK instance. We do not have any AWS accounts.  We have considered whitelisting but it is not secure enough. The supplier does not use AWS firehose Any ideas? 
@LukeMurphey  I'm trying to run the File/Directory Information Input app (v1.4.5) on a universal forwarder. It's a windows server and I've installed the latest version of python 3 (and set the app t... See more...
@LukeMurphey  I'm trying to run the File/Directory Information Input app (v1.4.5) on a universal forwarder. It's a windows server and I've installed the latest version of python 3 (and set the app to use 3). I keep getting the same 3 errors in splunkd (copied from another post as my system is isolated): "09-18-2019 10:47:10.099 +0200 ERROR ModularInputs - Introspecting scheme=file_meta_data: Unable to run "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\file_meta_data\bin\file_meta_data.py" --scheme": child failed to start: The system cannot find the file specified. 09-18-2019 10:47:10.356 +0200 WARN UserManagerPro - Can't find [distributedSearch] stanza in distsearch.conf, using default authtoken HTTP timeouts 09-18-2019 10:47:10.356 +0200 ERROR ModularInputs - Unable to initialize modular input "file_meta_data" defined in the app "file_meta_data": Introspecting scheme=file_meta_data: Unable to run "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\file_meta_data\bin\file_meta_data.py" --scheme": child failed to start: The system cannot find the file specified.." (Except it says Python3.exe instead of python). Other posts with these errors did not have python installed, or one said their path environment variable was incorrect but didn't elaborate. My path is set with the 2 default values from the installer if that matters.
Hi I am trying to use the hyperlink markdown you shared with someone else, but when I add the  [Markdown Guide](https://www.markdownguide.org) inside the email body of sending an email action (SMTP... See more...
Hi I am trying to use the hyperlink markdown you shared with someone else, but when I add the  [Markdown Guide](https://www.markdownguide.org) inside the email body of sending an email action (SMTP), I get exactly above without a link added, just text, no hyperlink.  Anybody who could help me figure out how I can get a hyperlink to show in the body of the send email action?  I am on version 5.3.2.88192 - I also tried the ,<a> tag with the href and that doesn't work either.
  Hi Splunk Gurus... As you can see, non English words length function not working as expected. checked the old posts, documentations, but no luck. any suggestions please. thanks.      | makeresu... See more...
  Hi Splunk Gurus... As you can see, non English words length function not working as expected. checked the old posts, documentations, but no luck. any suggestions please. thanks.      | makeresults | eval _raw="இடும்பைக்கு" | eval length=len(_raw) | table _raw length this produces: _raw length இடும்பைக்கு 11 (that word இடும்பைக்கு is actually 6 charactors, not 11)      
Hi I wanted to remove unwanted events from my data, To ingest as cleanest as possible and for better line breaking etc. These events are not like regular logs. Since these are generated by a script ... See more...
Hi I wanted to remove unwanted events from my data, To ingest as cleanest as possible and for better line breaking etc. These events are not like regular logs. Since these are generated by a script which runs everyday at 12 am and log file rotates and gets renamed as logfile.log.<date> Here is the pattern that I wanted to remove using SEDCMD and apply it in props.conf [09/13/2023 00:00:00]       <Event was scheduled based on job definition.> [10/12/2023 23:58:01]       <Executing at CA_AGENT> [11/12/2023 23:58:01]        ----------------------------------------
How to display one row table in a pie chart? Thank you for your help. index=test ---- Score calculation ----- | table Score1, Score2, Score3, Score4 Score1 Score2 Score3 Score4 70 50... See more...
How to display one row table in a pie chart? Thank you for your help. index=test ---- Score calculation ----- | table Score1, Score2, Score3, Score4 Score1 Score2 Score3 Score4 70 50 60 90 My expected Pie Chart:       
Hi folks I've a KVstore containing the following values: hostname, IP address. This KVstore is updated every hour to ensure that the host name and IP address always match. The KVStore is updated us... See more...
Hi folks I've a KVstore containing the following values: hostname, IP address. This KVstore is updated every hour to ensure that the host name and IP address always match. The KVStore is updated using saved search. The reason is that the environment (IP … hostname relationship) changes very often (DHCP) I was thinking about the automatic lookup when logs containing an IP address are ingested to enrich them with the hostname corresponding at the time of ingestion and not the one corresponding during the next search? Unfortunately ingest-time lookup is not available in Splunk Cloud Platform and this functionality is also only for CSV file lookup. And also the solution with intermediate forwarder is not suitable for me. Any advice ?
I had a missing data from a certain date and time range. How would i re-ingest the data into splunk from a UF.   Below is the inputs.conf [monitor:///app/java/servers/app/log/app.log.2023-11-12... See more...
I had a missing data from a certain date and time range. How would i re-ingest the data into splunk from a UF.   Below is the inputs.conf [monitor:///app/java/servers/app/log/app.log.2023-11-12] index = app_logs ignoreOlderThan = 10d disabled = false sourcetype = javalogs Its missing data from Nov-11 00:05 till Nov-12 13:00 so how would i just reinject the data only for that certain data/time period. It just one log file for a day although we have some events so how would i regest only the missing data for the time period and please let me know the config.    
Hello,  I have the below Splunk search and I want to put the results into a line graph so I can compare all of the disk instances e.g. C, D , F over a period of time.  The search that I am using ... See more...
Hello,  I have the below Splunk search and I want to put the results into a line graph so I can compare all of the disk instances e.g. C, D , F over a period of time.  The search that I am using is:  index=windows_perfmon eventtype="perfmon_windows" Host="XXXX" object="LogicalDisk" counter="% Disk Write Time" instance="*" AND NOT instance=_Total AND NOT instance=Hard* | stats latest(Value) as Value by _time, instance | eval Value=round(Value, 2) Any advise as I would like to create this in a line graph visualisation with the instances on different lines so you can do trend analysis on the Disk Write Time.   The results I am getting are:  _time instance value 2023-11-15 15:28:02 C: 2.83 2023-11-15 15:28:02 D : 0.01 2023-11-15 15:33:02 C: 4.10 2023-11-15 15:33:02 0.01 2023-11-15 15:38:02 C: 2.59 2023-11-15 15:38:02 0.01 2023-11-15 15:43:02 C: 1.98 2023-11-15 15:43:02 0.01 2023-11-15 15:48:02 C: 2.81 2023-11-15 15:48:02 0.01 2023-11-15 15:53:02 C: 2.51 2023-11-15 15:53:02 0.01
Hi there,  I have multiple panels added in a dashboard and I would like to reduce the font size of the entire dashboard contents - the dashboard is being created using classic dash, not the studio... See more...
Hi there,  I have multiple panels added in a dashboard and I would like to reduce the font size of the entire dashboard contents - the dashboard is being created using classic dash, not the studio.  Is there a possibility to achieve this?  Ty!
I have the below code. I know that values exist under the subsearch which are not returning when I run the below query. However, when I uncomment the "where clause" in the sub search the values appea... See more...
I have the below code. I know that values exist under the subsearch which are not returning when I run the below query. However, when I uncomment the "where clause" in the sub search the values appear. I don't know what I have done incorrectly for my results to not show. I've also commented out the |search and it still doesn't show that these values exist in the subsearch.  Any help would be appreciated.  index=customer name IN (gate-green, gate-blue) msg="*First time: *" | rex field=msg "First time: (?<UserId>\d+)" | eval FirstRequest = 1 | join type=left UserId [search index=customer name IN (cust-blue, cust-green) msg="*COMPLETED *" | rex field=msg "Message\|[^\t\{]*(?<json>{[^\t]+})" | spath input=json path=infoId output=UserId | eval Completed = 1 ```| where UserId IN (125,999,418,208)```] | table UserId, Completed | search UserId IN (125,999,418,208)  
Hello, I'm building a query which matches entries in an inputlookup table against a set of log data. The original working query (thanks to @ITWhisperer ) is: dataFeedTypeId=AS [ | inputlookup appr... See more...
Hello, I'm building a query which matches entries in an inputlookup table against a set of log data. The original working query (thanks to @ITWhisperer ) is: dataFeedTypeId=AS [ | inputlookup approvedsenders | fields Value | rename Value as sender] | stats count as cnt_sender by sender | append [inputlookup approvedsenders | fields Value | rename Value as sender] | fillnull cnt_sender | stats sum(cnt_sender) as count BY sender This is correctly providing a list of all of the emails address entries in the lookup file with the number of times they occur in the email address field (sender) of the dataset. However, the "Value" field from the lookup file may contain an email address, an IP address, or a domain name: "Value,description,spoof,spam,heuristic,newsletter training.cloud@bob,Email Tests,Y,Y,Y,Y mk.mimecast.com,mimecast emails,Y,Y,Y,Y blah@yahoo.com,more belgrat,Y,Y,Y,Y bbc.co.uk,BBC sends me lots of useful information,N,Y,Y,Y 81.96.24.195,test IP,Y,Y,Y,Y yahoo.com, Yahoo domain,Y,Y,Y,Y" What I need to do next is to widen the search, at the moment it is doing an exact match on the sender field and I need the entries in the lookup file to be used against a number of fields in the log data. Examples: This query seems to provide the correct results but it does not show the null values as above(n.b. the return 1000 isn't optimal code, I'd prefer it to simply return all of the results): dataFeedTypeId=AS [| inputlookup approvedsenders | fields Value | return 1000 $Value] | stats count as cnt_sender by sender This query also provides the correct results but again does not show the null values (there is no domain field in the log data, I am using an app to extract it from the sender field) dataFeedTypeId=AS | rex field=sender "\@(?<domain_detected>.*)" | eval list="mozilla" | `ut_parse_extended(domain_detected, list)` | lookup approvedsenders Value AS domain_detected OUTPUTNEW description Value | lookup approvedsenders Value AS sender OUTPUTNEW description Value | lookup approvedsenders Value AS senderIp OUTPUTNEW description Value | search description="*" | table subject sender domain_detected senderIP description Value | lookup approvedsenders description OUTPUTNEW Value as Matched | stats count by Matched   My apologies for the long question, thank you for taking the time to read this far.
In a custom app, is there a means of concealing the Actions dropdown button in a Dashboard Studio dashboard if I do not want users to be able to Download PDF, Download PNG, Clone Dashboard? Thanks
Is there a suggested size of lookup that would be the maximum size of a lookup that should be used for an automatic lookup? Such as if your lookup exceeds more than x rows it would best not to use w... See more...
Is there a suggested size of lookup that would be the maximum size of a lookup that should be used for an automatic lookup? Such as if your lookup exceeds more than x rows it would best not to use with an automatic lookup?    
I need MS exchange0365 trace logs in splunk which add-on can provide these logs? It it possible with azure monitor logs? I tried graphAPI add-on but it can only send activity logs not trace messages l... See more...
I need MS exchange0365 trace logs in splunk which add-on can provide these logs? It it possible with azure monitor logs? I tried graphAPI add-on but it can only send activity logs not trace messages logs. Kindly help. Thanks.
Hello everyone, need your help! We have a data source which is sending huge logs and thus we want to drop useless field values before indexing, we have installed an add-on which extracts a field fro... See more...
Hello everyone, need your help! We have a data source which is sending huge logs and thus we want to drop useless field values before indexing, we have installed an add-on which extracts a field from raw logs (example: ProductName) and assigns the value of this field as source to the raw logs and similarly uses the same props and transforms to extract rest of the field/values till this part everything works smooth, now the issue is among those extracted fields there is a field which i want to drop before indexing, i understand that there is way to send particular field/value to a nullqueue using props and transforms, below is a sample configuration i tried but it didn't work. Your support in this is highly appreciated.   Props: [test:syslog] SHOULD_LINEMERGE = false EVENT_BREAKER_ENABLE = true TRANSFORMS-test_source = test_source, test_format_source TRANSFORMS-nullQ=nullFilter REPORT-regex_field_extraction = test_regex_field_extraction, test_file_name_file_path REPORT-dvc = test_dvc Transforms: [test_source] REGEX = ProductName="([^"]+)" DEST_KEY = MetaData:Source FORMAT = source::$1 [test_format_source] INGEST_EVAL = source=replace(lower(source), "\s", "_") [test_dvc] REGEX = ^<\d+>\d\s[^\s]+\s([^\s]+) FORMAT = dvc::"$1" [nullFilter] REGEX = (?mi)XYZData\>(.*)?=\<*?\/XYZData\> FORMAT = remove_field::$1 DEST_KEY = queue FORMAT = nullQueue [test_regex_field_extraction] REGEX = <([\w-]+)>([^<]+?)<\/\1> FORMAT = $1::$2 CLEAN_KEYS = false [test_file_name_file_path] REGEX = ^(.+)[\\/]([^\\/]+)$ FORMAT = source_process_name::$2 source_process_path::$1 SOURCE_KEY = SourceProcessName [test_severity_lookup] filename = test_severity.csv [test_action_lookup] filename = test_action_v110.csv case_sensitive_match = false [drop_useless_fields] INGEST_EVAL = random:=null()   ## Below is the sample raw log, i want to drop the field <XYZData> Sample raw data: <29>1 2023-11-09T18:34:02.0Y something testEvents - EventFwd [agentInfo@ioud tenantId="1" bpsId="1" tenantGUID="{00000000-0000-0000-0000-000000000000}" tenantNodePath="x\y"] <?xml version="x.0"?> <test_hjgd><MachineInfo><AgentGUID>{iolkiu-5d9b-89iu-3e19-jjkiuhygtf}</AgentGUID><MachineName>something</MachineName><RawMACAddress>xyxyxyxyxz</RawMACAddress><IPAddress>xx.xx.xx.xx</IPAddress><AgentVersion>x.x.x.xx</AgentVersion><OSName>GGG</OSName><TimeZoneBias>-333</TimeZoneBias><UserName>xxx</UserName></MachineInfo><EventList ProductName="something" ProductVersion="xx.x" ProductFamily="Secure"><Event><Time>2023-11-09T18:34:38</Time><Severity/><EventID>jkiuj</EventID><XYZData>T1BHoQUAAAAAAAADQg0AAO0dXVNbu1E/JZP3xNyElJsZ13cA24kbwB7bhPaJMdiAG8DUNjehnf737of26ONIOrID0zycyWCfo13trlar1WolOU31h/qh7tSteqX+VDO1VCs1Vwt1r/6qXqvf1Fu1A9+vAHKvLqF8CtB7dU3QUzVWXfUGsP5COCu1VhOATuHzlmjMCO8enhH+h2qppjoA6AogbfhG/AHwXMDTDOivNe+hegQKM9Uhme6h/BX8PakHeOsBfZbtI/ztwhNSNZhjjfeKZF3B8y1weVIn8HmnJeqor+qcPjtQPobnsfoHSNKBpwFwtyH78L2vjgA+At4jKOnCew9KOtAG5P4bydJUjYAUrmyfSC8XpJ8e1G6p/6grtad+h3/voHwXIKjPD0DvPTztgvax/A2Uf4RSLHsPpXtQ6wpoTqHeDpTvge7+60lQ5tWEti3gbQ7YT6RH5L/j/XuT+eT/Y/5lDi7XIUj3J/UL93QLavr1yjguDenJVqmmDdkHK2VNfHXsmvvrLfztwOcuyY504vjcg3NoDY+CGfT9Ar4nhb2iDc+gR+YwllpFP2xSA8fSIfzNtfw4pj7qPn8Hnx+gp5Guj9eEN6R3X4wf5ItjZUSja10xEg5VH95PQF6k2oO292gsnEP9MehkTGOiDzbfhTb0CPN10WvVvKUnrqm917oU6z2SrrGlov04TqgH5uRvECbPHfIZdwmP8Tvo0fYYNsXN/AaOKPQQh6Qf4z2wdGz5B8Qewr8+/DF0BHVe6x4OWYrvOWxYC/r/HuS5oJZiCxfw+UD2egXPS5D0FfA6BG7ojy/JH4e4IHWbU8N7C+s0ZtumL3DsPpAkaz1TsH+fQl9eUq/EtYt+d0CaGpOlfdLeuA96OyX79L0z6nGXZKriy3b4QHznpBWU/DNgoOT3hd84AV5H1JfHuhc/qTMYgWyj1fVDXHpAD/t7D+i8hT/2Pm9p9voQocx1mmAtOAqWVMYt+kGedaD7Ig1vQ+lKa8T1PT6dXMwOze+o6TlZhKEQg5gZaUg8cGaPz+fv9OiUCGAFlNnL3VLcgF75iry9kVJwOiDpJdmBD0HOEyu+KHN/T/PBa92DBjPHK6DNYpzAHhT96Tn89T1fGaKK3E4o+im3SrQto31aUPJLUzSee35PSdsmS7TLdrRt+eVN0NMKRhha+yNFKIJbLq/i+jfAXAWhEmE9rwaq+GHbcAyLh1xv1JMj8uk4D3aBwi3ZyI4e6WWI62+RL3uvQ239C4dCHi73QrgFl1D3yRpfXwD+9AI6TvFKSeJGhnl4bdDLsRe3voTdxPjgPDyB+epluNq0zXxwAt9r9Z1s4RtY1g1gLbU+TimqaeloU54PCS72MKexuXoBeWN87DkqJX2oz/N9OEd0Q4rmjC830V0XYAPLp6e42bPYy/StSx37iD3TAlr1mWa8qV7JNAr4HdWZ6Gh9VtAY0ep+aXmLXGzxLPb8yp5spjMKvCao1v5IR3bDYg2CUWAXYK8tL+hS5bXhkviasl2IIz7C50eIsffo+YNeK/qYqBWJuW61H7ZnavSEGFn7ccHzxgPVcrCGV5W8DylfgauSo2IF4kqC8fM+6JHf+uAR7fVcrgytDHxbd8YPjWj0NrRGH8leU9GdX+eYYJcw6pGfWOQZ9egNtOce3p8AC+vO9Zwp7duurhnHomn0OQvwQLzSwHamIpTGT9YPeZmuHm8La3z7s7bd2pXuhQWtzG89GHrU74DxRq/Gq/CatA6dq39Bu2YlPjt67ohjxOpjjLPUWmklqPh4aJnskaY6fjS9bFqVxgrFPWW5D2n+Rs9oVhe2zbfJUicUocm8lYYPaX29IK82tWYzod6IrKCkrrynxpCxOzOOcuvG6thR+jb1c8a+Xbcd9HkSGcRg///obkDtuSmylebNHdfY+xI7NAqfuaSs5Jpm3BuylEnS/2Puaag+URbqnHJNn3Uu4zxg3ZhxM94/xs/NaoRW3mGoyCVYZzRip+T5xuRfzYojDpOoLy1BGsuXRLBTElXjdEg7OJLZs8yB81zrdhWcrXJqGBxembG3wJbuw/Oa/P5NkZEz/nbzejavEYzGC/VPxZncvGyL1D2gPNvThrVCUm3HN9cTlSm4UfJmdUckM9cNZbP21HslGaU2xeTcH2Jdbon9fkr7Y8sApg3Zzh82Ii2Ity2nto1jMDCKCNm5GY0+zbwatqUfbsghr4ZtY5vzyK1jamy6eym56dgeJu4z4C7lKeCMnfVqnKNk5WwfdQy97O7P2tmrXGyf7gg+r9V9gpaL4dcvz5thKiE8n9YA5spBNsU4NsfWBtMetX6sKavxy43wD+B9sxp27OzODo+65xckBVMux9t5dewatkz5fHJrpTRcrdMcLTYCfOIztcstjufzjGO61mbLNoae5VXBDcWY0yLncpOw180ocMuPI9gmSkzhsFaqaKRxRGNVVKqwwhmvatk2qRPm0SPvxRmyvLb8LBW77+z5tLoX09h2f+bQzcV2+ziHcj5+qt/z2rBd7c1sYVtJtqNn535k/2jlnLEwdpHCsunIOoN9GM7Z/4bnLyBVmVoKt5lYoTQSMCOhT6+RgNk75rxHLZllv9ysE208t4xPJi0tDZp3zoEvdb7axMipMwbhGk0rL+TG4+FyzrNg33NWyc1BpKADnVdbkwQSG8r85EL9c1gxGPK7olKzqimXcZaMV3GnAD9ysmd+Oa7VsL9MziSdFT9SfajdLk6UjIsTT/4ezwGU/N3JifucXP68cr7SI820sArDzhWlcEX37q5XTqtPKAt0RidqvmS0vEc7B32wSj59Y3QQ5u1mL2UfO0eyIdl+H9Yu+6DvI9XJkK5cx8gXlyIsI66Pl8o+cfo80rbhaeiV9enM3gF9n+gTozGpy3KZ7BpS7Og9/Ee1tnIT6Xxgr9DegCTZ93alxnrnbezsTaU5Sqx6SlHzI2Gz10N/ckjzz0KPfxOX5mAzXfHep9SmBeVobEphOHrJbxRXnwHsInnC2Hj9/BrxfFk5N+JD21D6UMjt+9Uh1PsehUo5nm/iGVTOptrnmzhGQI2ylwxnIF1YF57yzz126fRzdZbBpiqzxoLmVcyt2buCPsT0cgjCufSljgFSddNYHKctCxzRrzufVeP0aca7Vnxm7jZCJwera/l7d66OQb6qmc5Ky3zjl5TPPzYCZeHstpzolHUujk60bcQ8ohpGSzHYKfCS81vm+bOaUFzKo8k9Wyo5Kv/EaY1X470EXsOxxtA8Z86cNSrg7shyx489Ml1I2c8Z3yjrjNp71t6z9p413q+G9yt4z2ofKf6Cs0joSy5ICvukUhondnYtfg48dsdtkxooOXrPWaLWGKA/it7YDD+O1bdOmdnzwiY1/Hsrtk+OQUxf2tih0vK6Wc7GT+j0SltrUzgs9VzBmalV5O7Ju2K3/vnukmIe5SvdHu05K/9t7ok+73ml+p6oua8Vv/f5q90KlZOV9R1Q+96hyYG85M3DWLbjuW4emrzbtjcMwxTqm4T1TcL6JmF9k7C+SVjfJPwZrvVNwvomYX2TsL5JWN8kRHnrm4T1TcL6JmF9k7C+SbjpTULTS5+g3+Yvkl0M8bA5pywgB2uk4wa02BMawxfKnPxLQQ8ossqPI9uKf22Mf2fsALxY3hkcl4+/j8IWxZjl8qaTQbdxy+VNkujA0prx2n55kyKHvpNV2VQDPSrrW1H1KKGFMj/WhHhezCFOqHckPxCH2ZB4D++UqKSswdanrMVHoIUVjeK7re2kq3r6nOKIfrdyTCM3x25yJZIZ7A7e2MteOCuIRhLao75YWzlKqRWDhCIXzilKTlhGXy4m5t6vyUdNiavcrfRP3bneIVS66TkMW8e8Er0kCzHytkp9EccbkKVdR/YKYlCOmx6SdVMYEgX9zD4c96ucmTzWWuK6vSKSqsLgs4+8nuAdg6mSnZVvRe/K6gjjhhuiG9mnfoY5BmwW9WGYyDhdN5cto9pwkDz3TcSbrG237iHoCVxa41kBKuEY38Z1ND7lfq7kew0u6FXapvZPcSy37sE1r5fCRe52on0fFtxa24VxFx925ex4tclzW1bbo91k5aktHmfnYbX2z1twLqaach824HTqnhfrImW2kDv6mqb/mj9GzJYvpD1fKFxriRhnuiTg/+ojFJOzLjQ3IGjBUblPoV8QkeA/goJhd3Vm3T7EIZjq/UJbmDD7DM2u/WNHltDuGE74ZImfXF7RCvc2IHWL3RPq0M3yUFTekZShLOSYveU9y8T7C5nKO4XufIkDZu91c0rIcHJ8MvP3+aumwzrnOOo6sGyGxyNTnwDHmLUnDdoy/TS35Rhci8dAE5FkVmVV/PygMx9UL7vhhyguikjio7lTKj0xrFHt/wl1m9lDi5X/w5cGCK3OXjt/zKShnj4nH0PHYO5EP90i7sOq8Z0NdZWchZjp6QzA3Pr5N2s26e12Cllg91zUCPKDw8pc5AzMkO8Q9nIseIzwddqpvd4zKiqxizbl7FsvgUks4RAYr/uIr8kbvCFvl9S/fsvjQjvll7Nb/N/JrTU/wA=</XYZData><UserInfo>T1BHoQUAAAAAAAADJwEAAIWSTW+CYBCE56eQ3lv10vRANQ1Yw6EfSfXSm1HbmgAlgE35933ehYooxgvszszusLv4muhXiWJ5+tFGuQpt9a1U97rSSDca8vZgUq3A17CpPo1daK5HXaO6NU2hUkvYNc/YemxMlxI7fqKxfE3xWaB1XhHcBhjhytguUMLKK6dr7DeUTkamZURAotHnSykO8pqIzxrPTMM6FLrTvPPTCLm2aHf3lUdZ6r+5XNBnaWfcGte30v6d4OJmrjGVtx3hk1A0Pm1JfM8N+9m0/ptwSJG7abB/RK6OWuUjWKUyykl8tz+iZ26XI/ST/zZNVLpnNXrLWnWLDfYt5x72KBzdPybfZOnDV4G79w/XozETtyriuyoakOEV+veGztv93aFPHRnS7xfX/qWH8=</UserInfo><DestinationUserInfo>T1BHoQUAAAAAAAADJwEAAIWSTW+CYBCE56eQ3lv10vRANQ1Yw6EfSfXSm1HbmgAlgE35933ehYooxgvszszusLv4muhXiWJ5+tFGuQpt9a1U97rSSDca8vZgUq3A17CpPo1daK5HXaO6NU2hUkvYNc/YemxMlxI7fqKxfE3xWaB1XhHcB6xHhhjhuyfgfr7DeUTkamZURAotHnSykO8pqIzxrPTMM6FLrTvPPTCLm2aHf3lUdZ6r+hjhhyujte30v6d4OJmrjGVtx3hk1A0Pm1JfM8N+9m0/ptwSJG7abB/RK6OWuUjWKUyykl8tz+iZ26XI/ST/zZNVLpnNXrLWnWLDfYt5x72KBzdPybfZOnDV4G79w/XozETtyriuyoakOEV+veGztv93aFPHRnS7xfX/qWH8=</DestinationUserInfo><ThreatName/><PolicyName/><TimeSZone>+00</TimeSZone></Event></EventList></XYYPREV_1100>
Hello, I have a lookup file and I would like to use it to search a dataset and return a table showing each entry in the lookup file with a count of their number of matches in the main dataset includ... See more...
Hello, I have a lookup file and I would like to use it to search a dataset and return a table showing each entry in the lookup file with a count of their number of matches in the main dataset including displaying the entries from the lookup file which have a null value.   I've tried a number of techniques, the closet I have come is: dataFeedTypeId=AS [| inputlookup approvedsenders | fields Value | return 1000 $Value] | stats count as cnt_sender by sender But this only shows the lookup fileentries with non-zero values (n.b. I am manually adding 1000 to the return to make it work, that's a different problem) I have also tried: dataFeedTypeId=AS [ | inputlookup approvedsenders | fields Value] | stats count as cnt_sender by Value | append [ inputlookup approvedsenders | fields Value] | fillnull cnt_sender | stats sum(cnt_sender) as count BY Value This shows all the values in the lookup file but shows a zero count against each one.  Thank you in advance.  
hai all i am using below search to get enrich a field StatusDescription using subsearch  when i was running sub search alone its gives me results for hostname and StatusDescription but using below... See more...
hai all i am using below search to get enrich a field StatusDescription using subsearch  when i was running sub search alone its gives me results for hostname and StatusDescription but using below by join StatusDescription field is getting empty values please correct me    index=_internal sourcetype=splunkd source="/opt/splunk/var/log/splunk/metrics.log" group=tcpin_connections os=Linux | dedup hostname | rex field=hostname "(?<hostname>[^.]+)\." | eval age=(now()-_time) | eval LastActiveTime=strftime(_time,"%y/%m/%d %H:%M:%S") | eval Status=if(age<3600,"Running","DOWN") | rename age AS Age | eval Age=tostring(Age,"duration") | table _time, hostname, sourceIp, Status, LastActiveTime, Age | join type=left hostname [ search index=index1 sourcetype="new_source1" | rename NodeName AS hostname | table hostname, StatusDescription ]
We are facing very strange issue as the objects of specific Apps reverted back to old settings even the lookup files were impacted on our SHC this issue repeated more than once since upgrading Splunk... See more...
We are facing very strange issue as the objects of specific Apps reverted back to old settings even the lookup files were impacted on our SHC this issue repeated more than once since upgrading Splunk to v 9.0.4 Can you please help on this case ?
Hi, The code is like index=main host=server10 (EventCode=4624 OR  EventCode=4634) Logon_Type=3 NOT user="*$" NOT user "ANONYMOUS LOGON" | dedup user | where NOT MsgID==AUT22673 | eval LoginTime=_t... See more...
Hi, The code is like index=main host=server10 (EventCode=4624 OR  EventCode=4634) Logon_Type=3 NOT user="*$" NOT user "ANONYMOUS LOGON" | dedup user | where NOT MsgID==AUT22673 | eval LoginTime=_time | table user LoginTime   The output will list active RDP user.  No idea how to fix the rest of it, either 1: If number of user == 0, then print "No Remote desktop user" 2: Or put number of user into a Single Value, Radial Gauge (not username) Sounds so easy but I cannot figure out how to fix it.  Too little Splunk experience. Rgds Geir