All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @Deepthi1, you need at first to install the Splunk_TA_nix add-on on the forwarder on the Linux servers and enable (in inputs.conf) the df.sh input stanza that takes information about file systems... See more...
Hi @Deepthi1, you need at first to install the Splunk_TA_nix add-on on the forwarder on the Linux servers and enable (in inputs.conf) the df.sh input stanza that takes information about file systems. Then you can use the output of this script on your dashboards. Ciao. Giuseppe
 message: Send async response via rest [url=https://prd.ase1.dbktp-feedloader.prd.gcp.db.com/callbackservice/book, asyncResp={"transactionItems":[{"itemId":"KTPACC1_20240717000001633206_01","status":... See more...
 message: Send async response via rest [url=https://prd.ase1.dbktp-feedloader.prd.gcp.db.com/callbackservice/book, asyncResp={"transactionItems":[{"itemId":"KTPACC1_20240717000001633206_01","status":"FAILED","accountIdentification":{"gtbCashAccount":{"branchCode":"788","accountNumber":"0191395050","currencyCode":"USD"}}},{"itemId":"KTPACC1_20240717000001633206_02","status":"FAILED","accountIdentification":{"gtbCashAccount":{"branchCode":"788","accountNumber":"0000195054","currencyCode":"USD"}}}],"orderStatusResponse":{"orderStatus":"ORDER_FAILURE","orderId":"KTPACC1_20240717000001633206"},"error":{"errorCode":"SEP013","errorDescription":"Cannot find IDMS-0788 account by accNumber: 0000195054"}}] ==================================================================================   am using below query but its not giving me output , any idea  index = app_events_sdda_core_de_prod source="/home/sdda/apps/logs/sep-app/app-json.log" level=TRACE | fields message | rex field=message \"error\":\{\"errorCode\":\"(?<errorCode>[^\"]+)\" | dedup errorCode | table errorCode =====================================================================   However syntax showing correct on https://regex101.com/r/XkBntG/1     
Hi @NanSplk01 , Splunk is a search engine, not a file explorer! you can have the list of files in a folder creating a script on the remote server that lists the files (e.g. every 15 minutes) so you... See more...
Hi @NanSplk01 , Splunk is a search engine, not a file explorer! you can have the list of files in a folder creating a script on the remote server that lists the files (e.g. every 15 minutes) so you can list them, but displaying the files in a dashboard in the original format with its own application isn't a job for Splunk. Ciao. Giuseppe
Hi @sswigart , as you can read at https://docs.splunk.com/Documentation/Splunk/9.2.2/Admin/Inputsconf#Event_Log_filtering, if the logs you indicated aren't relevant for your searches, you can foll... See more...
Hi @sswigart , as you can read at https://docs.splunk.com/Documentation/Splunk/9.2.2/Admin/Inputsconf#Event_Log_filtering, if the logs you indicated aren't relevant for your searches, you can follow the instructons in the above link, adding a blacklist to your inputs.conf in the WinEventLog:Security stanza. Only one attention point: in the blacklist you must use a regex, not a string to search. Ciao. Giuseppe
Hi @jacksonchandler , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Hi @bhaskar5428 , you need a correlation key that cannot be a simple string, in this case you need to extract this field: in your case, the correlation key should be orderId, so you could run somet... See more...
Hi @bhaskar5428 , you need a correlation key that cannot be a simple string, in this case you need to extract this field: in your case, the correlation key should be orderId, so you could run something like this: index = app_events_sdda_core_de_prod source="/home/sdda/apps/logs/sep-app/app-json.log" level=TRACE | rex field=message "\"orderId\":\"(?<orderId>[^\"]+)\"},\"error\":\{\"errorCode\":\"(?<errorCode>[^\"]+)\"" | fields orderId errorCode | dedup orderId | table orderId, errorCode | append [ index = app_events_sdda_core_de_prod "Process transaction locally" b95d0d10-9709-4299-9d3e-8c65dd5a539d source="/home/sdda/apps/logs/sep-app/app-json.log" | rex field=message "deliveringApplication=(?<AppID>\w+)" | rex "(?<orderId>\w{8}-\w{4}-\w{4}-\w{4}-\w{12})" | dedup AppID | table AppID orderId ] | stats values(errorCode) AS errorCode values(AppID) AS AppID BY orderId This solution has only one limit: you must be sure that the second search will have less than 50,000 results, otherwise, you need a different solution Ciao. Giuseppe
Here is a simple mockup demonstrating the feature { "visualizations": { "viz_WuaoFRne": { "type": "splunk.table", "dataSources": { "primary": "ds_... See more...
Here is a simple mockup demonstrating the feature { "visualizations": { "viz_WuaoFRne": { "type": "splunk.table", "dataSources": { "primary": "ds_67f8SogQ" }, "hideWhenNoData": true } }, "dataSources": { "ds_67f8SogQ": { "type": "ds.search", "options": { "query": "| makeresults count=10\n| eval group=mvindex(split(\"Inactive,Active\",\",\"),random()%2)\n| eval groupId=\"Group \".random()%9\n| search group=\"$groups$\"" }, "name": "groups" } }, "defaults": { "dataSources": { "ds.search": { "options": { "queryParameters": { "latest": "$global_time.latest$", "earliest": "$global_time.earliest$" } } } } }, "inputs": { "input_global_trp": { "type": "input.timerange", "options": { "token": "global_time", "defaultValue": "-24h@h,now" }, "title": "Global Time Range" }, "input_hM19F0Rl": { "options": { "items": [ { "label": "All", "value": "*" }, { "label": "Inactive groups", "value": "Inactive" }, { "label": "Active groups", "value": "Active" } ], "token": "groups" }, "title": "Display", "type": "input.dropdown" } }, "layout": { "type": "absolute", "options": { "width": 1440, "height": 960, "display": "auto" }, "structure": [ { "item": "viz_WuaoFRne", "type": "block", "position": { "x": 0, "y": 0, "w": 830, "h": 300 } } ], "globalInputs": [ "input_global_trp", "input_hM19F0Rl" ] }, "description": "", "title": "Hidden" } If you are still having problems please share you dashboard source code in a code block </> (as demonstrated above).
Thanks Guiseppe. I solved it myself with your advice - data was there but wasnt showing as a table because my dashboard wasnt configured to do that - was to be shown as a list. Changed that and worked!
Hi @jacksonchandler , let me understand: if you run the search without the last row (the table command) does ir run and does you have results? if yes, remove the last row, also because you don't n... See more...
Hi @jacksonchandler , let me understand: if you run the search without the last row (the table command) does ir run and does you have results? if yes, remove the last row, also because you don't need it. if not, run the search without the last two rows and check if the field ut_domain_without_tld is present. Could you share the content of the macro? if you use <CTRL><SHIFT>E on the search with the macro, you can have the full search (without macro) displayed in a window. Ciao. Giuseppe
Like @P_vandereerden says, SPL is totally different from procedural languages.  You need to think differently.  One point is: explicit iteration should be used sparsely.  There are also lots of other... See more...
Like @P_vandereerden says, SPL is totally different from procedural languages.  You need to think differently.  One point is: explicit iteration should be used sparsely.  There are also lots of other elements in the illustrated code that make it "unSPL" and some unnecessary. For a problem like this, it is better to follow my four golden rules ("four commandments") of asking answerable questions. To ask an answerable data analytics question, follow these golden rules; nay, call them the four commandments: Illustrate data input (in raw text, anonymize as needed), whether they are raw events or output from a search that volunteers here do not have to look at. Illustrate the desired output from illustrated data. Explain the logic between illustrated data and desired output without SPL. If you also illustrate attempted SPL, illustrate actual output and compare with desired output, explain why they look different to you if that is not painfully obvious. In your case, you also want to illustrate how desired output change when the token takes different values.  One more tip: Use Splunk's auto format feature to format SPL if there are more than a couple pipes.  Like this: index="xxx" source = "yyyyzzz" AND $DropdownValue$ AND Input | eventstats max(_time) as maxTimestamp by desc | head 1 | dedup _time | eval lastTriggered = strftime(_time, "%d/%m/%Y %H:%M:%S %Z") | stats values(lastTriggered) as lastTriggeredTime | appendcols [search index="xxx" source = "yyyyzzz" sourcetype = "mule:rtf:per:logs" AND $DropdownValue$ AND Output | eventstats max(_time) as maxTimestamp by desc | head 1 | dedup _time | eval lastProcessed = strftime(_time, "%d/%m/%Y %H:%M:%S %Z") | stats values(lastProcessed) as lastProcessedTime] | appendcols [search index="xxx" source = "yyyyzzz" sourcetype = "mule:rtf:per:logs" AND $DropdownValue$ AND Error | eventstats max(_time) as maxTimestamp by desc | head 1 | dedup_time | eval lastErrored = strftime(_time, "%d/%m/%Y %H:%M:%S %Z")] | eval "COMPONENT ID"="$DropdownValue$" | eval "Last Triggered Time"=lastTriggeredTime | eval "Last Processed Time"=lastProcessedTime | eval "Last Errored Time"=lastErrored | table "COMPONENT ID", "Last Triggered Time", "Last Processed Time","Last Errored Time" | fillnull value="NOT IN LAST 12 HOURS" "COMPONENT ID","Last Triggered Time", "Last Processed Time","Last Errored Time" After this formating, you can easily see why some commands are wasteful.
i have applied but data is events are getting merged in online please check attachments   sorry i have modified the json file and here is it what will the sourcetype settings [ { "sourcetype... See more...
i have applied but data is events are getting merged in online please check attachments   sorry i have modified the json file and here is it what will the sourcetype settings [ { "sourcetype": "testoracle_sourcetype", "check_name": "cdb_tbs_check", "check_error": "", "check_status": "OK", "current_use_mb": "1355", "percent_used": "2", "tablespace_name": "SYSTEM", "total_physical_all_mb": "65536", "database_name": "C2N48617", "host_name": "flosclnrhv03.pharma.aventis.com", "instance_name": "C2N48617" }, { "sourcetype": "testoracle_sourcetype", "check_name": "cdb_tbs_check", "check_error": "", "check_status": "OK", "current_use_mb": "23596", "percent_used": "36", "tablespace_name": "SYSAUX", "total_physical_all_mb": "65536", "database_name": "C2N48617", "host_name": "flosclnrhv03.pharma.aventis.com", "instance_name": "C2N48617" }, { "sourcetype": "testoracle_sourcetype", "check_name": "cdb_tbs_check", "check_error": "", "check_status": "OK", "current_use_mb": "29", "percent_used": "0", "tablespace_name": "UNDOTBS1", "total_physical_all_mb": "65536", "database_name": "C2N48617", "host_name": "flosclnrhv03.pharma.aventis.com", "instance_name": "C2N48617" }, { "sourcetype": "testoracle_sourcetype", "check_name": "cdb_tbs_check", "check_error": "", "check_status": "OK", "current_use_mb": "4", "percent_used": "0", "tablespace_name": "USERS", "total_physical_all_mb": "65536", "database_name": "C2N48617", "host_name": "flosclnrhv03.pharma.aventis.com", "instance_name": "C2N48617" }, { "sourcetype": "testoracle_sourcetype", "check_name": "fra_check", "check_error": "", "check_status": "OK", "flash_in_gb": "40", "flash_reclaimable_gb": "0", "flash_used_in_gb": "1.5", "percent_of_space_used": "3.74", "database_name": "C2N48617", "host_name": "flosclnrhv03.pharma.aventis.com", "instance_name": "C2N48617" }, { "sourcetype": "testoracle_sourcetype", "check_name": "General_parameters", "check_error": "", "check_status": "OK", "database_major_version": "19", "database_minor_version": "0", "database_name": "C2N48617", "database_version": "19.0.0.0.0", "host_name": "flosclnrhv03.pharma.aventis.com", "instance_name": "C2N48617", "script_version": "1.0" }, { "sourcetype": "testoracle_sourcetype", "check_name": "pdb_tbs_check", "check_error": "", "check_status": "OK", "current_use_mb": "76", "pdb_name": "O1S48633", "percent_used": "0", "tablespace_name": "UNDOTBS1", "total_physical_all_mb": "65536", "database_name": "C2N48617", "host_name": "flosclnrhv03.pharma.aventis.com", "instance_name": "C2N48617" }, { "sourcetype": "testoracle_sourcetype", "check_name": "pdb_tbs_check", "check_error": "", "check_status": "OK", "current_use_mb": "5", "pdb_name": "O1S48633", "percent_used": "0", "tablespace_name": "TOOLS", "total_physical_all_mb": "65536", "database_name": "C2N48617", "host_name": "flosclnrhv03.pharma.aventis.com", "instance_name": "C2N48617" }, { "sourcetype": "testoracle_sourcetype", "check_name": "pdb_tbs_check", "check_error": "", "check_status": "OK", "current_use_mb": "21", "pdb_name": "O1NN2467", "percent_used": "0", "tablespace_name": "UNDOTBS1", "total_physical_all_mb": "65536", "database_name": "C2N48617", "host_name": "flosclnrhv03.pharma.aventis.com", "instance_name": "C2N48617" }, { "sourcetype": "testoracle_sourcetype", "check_name": "pdb_tbs_check", "check_error": "", "check_status": "OK", "current_use_mb": "627", "pdb_name": "O1NN2467", "percent_used": "1", "tablespace_name": "SYSAUX", "total_physical_all_mb": "65536", "database_name": "C2N48617", "host_name": "flosclnrhv03.pharma.aventis.com", "instance_name": "C2N48617" }, { "sourcetype": "testoracle_sourcetype", "check_name": "pdb_tbs_check", "check_error": "", "check_status": "OK", "current_use_mb": "784", "pdb_name": "O1S48633", "percent_used": "1", "tablespace_name": "SYSTEM", "total_physical_all_mb": "65536", "database_name": "C2N48617", "host_name": "flosclnrhv03.pharma.aventis.com", "instance_name": "C2N48617" }, { "sourcetype": "testoracle_sourcetype", "check_name": "pdb_tbs_check", "check_error": "", "check_status": "OK", "current_use_mb": "1547", "pdb_name": "O1NN8944", "percent_used": "2", "tablespace_name": "SYSAUX", "total_physical_all_mb": "65536", "database_name": "C2N48617", "host_name": "flosclnrhv03.pharma.aventis.com", "instance_name": "C2N48617" }, { "sourcetype": "testoracle_sourcetype", "check_name": "pdb_tbs_check", "check_error": "", "check_status": "OK", "current_use_mb": "1149", "pdb_name": "O1S48633", "percent_used": "2", "tablespace_name": "USERS", "total_physical_all_mb": "65536", "database_name": "C2N48617", "host_name": "flosclnrhv03.pharma.aventis.com", "instance_name": "C2N48617" }, { "sourcetype": "testoracle_sourcetype", "check_name": "pdb_tbs_check", "check_error": "", "check_status": "OK", "current_use_mb": "58", "pdb_name": "O1NN8944", "percent_used": "0", "tablespace_name": "UNDOTBS1", "total_physical_all_mb": "65536", "database_name": "C2N48617", "host_name": "flosclnrhv03.pharma.aventis.com", "instance_name": "C2N48617" }, { "sourcetype": "testoracle_sourcetype", "check_name": "pdb_tbs_check", "check_error": "", "check_status": "OK", "current_use_mb": "7804", "pdb_name": "O1S48633", "percent_used": "12", "tablespace_name": "SYSAUX", "total_physical_all_mb": "65536", "database_name": "C2N48617", "host_name": "flosclnrhv03.pharma.aventis.com", "instance_name": "C2N48617" } ]
The fragment you illustrated is NOT a complete XML document.  Please post full event.  My suspicion is that your raw event contains an XML document, but also contains something that is not XML.  You ... See more...
The fragment you illustrated is NOT a complete XML document.  Please post full event.  My suspicion is that your raw event contains an XML document, but also contains something that is not XML.  You will need to first extract XML into a field, then apply spath.
If you want i can share raw event for both queries
index = app_events_sdda_core_de_prod source="/home/sdda/apps/logs/sep-app/app-json.log" level=TRACE | fields message | rex field=message "\"orderId\":\"(?<orderId>[^\"]+)\"},\"error\":\{\"errorCode... See more...
index = app_events_sdda_core_de_prod source="/home/sdda/apps/logs/sep-app/app-json.log" level=TRACE | fields message | rex field=message "\"orderId\":\"(?<orderId>[^\"]+)\"},\"error\":\{\"errorCode\":\"(?<errorCode>[^\"]+)\"" | dedup orderId | table orderId, errorCode   --------------------------------------------------------------------------------------------------------------------------------------- index = app_events_sdda_core_de_prod "Process transaction locally" b95d0d10-9709-4299-9d3e-8c65dd5a539d source="/home/sdda/apps/logs/sep-app/app-json.log" |rex field=message "deliveringApplication=(?<AppID>\w+)" |dedup AppID |table AppID   Above order id i have added just for showcase purpose actually i want SPl in such way that order id in my first SPL automatically get checks in 2nd  and i will get three column. Inner search kind of thing   pls help   
Hi, Im trying to collate URL domain names of users who visit websites over the course of 24 hours. It pulls the right data but it wont table and im not sure how to fix it.  Im using URL Toolbox to... See more...
Hi, Im trying to collate URL domain names of users who visit websites over the course of 24 hours. It pulls the right data but it wont table and im not sure how to fix it.  Im using URL Toolbox to parse the domain out.  index="##" eventtype=pan $user$ hoursago=24 | eval list="mozilla" | `ut_parse_extended(url,list)` | stats count by ut_domain_without_tld | table ut_domain_without_tld count Im fairly new to splunk so any help is appreciated.
Hi All, I have a dropdown as below in dashboard studio I also have a table that has options - All, active groups and inactive groups.  If user clicks on "Inactive groups", it should ... See more...
Hi All, I have a dropdown as below in dashboard studio I also have a table that has options - All, active groups and inactive groups.  If user clicks on "Inactive groups", it should display table that has details of inactive groups, if user clicks on Active groups table with active groups should be displayed. If user clicks on All, then all groups should be displayed. Until any of above chosen table should be hidden.  I have selected "When data is unavailable, hide element" option under visibility in configuration. But I am not getting how to achieve my above use cases. Please can anyone of you help me on this.  Thanks, PNV  
Last month we were working on a Splunk ES Demo and i found out that we can not delete a notable. either i have not understood the ES yet or the ES developers are really funny, lol ! 
I misunderstood search affinity and misunderstood the purpose of multi-site configuration. Thank you for your kind notice.
When you talk about Notable event suppression, I assume you are talking about the Notable Event Suppression action in the Incident Review. If you want to whitelist/blacklist certain assets, then you... See more...
When you talk about Notable event suppression, I assume you are talking about the Notable Event Suppression action in the Incident Review. If you want to whitelist/blacklist certain assets, then you should add the lookup logic to the correlation search that has caused the notable event in the first place. You cannot add lookup logic to the event type search ES creates for the suppression logic.
The normal way to get data from windows machines is to install the universal forwarder on the machine and pretty much the rest happens as magic. https://www.splunk.com/en_us/blog/learn/splunk-univer... See more...
The normal way to get data from windows machines is to install the universal forwarder on the machine and pretty much the rest happens as magic. https://www.splunk.com/en_us/blog/learn/splunk-universal-forwarder.html Also, you should install the TAs (Technical Add On) for Windows https://splunkbase.splunk.com/app/742 and then you will have the data in Splunk in a way that can be easily digested.