All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I think using eventstats can get you the desired output you are looking for if I am interpreting your question correctly.     <base_search> | eventstats sum(eval(case('ProductCategory'... See more...
I think using eventstats can get you the desired output you are looking for if I am interpreting your question correctly.     <base_search> | eventstats sum(eval(case('ProductCategory'=="productcat1", 'Sales Total'))) as productcat1, sum(eval(case('ProductCategory'=="productcat2", 'Sales Total'))) as productcat2   Or for a more dynamic approach something like this may work.   <base_search> | eventstats sum("Sales Total") as overall_sales by ProductCategory | eval overall_sales_json=json_object("fieldname", 'ProductCategory', "value", 'overall_sales') | eventstats values(overall_sales_json) as overall_sales_json | foreach mode=multivalue overall_sales_json [ | eval fieldname=spath('<<ITEM>>', "fieldname"), field_value=spath('<<ITEM>>', "value"), combined_json=if( isnull(combined_json), json_object(fieldname, field_value), json_set(combined_json, fieldname, field_value) ) ] | fromjson combined_json prefix=dynamic_ | fields - combined_json, overall_sales_json, fieldname, field_value, overall_sales ``` Below code is if you only want the new fields on the first row ``` | streamstats count as line_number | foreach dynamic_* [ | eval <<FIELD>>=if( 'line_number'==1, '<<FIELD>>', null() ) ] | fields - line_number | rename dynamic_* as *    
I'm very new to metrics data in Splunk, I have a question regarding the what is plugin_instance and how can i get the values of it. I'm trying to get the results for the query but end up with no re... See more...
I'm very new to metrics data in Splunk, I have a question regarding the what is plugin_instance and how can i get the values of it. I'm trying to get the results for the query but end up with no results.  | mstats avg("processes.actions.ps_cputime.syst") prestats=true WHERE `github_collectd` host="*" span=10s BY plugin_instance
TLS is needed. Most common reason for not starting is expired cert. You just need to check this and replace it with valid one. But you should see that issue on mongodb.log. Have you checked that also... See more...
TLS is needed. Most common reason for not starting is expired cert. You just need to check this and replace it with valid one. But you should see that issue on mongodb.log. Have you checked that also mongodb engine’s have updated to a new one? Also version number has updated. Have you started splunk after each version updates so splunk can do needed migrations? With SHC you must do manual migration for it.
Hi Cummunity team,  I have a complex query to gather the data below, but a new request came up, it was asked to me to add in the report email subject the product category totals by Category. with ... See more...
Hi Cummunity team,  I have a complex query to gather the data below, but a new request came up, it was asked to me to add in the report email subject the product category totals by Category. with the $result.productcat1$ and $result.productcat2$ I could apprach that, but the way I'm calculating the totals I'm not getting the expected numbers, because I'm appeding the columns from a subquery and transposing the values with  xyseries. Could you please suggest how can I sum(Sales Total) by productcat1 and productcat2 in a new field but keeping the same output as I have now?,  e.g.: something like if ProducCategory="productcat1"; then  productcat1=productcat1+SalesTotal, else productcat2=productcat2+SalesTotal ``` But Print the original output ```   Consider productcat1 and productcat2 are fixed values.  ENV ProducCategory ProductName SalesCondition SalesTotal productcat1 productcat2 prod productcat1 productR blabla 9 152 160 prod productcat1 productj blabla 8     prod productcat1 productc blabla 33     prod productcat2 productx blabla 77     prod productcat2 productpp blabla 89     prod productcat2 productRr blabla 11     prod productcat1 productRs blabla 6     prod productcat1 productRd blabla 43     prod productcat1 productRq blabla 55     Thanks in advance.
Is there a TA for HPE 3PAR data? I have the logs ingested and would like to use an existing TA to normalize the data, but I haven't found one in Splunkbase or elsewhere online.
When using the Splunk Logging Driver for Docker, you can leverage SPLUNK_LOGGING_DRIVER_BUFFER_MAX to set the maximum number of messages held in buffer for retries. The default is 10 * 1000 but can a... See more...
When using the Splunk Logging Driver for Docker, you can leverage SPLUNK_LOGGING_DRIVER_BUFFER_MAX to set the maximum number of messages held in buffer for retries. The default is 10 * 1000 but can anyone confirm the maximum value that can be set?
I don't think this is exactly it but it may lead you to the right path   | rest /services/datamodel/model |search eai:appName=search | table updated   The updated field shows when the model w... See more...
I don't think this is exactly it but it may lead you to the right path   | rest /services/datamodel/model |search eai:appName=search | table updated   The updated field shows when the model was last updated. 
Hello All,   I have searched high and low to try to discover why the kvstore process will not start. This system was upgraded from Splunk 8.0, to 8.2, and finally 9.2.1. I have looked in mongod.lo... See more...
Hello All,   I have searched high and low to try to discover why the kvstore process will not start. This system was upgraded from Splunk 8.0, to 8.2, and finally 9.2.1. I have looked in mongod.log and splunkd.log, but do not really see any thing that helps resolve the issue. Is ssl required for this? the - is there a way to set a correct ssl config, or disable it in the server.conf file? Would the failure of the KVstore process affect IOWAIT? I am running on Oracle Linux, ver 7.9 - I am open to any suggestions. Thanks ewholz
Yep, you'll have to have separated calls for that. Filters on SOAR REST can be appended but they will work as "AND" condition as "OR" is not supported in that sense as a limitation on Django queryset... See more...
Yep, you'll have to have separated calls for that. Filters on SOAR REST can be appended but they will work as "AND" condition as "OR" is not supported in that sense as a limitation on Django querysets So the easiest way would be to combine the results of https://<your_soar_instance>/rest/container?_filter_name__icontains="computer" and the ones from https://<your_soar_instance>/rest/container?_filter_name__icontains="process" and finally process then accordingly.  
Hi @danspav , Thank you for your response. I made the changes and when I clicked on the hyperlink, it is not redirecting to the correct dynamically generated external URL 'https://abc12345.apps.dyna... See more...
Hi @danspav , Thank you for your response. I made the changes and when I clicked on the hyperlink, it is not redirecting to the correct dynamically generated external URL 'https://abc12345.apps.dynatrace.com/ui/apps/dynatrace.classic.distributed.traces/ui/services/SERVICE-ABC12345678AB1A1/purepaths?servicefilter=0%1E9%11SERVICE_METHOD-12345ABC1234A123%14abc%100%111340861000%144611686018427387&gtf=c_1716990969058_1716991269058&gf=all'. Here are the screenshots and code below. Please assist on this.     "visualizations": {         "viz_aBCd123": {             "type": "splunk.table",             "options": {                 "count": 5000,                 "dataOverlayMode": "none",                 "drilldown": "none",                 "backgroundColor": "#FAF9F6",                 "tableFormat": {                     "rowBackgroundColors": "> table | seriesByIndex(0) | pick(tableAltRowBackgroundColorsByBackgroundColor)",                     "headerBackgroundColor": "> backgroundColor | setColorChannel(tableHeaderBackgroundColorConfig)",                     "headerColor": "> headerBackgroundColor | maxContrast(tableRowColorMaxContrast)"                 },                 "eventHandlers": [                     {                         "type": "drilldown.customUrl",                         "options": {                             "url": "$row.URL.value|n$",                             "newTab": true                         }                     }                 ],
A legend. Thank you for making that clear!
Hey all, wondering if anyone has solved this problem before. Looking at potential for taking a Splunk Cloud alert and using it to connect to Ansible Automation Platform to launch a template. Have loo... See more...
Hey all, wondering if anyone has solved this problem before. Looking at potential for taking a Splunk Cloud alert and using it to connect to Ansible Automation Platform to launch a template. Have looked into the webhooks however AAP is only configured to allow Github and GitLab webhooks on templates it seems, and when attempting to post to the API endpoint to launch the template it would sit there and eventually time out.   Wondering if anyone has explored this space before and if there are any suggestions on how to get this connection working. 
There are no logs coming in from a VMgmt tool, I'm simply being handed a Critical CVE and being told to search for any assets that match.  Right now, I'm just performing a very taxing search where "i... See more...
There are no logs coming in from a VMgmt tool, I'm simply being handed a Critical CVE and being told to search for any assets that match.  Right now, I'm just performing a very taxing search where "index=* sourcetype=* [insert something that might relate to the asset]"
This the real answer. Still valid as of 2024 for the VMware TA add-on. GUI did not work for installation. Had to copy the tgz into the directory and extract it. Restarted Splunk and it works.
Once you configure your Azure event Hub inputs this should sourcetype mscs:azure:eventhub. Once the data comes in the Splunk TA will map to other sourcetypes, see below, these will then create the va... See more...
Once you configure your Azure event Hub inputs this should sourcetype mscs:azure:eventhub. Once the data comes in the Splunk TA will map to other sourcetypes, see below, these will then create the various CIM fields that can be mapped to the Alerts Data model (That's why you’re not seeing it being CIM compliant in the document list, as it’s a parent sourcetype)    Note: Often Splunk TA's perform a lot of data props/transformations/Regex behind the scenes and CIM compliance work.  The mscs:azure:eventhub sourcetype will point to the below sourcetypes and these are mapped to the Alerts Data model, now whether they contain the actual data you want is another matter, is this field you’re interested mapped to an Alerts Data model CIM field?   mscs:azure:security:alert (CIM Mapped to Alerts Data model) mscs:azure:security:recommendation (CIM Mapped to Alerts Data model) The below sourcetype has many other data types, so various elements will map to the different datamodels. azure:monitor:aad (Maps to Alerts/Authentication/Change) So, in your case the Alerts Data model is most likely the main use case, so it’s best to get the data into an test index first, tune the Alerts Data model to point to the test index with the tag alert, this will kick the searches in for the Alert data model, you can then do some analysis on the CIM fields and see what you’re getting.  If not seeing the fields you want, then your option is to use the Raw data for your searches, or you can create own data model and accelerate the data so it’s faster, but this is not CIM compliance in the true ES sense, it’s just making it faster and use of datamodels, which is fine, but maybe overkill.  The general idea is to map as much as you can for CIM compliance or the ones recommended on the CIM Compliance page you never get it 100%  Here are some Links for you to look at: Alerts Data model - Look at your data and can you map them to a field or recommended ones, The TA should do most of this for you as its CIM compliant.  https://docs.splunk.com/Documentation/CIM/5.0.2/User/Alerts CIM validation - you can use this for analysis work https://splunkbase.splunk.com/app/2968 MS Cloud TA - Info  https://splunk.github.io/splunk-add-on-for-microsoft-cloud-services/
What's your end goal with this? That would affect how I'd approach the problem. Do you want a playbook to take action based on this info, or is it some kind of audit?
Splunk has finally added the issue to their known issues page https://docs.splunk.com/Documentation/Splunk/9.2.0/ReleaseNotes/KnownIssues  https://docs.splunk.com/Documentation/Splunk/9.2.1/Release... See more...
Splunk has finally added the issue to their known issues page https://docs.splunk.com/Documentation/Splunk/9.2.0/ReleaseNotes/KnownIssues  https://docs.splunk.com/Documentation/Splunk/9.2.1/ReleaseNotes/KnownIssues
#machinelearning Hello, I am using dist=auto in my Density function and I am getting negative Beta Results. I feel like this is wrong but keep me honest, I would like to understand how Beta distrib... See more...
#machinelearning Hello, I am using dist=auto in my Density function and I am getting negative Beta Results. I feel like this is wrong but keep me honest, I would like to understand how Beta distribution is captured  and why the mean is a negative result if I am using 0 to 100% success rate? other distribution I am happy with it (e.g Gaussian KDE and Normal) |fit DensityFunction MyModelSuccessRate by "HourOfDay,Object" into MyModel2 dist="auto" Thanks,   Joseph     
@verothor  Have you tried something like this? require([ 'underscore', 'jquery', 'splunkjs/mvc', "splunkjs/mvc/searchmanager", 'splunkjs/mvc/simplexml/ready!' ], function (_, $,... See more...
@verothor  Have you tried something like this? require([ 'underscore', 'jquery', 'splunkjs/mvc', "splunkjs/mvc/searchmanager", 'splunkjs/mvc/simplexml/ready!' ], function (_, $, mvc, SearchManager) { let mySearch = new SearchManager({ id: "mysearch", autostart: "false", search: '| makeresults | eval test = "This is test" ', preview: false, }, { tokens: true, tokenNamespace: "submitted" }); let mySearchResults = mySearch.data("results"); mySearchResults.on("data", function () { resultArray = mySearchResults.data().rows; console.log("My Data", resultArray); }); $(document).ready(function () { setInterval(function () { mySearch.startSearch(); }, 3000); }); });   Note: This is a sample JS. Just modify as per your requirement.    KV
Hi @Ram2 , host e sourcetype are indextime fields that you associate to your data surce, site should be an extracted field. Have you this field running only the search without stats? if not (as pr... See more...
Hi @Ram2 , host e sourcetype are indextime fields that you associate to your data surce, site should be an extracted field. Have you this field running only the search without stats? if not (as probable) you have to extract it. Ciao. Giuseppe