All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Could anyone please tell me how to connect to external Postgres DB so I could see data in this dashboard? I cannot find instruction anywhere. Thanks, pawelF
Dear All, I am running on Splunk Cloud 9.0.2208.3 as a sc_admin-rolled user and I have created a load of calculated fields as my user for the administrators to see data in the associated Data Models... See more...
Dear All, I am running on Splunk Cloud 9.0.2208.3 as a sc_admin-rolled user and I have created a load of calculated fields as my user for the administrators to see data in the associated Data Models. I have made the Calculated Fields non-private - i.e. in the app/global as appropriate. All is good and others can see the data in the DMs, however, I am leaving the company and they are going to have to delete my user as a part of the offboarding process (plus I don't want to be blamed if someone hacks the system and it was my user, somehow). As a result, I want to reassign my KBOs to the nobody user, so that my user is no longer in the circuit and can be safely deleted, however, when I go into "Reassign Knowledge Objects", I cannot see the Calculated Fields at all. I have made sure that I can see KBOs in all apps and all users and still no dice. Why can't I reassign these KBOs??? How can I do this? Or perhaps, do I NEED to do this?
I developed an Add-on with the add-on builder that used python code to send events to my splunk. I first tested that add-on locally (worked prefecrtly) and then tried it on a different splunk enterpr... See more...
I developed an Add-on with the add-on builder that used python code to send events to my splunk. I first tested that add-on locally (worked prefecrtly) and then tried it on a different splunk enterprise to see if it works there. After clicking on the "install app from file" option and installing it from the SPL file, the icon of the app showed up where the apps are. I tried to click the app but then, what would normally take me to the inputs page (my app requires an input), prompted an error that said "Failed to load Inputs Page, This is normal on Splunk search heads as they do not require an Input page. Check your installation or return to the configuration page" and when I clicked on details it said "Error: Request failed with status code 500" To solve this, I tried a few things (none of them worked) - I tried installing the add on builder on the other splunk enterprise, which gave me a blank screen every time I clicked on it, I tried opening a splunk cloud of my own and test it there but  didn't see the "install app from file" option and I tried looking for errors in the console and only saw "Uncaught (in promise) Error: Request failed with status code 500" a few times. Does someone know a reason as to why it might happen? Ways to fix it?
I have this search which builds a table my_search | timechart span=1d sum(eval(b/1024/1024/1024)) AS volume_b it will build a table like this: 24 October 18 25 October 10 26 October 25... See more...
I have this search which builds a table my_search | timechart span=1d sum(eval(b/1024/1024/1024)) AS volume_b it will build a table like this: 24 October 18 25 October 10 26 October 25 27 October 30   Now, from this search I want to do a simple count: how many days have a volume>15 for the table above it would just show count: 3
Hi all, I am a fresher in Splunk. Recently, I met an problems and would like to ask whether anyone has ideas. I compose my spl code to generate this kind of table.  city              produce na... See more...
Hi all, I am a fresher in Splunk. Recently, I met an problems and would like to ask whether anyone has ideas. I compose my spl code to generate this kind of table.  city              produce name        count city1            product 1            perchase count 1-1 city1            product 2            perchase count 1-2 city2            product 1            perchase count 2-1 city2            product 2            perchase count 2-2 city3            product 1            perchase count 3-1 city3            product 2            perchase count 3-1 But I would like to transfer the table into this kind of table. product name          city1                                 city2                                 city 3 product 1         perchase count 1-1     perchase count 2-1     perchase count 3-1 product 2         perchase count 1-2     perchase count 2-2     perchase count 3-2 Do any experts have ideas how to implement SPL code to fulfill that? I tried to use transpose column_name=city, but in vain. The output doesn't look as I expect. Thank you so much ! 
Hello,    Simon Richardson developed the add-on TA_Zimbra. I'm new in Zimbra and I would like to improve the add-on with new extraction and capabilities. I would like to work in the right way, pr... See more...
Hello,    Simon Richardson developed the add-on TA_Zimbra. I'm new in Zimbra and I would like to improve the add-on with new extraction and capabilities. I would like to work in the right way, producing shareable results. As what I can see, this add-on is still under development. Is there a way to contact Simon Richardson?   Thank you very much Kind Regards Marco
This is my first question here! And I just started my journey with Splunk. I have two files test1.csv and test2.csv with same column names in both the files: hashValue, updatedTime, alertName.  How... See more...
This is my first question here! And I just started my journey with Splunk. I have two files test1.csv and test2.csv with same column names in both the files: hashValue, updatedTime, alertName.  How do I compare both the files w.r.t there column values and output only the difference? Thanks 
Exceptions Day1 Day2 Day3 Abc 5 4 3 Start 3 4 4 xyz 3 2 5  
Hi team,   i have multiselect input filter , i need to set the value of multiselect input filter value to drilldown value of pie chart how to change the input filter value to drill down token v... See more...
Hi team,   i have multiselect input filter , i need to set the value of multiselect input filter value to drilldown value of pie chart how to change the input filter value to drill down token value. Please guide on this
Hello, I am trying to fix an error in for an inherited add-on that i am maintaining. category: app_cert_validation description: Detect usage of JavaScript libraries with known vulnerabilities. ex... See more...
Hello, I am trying to fix an error in for an inherited add-on that i am maintaining. category: app_cert_validation description: Detect usage of JavaScript libraries with known vulnerabilities. ext_data: { [+] message_id: 7002 rule_name: Validate app certification severity: Fatal solution: 3rd party CORS request may execute parseHTML) executes scripts in event handlers jQuery before 3.4.0, as used in Drupal Backdrop CMS, and other products, mishandles jQuery.extend(true, (], ..) because of Object.prototype pollution Regex in its Query.htmlPrefilter sometimes may introduce XSS Regex in its jQuery.htmlPrefilter sometimes may introduce XSS reDOS - regular expression denial of service Regular Expression Denial of Service (ReDoS) Regular Expression Denial of Service (ReDoS) This vulnerability impacts pm (server) users of moment. js, especially if user provided locale string, eg fr is directly used to switch moment locale. status: Fail sub_category: Checks related to JavaScript usage I checked some community questions, one of the answer mentions the below fix. Export the app from any Add-on Builder Import the app into Add-on Builder v4.1.0 or newer Download the app packaged from Add-on Builder v4.1.0 or newer I don't have the original app export and cannot import that to new AOB. I tried importing a tgz file but that gives an error. Is there any other way I can fix this or something else i can try?
Hi  I am some confuse the documentation for Role Based Field Filtering as following.  https://docs.splunk.com/Documentation/Splunk/9.0.1/Security/planfieldfiltering   from the documentation, ... See more...
Hi  I am some confuse the documentation for Role Based Field Filtering as following.  https://docs.splunk.com/Documentation/Splunk/9.0.1/Security/planfieldfiltering   from the documentation, the restricted command (tstats) return sensitive data that a role with field filters might not be allowed to access and it is very risky that someone with malicious intentions tries to use them to circumvent role-based field filtering.  it provide the workaround that assign one of two capability to role that have field filter.  the one of capability is he run_commands_ignoring_field_filter. here is my question, user_A have a role that include run_commands_ignoring_field_filter capability and configured field filtering and User_A run tstats to search information which include field that required masking, what happen in the result result?  I wonder if it show sensitive data or making data ?   thank you in advanced.     These commands can return sensitive data that a role with field filters might not be allowed to access. They might pose a potential security risk for your organization if someone with malicious intentions tries to use them to circumvent role-based field filtering. As a result, the Splunk platform restricts these commands when used by people with roles that are configured with field filtering.
Hi ,  I have a scenario where the files needs to be transferred for both inbound and outbound at 2 am daily.  I need to create an alert when files are present in inbound by 2 am but missing in ou... See more...
Hi ,  I have a scenario where the files needs to be transferred for both inbound and outbound at 2 am daily.  I need to create an alert when files are present in inbound by 2 am but missing in outbound by 2 am .  Here is my query below. please help  index=cas source="/bin/var/logs/log"  File 1OR File 2 OR File 3 OR File 4 Inbound  for outbound condition is to change to outbound and File 1 represents the file that is getting transferred 
Hello,  I wanted to ask if there was a way I can delete reports created by Enterprise Security? There are reports created by Enterprise Security that we will never use, and i would just like to clea... See more...
Hello,  I wanted to ask if there was a way I can delete reports created by Enterprise Security? There are reports created by Enterprise Security that we will never use, and i would just like to clean up the reports menu. 
Figuring out the best add-on(s) to ingest security data related to O365/Azure is an exercise in insanity...   Can we get some clarification and/or consolidation for this since all 5 of these add-on... See more...
Figuring out the best add-on(s) to ingest security data related to O365/Azure is an exercise in insanity...   Can we get some clarification and/or consolidation for this since all 5 of these add-ons are developed by Splunk or Microsoft?   Microsoft Graph Security API Add-On for Splunk: https://splunkbase.splunk.com/app/4564 https://learn.microsoft.com/en-us/graph/api/resources/security-api-overview?view=graph-rest-1.0#alerts Alerts from the following providers are available: Azure Active Directory Identity Protection Microsoft 365 Default Cloud App Security Custom Alert Microsoft Defender for Cloud Apps Microsoft Defender for Endpoint Microsoft Defender for Identity  Microsoft Sentinel (formerly Azure Sentinel)   Splunk Add-on for Microsoft Security: https://splunkbase.splunk.com/app/6207 Microsoft 365 Defender incidents and alerts OR Microsoft Defender for Endpoint alerts.   Splunk Add-on for Microsoft Office 365: https://splunkbase.splunk.com/app/4055 All service policies, alerts and entities visible through the Microsoft cloud application security portal. All audit events and reports visible through the Microsoft Graph API endpoints. This includes all log events and reports visible through the Microsoft Graph API.   Splunk Add-on for Microsoft Cloud Services: https://splunkbase.splunk.com/app/3110 mscs:azure:security:alert   Splunk Add on for Microsoft Azure: https://splunkbase.splunk.com/app/3757 Azure Security Center Alerts & Tasks   EDIT: There's also the Microsoft Defender Advanced Hunting Add-on for Splunk (https://splunkbase.splunk.com/app/5518) but the Splunk Add-on for Microsoft Security also seems to cover Advanced Hunting: https://docs.splunk.com/Documentation/AddOns/released/MSSecurity/Releasenotes#New_features  
Hi all, I have a timestamp in a format I havn't dealt with before and I am struggling to get it converted to my timezone using the offset. In raw event form it is like this: "TimeGenerated": "2022-... See more...
Hi all, I have a timestamp in a format I havn't dealt with before and I am struggling to get it converted to my timezone using the offset. In raw event form it is like this: "TimeGenerated": "2022-10-25T04:21:50.2975103Z" I have also attached a screenshot of how splunk is indexing it. My second question is how would I configure the sourcetype to have splunk use TimeGenerated field as _time automatically? I've attached a second screenshot with the sourcetype as well.   Any help or links would be greatly appreciated!
I'm not having any luck finding what the functional differences are between a lookup created in splunk core ( Settings > Lookups > add new) that lives in the ES app context, and a managed lookup crea... See more...
I'm not having any luck finding what the functional differences are between a lookup created in splunk core ( Settings > Lookups > add new) that lives in the ES app context, and a managed lookup created from the content management page ( ES > configure > Content Management > Create New Content ).  I have created and experimented with both and I can't find any functional difference. The documentation describes how to create managed lookups but I'm not finding anything on what the point is. 
Hi Splunkers  I'm trying to extract some fields using the opting under the log "Extract Fields" using the regix method. In the step of "Select Fields" when I select a filed that I would like to e... See more...
Hi Splunkers  I'm trying to extract some fields using the opting under the log "Extract Fields" using the regix method. In the step of "Select Fields" when I select a filed that I would like to extract, it freezes for a couple of minutes and returns with the following message: "The extraction failed. If you are extracting multiple fields, try removing one or more fields. Start with extractions that are embedded within longer text strings." So I'm not "extracting multiple fields", its just one filed, and yet the error still appears. Here is the log sample I used: 2022-10-26T20:10:11+03:00 192.168.xxx.xxx TRP|No Caller ID received: Line: 8 Slot: 2 Port: 12 I was just trying to extract the "TRP". I have tried different ways to solve this issue: I have tried the "I prefer to write the regular expression myself" option in the "Select Method" step and entered the regix and hit "Preview" but it just stuck. I have tried to use other log sample with no luck. Tried using totally different log from a totally different index but ended up with the same error message. Even restated Splunk but no luck either!   What am I missing here? 
hi , I have the below query. Index=Config source =“Java/path/ log.csv” inbound  Csv files are supposed to be delivered on a hourly basis before hour past 13 minutes ( eg : file delivered time is 12... See more...
hi , I have the below query. Index=Config source =“Java/path/ log.csv” inbound  Csv files are supposed to be delivered on a hourly basis before hour past 13 minutes ( eg : file delivered time is 12:12minutes) . I need to create an alert if any of the files are delivered after 13minutes every hour . (12:14 minutes ) - create alert 
Hi,  We upgraded Splunk from 8.2.6 to 9.0.1 recently and have one big internal app ( dashboard, views, field extractions) which is failing dring app readiness check. It shows that app is using py... See more...
Hi,  We upgraded Splunk from 8.2.6 to 9.0.1 recently and have one big internal app ( dashboard, views, field extractions) which is failing dring app readiness check. It shows that app is using python 2 and not compatible with python3 . Either uninstall it or update it.  How do i update this internal app ( this is not splunk app , builder or plugin) - is there any documentation or steps which anyone can provide.     
Hello, We're standing up Splunk HF on AWS via EC2.  With a 50GB/day ingest, what's the lowest vCPU/RAM we can configure, and would scaling it up later affect functionality?   Thanks!