All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello dears, Can i list search result with stat count like hourly trend ? Example; Hour : 00:00 EventCount: 10 Hour : 01:00 EventCount: 15 Hour : 02:00 EventCount: 23 . . Hour : 23:00... See more...
Hello dears, Can i list search result with stat count like hourly trend ? Example; Hour : 00:00 EventCount: 10 Hour : 01:00 EventCount: 15 Hour : 02:00 EventCount: 23 . . Hour : 23:00 EventCount : 127 Regards.
Dears I tried to install private app in my splunk cloud developer account, I have 14 days trial and my account is admin. I referred to this link to install private app https://docs.splunk.com/Doc... See more...
Dears I tried to install private app in my splunk cloud developer account, I have 14 days trial and my account is admin. I referred to this link to install private app https://docs.splunk.com/Documentation/SplunkCloud/8.2.2201/Admin/PrivateApps#Install_private_apps_on_Splunk_Cloud_Platform. But I found there is no button install  Install app from file. Would you please tell me how I can install an app from file for my splunk cloud account. Thanks very much!
Hey folks, I have one set of application where there is version upgrade.Due to that version upgrade they changed the path of logs. As they are facing some issues with sourcetype,they changed the sour... See more...
Hey folks, I have one set of application where there is version upgrade.Due to that version upgrade they changed the path of logs. As they are facing some issues with sourcetype,they changed the sourcetype name as well. IS there a  way to know which alerts are configured with this sourcetype /source and change them to new source/sourcetype instead of opening all the alerts manually and check where those sourcetype/source is used in query Thanks in Advance,
Hi Splunkers,  I have defined a filed as follows using eval condition        | eval body = "Sample Example :-" . " ---- " . " HOST INFORMATION: " . " ---- Source Network Address: " . src ... See more...
Hi Splunkers,  I have defined a filed as follows using eval condition        | eval body = "Sample Example :-" . " ---- " . " HOST INFORMATION: " . " ---- Source Network Address: " . src . " ---- Source Network Hostname: " . srcdns_hostname . " ---- " . " END "       which produces the result as follows  Now, I would like to change the above result into the below format how can I achieve that        Sample Example :- HOST INFORMATION: Source Network Address: 1.1.3.5 Source Network Hostname: ABCD.net END        
I am trying to on board logs for Sage accounting software to Splunk, how do I go about it? I could not find any documentation, TA or app on this, how do I get Sage logs to Splunk? Is there any TA or ... See more...
I am trying to on board logs for Sage accounting software to Splunk, how do I go about it? I could not find any documentation, TA or app on this, how do I get Sage logs to Splunk? Is there any TA or Apps for Sage? 
Hi Team, I am getting  date and time format as "Created_time =1649576166225" in raw log we have to convert. Please help me convert in readable format for all logs. Need to do any changes in input l... See more...
Hi Team, I am getting  date and time format as "Created_time =1649576166225" in raw log we have to convert. Please help me convert in readable format for all logs. Need to do any changes in input level or anything please help me          
I have created a lookup for a threat feed CSV file we are using.  After deleting all the Lookup CSV files and removing all the peops.conf and Transforms.conf inputs for this lookup from the the depl... See more...
I have created a lookup for a threat feed CSV file we are using.  After deleting all the Lookup CSV files and removing all the peops.conf and Transforms.conf inputs for this lookup from the the deployer, CMn SHs and indexers I still see an error 
I need regular expression to extract JSON from message field .. Can some one help After extract i want to parse the extracted json using spath command   { [-] @timestamp: 2022-04-09T05:50:04.3... See more...
I need regular expression to extract JSON from message field .. Can some one help After extract i want to parse the extracted json using spath command   { [-] @timestamp: 2022-04-09T05:50:04.336Z @version: 1 file: test.log message: 2021-04-09 05:50:04.273+0000 INFO |RestAPI.adminsrvr | Request #5172: { "context": { "httpContextKey": 1111111111, "verbId": 2, "verb": "GET", "originalVerb": "GET", "protocol": "https", "parameters": { "uri": { "version": "v2" }}}} name: test no: 111111111111 }
My company does not have a Windows Server with Splunk Enterprise so I cannot use the Splunk Add-on for SCOM to ingest the data.  I would like to use the database instead but I dont know  what data fr... See more...
My company does not have a Windows Server with Splunk Enterprise so I cannot use the Splunk Add-on for SCOM to ingest the data.  I would like to use the database instead but I dont know  what data from tables to send like the add-on performs.   Can someone help?
I need a query to view disk encryption (DAR) of all my hosts, be it Bit Locker, LUKS, etc. index=* host=* | ??? Thank you in advance.
The SplunkWorks-built TA called Splunk Add-on for Cisco FireSIGHT said in the description that it is able to parse NGIPS logs. But upon inspection of the `props.conf`, it doesn't have sourcetype for ... See more...
The SplunkWorks-built TA called Splunk Add-on for Cisco FireSIGHT said in the description that it is able to parse NGIPS logs. But upon inspection of the `props.conf`, it doesn't have sourcetype for NGIPS. Which should I use? I tried the `cisco:sourcefire` but it's not working.
Hi All, I am using Javascript file to export splunk data from dashboard to CSV file. Issue I am facing is : for few records where strings are long , data is breaking into next line. I want to wrap... See more...
Hi All, I am using Javascript file to export splunk data from dashboard to CSV file. Issue I am facing is : for few records where strings are long , data is breaking into next line. I want to wrap those long strings in  " " to stop breaking data to next line. below is my code. could someone please help me to get expected result $('#exportBtn').on('click',function(e){ var searchObj= mvc.components.getInstance("rrc_main"); var myResults = searchObj.data('results',{ output_mode : 'json_rows', count:0 }); myResults.on("data",function(){ if(myResults.hasData()){ var data= myResults.data().fields.tostring().replace("Edit,",""); var rows = myresults.data().rows; $.each(rows, function(row){ data = data+ "\n"; for(var i=0; i< 53; i++){ if(rows[row][i]==="edit"){ continue; } if(rows[row][i]== null){ data= data +"\"\","; }else{ data = data +"\""+ rows[row][i].tostring()+"\","; } } }); Kindly help to wrap long strings in  " "  to read csv in proper format without breaking long strings in next line. Appreciate your help!   thanks, ND
We have a requirement to upgrade mongo DB to version 4.2 or later.  Can you please let me know what's the version of mongo DB used in Splunk 8.2.5.  If its not 4.2 or later, can you please let me... See more...
We have a requirement to upgrade mongo DB to version 4.2 or later.  Can you please let me know what's the version of mongo DB used in Splunk 8.2.5.  If its not 4.2 or later, can you please let me know if mongo DB can be upgraded separately. Will Splunk have any issues if Mongo DB upgrade is done. 
Here is a handy way to skim all the job results from - Rule and - Gen searches with ES to look for issues. | rest splunk_server=local count=0 /servicesNS/-/-/search/jobs/ | where match(label,"Ru... See more...
Here is a handy way to skim all the job results from - Rule and - Gen searches with ES to look for issues. | rest splunk_server=local count=0 /servicesNS/-/-/search/jobs/ | where match(label,"Rule$|Gen$") | table label, eai:acl.owner, eai:acl.app, isFailed, messages.warn, messages.fatal, messages.error
Currently I have a field holding a Julian date. I am trying to convert it using strftime but i'm having issues.   Date = 2022.091 Current query:     index = * | eval ConvertedDate = strftime(... See more...
Currently I have a field holding a Julian date. I am trying to convert it using strftime but i'm having issues.   Date = 2022.091 Current query:     index = * | eval ConvertedDate = strftime(DATE,"%Y.%j")| table ConvertedDate       Ideally I would like to get an output like 04/03/2022   Thank you, Marco
hello At the end of this subsearch I would like to be able to retrieve the results of the sum of Pb + Pb2 + Pb3 classed by name and town   index=abc sourcetype=toto | search rtt > 200 | stats ... See more...
hello At the end of this subsearch I would like to be able to retrieve the results of the sum of Pb + Pb2 + Pb3 classed by name and town   index=abc sourcetype=toto | search rtt > 200 | stats avg(rtt) as rtt by name town | eval Pb=if(rtt>200,1,0) | search Pb > 0 | append [ search `index=cde sourcetype=tutu | stats avg(logon) as logon by name town | eval Pb2=if(logon>300,1,0) | search Pb2 > 0 ] | append [ search index=efg sourcetype=titi | stats dc(id) as id by name town | eval Pb3=if(id>2,1,0) search Pb3 >5]   something like this   | stats sum(Pb1 + Pb2 + Pb3) by name town   could you help please?
Hi Could you please help me with using REX/REGEX inside eval? Here is what I'm trying to do  | makeresults | eval User="user1=test@domain.com | use1=test1" | makemv delim="|" User | mvexpand User... See more...
Hi Could you please help me with using REX/REGEX inside eval? Here is what I'm trying to do  | makeresults | eval User="user1=test@domain.com | use1=test1" | makemv delim="|" User | mvexpand User | fields - _time | eval signature="87347,123,1,0,84" | makemv signature delim="," | mvexpand signature | eval account=if(like(signature,"87347") AND like(User,"%@%" )," REGEX USER TO KEEP EVERYTHING BEFORE @ "," DONT MAKE ANY CHANGES , KEEP THE USER WITH @") Thanks     
Hi hope someone can help here. ta-ms-loganalytics have suddenly stopped working, i can see below type of errors being logged about the modular inputs  ERROR ModularInputs - Unable to initialize... See more...
Hi hope someone can help here. ta-ms-loganalytics have suddenly stopped working, i can see below type of errors being logged about the modular inputs  ERROR ModularInputs - Unable to initialize modular input "log_analytics" defined inside the app "TA-ms-loganalytics": Introspecting scheme=log_analytics: script running failed (killed by signal 9: Killed). raise ConnectionError(e, request=request)\nConnectionError: HTTPSConnectionPool(host='127.0.0.1', port=8089): Max retries exceeded with url: /servicesNS/nobody/TA-ms-loganalytics/data/inputs/log_analytics?count=0&output_mode=json (Caused by NewConnectionError('<solnlib.packages.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7f118318e610>: Failed to establish a new connection: [Errno 111] Connection refused',))\n" any help much appreciated.
Hi All,   Getting very frequent alert for one of my search peer from DMC even though search head is up and working fine.  observed the below error in the logs    ERROR ExecProcessor - messa... See more...
Hi All,   Getting very frequent alert for one of my search peer from DMC even though search head is up and working fine.  observed the below error in the logs    ERROR ExecProcessor - message from "/xx/xx" Socket error communicating with splunkd (error=The read operation timed out) Can you please help me on this issue. Thanks in Advance.
04-07-2022 14:25:12.529 -0400 ERROR TailReader - Ran out of data while looking for end of header 04-07-2022 14:25:12.529 -0400 WARN AggregatorMiningProcessor - Breaking event because limit of 256 ha... See more...
04-07-2022 14:25:12.529 -0400 ERROR TailReader - Ran out of data while looking for end of header 04-07-2022 14:25:12.529 -0400 WARN AggregatorMiningProcessor - Breaking event because limit of 256 has been exceeded - data_source="/logs/gui/adcguidev5_ms11/adcguidev5_ms11.log", data_host="xxxxxxxxxxx", data_sourcetype="ADC-MSLOG" We are getting a number of these errors and would really like to clear them up. I am getting Ran out of data while looking for end of header. The logs is a weblogic managed server log.