All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi  I want to know if it is possible to show the number of impacted records in last 15 mins for the below search:  Query: index = events_prod_tio_omnibus_esa ( "SESE023" OR "SESE020" OR "SESE030"... See more...
Hi  I want to know if it is possible to show the number of impacted records in last 15 mins for the below search:  Query: index = events_prod_tio_omnibus_esa ( "SESE023" OR "SESE020" OR "SESE030" ) Result:    Requirement :  For the above search, if the search is executed at : 11:30 ==> It will show 0 records  11:40 ==> It will show 2 records (as the last event raised on 11:37:14 is having 2 records and currenttime - event time < 15 mins) 11:50 ==> It will show 2 records (as the last event raised on 11:37:14 is having 2 records and currenttime - event time < 15 mins) 11:55 ==> It will show 0 records (as the last event raised on 11:37:14 is having 2 records but currenttime - event time >15 mins)  
Hi Community, I'm working on script input. I have created a script to convert binary code logs into human read-able format and it is working fine. Now the issue is the file im monitoring is in "/v... See more...
Hi Community, I'm working on script input. I have created a script to convert binary code logs into human read-able format and it is working fine. Now the issue is the file im monitoring is in "/var/log/test" directory.  The script is in "/opt/splunk/etc/apps/testedScript/bin/testedscript.sh" directory. I'm getting script address as source in Splunk. Attaching screenshot as reference.  below is my inputs.conf stanza I'm using (/opt/splunk/etc/apps/testScript/local/inputs.conf): [script:///opt/splunk/etc/apps/testScript/bin/testedScript.sh] disabled=false index=testing interval=30 sourcetype=free2 Is there anyway i can get exact source address like in my case it is "/var/log/test/file1"
Hi @Jit06 , did you tried with show.splunk.com ? Ciao. Giuseppe
Hi @Dayalss , the Qualys Add-On for Splunk is very useful to ingest and parse Qualys data, but it doesn't contains dashboard to display data. For this requirement, find another app in splunkbase: a... See more...
Hi @Dayalss , the Qualys Add-On for Splunk is very useful to ingest and parse Qualys data, but it doesn't contains dashboard to display data. For this requirement, find another app in splunkbase: apps.splunk.com, I don't know which is the most accurate for your requirements. You can use these dashboard as they are or as starting point for your custom dashboards. Ciao. Giuseppe  
I just received a mail stating, past June 14 we won't be even able to view the past support tickets.  I see it as a blocker for learning. Because whenever I face an issue, I refer to the past tickets... See more...
I just received a mail stating, past June 14 we won't be even able to view the past support tickets.  I see it as a blocker for learning. Because whenever I face an issue, I refer to the past tickets and learn from that before actually creating a ticket. Past tickets can be available atleast as HTML to view. Kindly let me know if there are any such plans.
Hi, I have ingested the qualys data using the Qualys TA addon and enabled the inputs to run once every 24 hours. Im ingesting the host detection and knowledge logs into Splunk. The requirement is ... See more...
Hi, I have ingested the qualys data using the Qualys TA addon and enabled the inputs to run once every 24 hours. Im ingesting the host detection and knowledge logs into Splunk. The requirement is to create a dashboard with multiple multiselect filters and do the enrichment from our database. But I found that the data in qualys is different from Splunk logs. And the inputs is ingesting only a certain amount of data.   My ask is I want to ingest complete data every time the inputs runs , so that I get accurate data and use it in dashboards. Please help me.   Regards, Dayal
Hi, We are looking for migration guidance from Exabeam to Splunk . Is there a way to migrate data from Exabeam data lake to Splunk ? Also any documentation, guidelines present for Exabeam customer... See more...
Hi, We are looking for migration guidance from Exabeam to Splunk . Is there a way to migrate data from Exabeam data lake to Splunk ? Also any documentation, guidelines present for Exabeam customer migration to Splunk. Please let me know. Thanks. Guru
Hi I can't think of any app that monitors user folder sizes, but it wouldn’t be that hard to set up. Possible High-Level Steps: Determine your OS is it Windows / Linux Based on the OS, you ca... See more...
Hi I can't think of any app that monitors user folder sizes, but it wouldn’t be that hard to set up. Possible High-Level Steps: Determine your OS is it Windows / Linux Based on the OS, you can use various Linux command’s  + bash script to monitor user folder sizes on a regular based and output that data into a text log file with a timestamp, you can do the same if its Windows and use a PowerShell script. The log file can be monitored at various intervals  by Splunk UF + inputs.conf and Props.conf Once the data is in an index, you can set up thresholds and alerts. Yes, a bit of homework and scripting, but that’s the flexibility of Splunk and not that hard to do, and you would have created your own private TA
Thanks, Provided query which i am trying to do.
index=mulesoft applicationName=test | stats values(content.payload.requestID) as Request1 values(content.payload.impConReqId) as ImpConReqId values(content.payload.batchId) as batch1 values(content... See more...
index=mulesoft applicationName=test | stats values(content.payload.requestID) as Request1 values(content.payload.impConReqId) as ImpConReqId values(content.payload.batchId) as batch1 values(content.payload{}.batchId) as batch2 values(content.payload{}.impConReqId) as impConReqId1 values(content.payload.OutputParameters.X_REQUEST_ID ) as Request2 BY applicationName,correlationId | eval ImpConReqID= coalesce(ImpConReqId,impConReqId1) | eval RequestId= coalesce(Request1,Request2) | eval batchId= coalesce(batch1,batch2) | eval ImpCon=mvmap(ImpConReqID,if(match(ImpConReqID,".+"),"ImpConReqID: ".ImpConReqID,null())) | eval batch=mvmap(batchId,if(match(batchId,".+"),"batchId: ".batchId,null())) | eval ReqId=mvmap(RequestId,if(match(RequestId,".+"),"RequestId: ".RequestId,null())) | eval oracle=mvappend(ImpCon,batch,ReqId) | eval orcaleid=mvfilter(isnotnull(oracle)) | eval OracleResponse=mvjoin(orcaleid," ") | rename applicationName as ApplicationName correlationId as CorrelationId | table ApplicationName OracleResponse CorrelationId This the query which i am trying to get batchID, requestID, ImpconID.If the field value contains then i need to show in the table based on correlationID. Right now I am getting values properly. But in some scenario for the particular correlationID we have two or three ImpconIDwith values and with null values. So i want filter that null value ImpconId in the table .    
I am having the issue on Windows clients. Because the group isn't on Domain Controllers shouldn't splunk install clients anyway? If I dont use my AD user to run the service I am able to install spl... See more...
I am having the issue on Windows clients. Because the group isn't on Domain Controllers shouldn't splunk install clients anyway? If I dont use my AD user to run the service I am able to install splunk from GPO. The installer creates a user and put it on NT Service. The NT Service\splunk-user is not added to any of the required groups I do that manually.
Good Morning  Does anyone currently use Splunk or an App in Splunk to monitor folder size?  We are currently been asked to set up new folders for fileshare for various teams and as our storage reso... See more...
Good Morning  Does anyone currently use Splunk or an App in Splunk to monitor folder size?  We are currently been asked to set up new folders for fileshare for various teams and as our storage resource are near end we'd like to monitor each users' folder size. The ideal scenario would be that there would be a threshold in size put on each folder and when the folder is near capacity then an alert would trigger and the IT Team would take action.  Kind regards,   Paula      
Do you mean that when you actively zoom a single panel, that same zoom should apply to the other panels. I don't believe there is any way for the dashboard to get feedback when you zoom a single map... See more...
Do you mean that when you actively zoom a single panel, that same zoom should apply to the other panels. I don't believe there is any way for the dashboard to get feedback when you zoom a single map, so that you could set tokens that could be used by other panels.  
I am able to do the zoom on each panel but when I am refreshing the dashboard it shows , do you know the reason.  
What do you mean it's showing null values - your mvmap statement looks like it's doing what you want it to do, i.e. making sure that it only takes data with at least 1 character. Can you demonstrate... See more...
What do you mean it's showing null values - your mvmap statement looks like it's doing what you want it to do, i.e. making sure that it only takes data with at least 1 character. Can you demonstrate the issue as the mvmap statement works, i.e. this example shows that it will remove the empty middle element | makeresults | fields - _time | eval ImpConReqID=mvappend("a","","b") | eval ImpCon=mvmap(ImpConReqID,if(match(ImpConReqID,".+"),"ImpConReqID: ".ImpConReqID, null())) | eval base_elements=mvcount(ImpConReqID) | eval reduced_elements=mvcount(ImpCon) What is the relevance of the 2nd two lines of your example to your question?  
We have the same problem here. The “Performance Monitor Users” group does not exist on a domain controller. Accordingly, the domain account for the forwarder cannot be added.
Hi @gcusello    I used your solution and it worked. I now only have to fix the bytes as they don't show up, but I will try to solve  it myself :D. Thanks! 
Referring to previous question (Solved: How to insert hyperlink to the values of a column ... - Splunk Community) how can I add 2 different URLs for 2 different columns in the table such that, the re... See more...
Referring to previous question (Solved: How to insert hyperlink to the values of a column ... - Splunk Community) how can I add 2 different URLs for 2 different columns in the table such that, the respective hyperlink opens only when the value in the respective column is clicked.             "eventHandlers": [                 {                     "type": "drilldown.customUrl",                     "options": {                         "url": "$row.firstLink.value$",                         "newTab": true                     }                 },                 {                     "type": "drilldown.customUrl",                     "options": {                         "url": "$row.secondLink.value$",                         "newTab": true                     }                 }             ]
Hi @vstan , check if in all events you have the User field (fields are case sensitive!), if not add in the coalesce command all the fields containing the User values to use as correlation key. Then... See more...
Hi @vstan , check if in all events you have the User field (fields are case sensitive!), if not add in the coalesce command all the fields containing the User values to use as correlation key. Then check the exact field name of TOTAL_ATTACHMENT_SIZE_SEGMENT and EMAIL_ADDRESS. Ciao. Giuseppe P.S.: Karma Points are appreciated