All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello @Stives , How does your Inputstanza looks like? If no crcSalt is specified in the stanza, Splunk will look into the first (i think 256) Bytes of a file and determines based on that if it alre... See more...
Hello @Stives , How does your Inputstanza looks like? If no crcSalt is specified in the stanza, Splunk will look into the first (i think 256) Bytes of a file and determines based on that if it already know the File. If the first Bytes in the CSV files will always be the same you could change your inputstanza and add      crcSalt = <SOURCE>     docs to monitoring stanza for a deeper look into crcSalt:  https://docs.splunk.com/Documentation/Splunk/9.1.2/Admin/Inputsconf#MONITOR: But be cautious, this will tell splunk to watch for the full path to determine if this file is already been indexed, so there is a possibility that you index the same file twice. Especially for Directories with rolling logfiles. Other possibility could be that the dates are out of the retention time scope. (If the files got indexed once but due to retention time got removed again when its bucket is not hot anymore)
Hi, @gcusello ,   Thanks for the reply. I have one concern, in the mutliselect dropdown the values selected will be a,b,c or b,c,a etc which will be comma separated. In such conditions will this l... See more...
Hi, @gcusello ,   Thanks for the reply. I have one concern, in the mutliselect dropdown the values selected will be a,b,c or b,c,a etc which will be comma separated. In such conditions will this logic will work?
What is pai?
Hi, There are a lot of clients in my architecture and every other splunk instance is deployed in either /opt/bank/splunk OR /opt/insurance/splunk OR /opt/splunk   Hence I want to run a command to ... See more...
Hi, There are a lot of clients in my architecture and every other splunk instance is deployed in either /opt/bank/splunk OR /opt/insurance/splunk OR /opt/splunk   Hence I want to run a command to extract list of all clients along with the path where splunkd is running.   How can i achieve this, please suggest
Hello, I´m trying to resolve monitoring issue of available .csv files of specific directory. There are several files marked by different date e.g. 2023-11-16_filename.csv or 2023-11-20_filename.csv. ... See more...
Hello, I´m trying to resolve monitoring issue of available .csv files of specific directory. There are several files marked by different date e.g. 2023-11-16_filename.csv or 2023-11-20_filename.csv. None of them has the same date at the beginning for this reason. I´m able synch with the server most of the files but there are some which I´m not. For example my indexing started on 02.10.23 and all the files matching or later are available as source. But all the files before this date are not e.g. 2023-09-15_filename.csv. What could cause this performance and is there a way how to push files to available as a source even they marked with the date before 02.10.2023 ? Thanks
I have an inputlookup table, in this lookup table there is a JSON array called "Evidence" There is two field I would like to extract, one is "Rule" and the "Criticality". An example of Evidence arra... See more...
I have an inputlookup table, in this lookup table there is a JSON array called "Evidence" There is two field I would like to extract, one is "Rule" and the "Criticality". An example of Evidence array will look like this: {"Evidence":[{"Rule":"Observed in the Wild Telemetry","Criticality":1},{"Rule":"Recent DDoS","Criticality":3}]} So if I eval both "Rule" and Criticality" as shown below: | eval "Rule"=spath(Evidence, "Evidence{}.Rule") | eval "Criticality"=spath(Evidence, "Evidence{}.Criticality") | table Rule Criticality The output will show like this but the Rule & Criticality column doesn't separate into different row (it is all in one row): Rule Criticality Observed in the Wild Telemetry Recent DDoS 1 3 Now the tricky part, I would like display the top count of Rule (top Rule limit=10)  but at the same time display the associated Criticality with the Rule. How do it? since the above does not separate into different row. The final outlook I am looking for, will look like this: Rule Criticality Count Observed in the Wild Telemetry 1 50 Recent DDoS 3 2 An alternative I was thinking was using foreach then concate it into a Combined Field, but I think It is kind of complex.
Does AppDynamics Machine agent support windows 10. I am able to see message as Machine agent started. Under servers I can see the processes of my system running along with PIDs for the system where m... See more...
Does AppDynamics Machine agent support windows 10. I am able to see message as Machine agent started. Under servers I can see the processes of my system running along with PIDs for the system where my machine agent has been hosted. However, I am not able to get %CPU, Disk Memory related metrics. When I try to access same from Metrics browser ,it says no data to display. Please suggest
I am curious to know about a couple of things related to fetching S3 logs. Is there any limitation in the number of inputs which we create in the AWS add-on? Is there any limitation on indexes on ... See more...
I am curious to know about a couple of things related to fetching S3 logs. Is there any limitation in the number of inputs which we create in the AWS add-on? Is there any limitation on indexes on which we log the S3 data?
Yes, there was some error with endpoint. I have checked the error via below query index=_internal sourcetype=aws:s3:log ERROR
Hello I have this search :   index="report" | stats count(Category__Names_of_Patches) as totalNumberOfPatches by Computer_Name | eval exposure_level = case( totalNumberOfPatches >= 1 AND total... See more...
Hello I have this search :   index="report" | stats count(Category__Names_of_Patches) as totalNumberOfPatches by Computer_Name | eval exposure_level = case( totalNumberOfPatches >= 1 AND totalNumberOfPatches <= 5, "Low Exposure", totalNumberOfPatches >= 6 AND totalNumberOfPatches <= 9, "Medium Exposure", totalNumberOfPatches >= 10, "High Exposure", totalNumberOfPatches == 0, "Compliant", 1=1, "<not reported>" )   and i want to create pai for each exposure_level and color the pai in different color how can i do it ?  Thanks
@splunk Is there any solution for this?
That's another question, completely unrelated to the original issue. See my response in this thread https://community.splunk.com/t5/Deployment-Architecture/What-are-the-best-practices-for-creating-an... See more...
That's another question, completely unrelated to the original issue. See my response in this thread https://community.splunk.com/t5/Deployment-Architecture/What-are-the-best-practices-for-creating-and-distributing-apps/m-p/668990 for managing apps.
You can do tstats from datamodel. That's not so minimal You simply change your initial change to | tstats [...] by Authentication.user _time span=1d That's what gives you data with which you ca... See more...
You can do tstats from datamodel. That's not so minimal You simply change your initial change to | tstats [...] by Authentication.user _time span=1d That's what gives you data with which you can work further. | `drop_dm_object_name("Authentication")` This doesn't hurt us I admit, the next step is a bit advanced. I won't yet give you a complete solution but I will point you in the right direction You need to do streamstats to carry over the information when was the last occurrence of given user in those statistics. | streamstats current=f last(_time) as last by user This will give you last time the given user was included along with your "current" occurrence. Now you can see whether the difference is just one day or more which will tell you whether the streak was continuous or not. That's for start.
Hello,  After adding that property I have now error: The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption. Error: "Certificates do not ... See more...
Hello,  After adding that property I have now error: The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption. Error: "Certificates do not conform to algorithm constraints". ClientConnectionId:f6d04b57-f4f7-4bc2-93ca-d3ac59ad7b4b Splunk: 9.0.4.1 Db Connect: 3.14.1 Java: openjdk 17.0.8.1   Funny thing i don't have that problem on other server Splunk: 9.0.4.1 DB Connect: 3.12.2 Java: openjdk 17.0.7
I think you are looking for a stepped line graph? You could do something like this (I changed the second instance of 6/11 to 7/11 to show the changes separately, I also added another 12/11 to show t... See more...
I think you are looking for a stepped line graph? You could do something like this (I changed the second instance of 6/11 to 7/11 to show the changes separately, I also added another 12/11 to show the impact of having two values for the same date where one is negative and the other positive.) | makeresults format=csv data="date,change 25/10/2023,6000 31/10/2023,0 6/11/2023,2500 7/11/2023,500 12/11/2023,-7800 12/11/2023,800 16/11/2023,500" | eval _time=strptime(date,"%d/%m/%Y") | streamstats sum(change) as total | autoregress total | eval row=mvrange(0,2) | mvexpand row | eval total=if(row=0,total_p1,total) | table _time total Note that this chart only works well with _time for the x-axis, other scales not so well.  
hello , I am Masterschool student and trying to install Splunk on my VM and it doesn t work, anyone can help thank you
Hello, I have the same problem. Anyone can help?
Please check the aws s3 logs in the splunk end it may be due to permission issue from aws end. Once you go through the logs you will get clear visibility.  the logs will be under /opt/splunk/var/log/... See more...
Please check the aws s3 logs in the splunk end it may be due to permission issue from aws end. Once you go through the logs you will get clear visibility.  the logs will be under /opt/splunk/var/log/splunk and serach for aws.
i am using splunk fully on prem - no cloud option as per documentation TA to be installed on UF, you can refer to below link https://community.splunk.com/t5/Security/Universal-Forwarder-Technology-... See more...
i am using splunk fully on prem - no cloud option as per documentation TA to be installed on UF, you can refer to below link https://community.splunk.com/t5/Security/Universal-Forwarder-Technology-Add-On/m-p/669359#M17403 As i understood, TA to be installed on Indexers (already done) and on UF   Thanks
There is no good way to do this.  All you can do is to work around with array, like { "visualizations": { "viz_OQMhku6K": { "type": "splunk.ellipse", "_comment": [ ... See more...
There is no good way to do this.  All you can do is to work around with array, like { "visualizations": { "viz_OQMhku6K": { "type": "splunk.ellipse", "_comment": [ "==================================", "This is created by Person1 on 1/1/2023 @companyb", "On 2/1/2023 - added base search", "On 2/5/203 - added dropdown box" ] } },