All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

No one here in the Community knows the answer to that and Splunk's policy is to not attach dates to future features.  We'll know about it when it's released. Check https://ideas.splunk.com to see if... See more...
No one here in the Community knows the answer to that and Splunk's policy is to not attach dates to future features.  We'll know about it when it's released. Check https://ideas.splunk.com to see if others are asking the same thing and vote for it.
The prerequisites indicate that the Splunk DB Connect extension will not work with systems that are FIPS compliant.  Will this change in future releases and is there a timeframe for this release? 
You are right, the problem is in the addon linked to previous sourcetype. Thanks for your suggestions, I have all data I need to perform analysis. I'm going to do them.
I'm curious about why a sourcetype can no longer be used.  Sourcetypes never expire.  Perhaps it's an add-on that can't be used? The inputs.conf file to check is the one that references the file or ... See more...
I'm curious about why a sourcetype can no longer be used.  Sourcetypes never expire.  Perhaps it's an add-on that can't be used? The inputs.conf file to check is the one that references the file or directory we're talking about.  Use btool to find it. splunk btool --debug inputs list | grep "<<CSV file or directory name>>" Have you checked the logs? Have you tried the search I suggested? Have you tried looking in other indexes?
Hi @richgalloway, thanks for your answer. I can share with you some other bits. Previously, we used another sourcetype provided by a Splunk supported addon, which now can no longer be used after a... See more...
Hi @richgalloway, thanks for your answer. I can share with you some other bits. Previously, we used another sourcetype provided by a Splunk supported addon, which now can no longer be used after a check with support. Even if with some problems, data was sent to cloud while using it, so the HF has the right permission to read pulled csv files. I tested the custom addon on a local test environment and here all data are correctly extracted, even timestamp. I thought about inputs.conf file, but not sure about which one I have to analyze: the one in SPLUNK_HOME/etc/system/local? The one on SPLUNK_HOME/etc/system/default? Others?
Searches are in the audit log.  Saved searches will have a non-empty value in the savedsearch_name field.  The user name is in the user field. index=_audit action=search | table user savedsearch_nam... See more...
Searches are in the audit log.  Saved searches will have a non-empty value in the savedsearch_name field.  The user name is in the user field. index=_audit action=search | table user savedsearch_name search
You have the right steps, but perhaps something in the details is amiss. Verify the inputs.conf stanza points to the correct file/directory. Verify the file permissions allows reading by the HF. C... See more...
You have the right steps, but perhaps something in the details is amiss. Verify the inputs.conf stanza points to the correct file/directory. Verify the file permissions allows reading by the HF. Check the splunkd.log files on the HF to see if any messages might explain why the file is not uploaded. Confirm the CSV file has timestamps for each event and that the timestamps are correctly extracted.  Timestamps that are in the future or too far in the past will not be found by Splunk.  Try searching a wide time range to see if the data has bad timestamps index=web earliest=0 latest=+10y
Check out the timewrap command.
Untested, but try the chart command. | eval weeknum=strftime(strptime(yourdatefield,"%d-%m-%Y"),"%V") | chart dc(Task_num) as Tasks over weeknum by STATUS  
This depends in your use case and your environment. If you have Splunk Cloud in use then you can try to use Splunk Edge Processor. That is probably the easiest way to do it? Without Splunk Cloud you ... See more...
This depends in your use case and your environment. If you have Splunk Cloud in use then you can try to use Splunk Edge Processor. That is probably the easiest way to do it? Without Splunk Cloud you can try ingest even or "old way" with props.conf and transforms.conf. More about this: Field extraction configuration https://docs.splunk.com/Documentation/Splunk/9.1.1/Data/Configureindex-timefieldextraction Are you absolutely sure that you want extract those fields on index time not on search time?  
Can we show them in one row? for example:  223,229 (45%) it will look much better
Hi thanks not soo petty, but good enough as a workaround Do you know how can I add "%" to each value? current query: | stats sum(CountEvents) by CT | rename "sum(CountEvents)" as "CountEvents"... See more...
Hi thanks not soo petty, but good enough as a workaround Do you know how can I add "%" to each value? current query: | stats sum(CountEvents) by CT | rename "sum(CountEvents)" as "CountEvents" | eventstats sum(CountEvents) as Total | eval percentages%=round(CountEvents*100/Total,2) | fields - Total
Thank you for your answer. How can I specify a regular expression at ingestion time, in the "add data" wizard?
Hi Everyone, I want to plot a chart according to the calendar week. I plotted a timechart like this,   |timechart span=7d distinct_count(Task_num) as Tasks by STATUS   But this doesn't give the ... See more...
Hi Everyone, I want to plot a chart according to the calendar week. I plotted a timechart like this,   |timechart span=7d distinct_count(Task_num) as Tasks by STATUS   But this doesn't give the exact calendar weeks. Also i am keeping this charts data to last 3 months. Anyone have idea how to plot a bar chart based on calendar week? instead of date i want to see the data for current calendar weeks of last 3 months. I got from the splunk community on how to get the Calendar week. But i am not to plot a graph out of it. | eval weeknum=strftime(strptime(yourdatefield,"%d-%m-%Y"),"%V")    
Hello I'm trying to create a timechart which will compare between two date\time range I want to see the values of last sunday (10.9) between 15:00-16:30 and compare with the values for the same tim... See more...
Hello I'm trying to create a timechart which will compare between two date\time range I want to see the values of last sunday (10.9) between 15:00-16:30 and compare with the values for the same time but sunday last week (3.9) How can I do it ? Thanks
Hi Team, I am looking for the help to created search query for my daily run report which is running 3 time in a day. we are putting the files in directory which we are monitoring in splunk. is ... See more...
Hi Team, I am looking for the help to created search query for my daily run report which is running 3 time in a day. we are putting the files in directory which we are monitoring in splunk. is there any way we can grab events from only latest sourcefile? For example:  Index=abc sourcetype=xyz source=/opt/app/file1_09092023.csv source=/opt/app/file2_09102023.csv source=/opt/app/file3_09112023.csv..... new file can be placed time to time. I wanted report can be show only events from latest file, is it possible? Thank you  
Hard a hard time debugging that one. It only works if your SPL code with subquery return is in a dashboard "base search". <dashboard> <label>My dashboard title</label> <search id="parent_search... See more...
Hard a hard time debugging that one. It only works if your SPL code with subquery return is in a dashboard "base search". <dashboard> <label>My dashboard title</label> <search id="parent_search_1"> <query>``` put your query here with your subquery return $ ```</query> </search> <row> <panel> <table> <title>My child visualization</title> <search base="parent_search_1"> <query>``` have the rest of your query there ```</query>  
Hi I'm not sure if I understand your need, but maybe this helps? | stats sum(CountEvents) as countE by CT | eventstats sum(countE) as Total | eval perc=round(countE*100/Total,2) | chart sum(perc) a... See more...
Hi I'm not sure if I understand your need, but maybe this helps? | stats sum(CountEvents) as countE by CT | eventstats sum(countE) as Total | eval perc=round(countE*100/Total,2) | chart sum(perc) as "EventsPercentages[%]" values(countE) as countE over CT Then in visualisation tab select Format -> Chart Overlay - > Overlay (e.g.  your % field) and View as Axis = On r. Ismo
Hi Splunkers, I have to forward data inside csv files from an on prem HF to Splunk Cloud and I'm facing some issues, cause data seem to not be forwarded. Let me share with you some additional bits. ... See more...
Hi Splunkers, I have to forward data inside csv files from an on prem HF to Splunk Cloud and I'm facing some issues, cause data seem to not be forwarded. Let me share with you some additional bits. Info about data Source data are on a cloud instance (Forcepoint) provided by vendor A script has been provided by vendor to pull data from cloud The script is installed and configured on our Splunk HF Data are saved locally on HF Data are in .csv files  Info about HF configuration We create a new data inputs under Settings -> Data inputs -> Local inputs -> Files & Directories We set as data input the path were .csv are saved after script execution We set the proper sourcetype and index Of course, we configured the HF to send data to Splunk Cloud. We downloaded the file from cloud, from "Universal Forwarder" app and installed it as app on HF: the outputs.conf is proper configured, other data are sent without problem to Splunk cloud (for example, Network input ones goes to Cloud without issues; same for Windows ones) Info about sourcetype and index and their deployment We create a custom addon that simply provide the sourcetype "forcepoint" Sourcetype is configured to extract data from CSV; that means that we set parameter      Indexed_extractions=csv ​     We installed addon on both HF and Splunk Cloud The index, called simply "web", has been created on both HF and Splunk Cloud By thw way, seems that data are not sent from HF to Cloud. So, did I forgot some steps? Or I made wrong some of above ones?  
Hi, Too many hours to solve such a simple question...It is supposed to be a basic thing I want to present both percentages and regular values in bar chart (it can be in the tooltip, like it exi... See more...
Hi, Too many hours to solve such a simple question...It is supposed to be a basic thing I want to present both percentages and regular values in bar chart (it can be in the tooltip, like it exists in a pie chart), If not possible to present only percentages but add the "%" symbol (when I tried to add % it converted the fields to string and nothing was shown in the chart) * I can't add a js script, no access to the server This is my query: | stats sum(CountEvents) by CT | rename "sum(CountEvents)" as "countE" | eventstats sum(countE) as Total | eval perc=round(countE*100/Total,2) | chart sum(perc) as "EventsPercentages[%]" over CT thanks a lot