All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I came across an identical thread at Re: How does Splunk calculate linecount? - Splunk Community
@livehybrid  yes i am trying it outside dashboard in search bar i not getting any error or result as well  
Hi Kiran, Yea adding the outputcsv command fixed the issue.   Thanks! David
Hi @David_M  Did you use outputcsv, or some other method for exporting the csv such as using the "Output results to lookup" alert action? As previously mentioned - the output path for outputcsv is ... See more...
Hi @David_M  Did you use outputcsv, or some other method for exporting the csv such as using the "Output results to lookup" alert action? As previously mentioned - the output path for outputcsv is $SPLUNK_HOME/var/run/splunk/csv - however these files are not replicated across the cluster if you are running a SHC.  If you're using the outputcsv, can you confirm you arent using dispatch=true ? If you are you then your job will be in $SPLUNK_HOME/var/run/splunk/dispatch/<job id>/csv  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
Hello Had an additional questions, if I have to extract datas from two fields For example, I will NOT send to distant splunk, all the datas that in the dest_zone and src_zone contains DMZ-NETWORK... See more...
Hello Had an additional questions, if I have to extract datas from two fields For example, I will NOT send to distant splunk, all the datas that in the dest_zone and src_zone contains DMZ-NETWORK and GUEST-NETWORK My config is something like that TRANSFORMS.CONF [clone_only_dmz-network] REGEX = ^((?!dmz-network).)*$ CLONE_SOURCETYPE = cloned:firewall_clone DEST_KEY = _SYSLOG_ROUTING #it's syslog logs FORMAT = distant_splunk PROPS.CONF [firewall_sourcetype] TRANSFORMS-clone = clone_only_dmz-network OUTPUTS.CONF [syslog:distant_splunk] server = ip of the distant HF This is actually working, I tested it But, if I want to not send mutliple fileds, how can I achieve that ?
Thanks much, this seems to be a direct point to our administrators. Can you comment on the problems reported by isoutamo below?
@David_M  Verify that the reports are configured to generate CSV files. In Splunk Web, go to Settings > Searches, Reports, and Alerts, find your reports, and check their settings.   you have two... See more...
@David_M  Verify that the reports are configured to generate CSV files. In Splunk Web, go to Settings > Searches, Reports, and Alerts, find your reports, and check their settings.   you have two choices: 1) schedule an alert adding csv as attachment, to receive the csv via email. 2) you could schedule a report adding the outputcsv command at the end. In this way, you save your report as csv in a pre-defined folder (not changeable!).  
Hi Kiran, Well I checked the directory mentioned in the posts and the files aren't there for some reason. David
@David_M  By default, when a Splunk report generates a CSV file (e.g., using the outputcsv command or scheduled report export), the files are saved in the $SPLUNK_HOME/var/run/splunk/csv directory o... See more...
@David_M  By default, when a Splunk report generates a CSV file (e.g., using the outputcsv command or scheduled report export), the files are saved in the $SPLUNK_HOME/var/run/splunk/csv directory on the search head where the report was executed. $SPLUNK_HOME is typically /opt/splunk on Linux systems, so the full path would be /opt/splunk/var/run/splunk/csv/. Navigate to this directory using a terminal: cd /opt/splunk/var/run/splunk/csv ls -l Look for files with a .csv extension. The file names might correspond to the report name, search job ID, or a custom name specified in the report configuration https://docs.splunk.com/Documentation/SplunkCloud/latest/SearchReference/Outputcsv Please refer to this for more details, as highlighted by @gcusello :  https://community.splunk.com/t5/Getting-Data-In/Is-there-anyway-to-generate-and-store-CSV-files-in-a-specific/td-p/465996 
Hello, I setup 2 reports to run early this AM.  Looks like both reports ran according to splunk.  The problem I have now is finding the actual .csv files on the splunk server so I can scp them. Tha... See more...
Hello, I setup 2 reports to run early this AM.  Looks like both reports ran according to splunk.  The problem I have now is finding the actual .csv files on the splunk server so I can scp them. Thank...
Update I suspect this might be a Splunk-related issue, possibly due to the version I'm currently using (9.3.1). I spun up a new server for quick testing and reused the same configuration parameters... See more...
Update I suspect this might be a Splunk-related issue, possibly due to the version I'm currently using (9.3.1). I spun up a new server for quick testing and reused the same configuration parameters from my previous setup. Mainly, the props.conf, transforms.conf, and inputs.conf. Interestingly, everything seems to be working fine on the new server, even though the configuration is identical to the old one. The only difference I can observe is in the data ingestion flow. initially, I ingested a set of JSON array entries in one format, and later ingested another set with a different structure containing more fields. So far, it all appears to be working as expected. However, when I tried the same method on my previous server, it didn’t work as expected. This is puzzling since both servers are using the same configuration files and setup. The only noticeable difference was the data ingestion flow on the new server, I ingested one format of JSON array first, followed by another with more fields, and it worked fine. But replicating this exact process on the older server doesn’t yield the same results.
Hi @hazardoom , 7k alerts aren't upgradable, in this case, the only way, is to move them in the default folder. Ciao. Giuseppe
Hi @ganesanvc  Please can you provide the full search you are trying? Did you try it outside the dashboard in the search bar? If there is a problem with the search you should be able to see it clear... See more...
Hi @ganesanvc  Please can you provide the full search you are trying? Did you try it outside the dashboard in the search bar? If there is a problem with the search you should be able to see it clearly there. Thanks  
How to manually copy 7k alerts? Isn't there a faster way with bash script or rest or sth? 
@livehybrid  i am getting no result or 0 record for this  
Hello @Xiangning_Mao , is it possible to add Status column as well?
metadata is not reliable compared to tstats, with tstats I get accurate lastTime field info.
Hi @Zhangyy  This should give you approx 25km in each direction as you've explaine: | where lat>=35.5 AND lat<=36.0 AND lon>=139.5 AND lon<=140.0  Did this answer help you? If so, please conside... See more...
Hi @Zhangyy  This should give you approx 25km in each direction as you've explaine: | where lat>=35.5 AND lat<=36.0 AND lon>=139.5 AND lon<=140.0  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @livehybrid, I'm still under to get the fields listed even updating the props.conf. [preprocess_case] TRANSFORMS-setsourcetype = sourcetype_router, sourcetype_router2 SHOULD_LINEMERGE=false LI... See more...
Hi @livehybrid, I'm still under to get the fields listed even updating the props.conf. [preprocess_case] TRANSFORMS-setsourcetype = sourcetype_router, sourcetype_router2 SHOULD_LINEMERGE=false LINE_BREAKER=(\[)|(([\r\n]+)\s*{(?=\s*"attribute":\s*{))|(\]) TRUNCATE=100000 TIME_PREFIX="ClosedDate":\s*" [too_small] PREFIX_SOURCETYPE = false  
Hi @LearningGuy  If you are wanting to collect the data into a "summary" index then you do not have to use the method which appends the "summaryindex" command if this doesnt do what you need it to d... See more...
Hi @LearningGuy  If you are wanting to collect the data into a "summary" index then you do not have to use the method which appends the "summaryindex" command if this doesnt do what you need it to do. Instead just create your search as you did with the collect command (with output mode to HEC) and then schedule the report to run at the relevant interval.  Check out Manually configure a report to populate a summary index in the summary indexing docs.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing