All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@David_M  Verify that the reports are configured to generate CSV files. In Splunk Web, go to Settings > Searches, Reports, and Alerts, find your reports, and check their settings.   you have two... See more...
@David_M  Verify that the reports are configured to generate CSV files. In Splunk Web, go to Settings > Searches, Reports, and Alerts, find your reports, and check their settings.   you have two choices: 1) schedule an alert adding csv as attachment, to receive the csv via email. 2) you could schedule a report adding the outputcsv command at the end. In this way, you save your report as csv in a pre-defined folder (not changeable!).  
Hi Kiran, Well I checked the directory mentioned in the posts and the files aren't there for some reason. David
@David_M  By default, when a Splunk report generates a CSV file (e.g., using the outputcsv command or scheduled report export), the files are saved in the $SPLUNK_HOME/var/run/splunk/csv directory o... See more...
@David_M  By default, when a Splunk report generates a CSV file (e.g., using the outputcsv command or scheduled report export), the files are saved in the $SPLUNK_HOME/var/run/splunk/csv directory on the search head where the report was executed. $SPLUNK_HOME is typically /opt/splunk on Linux systems, so the full path would be /opt/splunk/var/run/splunk/csv/. Navigate to this directory using a terminal: cd /opt/splunk/var/run/splunk/csv ls -l Look for files with a .csv extension. The file names might correspond to the report name, search job ID, or a custom name specified in the report configuration https://docs.splunk.com/Documentation/SplunkCloud/latest/SearchReference/Outputcsv Please refer to this for more details, as highlighted by @gcusello :  https://community.splunk.com/t5/Getting-Data-In/Is-there-anyway-to-generate-and-store-CSV-files-in-a-specific/td-p/465996 
Hello, I setup 2 reports to run early this AM.  Looks like both reports ran according to splunk.  The problem I have now is finding the actual .csv files on the splunk server so I can scp them. Tha... See more...
Hello, I setup 2 reports to run early this AM.  Looks like both reports ran according to splunk.  The problem I have now is finding the actual .csv files on the splunk server so I can scp them. Thank...
Update I suspect this might be a Splunk-related issue, possibly due to the version I'm currently using (9.3.1). I spun up a new server for quick testing and reused the same configuration parameters... See more...
Update I suspect this might be a Splunk-related issue, possibly due to the version I'm currently using (9.3.1). I spun up a new server for quick testing and reused the same configuration parameters from my previous setup. Mainly, the props.conf, transforms.conf, and inputs.conf. Interestingly, everything seems to be working fine on the new server, even though the configuration is identical to the old one. The only difference I can observe is in the data ingestion flow. initially, I ingested a set of JSON array entries in one format, and later ingested another set with a different structure containing more fields. So far, it all appears to be working as expected. However, when I tried the same method on my previous server, it didn’t work as expected. This is puzzling since both servers are using the same configuration files and setup. The only noticeable difference was the data ingestion flow on the new server, I ingested one format of JSON array first, followed by another with more fields, and it worked fine. But replicating this exact process on the older server doesn’t yield the same results.
Hi @hazardoom , 7k alerts aren't upgradable, in this case, the only way, is to move them in the default folder. Ciao. Giuseppe
Hi @ganesanvc  Please can you provide the full search you are trying? Did you try it outside the dashboard in the search bar? If there is a problem with the search you should be able to see it clear... See more...
Hi @ganesanvc  Please can you provide the full search you are trying? Did you try it outside the dashboard in the search bar? If there is a problem with the search you should be able to see it clearly there. Thanks  
How to manually copy 7k alerts? Isn't there a faster way with bash script or rest or sth? 
@livehybrid  i am getting no result or 0 record for this  
Hello @Xiangning_Mao , is it possible to add Status column as well?
metadata is not reliable compared to tstats, with tstats I get accurate lastTime field info.
Hi @Zhangyy  This should give you approx 25km in each direction as you've explaine: | where lat>=35.5 AND lat<=36.0 AND lon>=139.5 AND lon<=140.0  Did this answer help you? If so, please conside... See more...
Hi @Zhangyy  This should give you approx 25km in each direction as you've explaine: | where lat>=35.5 AND lat<=36.0 AND lon>=139.5 AND lon<=140.0  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @livehybrid, I'm still under to get the fields listed even updating the props.conf. [preprocess_case] TRANSFORMS-setsourcetype = sourcetype_router, sourcetype_router2 SHOULD_LINEMERGE=false LI... See more...
Hi @livehybrid, I'm still under to get the fields listed even updating the props.conf. [preprocess_case] TRANSFORMS-setsourcetype = sourcetype_router, sourcetype_router2 SHOULD_LINEMERGE=false LINE_BREAKER=(\[)|(([\r\n]+)\s*{(?=\s*"attribute":\s*{))|(\]) TRUNCATE=100000 TIME_PREFIX="ClosedDate":\s*" [too_small] PREFIX_SOURCETYPE = false  
Hi @LearningGuy  If you are wanting to collect the data into a "summary" index then you do not have to use the method which appends the "summaryindex" command if this doesnt do what you need it to d... See more...
Hi @LearningGuy  If you are wanting to collect the data into a "summary" index then you do not have to use the method which appends the "summaryindex" command if this doesnt do what you need it to do. Instead just create your search as you did with the collect command (with output mode to HEC) and then schedule the report to run at the relevant interval.  Check out Manually configure a report to populate a summary index in the summary indexing docs.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @Zhangyy , try without dots and use quotes. Ciao. Giuseppe
Hi @rahulhari88 , my hint is from the Splunk Cluster Administration Course, probably it's ok also in your way: try it. Ciao. Giuseppe
Hi @ws  The reason you arent getting the fields listed is because it isnt being parsed as valid JSON. To remove the trailing "]" try the following LINE_BREAKER LINE_BREAKER=(\[)|(([\r\n]+)\s*{(?=\... See more...
Hi @ws  The reason you arent getting the fields listed is because it isnt being parsed as valid JSON. To remove the trailing "]" try the following LINE_BREAKER LINE_BREAKER=(\[)|(([\r\n]+)\s*{(?=\s*"attribute":\s*{))|(\])  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi team, I have a question related to Splunk SOAR. I'm working on a new community app that will include an on-poll action. This action will ingest a large number of events into SOAR. I came across a... See more...
Hi team, I have a question related to Splunk SOAR. I'm working on a new community app that will include an on-poll action. This action will ingest a large number of events into SOAR. I came across a document that mentions a few limits, including that 61k events were tested. I just wanted to check if anyone knows what configuration was used for that test? (For example, what environment or specs were in place when they tested the 61k ingestion?)
Hi @livehybrid, i tried the following method to write into the local file with keeping the file at /tmp but it still didn't work. As for my situation, i think the best scenario would be keep a recor... See more...
Hi @livehybrid, i tried the following method to write into the local file with keeping the file at /tmp but it still didn't work. As for my situation, i think the best scenario would be keep a record of something like "seen before record.txt" and do a comparison and only to write new records into the file and remove previous indexed entries. At least the current approach is workable, but we’ll need to monitor the file size of "seen before record.txt" as it continues to grow. For now, the file size isn’t a concern since it only stores a limited number of tracking records.
Hi, Unsure what is the root cause as i was trying to do some minor adjustment to ignore the [ ] at the transforms.conf. Previously I'm able to view the fields like Id Name and their value but curre... See more...
Hi, Unsure what is the root cause as i was trying to do some minor adjustment to ignore the [ ] at the transforms.conf. Previously I'm able to view the fields like Id Name and their value but currently nothing shows. I tried to re-do the props.conf, transforms.conf and inputs.conf by adding parameter by parameter and it still didn't work.