All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Wait a second. You did both on the same host? rpm and deb?
Hi @SplunkySplunk  All three are three big concepts and looks like you have done some studying on the Splunk docs(if not, the links are below).  maybe you should ask your requirements more clearly.... See more...
Hi @SplunkySplunk  All three are three big concepts and looks like you have done some studying on the Splunk docs(if not, the links are below).  maybe you should ask your requirements more clearly.. so there will be better answers. thanks.    https://docs.splunk.com/Documentation/Splunk/9.1.2/Knowledge/Usesummaryindexing https://docs.splunk.com/Documentation/Splunk/9.1.2/Knowledge/Aboutdatamodels https://docs.splunk.com/Documentation/SplunkCloud/9.1.2308/Knowledge/Manageacceleratedsearchsummaries  
Hello. Im using Splunk cloud and thinking about add summary index or data model. I'm trying to understand the difference between the 3 options : summary index, report acceleration and data model. ... See more...
Hello. Im using Splunk cloud and thinking about add summary index or data model. I'm trying to understand the difference between the 3 options : summary index, report acceleration and data model. Can someone please explain to me what is the main purpose of each ? Using summary index is the best way to avoid performance issues with heavy searches ? How it works with summary index? should i create new index and run my dashboards on this index ? Thanks
I have updated the universal forwarder with RPM and deb packages and following commands: rpm -Uvh and dpkg -i
Sure. But how do you define "assets"? How do you differentiate between them? Because while you can use the general approach of combining two separate searches (either by means of append or multisear... See more...
Sure. But how do you define "assets"? How do you differentiate between them? Because while you can use the general approach of combining two separate searches (either by means of append or multisearch) with additional field to classify your results into one of two sets, there might be more effective ways in specific cases.
How did you install and upgrade your forwarder? RPM? deb? tgz?
Hello, I noticed that in versions upper 9.1, the user and group were changed to "splunkfwd" I have updated the universal forwarder to the newer version (9.1), but the user and group did not chang... See more...
Hello, I noticed that in versions upper 9.1, the user and group were changed to "splunkfwd" I have updated the universal forwarder to the newer version (9.1), but the user and group did not change to "splunkfwd." Subsequently, we encountered several problems related to permissions, such as the Universal Forwarder lacking permission to read auditd logs. Therefore, it is necessary to modify the "log_group" parameter in the auditd.conf file. Should I manually change it, or is there an alternative solution to resolve all permission problems?
Hi @Drewprice , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Hi @kyokei, this is another question, even if on the same data. Anyway, if you want to discard a part of your events at index time, you have to use the SEDCMD command in your props.conf [your_sour... See more...
Hi @kyokei, this is another question, even if on the same data. Anyway, if you want to discard a part of your events at index time, you have to use the SEDCMD command in your props.conf [your_sourcetype] SEDCMD = s/^([^,]*,)([^,]*,)(.*)/([^,]*,)()(.*)/g Ciao. Giuseppe
makeresults can be used to generate search results. An example from the documentation. | makeresults | eval test="buttercup rarity tenderhoof dash mcintosh fleetfoot mistmane" | makemv delim=" " t... See more...
makeresults can be used to generate search results. An example from the documentation. | makeresults | eval test="buttercup rarity tenderhoof dash mcintosh fleetfoot mistmane" | makemv delim=" " test | mvexpand test _time test 2024-01-01 00:00:00 buttercup 2024-01-01 00:00:00 rarity 2024-01-01 00:00:00 tenderhoof 2024-01-01 00:00:00 dash 2024-01-01 00:00:00 mcintosh 2024-01-01 00:00:00 fleetfoot 2024-01-01 00:00:00 mistmane   Then you can use `search` or any other commands as usual. | makeresults | eval test="buttercup rarity tenderhoof dash mcintosh fleetfoot mistmane" | makemv delim=" " test | mvexpand test | search test="m*" _time test 2024-01-01 00:00:00 mcintosh 2024-01-01 00:00:00 mistmane
@splunkcol  the update of plugin worked - your reply helped my friend. I wish you the best.
Hi, forgive my English, I'm using a translator. About the problem it is better that you send a ticket to Splunk directly, it seems to be a bug in the end I gave up because whenever they ask about th... See more...
Hi, forgive my English, I'm using a translator. About the problem it is better that you send a ticket to Splunk directly, it seems to be a bug in the end I gave up because whenever they ask about the version of splunk they always make the excuse that they do not support old versions of Splunk, in this case although it is true that I was using an old version of Splunk clearly the problem was the add-on.
Hi @splunkcol  what was your fix? We have exact same issue. Its a mandatory field as soon we update the secret key - and doesn't allow save. 
Hi @gcusello , With your help i am able to extract the timestamp, _time = 23-11-26 01:20:51.500 AM but _time is same for each event. 23-11-26 01:20:51.500 AM, +0.000000000E+00,+2.90500E+00,0 23-1... See more...
Hi @gcusello , With your help i am able to extract the timestamp, _time = 23-11-26 01:20:51.500 AM but _time is same for each event. 23-11-26 01:20:51.500 AM, +0.000000000E+00,+2.90500E+00,0 23-11-26 01:20:51.500 AM,+1.000000000E-01,+1.45180E+01,0 23-11-26 01:20:51.500 AM,+2.000000000E-01,+7.93600E+00,0 23-11-26 01:20:51.500 AM,+3.000000000E-01,+3.60100E+00,0 23-11-26 01:20:51.500 AM,+4.000000000E-01,+3.19100E+00,0 23-11-26 01:20:51.500 AM,+5.000000000E-01,+3.17300E+00,0   How can i achieve below format during data ingest? 23-11-26 01:20:51.500 AM, +2.90500E+00,0 23-11-26 01:20:51.600 AM, +1.45180E+01,0 23-11-26 01:20:51.700 AM, +7.93600E+00,0 23-11-26 01:20:51.800 AM, +3.60100E+00,0 23-11-26 01:20:51.900 AM, +3.19100E+00,0 23-11-26 01:20:52.000 AM, +3.17300E+00,0 Basically, add those duration under time to the trigger time to create _time. "Time","U1-2[]","Event" +0.000000000E+00,+2.90500E+00,0 +1.000000000E-01,+1.45180E+01,0 +2.000000000E-01,+7.93600E+00,0 +3.000000000E-01,+3.60100E+00,0 +4.000000000E-01,+3.19100E+00,0 Thanks a lot for your help.
@PickleRick Thanks for the reply. Please, ignore both searches. What I want to pull out the total unique assets in the DHCP source. I then want to be able to compare to the totals of unique assets ... See more...
@PickleRick Thanks for the reply. Please, ignore both searches. What I want to pull out the total unique assets in the DHCP source. I then want to be able to compare to the totals of unique assets in the SysMon source and output these assets that do not have SysMon present. Thanks in advance.
Thank you so much! I was going down that track but could not put it together.
If you look into outputs.conf specs, you'll see that it supports both SQS output as well as RFS output which should be able to write into S3 buckets. Never used them myself though so I have no idea h... See more...
If you look into outputs.conf specs, you'll see that it supports both SQS output as well as RFS output which should be able to write into S3 buckets. Never used them myself though so I have no idea how they work and whether they require HF or if they will work with UF as well (I suspect the former).
It's not clear how you distinguish your "sources". Your first search simply pulls data from two separate indexes while the second one does something completely strange. Please describe what constitu... See more...
It's not clear how you distinguish your "sources". Your first search simply pulls data from two separate indexes while the second one does something completely strange. Please describe what constitutes those sets of sources you want to calculate difference from.
》An other intorragation, is it normal to only have default DataModels visible (and not all my Datamodels) from CM (Settings/DataModels)  ? My DM are ok.... sorry for that
Hello Community, I have a challenge finding and isolating the unique hosts out of two sources (DHCL and SysMon in my case) I did try the following but it did work as expected: EXAMPLE 1: index=dh... See more...
Hello Community, I have a challenge finding and isolating the unique hosts out of two sources (DHCL and SysMon in my case) I did try the following but it did work as expected: EXAMPLE 1: index=dhcp_source_index | stats count by host | eval source="dhcp" | append [ search index=sysmon_index | stats count by host | eval source="sysmon" ] | stats values(source) as sources by host | where mvcount(sources)=1 AND sources="dhcp"   EXAMPLE 2: index=my_index | dedup host, source | stats list(source) as sources by host | append [search index=my_index | stats latest(_time) as last_seen by host] | eventstats max(last_seen) as last_seen by host | where mvcount(sources)=1 | table host, last_seen The numbers from the manual findings and the above SPLs differ Thanks in advance