All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @cmlombardo, good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Hi @_pravin, no Data Models are calculated in a separated way and, as @richgalloway said, they could be in a different location and have a different retention. If you'r data Models use the same spa... See more...
Hi @_pravin, no Data Models are calculated in a separated way and, as @richgalloway said, they could be in a different location and have a different retention. If you'r data Models use the same space of the index, probably you used in the Data Model also the _raw, and it isn't a best practice, because in the Data Model, you should have only the fields you need for your searches, not all the _raw. Usually the space occupation for one year of an accelerated DataModel is around the daily license consuption for that index moltiplicated for 3.4. Ciao. Giuseppe
DMA data is stored in same location (by default) as the index the accelerated data came from, but is not included in the index size so is not covered by index size limits.  When sizing an index, one ... See more...
DMA data is stored in same location (by default) as the index the accelerated data came from, but is not included in the index size so is not covered by index size limits.  When sizing an index, one should leave room on the storage device for DMA or use the tstatsHomePath setting in indexes.conf to put DMA output elsewhere.
Hi @cmlombardo, I'm not sure (someone else could contradict me!) but they should be the same thing. Anyway, I prefer the second solution to be more sure about the sequence of transformation applica... See more...
Hi @cmlombardo, I'm not sure (someone else could contradict me!) but they should be the same thing. Anyway, I prefer the second solution to be more sure about the sequence of transformation application. because the execution sequence could be important. Ciao. Giuseppe
Hi @cmlombardo, good for you, see next time! let me know if I can help you more, or, please, accept one answer (also your own) for the other people of Community. Ciao and happy splunking Giuseppe... See more...
Hi @cmlombardo, good for you, see next time! let me know if I can help you more, or, please, accept one answer (also your own) for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
I would like to understand better how transformations work, in terms of priority and data flow. Let's say I have 3 transformations in place, that I want to apply to a sourcetype (e.g. "test"): [tr1... See more...
I would like to understand better how transformations work, in terms of priority and data flow. Let's say I have 3 transformations in place, that I want to apply to a sourcetype (e.g. "test"): [tr1] [tr2] [tr3] Can somebody explain to me what the difference is (if any) between these 2 props.conf: ----1---- [test] TRANSFORMS-1=tr1 TRANSFORMS-2=tr2 TRANSFORMS-3=tr3 ---2---- [test] TRANSFORMS-1=tr1,tr2,tr3   Is the resulting data transformation the same on both cases? Thank you!
Ciao Giuseppe, props.conf and transforms.conf are on our HFs. Since my post, I was able to finally find out what happened. The transforms names we chose were already used in other transformations e... See more...
Ciao Giuseppe, props.conf and transforms.conf are on our HFs. Since my post, I was able to finally find out what happened. The transforms names we chose were already used in other transformations elsewhere. What are the odds? Hilarious. After checking so many times the syntax and the regexes I decided to take a look at all the transformations in place and found the culprit. Now that the new transformations have a different name, everything works as expected. Claudio
The same search should do that.  It's a matter of how extensive the lookup file is.
Hi Community,   We have this wierd situation where one of the newest splunk installs (3 months old) went out of space - the capacity of the server was 500GB. When I checked the size of each ondex ... See more...
Hi Community,   We have this wierd situation where one of the newest splunk installs (3 months old) went out of space - the capacity of the server was 500GB. When I checked the size of each ondex in GUI, the size were all under limit. The sum of all were under 250 Gb, which made sense as the size of all index is set to 500GB (default). But when I calculated the size of the data models associated with the index, I could see that the data models had used almost 250Gb.  My understanding was that the data models should be also be included under the index capacity, but it seemed be exceeding the limits.   Can anyone please throw some light on this topic?   Regards, Pravin
Its resolved
I tried this option, but when I try to put "*" as a value it doesn't work, do you know why?
Dear team, Am using windows machine agent to capture default metrics ,but am getting the following error in client side when enabled the debug mode. com.google.inject.ProvisionException: Unable to ... See more...
Dear team, Am using windows machine agent to capture default metrics ,but am getting the following error in client side when enabled the debug mode. com.google.inject.ProvisionException: Unable to provision, see the following errors:   1) [Guice/ErrorInCustomProvider]: IllegalArgumentException: Cipher cannot be initialized   while locating LocalSimCollectorScriptPathProvider   at PathRawCollectorModule.configure(PathRawCollectorModule.java:25)       \_ installed by: ServersExtensionModule -> PathRawCollectorModule   at SimCollectorProcessBuilderFromPathProvider.<init>(SimCollectorProcessBuilderFromPathProvider.java:42)       \_ for 1st parameter   while locating SimCollectorProcessBuilderFromPathProvider   while locating Optional<RawCollectorUtil$ICollectorProcessBuilder>   Learn more:   https://github.com/google/guice/wiki/ERROR_IN_CUSTOM_PROVIDER   1 error   ====================== Full classname legend: ====================== LocalSimCollectorScriptPathProvider:        "com.appdynamics.sim.agent.extensions.servers.collector.LocalSimCollectorScriptPathProvider" Optional:                                   "com.google.common.base.Optional" PathRawCollectorModule:                     "com.appdynamics.sim.agent.extensions.servers.collector.PathRawCollectorModule" RawCollectorUtil$ICollectorProcessBuilder:  "com.appdynamics.sim.agent.extensions.servers.collector.RawCollectorUtil$ICollectorProcessBuilder" ServersExtensionModule:                     "com.appdynamics.sim.agent.extensions.servers.ServersExtensionModule" SimCollectorProcessBuilderFromPathProvider: "com.appdynamics.sim.agent.extensions.servers.collector.SimCollectorProcessBuilderFromPathProvider" ======================== End of classname legend: ========================   at com.google.inject.internal.InternalProvisionException.toProvisionException(InternalProvisionException.java:251) ~[guice-5.1.0.jar:?] at com.google.inject.internal.InjectorImpl$1.get(InjectorImpl.java:1104) ~[guice-5.1.0.jar:?] at com.appdynamics.sim.agent.extensions.servers.model.windows.WindowsRawCollector.collectRawData(WindowsRawCollector.java:84) ~[servers-23.7.0.3689.jar:?] at com.appdynamics.sim.agent.extensions.servers.model.windows.WindowsRawCollector.collectRawData(WindowsRawCollector.java:49) ~[servers-23.7.0.3689.jar:?] at com.appdynamics.sim.agent.extensions.servers.model.Server.collectAndReport(Server.java:65) ~[servers-23.7.0.3689.jar:?] at com.appdynamics.sim.agent.extensions.servers.ServersDataCollector.run(ServersDataCollector.java:94) [servers-23.7.0.3689.jar:?] at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) [?:1.8.0_101] at java.util.concurrent.FutureTask.runAndReset(Unknown Source) [?:1.8.0_101] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(Unknown Source) [?:1.8.0_101] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source) [?:1.8.0_101] at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) [?:1.8.0_101] at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [?:1.8.0_101] at java.lang.Thread.run(Unknown Source) [?:1.8.0_101] Caused by: java.lang.IllegalArgumentException: Cipher cannot be initialized at com.appdynamics.agent.sim.encryption.MessageDigestDecryptionService.decryptInputStream(MessageDigestDecryptionService.java:92) ~[machineagent.jar:Machine Agent v23.7.0.3689 GA compatible with 4.4.1.0 Build Date 2023-07-20 08:59:11] at com.appdynamics.sim.agent.extensions.servers.collector.LocalSimCollectorScriptPathProvider.get(LocalSimCollectorScriptPathProvider.java:77) ~[?:?] at com.appdynamics.sim.agent.extensions.servers.collector.LocalSimCollectorScriptPathProvider.get(LocalSimCollectorScriptPathProvider.java:34) ~[?:?] at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:86) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.provision(BoundProviderFactory.java:72) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:60) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:59) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingletonScope$1.get(SingletonScope.java:169) ~[guice-5.1.0.jar:?] at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:45) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingleParameterInjector.inject(SingleParameterInjector.java:40) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingleParameterInjector.getAll(SingleParameterInjector.java:60) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorInjector.provision(ConstructorInjector.java:113) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:91) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:300) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:58) ~[guice-5.1.0.jar:?] at com.google.inject.internal.InjectorImpl$1.get(InjectorImpl.java:1101) ~[guice-5.1.0.jar:?] ... 11 more Caused by: java.security.InvalidKeyException: Illegal key size at javax.crypto.Cipher.checkCryptoPerm(Cipher.java:1039) ~[?:1.8.0_71] at javax.crypto.Cipher.init(Cipher.java:1393) ~[?:1.8.0_71] at javax.crypto.Cipher.init(Cipher.java:1327) ~[?:1.8.0_71] at com.appdynamics.agent.sim.encryption.MessageDigestDecryptionService.decryptInputStream(MessageDigestDecryptionService.java:90) ~[machineagent.jar:Machine Agent v23.7.0.3689 GA compatible with 4.4.1.0 Build Date 2023-07-20 08:59:11] at com.appdynamics.sim.agent.extensions.servers.collector.LocalSimCollectorScriptPathProvider.get(LocalSimCollectorScriptPathProvider.java:77) ~[?:?] at com.appdynamics.sim.agent.extensions.servers.collector.LocalSimCollectorScriptPathProvider.get(LocalSimCollectorScriptPathProvider.java:34) ~[?:?] at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:86) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.provision(BoundProviderFactory.java:72) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:60) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:59) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingletonScope$1.get(SingletonScope.java:169) ~[guice-5.1.0.jar:?] at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:45) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingleParameterInjector.inject(SingleParameterInjector.java:40) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingleParameterInjector.getAll(SingleParameterInjector.java:60) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorInjector.provision(ConstructorInjector.java:113) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:91) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:300) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:58) ~[guice-5.1.0.jar:?] at com.google.inject.internal.InjectorImpl$1.get(InjectorImpl.java:1101) ~[guice-5.1.0.jar:?] ... 11 more   Any insights would be a great help   Regards Sathish
I would like to know which values are missing in the events compared to the lookup and output those field-values
Hi @bosseres, my hint is to create a scheduled search that aggregates logs asaving results in a summary index. Then you can use this aggregated summary index for your searches. Ciao. Giuseppe
Hi @BoldKnowsNothin , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
you are right, I can aggregate them, but problem that I have many searches where I need make the same one aggregation, it requires a lot of resources is it any options to aggregate them before sea... See more...
you are right, I can aggregate them, but problem that I have many searches where I need make the same one aggregation, it requires a lot of resources is it any options to aggregate them before searching? maybe any already ready solutions for auditd logs?
Hi @bosseres, usually logs are indexed as they are. Then you can display them aggregated as you like or detailed for each event. Could you betetr describe your requirement? Ciao. Giuseppe
Hello everyone!  Do anybody know, is it possible to aggregate (bind) auditd events (I mean logs from audit/audit.log) in one by Record ID (Event ID)? I want to make in on parsing study, and to get... See more...
Hello everyone!  Do anybody know, is it possible to aggregate (bind) auditd events (I mean logs from audit/audit.log) in one by Record ID (Event ID)? I want to make in on parsing study, and to get in my index already aggregated events in one.
Hello, We are investigating if we can install with helm Splunk OpenTelemetry Collector for Kubernetes to collect and ingest our logs to Splunk Cloud. We would like to split the system log from the ... See more...
Hello, We are investigating if we can install with helm Splunk OpenTelemetry Collector for Kubernetes to collect and ingest our logs to Splunk Cloud. We would like to split the system log from the other logs into two different indexes. Reading the documentation I saw that it is possible to indicate the index as an annotation in the namespaces or pods, but in the values.yaml of the helm the index field is required, but it seems to be usable for only one index. In summary we will want to use two different indexes, setting one as default and the other using namespace annotations. Could you kindly show me a configuration for our problem?
I'm also getting the same error, do you have any solutions yet?