DMA data is stored in same location (by default) as the index the accelerated data came from, but is not included in the index size so is not covered by index size limits. When sizing an index, one ...
See more...
DMA data is stored in same location (by default) as the index the accelerated data came from, but is not included in the index size so is not covered by index size limits. When sizing an index, one should leave room on the storage device for DMA or use the tstatsHomePath setting in indexes.conf to put DMA output elsewhere.
Hi @cmlombardo, I'm not sure (someone else could contradict me!) but they should be the same thing. Anyway, I prefer the second solution to be more sure about the sequence of transformation applica...
See more...
Hi @cmlombardo, I'm not sure (someone else could contradict me!) but they should be the same thing. Anyway, I prefer the second solution to be more sure about the sequence of transformation application. because the execution sequence could be important. Ciao. Giuseppe
Hi @cmlombardo, good for you, see next time! let me know if I can help you more, or, please, accept one answer (also your own) for the other people of Community. Ciao and happy splunking Giuseppe...
See more...
Hi @cmlombardo, good for you, see next time! let me know if I can help you more, or, please, accept one answer (also your own) for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
I would like to understand better how transformations work, in terms of priority and data flow. Let's say I have 3 transformations in place, that I want to apply to a sourcetype (e.g. "test"): [tr1...
See more...
I would like to understand better how transformations work, in terms of priority and data flow. Let's say I have 3 transformations in place, that I want to apply to a sourcetype (e.g. "test"): [tr1] [tr2] [tr3] Can somebody explain to me what the difference is (if any) between these 2 props.conf: ----1---- [test]
TRANSFORMS-1=tr1
TRANSFORMS-2=tr2
TRANSFORMS-3=tr3 ---2---- [test]
TRANSFORMS-1=tr1,tr2,tr3 Is the resulting data transformation the same on both cases? Thank you!
Ciao Giuseppe, props.conf and transforms.conf are on our HFs. Since my post, I was able to finally find out what happened. The transforms names we chose were already used in other transformations e...
See more...
Ciao Giuseppe, props.conf and transforms.conf are on our HFs. Since my post, I was able to finally find out what happened. The transforms names we chose were already used in other transformations elsewhere. What are the odds? Hilarious. After checking so many times the syntax and the regexes I decided to take a look at all the transformations in place and found the culprit. Now that the new transformations have a different name, everything works as expected. Claudio
Hi Community, We have this wierd situation where one of the newest splunk installs (3 months old) went out of space - the capacity of the server was 500GB. When I checked the size of each ondex ...
See more...
Hi Community, We have this wierd situation where one of the newest splunk installs (3 months old) went out of space - the capacity of the server was 500GB. When I checked the size of each ondex in GUI, the size were all under limit. The sum of all were under 250 Gb, which made sense as the size of all index is set to 500GB (default). But when I calculated the size of the data models associated with the index, I could see that the data models had used almost 250Gb. My understanding was that the data models should be also be included under the index capacity, but it seemed be exceeding the limits. Can anyone please throw some light on this topic? Regards, Pravin
Dear team, Am using windows machine agent to capture default metrics ,but am getting the following error in client side when enabled the debug mode. com.google.inject.ProvisionException: Unable to ...
See more...
Dear team, Am using windows machine agent to capture default metrics ,but am getting the following error in client side when enabled the debug mode. com.google.inject.ProvisionException: Unable to provision, see the following errors: 1) [Guice/ErrorInCustomProvider]: IllegalArgumentException: Cipher cannot be initialized while locating LocalSimCollectorScriptPathProvider at PathRawCollectorModule.configure(PathRawCollectorModule.java:25) \_ installed by: ServersExtensionModule -> PathRawCollectorModule at SimCollectorProcessBuilderFromPathProvider.<init>(SimCollectorProcessBuilderFromPathProvider.java:42) \_ for 1st parameter while locating SimCollectorProcessBuilderFromPathProvider while locating Optional<RawCollectorUtil$ICollectorProcessBuilder> Learn more: https://github.com/google/guice/wiki/ERROR_IN_CUSTOM_PROVIDER 1 error ====================== Full classname legend: ====================== LocalSimCollectorScriptPathProvider: "com.appdynamics.sim.agent.extensions.servers.collector.LocalSimCollectorScriptPathProvider" Optional: "com.google.common.base.Optional" PathRawCollectorModule: "com.appdynamics.sim.agent.extensions.servers.collector.PathRawCollectorModule" RawCollectorUtil$ICollectorProcessBuilder: "com.appdynamics.sim.agent.extensions.servers.collector.RawCollectorUtil$ICollectorProcessBuilder" ServersExtensionModule: "com.appdynamics.sim.agent.extensions.servers.ServersExtensionModule" SimCollectorProcessBuilderFromPathProvider: "com.appdynamics.sim.agent.extensions.servers.collector.SimCollectorProcessBuilderFromPathProvider" ======================== End of classname legend: ======================== at com.google.inject.internal.InternalProvisionException.toProvisionException(InternalProvisionException.java:251) ~[guice-5.1.0.jar:?] at com.google.inject.internal.InjectorImpl$1.get(InjectorImpl.java:1104) ~[guice-5.1.0.jar:?] at com.appdynamics.sim.agent.extensions.servers.model.windows.WindowsRawCollector.collectRawData(WindowsRawCollector.java:84) ~[servers-23.7.0.3689.jar:?] at com.appdynamics.sim.agent.extensions.servers.model.windows.WindowsRawCollector.collectRawData(WindowsRawCollector.java:49) ~[servers-23.7.0.3689.jar:?] at com.appdynamics.sim.agent.extensions.servers.model.Server.collectAndReport(Server.java:65) ~[servers-23.7.0.3689.jar:?] at com.appdynamics.sim.agent.extensions.servers.ServersDataCollector.run(ServersDataCollector.java:94) [servers-23.7.0.3689.jar:?] at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) [?:1.8.0_101] at java.util.concurrent.FutureTask.runAndReset(Unknown Source) [?:1.8.0_101] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(Unknown Source) [?:1.8.0_101] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source) [?:1.8.0_101] at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) [?:1.8.0_101] at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [?:1.8.0_101] at java.lang.Thread.run(Unknown Source) [?:1.8.0_101] Caused by: java.lang.IllegalArgumentException: Cipher cannot be initialized at com.appdynamics.agent.sim.encryption.MessageDigestDecryptionService.decryptInputStream(MessageDigestDecryptionService.java:92) ~[machineagent.jar:Machine Agent v23.7.0.3689 GA compatible with 4.4.1.0 Build Date 2023-07-20 08:59:11] at com.appdynamics.sim.agent.extensions.servers.collector.LocalSimCollectorScriptPathProvider.get(LocalSimCollectorScriptPathProvider.java:77) ~[?:?] at com.appdynamics.sim.agent.extensions.servers.collector.LocalSimCollectorScriptPathProvider.get(LocalSimCollectorScriptPathProvider.java:34) ~[?:?] at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:86) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.provision(BoundProviderFactory.java:72) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:60) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:59) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingletonScope$1.get(SingletonScope.java:169) ~[guice-5.1.0.jar:?] at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:45) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingleParameterInjector.inject(SingleParameterInjector.java:40) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingleParameterInjector.getAll(SingleParameterInjector.java:60) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorInjector.provision(ConstructorInjector.java:113) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:91) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:300) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:58) ~[guice-5.1.0.jar:?] at com.google.inject.internal.InjectorImpl$1.get(InjectorImpl.java:1101) ~[guice-5.1.0.jar:?] ... 11 more Caused by: java.security.InvalidKeyException: Illegal key size at javax.crypto.Cipher.checkCryptoPerm(Cipher.java:1039) ~[?:1.8.0_71] at javax.crypto.Cipher.init(Cipher.java:1393) ~[?:1.8.0_71] at javax.crypto.Cipher.init(Cipher.java:1327) ~[?:1.8.0_71] at com.appdynamics.agent.sim.encryption.MessageDigestDecryptionService.decryptInputStream(MessageDigestDecryptionService.java:90) ~[machineagent.jar:Machine Agent v23.7.0.3689 GA compatible with 4.4.1.0 Build Date 2023-07-20 08:59:11] at com.appdynamics.sim.agent.extensions.servers.collector.LocalSimCollectorScriptPathProvider.get(LocalSimCollectorScriptPathProvider.java:77) ~[?:?] at com.appdynamics.sim.agent.extensions.servers.collector.LocalSimCollectorScriptPathProvider.get(LocalSimCollectorScriptPathProvider.java:34) ~[?:?] at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:86) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.provision(BoundProviderFactory.java:72) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:60) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:59) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingletonScope$1.get(SingletonScope.java:169) ~[guice-5.1.0.jar:?] at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:45) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingleParameterInjector.inject(SingleParameterInjector.java:40) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingleParameterInjector.getAll(SingleParameterInjector.java:60) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorInjector.provision(ConstructorInjector.java:113) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:91) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:300) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:58) ~[guice-5.1.0.jar:?] at com.google.inject.internal.InjectorImpl$1.get(InjectorImpl.java:1101) ~[guice-5.1.0.jar:?] ... 11 more Any insights would be a great help Regards Sathish
Hi @bosseres, my hint is to create a scheduled search that aggregates logs asaving results in a summary index. Then you can use this aggregated summary index for your searches. Ciao. Giuseppe
you are right, I can aggregate them, but problem that I have many searches where I need make the same one aggregation, it requires a lot of resources is it any options to aggregate them before sea...
See more...
you are right, I can aggregate them, but problem that I have many searches where I need make the same one aggregation, it requires a lot of resources is it any options to aggregate them before searching? maybe any already ready solutions for auditd logs?
Hi @bosseres, usually logs are indexed as they are. Then you can display them aggregated as you like or detailed for each event. Could you betetr describe your requirement? Ciao. Giuseppe
Hello everyone! Do anybody know, is it possible to aggregate (bind) auditd events (I mean logs from audit/audit.log) in one by Record ID (Event ID)? I want to make in on parsing study, and to get...
See more...
Hello everyone! Do anybody know, is it possible to aggregate (bind) auditd events (I mean logs from audit/audit.log) in one by Record ID (Event ID)? I want to make in on parsing study, and to get in my index already aggregated events in one.
Hello, We are investigating if we can install with helm Splunk OpenTelemetry Collector for Kubernetes to collect and ingest our logs to Splunk Cloud. We would like to split the system log from the ...
See more...
Hello, We are investigating if we can install with helm Splunk OpenTelemetry Collector for Kubernetes to collect and ingest our logs to Splunk Cloud. We would like to split the system log from the other logs into two different indexes. Reading the documentation I saw that it is possible to indicate the index as an annotation in the namespaces or pods, but in the values.yaml of the helm the index field is required, but it seems to be usable for only one index. In summary we will want to use two different indexes, setting one as default and the other using namespace annotations. Could you kindly show me a configuration for our problem?
Hi @BoldKnowsNothin , what do you mean with "reduce space"? alias are applied at search time, meaning there isn't any additional disk usage. About license usage, the number of aliases or data elab...
See more...
Hi @BoldKnowsNothin , what do you mean with "reduce space"? alias are applied at search time, meaning there isn't any additional disk usage. About license usage, the number of aliases or data elaborations don't consume any additional license: license is only the volume of daily indexed logs. Ciao. Giuseppe
Hi @AMAN0113 .. please check these pages: https://docs.splunk.com/Documentation/Splunk/9.1.1/Security/limitfieldfiltering https://community.splunk.com/t5/Splunk-Search/How-to-restrict-search-access...
See more...
Hi @AMAN0113 .. please check these pages: https://docs.splunk.com/Documentation/Splunk/9.1.1/Security/limitfieldfiltering https://community.splunk.com/t5/Splunk-Search/How-to-restrict-search-access-to-certain-hosts-or-fields-on-a/m-p/192290