All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Make sure you've SSL configured correctly on appropriate Splunk server. Use this link as reference: https://docs.splunk.com/Documentation/Splunk/9.1.1/Security/ConfigureSplunkforwardingtousesignedcer... See more...
Make sure you've SSL configured correctly on appropriate Splunk server. Use this link as reference: https://docs.splunk.com/Documentation/Splunk/9.1.1/Security/ConfigureSplunkforwardingtousesignedcertificates
Hi @gcusello ,   Our datamodels don't use the same space as in the index so the accelerated data don't have a cap on the limit. I really liked your extended answer but could you please explain the... See more...
Hi @gcusello ,   Our datamodels don't use the same space as in the index so the accelerated data don't have a cap on the limit. I really liked your extended answer but could you please explain the line below in quotes, I find it a bit confusing. "Usually the space occupation for one year of an accelerated DataModel is around the daily license consuption for that index moltiplicated for 3.4."   Regards, Pravin  
As I understand it, transforms listed separately are processed in lexicographical order whereas those listed in a single TRANSFORMS setting are processed in the order given. IOW, this [test] TRANSF... See more...
As I understand it, transforms listed separately are processed in lexicographical order whereas those listed in a single TRANSFORMS setting are processed in the order given. IOW, this [test] TRANSFORMS-3=tr3 TRANSFORMS-1=tr1 TRANSFORMS-2=tr2 is the same as this [test] TRANSFORMS-1=tr1 TRANSFORMS-2=tr2 TRANSFORMS-3=tr3 But this TRANSFORMS-1=tr3,tr1,tr2 will process the transforms in the listed order.
Hello,   I hope everything is okay.   I need your help.   I am using this spl request : "index="bloc1rg" AND libelle IN (IN_PREC, OUT_PREC, IN_BT, OUT_BT, IN_RANG, OUT_RANG) earliest=-1... See more...
Hello,   I hope everything is okay.   I need your help.   I am using this spl request : "index="bloc1rg" AND libelle IN (IN_PREC, OUT_PREC, IN_BT, OUT_BT, IN_RANG, OUT_RANG) earliest=-1mon@mon latest=-1d@d | append [search index="bloc1rg" AND libelle IN (IN_PREC, OUT_PREC, IN_BT, OUT_BT, IN_RANG, OUT_RANG) earliest=-1mon@mon latest=-1d@d | chart count over id_flux by libelle | eval IN_BT_OUT_BT=IN_BT+OUT_BT | eval IN_PREC_OUT_PREC=IN_PREC+OUT_PREC | eval IN_RANG_OUT_RANG=IN_RANG+OUT_RANG | where IN_BT_OUT_BT>=2 | where IN_PREC_OUT_PREC >=2 | where IN_RANG_OUT_RANG >=2 | transpose | search column=id_flux | transpose | fields - "column" | rename "row 1" as id_flux] | stats last(_time) as last_time by id_flux libelle "   I have this results : I can't get what I want. Let me explain. For a given id_flux, I'd like to have the response time defined as follows: - out_rang time - in_rang time - time out_prec - time in_prec -time out_bt - time in_bt     Voilà ce que j'ai utilisé comme requête complète : search index="bloc1rg" AND libelle IN (IN_PREC, OUT_PREC, IN_BT, OUT_BT, IN_RANG, OUT_RANG) earliest=-1mon@mon latest=-1d@d |chart count over id_flux by libelle |eval IN_BT_OUT_BT=IN_BT+OUT_BT |eval IN_PREC_OUT_PREC=IN_PREC+OUT_PREC |eval IN_PREC_OUT_PREC=IN_PREC+OUT_PREC | eval IN_RANG_OUT_RANG=IN_RANG+OUT_RANG |where IN_BT_OUT_BT>=2 |where IN_PREC_OUT_PREC >=2 |where IN_RANG_OUT_RANG >=2 |transpose |search column=id_flux |transpose |fields - "column" |rename "row 1" as id_flux] | eval sortorder=case(libelle=="IN_PREC",1,libelle=="OUT_PREC" AND statut=="KO",2,libelle=="OUT_PREC" AND statut=="OK",3,libelle=="IN_BT",4,libelle=="OUT_BT",5, libelle=="IN_RANG",6, libelle=="OUT_RANG" AND statut=="KO",7, libelle=="OUT_RANG" AND statut=="OK",8) | sort 0 sortorder |eval libelle=if(sortorder=2,"ARE", (if (sortorder=3,"AEE", (if(sortorder=7, "BAN",(if(sortorder=8, "CCO", libelle))))))) |table libelle sortorder _time |chart avg(_time) over sortorder by libelle | filldown AEE, ARE, IN_BT, IN_PREC, OUT_BT, IN_RANG, OUT_RANG |eval OK=abs(OUT_BT-IN_BT)/1000 |eval AEE=abs(AEE-IN_PREC)/1000 |eval ARE=abs(ARE-IN_PREC)/1000 |eval CCO=abs(CCO-IN_RANG) |eval BAN=abs(BAN-IN_RANG) |fields - sortorder |stats values(*) as * |table AEE ARE BAN CCO OK |transpose |rename "row 1" as "temps de traitement (s)" |rename column as "statut"  
Hi @cmlombardo, good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Hi @_pravin, no Data Models are calculated in a separated way and, as @richgalloway said, they could be in a different location and have a different retention. If you'r data Models use the same spa... See more...
Hi @_pravin, no Data Models are calculated in a separated way and, as @richgalloway said, they could be in a different location and have a different retention. If you'r data Models use the same space of the index, probably you used in the Data Model also the _raw, and it isn't a best practice, because in the Data Model, you should have only the fields you need for your searches, not all the _raw. Usually the space occupation for one year of an accelerated DataModel is around the daily license consuption for that index moltiplicated for 3.4. Ciao. Giuseppe
DMA data is stored in same location (by default) as the index the accelerated data came from, but is not included in the index size so is not covered by index size limits.  When sizing an index, one ... See more...
DMA data is stored in same location (by default) as the index the accelerated data came from, but is not included in the index size so is not covered by index size limits.  When sizing an index, one should leave room on the storage device for DMA or use the tstatsHomePath setting in indexes.conf to put DMA output elsewhere.
Hi @cmlombardo, I'm not sure (someone else could contradict me!) but they should be the same thing. Anyway, I prefer the second solution to be more sure about the sequence of transformation applica... See more...
Hi @cmlombardo, I'm not sure (someone else could contradict me!) but they should be the same thing. Anyway, I prefer the second solution to be more sure about the sequence of transformation application. because the execution sequence could be important. Ciao. Giuseppe
Hi @cmlombardo, good for you, see next time! let me know if I can help you more, or, please, accept one answer (also your own) for the other people of Community. Ciao and happy splunking Giuseppe... See more...
Hi @cmlombardo, good for you, see next time! let me know if I can help you more, or, please, accept one answer (also your own) for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
I would like to understand better how transformations work, in terms of priority and data flow. Let's say I have 3 transformations in place, that I want to apply to a sourcetype (e.g. "test"): [tr1... See more...
I would like to understand better how transformations work, in terms of priority and data flow. Let's say I have 3 transformations in place, that I want to apply to a sourcetype (e.g. "test"): [tr1] [tr2] [tr3] Can somebody explain to me what the difference is (if any) between these 2 props.conf: ----1---- [test] TRANSFORMS-1=tr1 TRANSFORMS-2=tr2 TRANSFORMS-3=tr3 ---2---- [test] TRANSFORMS-1=tr1,tr2,tr3   Is the resulting data transformation the same on both cases? Thank you!
Ciao Giuseppe, props.conf and transforms.conf are on our HFs. Since my post, I was able to finally find out what happened. The transforms names we chose were already used in other transformations e... See more...
Ciao Giuseppe, props.conf and transforms.conf are on our HFs. Since my post, I was able to finally find out what happened. The transforms names we chose were already used in other transformations elsewhere. What are the odds? Hilarious. After checking so many times the syntax and the regexes I decided to take a look at all the transformations in place and found the culprit. Now that the new transformations have a different name, everything works as expected. Claudio
The same search should do that.  It's a matter of how extensive the lookup file is.
Hi Community,   We have this wierd situation where one of the newest splunk installs (3 months old) went out of space - the capacity of the server was 500GB. When I checked the size of each ondex ... See more...
Hi Community,   We have this wierd situation where one of the newest splunk installs (3 months old) went out of space - the capacity of the server was 500GB. When I checked the size of each ondex in GUI, the size were all under limit. The sum of all were under 250 Gb, which made sense as the size of all index is set to 500GB (default). But when I calculated the size of the data models associated with the index, I could see that the data models had used almost 250Gb.  My understanding was that the data models should be also be included under the index capacity, but it seemed be exceeding the limits.   Can anyone please throw some light on this topic?   Regards, Pravin
Its resolved
I tried this option, but when I try to put "*" as a value it doesn't work, do you know why?
Dear team, Am using windows machine agent to capture default metrics ,but am getting the following error in client side when enabled the debug mode. com.google.inject.ProvisionException: Unable to ... See more...
Dear team, Am using windows machine agent to capture default metrics ,but am getting the following error in client side when enabled the debug mode. com.google.inject.ProvisionException: Unable to provision, see the following errors:   1) [Guice/ErrorInCustomProvider]: IllegalArgumentException: Cipher cannot be initialized   while locating LocalSimCollectorScriptPathProvider   at PathRawCollectorModule.configure(PathRawCollectorModule.java:25)       \_ installed by: ServersExtensionModule -> PathRawCollectorModule   at SimCollectorProcessBuilderFromPathProvider.<init>(SimCollectorProcessBuilderFromPathProvider.java:42)       \_ for 1st parameter   while locating SimCollectorProcessBuilderFromPathProvider   while locating Optional<RawCollectorUtil$ICollectorProcessBuilder>   Learn more:   https://github.com/google/guice/wiki/ERROR_IN_CUSTOM_PROVIDER   1 error   ====================== Full classname legend: ====================== LocalSimCollectorScriptPathProvider:        "com.appdynamics.sim.agent.extensions.servers.collector.LocalSimCollectorScriptPathProvider" Optional:                                   "com.google.common.base.Optional" PathRawCollectorModule:                     "com.appdynamics.sim.agent.extensions.servers.collector.PathRawCollectorModule" RawCollectorUtil$ICollectorProcessBuilder:  "com.appdynamics.sim.agent.extensions.servers.collector.RawCollectorUtil$ICollectorProcessBuilder" ServersExtensionModule:                     "com.appdynamics.sim.agent.extensions.servers.ServersExtensionModule" SimCollectorProcessBuilderFromPathProvider: "com.appdynamics.sim.agent.extensions.servers.collector.SimCollectorProcessBuilderFromPathProvider" ======================== End of classname legend: ========================   at com.google.inject.internal.InternalProvisionException.toProvisionException(InternalProvisionException.java:251) ~[guice-5.1.0.jar:?] at com.google.inject.internal.InjectorImpl$1.get(InjectorImpl.java:1104) ~[guice-5.1.0.jar:?] at com.appdynamics.sim.agent.extensions.servers.model.windows.WindowsRawCollector.collectRawData(WindowsRawCollector.java:84) ~[servers-23.7.0.3689.jar:?] at com.appdynamics.sim.agent.extensions.servers.model.windows.WindowsRawCollector.collectRawData(WindowsRawCollector.java:49) ~[servers-23.7.0.3689.jar:?] at com.appdynamics.sim.agent.extensions.servers.model.Server.collectAndReport(Server.java:65) ~[servers-23.7.0.3689.jar:?] at com.appdynamics.sim.agent.extensions.servers.ServersDataCollector.run(ServersDataCollector.java:94) [servers-23.7.0.3689.jar:?] at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) [?:1.8.0_101] at java.util.concurrent.FutureTask.runAndReset(Unknown Source) [?:1.8.0_101] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(Unknown Source) [?:1.8.0_101] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source) [?:1.8.0_101] at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) [?:1.8.0_101] at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [?:1.8.0_101] at java.lang.Thread.run(Unknown Source) [?:1.8.0_101] Caused by: java.lang.IllegalArgumentException: Cipher cannot be initialized at com.appdynamics.agent.sim.encryption.MessageDigestDecryptionService.decryptInputStream(MessageDigestDecryptionService.java:92) ~[machineagent.jar:Machine Agent v23.7.0.3689 GA compatible with 4.4.1.0 Build Date 2023-07-20 08:59:11] at com.appdynamics.sim.agent.extensions.servers.collector.LocalSimCollectorScriptPathProvider.get(LocalSimCollectorScriptPathProvider.java:77) ~[?:?] at com.appdynamics.sim.agent.extensions.servers.collector.LocalSimCollectorScriptPathProvider.get(LocalSimCollectorScriptPathProvider.java:34) ~[?:?] at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:86) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.provision(BoundProviderFactory.java:72) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:60) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:59) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingletonScope$1.get(SingletonScope.java:169) ~[guice-5.1.0.jar:?] at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:45) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingleParameterInjector.inject(SingleParameterInjector.java:40) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingleParameterInjector.getAll(SingleParameterInjector.java:60) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorInjector.provision(ConstructorInjector.java:113) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:91) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:300) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:58) ~[guice-5.1.0.jar:?] at com.google.inject.internal.InjectorImpl$1.get(InjectorImpl.java:1101) ~[guice-5.1.0.jar:?] ... 11 more Caused by: java.security.InvalidKeyException: Illegal key size at javax.crypto.Cipher.checkCryptoPerm(Cipher.java:1039) ~[?:1.8.0_71] at javax.crypto.Cipher.init(Cipher.java:1393) ~[?:1.8.0_71] at javax.crypto.Cipher.init(Cipher.java:1327) ~[?:1.8.0_71] at com.appdynamics.agent.sim.encryption.MessageDigestDecryptionService.decryptInputStream(MessageDigestDecryptionService.java:90) ~[machineagent.jar:Machine Agent v23.7.0.3689 GA compatible with 4.4.1.0 Build Date 2023-07-20 08:59:11] at com.appdynamics.sim.agent.extensions.servers.collector.LocalSimCollectorScriptPathProvider.get(LocalSimCollectorScriptPathProvider.java:77) ~[?:?] at com.appdynamics.sim.agent.extensions.servers.collector.LocalSimCollectorScriptPathProvider.get(LocalSimCollectorScriptPathProvider.java:34) ~[?:?] at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:86) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.provision(BoundProviderFactory.java:72) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:60) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:59) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingletonScope$1.get(SingletonScope.java:169) ~[guice-5.1.0.jar:?] at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:45) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingleParameterInjector.inject(SingleParameterInjector.java:40) ~[guice-5.1.0.jar:?] at com.google.inject.internal.SingleParameterInjector.getAll(SingleParameterInjector.java:60) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorInjector.provision(ConstructorInjector.java:113) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:91) ~[guice-5.1.0.jar:?] at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:300) ~[guice-5.1.0.jar:?] at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:58) ~[guice-5.1.0.jar:?] at com.google.inject.internal.InjectorImpl$1.get(InjectorImpl.java:1101) ~[guice-5.1.0.jar:?] ... 11 more   Any insights would be a great help   Regards Sathish
I would like to know which values are missing in the events compared to the lookup and output those field-values
Hi @bosseres, my hint is to create a scheduled search that aggregates logs asaving results in a summary index. Then you can use this aggregated summary index for your searches. Ciao. Giuseppe
Hi @BoldKnowsNothin , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
you are right, I can aggregate them, but problem that I have many searches where I need make the same one aggregation, it requires a lot of resources is it any options to aggregate them before sea... See more...
you are right, I can aggregate them, but problem that I have many searches where I need make the same one aggregation, it requires a lot of resources is it any options to aggregate them before searching? maybe any already ready solutions for auditd logs?