All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @AL3Z, could you better describe what you whould do? if you already indexed a log, you canot remove an avent or a part of it. If you want to exclude some null values from a search you can do iy... See more...
Hi @AL3Z, could you better describe what you whould do? if you already indexed a log, you canot remove an avent or a part of it. If you want to exclude some null values from a search you can do iy in the search. So what' your requirement? Ciao. Giuseppe
Hi @BoldKnowsNothin, did you tried field aliases (https://docs.splunk.com/Documentation/Splunk/9.1.1/Knowledge/Addaliasestofields)? Ciao. Giuseppe
Hi @AL3Z, if you don't have results to the control search and you have all the other logs, you solved your issue. Ciao. Giuseppe
Hi @Navanitha , publish the solution when you'll solve for the other people of Community. Ciao. Giuseppe P.S.: Karma Points are appreciated
Hi @AL3Z, modify the regex in Search and see if the new regex matches all the events to filter. Ciao. Giuseppe
Hi @Navanitha, I see that the TIME_FORMAT is different, are these logs coming from the same source? maybe you have to apply different sourcetypes and different timestamp formatting. Ciao. Giuseppe
Some of the event logs in Splunk are getting truncated at the beginning. Tried some prop's to break before date, line_breaking at new line but nothing seems to be working. Truncated events 9/29/23... See more...
Some of the event logs in Splunk are getting truncated at the beginning. Tried some prop's to break before date, line_breaking at new line but nothing seems to be working. Truncated events 9/29/23 5:40:46.000 AM entFacing:1x.1xx.1xx.2xx/4565 to inside:1x.9x.x4x.x4x/43 duration 0:00:00 bytes 0 9/29/23 5:40:36.000 AM 53 (1x.x8.2xx.2xx/34) 9/29/23 5:37:21.000 AM bytes 1275 Well parsed events -  2023-09-29T05:57:57-04:00 1x.xx.2.1xx %ASA-6-302014: Teardown TCP connection 758830654 for ARCC:1xx.x7.9x.1x/xx to inside:1x.2xx.6x.x1/xx17 duration 0:00:00 bytes 0 Failover primary closed 2023-09-29T05:57:57-04:00 1x.xx.2.1xx %ASA-6-302021: Teardown ICMP connection for faddr 1x0.x5.0.1x/0 gaddr 1x.2x6.1xx6.x6/0 laddr 1x.xx6.1xx.x6/0 type 3 code 1   My props TZ = UTC SHOULD_LINEMERGE=false NO_BINARY_CHECK=true CHARSET=UTF-8 disabled=false TIME_FORMAT=%Y-%m-%dT%H:%M:%S MAX_TIMESTAMP_LOOKAHEAD=32
Hello comrades,   I'm just curios is there anyway to shorten frequent words? For example: <Data Name='IpAddress'>::ffff:10.95.81.99</Data> IpAddress to ipaddr or something like IPa.   Many than... See more...
Hello comrades,   I'm just curios is there anyway to shorten frequent words? For example: <Data Name='IpAddress'>::ffff:10.95.81.99</Data> IpAddress to ipaddr or something like IPa.   Many thanks,  
Hi  @gcusello  I'm trying to blacklist the below paths .. C:\Program Files\Rapid7\Insight Agent\components\insight_agent\3.2.5.31\ir_agent.exe C:\Program Files\WindowsPowerShell\Modules\gytpol\Cl... See more...
Hi  @gcusello  I'm trying to blacklist the below paths .. C:\Program Files\Rapid7\Insight Agent\components\insight_agent\3.2.5.31\ir_agent.exe C:\Program Files\WindowsPowerShell\Modules\gytpol\Client\fw4_6_2\GytpolClientFW4_6_2.exe Can we use like.* in place of version if it gets new version it can also be blacklisted ??  ----  Rapid7\\Insight Agent\\components\\insight_agent\\.*\\ir_agent.exe)|WindowsPowerShell\\Modules\\gytpol\\Client\\fw.*\\GytpolClientFW.*.exe)     Thanks      
I tried installing the add-on on HF but no luck.  I am working with Splunk support on this and they figured that the KV store for Checkpoint add-on is not loading as the regex is not matching our eve... See more...
I tried installing the add-on on HF but no luck.  I am working with Splunk support on this and they figured that the KV store for Checkpoint add-on is not loading as the regex is not matching our events.  They are working on giving me a regex, will try it out once I have it.
Hello, I was trying to explore all the null values in my index but is it not working as expected do we need any changes in the search  index=vpn earliest=-7d | fieldsummary | where match(values... See more...
Hello, I was trying to explore all the null values in my index but is it not working as expected do we need any changes in the search  index=vpn earliest=-7d | fieldsummary | where match(values, "^\[{\"value\":\"null\",\"count\":\d+\}\]$") Thanks  
Hi, @gcusello , yes I've  modified the inputs.conf in the Add-On (located in $SPLUNK_HOME/etc/deployment-apps) that is deployed using the Deployment Server. When I try this in search head it is not... See more...
Hi, @gcusello , yes I've  modified the inputs.conf in the Add-On (located in $SPLUNK_HOME/etc/deployment-apps) that is deployed using the Deployment Server. When I try this in search head it is not giving any results , Do we need to modify spl ?  index=winsec host=xxx | regex  "(?:New Process Name:).+(?:SplunkUniversalForwarder\\bin\\splunk.exe)"   Thanks
Hi there I've run into an issue where I can sort of guess why I'm having issues though have no clear idea regarding how to solve it. In our distributed environment we have a "lookup app" in our dep... See more...
Hi there I've run into an issue where I can sort of guess why I'm having issues though have no clear idea regarding how to solve it. In our distributed environment we have a "lookup app" in our deployer, TA_lookups/lookups/lookupfile.csv Recently a coworker added a few new lookup files and made additions to the file in question. This is where the problem manifests, logging onto the deployer, checking that the correct files are present in /opt/splunk/etc/shcluster/apps/TA_lookups/lookups/lookupfile.csv Everything looks great. Applying the bundle worked without any complaints/errors. All the new csv files show up in the cluster and are accesible from the GUI, however. This one file, the "lookupfile.csv" is not updated. So I can sort of guess that it may have something to do with the file being in use or something, though I am stompt as to how I should go about solving this? I've tried making some additional changes to the file, checked for any wierd linebraking or something, and nothing. I can se from the CLI that this one file has not been modified since the initial deployment, so the deployer applies the bundle, there are no complaints on either end that I can find, it just skips this one pre-existing csv file completely and as far as I can see, silently. What do I do here? Is there a way to "force" the push? Is the only way to solve this to just manually remove the app from the SH cluster an push again? All suggestions are welcome Best regards
Hi @AL3Z, I suppose that you modified the inputs.conf in the Add-On (located in $SPLUNK_HOME/etc/deployment-apps) that is deployed using the Deployment Server, is it correct? To be more sure, check... See more...
Hi @AL3Z, I suppose that you modified the inputs.conf in the Add-On (located in $SPLUNK_HOME/etc/deployment-apps) that is deployed using the Deployment Server, is it correct? To be more sure, check if the regex you used is correct in the search dashboard. Ciao. Giuseppe
Hi Team,   We are currently using splunk version 7.2, it was installed by a third party and currently we don't have info on the login credentials used to download the splunk earlier. if I download ... See more...
Hi Team,   We are currently using splunk version 7.2, it was installed by a third party and currently we don't have info on the login credentials used to download the splunk earlier. if I download the latest version with free trial and update the splunk version, will it update the existing license or we have to download with the same login as earlier to get the license?   Thanks and Regards, Shalini S
Hi, I had blacklisted C:\\Program Files\\SplunkUniversalForwarder\\bin\\splunk.exe  in inputs.conf  of Deploymentserver. blacklist3 = EvenCode="4688" Message="(?:New Process Name:).+(?:SplunkUnive... See more...
Hi, I had blacklisted C:\\Program Files\\SplunkUniversalForwarder\\bin\\splunk.exe  in inputs.conf  of Deploymentserver. blacklist3 = EvenCode="4688" Message="(?:New Process Name:).+(?:SplunkUniversalForwarder\\bin\\splunk.exe) Still I can see the logs ingestion into splunk,  How we can stop this ingestion.
Hi @cmlombardo, where did you located this props and transforms files? They must be located in the Indexers or, if present in the first Heavy Forwarder they are passing throgh, not on Universal For... See more...
Hi @cmlombardo, where did you located this props and transforms files? They must be located in the Indexers or, if present in the first Heavy Forwarder they are passing throgh, not on Universal Forwarder. Ciao. Giuseppe
Hi @Choi_Hyun, the UniversalForwarder App is an internal Splunk App and usually it isn't used to add configurations, how do you have an inputs.conf in this App? Anyway, I'm not sure that it's possi... See more...
Hi @Choi_Hyun, the UniversalForwarder App is an internal Splunk App and usually it isn't used to add configurations, how do you have an inputs.conf in this App? Anyway, I'm not sure that it's possible to manage this App using a Deployment Server, but if you have the inputs.conf file in local you could try to deploy this App with an inputs.conf with this stanza disabled. Otherwise, the only solution is a remote script shell that remove this file (not the App!) and restarts Splunk. I'm very confident about this last solution. If all the above solutions don't work, open a case to Splunk Support. Ciao. Giuseppe
Hello, In K8S, on a pod running a Spring Boot 3.x application (with OpenJDK 17) auto-instrumented by cluster-agent, the Java Agent fails on startup: [AD Agent init] Wed Sep 27 22:27:38 PDT 2023[INF... See more...
Hello, In K8S, on a pod running a Spring Boot 3.x application (with OpenJDK 17) auto-instrumented by cluster-agent, the Java Agent fails on startup: [AD Agent init] Wed Sep 27 22:27:38 PDT 2023[INFO]: JavaAgent - Java Agent Directory [/opt/appdynamics-java/ver22.9.0.34210] [AD Agent init] Wed Sep 27 22:27:38 PDT 2023[INFO]: JavaAgent - Java Agent AppAgent directory [/opt/appdynamics-java/ver22.9.0.34210] Agent logging directory set to [/opt/appdynamics-java/ver22.9.0.34210/logs] [AD Agent init] Wed Sep 27 22:27:38 PDT 2023[INFO]: JavaAgent - Agent logging directory set to [/opt/appdynamics-java/ver22.9.0.34210/logs] Could not start Java Agent, disabling the agent with exception java.lang.NoClassDefFoundError: Could not initialize class org.apache.logging.log4j.message.ReusableMessageFactory, Please check log files In the pod, the jar file (log4j-api) containing the ReusableMessageFactory is there (part of the appdynamics java-agent): sh-4.4$ pwd /opt/appdynamics-java/ver22.9.0.34210/lib/tp sh-4.4$ ls log4j* log4j-api-2.17.1.1.9.cached.packages.txt log4j-core-2.17.1.1.9.cached.packages.txt log4j-jcl-2.17.1.cached.packages.txt log4j-api-2.17.1.1.9.jar log4j-core-2.17.1.1.9.jar log4j-jcl-2.17.1.jar log4j-api-2.17.1.1.9.jar.asc log4j-core-2.17.1.1.9.jar.asc log4j-jcl-2.17.1.jar.asc From the POD manifest: - name: JAVA_TOOL_OPTIONS value: ' -Dappdynamics.agent.accountAccessKey=$(APPDYNAMICS_AGENT_ACCOUNT_ACCESS_KEY) -Dappdynamics.agent.reuse.nodeName=true -Dappdynamics.socket.collection.bci.enable=true -Dappdynamics.agent.startup.log.level=debug -Dappdynamics.agent.reuse.nodeName.prefix=eric-tmo-des-ms-entitlements -javaagent:/opt/appdynamics-java/javaagent.jar' I tried with the latest java-agent (23.9) but same result. I don't seem to have the problem with SpringBoot 2.7 (which does include log4j-api as opposed to 3.x). It seems the classloader can't find the class from the java-agent distribution.)  Has anyone encountered this ?  Thank you.
Hi @hiersdd, my first hint is to use a syslogs server like rsyslog or syslog-ng so it receives syslogs also when Splunk is down. You could also use the SC4S (https://splunkbase.splunk.com/app/4740)... See more...
Hi @hiersdd, my first hint is to use a syslogs server like rsyslog or syslog-ng so it receives syslogs also when Splunk is down. You could also use the SC4S (https://splunkbase.splunk.com/app/4740) that's a syslog-ng and a Universal forwarder. In this way you can easily manage inputs. Anyway, did you tried to use an inpul like the following? [udp://192.168.2.*:514] connection_host = ip index = checkpoint sourcetype = syslog Ciao. Giuseppe