All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello, I want to send syslog entrees to splunk directly. The config is done with the command "syslogadmin --set -ip xxx.xxx.xxxx.xxx -port 65456. Here is the configuration in place. But it does not... See more...
Hello, I want to send syslog entrees to splunk directly. The config is done with the command "syslogadmin --set -ip xxx.xxx.xxxx.xxx -port 65456. Here is the configuration in place. But it does not work Are there other ports or other configuration to set up? Any ideas ? syslogadmin --show -ip syslog.1 xxx.xxx.xxx.29 port 65456 syslog.2 xxx.xxx.xxx.30 port 65456 syslog.3 xxx.xxx.xxx.31 port 65456 syslog.4 xxx.xxx.xxx.80 port 65456 syslogadmin --show -facility Syslog facility: LOG_LOCAL7 auditcfg --show -filter Audit filter is enabled. 1-ZONE 2-SECURITY 3-CONFIGURATION 4-FIRMWARE 5-FABRIC 7-LS 8-CLI 9-MAPS Severity level: INFO
| metadata type=hosts index=*
By default HEC is running on HTTPS. If you really want to disable SSL then you can change it by doing the below  - In Splunk UI Goto -> Settings -> Data Inputs -> HTTP Event Collector - Click on "G... See more...
By default HEC is running on HTTPS. If you really want to disable SSL then you can change it by doing the below  - In Splunk UI Goto -> Settings -> Data Inputs -> HTTP Event Collector - Click on "Global Settings" Button and uncheck the "Enable SSL" Option ------ If you find this solution helpful, please consider accepting it and awarding karma points !!  
You can do this with CSS - essentially you need to have a multivalue field with the colour you want in the second value - this can be the result of some calculation e.g. based on the first three char... See more...
You can do this with CSS - essentially you need to have a multivalue field with the colour you want in the second value - this can be the result of some calculation e.g. based on the first three characters of the field - then you can follow this link to see how it is done in similar circumstances https://community.splunk.com/t5/Dashboards-Visualizations/How-to-update-table-cell-color-as-per-the-another-field/m-p/599965#M49240  
How to get an output containing all host details of all time along with their last update times?  Below search is taking huge time, how to get this optimized for faster search - index=*| fields h... See more...
How to get an output containing all host details of all time along with their last update times?  Below search is taking huge time, how to get this optimized for faster search - index=*| fields host, _time | stats max(_time) as last_update_time by host | eval t=now() | eval days_since_last_update=tonumber(strftime((t-last_update_time),"%d"))-1 | where days_since_last_update>30 | eval last_update_time=strftime(last_update_time, "%Y-%m-%d %H:%M:%S") | table last_update_time host days_since_last_update  
There are lots of different skills required or desirable for setting up and utilising Splunk, from architecture, infrastructure design, network management, data science, ux design, coding (to some de... See more...
There are lots of different skills required or desirable for setting up and utilising Splunk, from architecture, infrastructure design, network management, data science, ux design, coding (to some degree), etc. There are plenty of courses available through Splunk education, and other providers. Also, professional services can help you with this.
Hello @ITWhisperer, Thank you for your response. I wanted to let you know that API responses are not published in Splunk yet. While I have limited experience with SPL and haven't built any Splunk da... See more...
Hello @ITWhisperer, Thank you for your response. I wanted to let you know that API responses are not published in Splunk yet. While I have limited experience with SPL and haven't built any Splunk dashboards, I am very interested in working with Splunk. Looking forward to your guidance!  
Hello Splunkers!! I have ingested data into Splunk from the source system using the URI "https://localhost:8088/services/collector" along with the HEC token. However, the data is not being displaye... See more...
Hello Splunkers!! I have ingested data into Splunk from the source system using the URI "https://localhost:8088/services/collector" along with the HEC token. However, the data is not being displayed in Splunk with the appropriate sourcetype parsing, which is affecting the timestamp settings for the events. The sourcetype and timestamp are currently being displayed as below. My actual props.conf setting as below : [agv_voot] DATETIME_CONFIG = LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true category = Custom KV_MODE = json pulldown_type = 1 TIME_PREFIX = ^\@timestamp TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%3N TIMESTAMP_FIELDS = @timestamp TRANSFORMS-trim_timestamp = trim_long_timestamp transforms.conf [trim_long_timestamp] REGEX = (\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3})\d+(-\d{2}:\d{2}) FORMAT = $1 Please help to fix the proper parsing with correct sourcetype and timestamp.
Is your data already in Splunk? Have the fields already been extracted? Do you know how to write SPL? Do you know how to create dashboards?
| eval average=floor(average)
Hi  can we force the default expiration of all scheduled searches to 24 hours in Splunk cloud? I came across few post/docs which states that can be done but It was unclear as in which configuration ... See more...
Hi  can we force the default expiration of all scheduled searches to 24 hours in Splunk cloud? I came across few post/docs which states that can be done but It was unclear as in which configuration we need to make the changes in
Splunk HEC was configured as defined in the documentation. I could see that I can send data using https URL. When sending same data using HTTP URL - request is failing with the error "curl: (56) Recv... See more...
Splunk HEC was configured as defined in the documentation. I could see that I can send data using https URL. When sending same data using HTTP URL - request is failing with the error "curl: (56) Recv failure: Connection reset by peer". curl https://<host>:<port>/services/collector -H  'Authorisation: Splunk <token>' -d '{"sourcetype": "demo", "event": "Test data!"}' OUTPUT/Response :  {"text":"Success","code":0} curl http://<host>:<port>/services/collector -H  'Authorisation: Splunk <token>' -d '{"sourcetype": "demo", "event": "Test data!"}' curl: (56) Recv failure: Connection reset by peer This was the command used to enable token /opt/splunk/bin/splunk http-event-collector enable -name <hec_name> -uri https://localhost:8089 which worked perfectly fine thought I had to enable http URL and executed below command: /opt/splunk/bin/splunk http-event-collector enable -name catania-app-stat -uri http://localhost:8089 Error/Output : Cannot connect Splunk server What am I missing here. How do I get source to send data over HTTP protocol.
Maybe something like | mstats rate(Query) as QPS where index=metrics host=* by Site span=5m | streamstats window=2 global=false current=false stdev(QPS) as devF by Site | sort Site, - _time | stream... See more...
Maybe something like | mstats rate(Query) as QPS where index=metrics host=* by Site span=5m | streamstats window=2 global=false current=false stdev(QPS) as devF by Site | sort Site, - _time | streamstats window=2 global=false current=false stdev(QPS) as devB by Site | where 4*devF > QPS OR 4*devB > QPS | timechart span=5m values(QPS) by Site  
Hi @tsocyberoperati , you have to follow the instructions at https://docs.splunk.com/Documentation/Splunk/9.3.1/Forwarding/Forwarddatatothird-partysystemsd#Forward_a_subset_of_data in props.conf [... See more...
Hi @tsocyberoperati , you have to follow the instructions at https://docs.splunk.com/Documentation/Splunk/9.3.1/Forwarding/Forwarddatatothird-partysystemsd#Forward_a_subset_of_data in props.conf [host::hostA] TRANSFORMS-hostA = send_to_syslog in transforms.conf [send_to_syslog] REGEX = . DEST_KEY = _SYSLOG_ROUTING FORMAT = my_syslog_group where my_syslog_group is the stanza in outputs.conf. Ciao. Giuseppe
We are looking into build an our own AI chatbot with integrating Splunk AI Assistant. Can Splunk AI Assistant be called using API calls through our application and get the responses? If possible, can... See more...
We are looking into build an our own AI chatbot with integrating Splunk AI Assistant. Can Splunk AI Assistant be called using API calls through our application and get the responses? If possible, can you provide further details about those ?
Hi @arjun_ananth , I don't like lookup method, I'd like to use a summary index: schedule a search every night (if the change frequency that you want to monitor is one day) e.g.: index=your_index |... See more...
Hi @arjun_ananth , I don't like lookup method, I'd like to use a summary index: schedule a search every night (if the change frequency that you want to monitor is one day) e.g.: index=your_index | dedup ip | table _time host ip | collect index=your_summary and then run a search on the summary index: index=your_summary | stats dc(ip) AS ip_count By host | where ip_count>1 in this way you haven't the problem of manage the timestamp and lookup upgrade, and, at the same time, you have a quick search. Ciao. Giuseppe
Working on a query to generate an alert when a field value changes. The requirement is to detect the change in IP for a FQDN. Currently I'm trying to use a lookup file which has the current value of... See more...
Working on a query to generate an alert when a field value changes. The requirement is to detect the change in IP for a FQDN. Currently I'm trying to use a lookup file which has the current value of the IP for two FQDN per host.  Columns - Host|FQDN|Current_IP Looks something like Host1 fqdn1 IP1 Host2 fqdn1 IP1 Host1 fqdn2 IP2 Host2 fqdn2 IP2 I followed an approach suggested in another thread to use inputlookup My current query looks like - stats latest(IP) as Latest_IP | inputlookup append=true myfile.csv | stats first(Latest_IP) as Latest_IP, first(Current_IP) as Previous_IP | where Latest_IP!=Previous_IP   This gives me a result with the latest and previous IP whenever the IP changes, but looking to add more details to the result which also lists the FQDN and the time when the IP changed.
We've configured -Dappagent.start.timeout=30000 for the java agent for the webapps after we got the issue of Pods getting failed to start due to AppD taking lot of time initially which was delaying l... See more...
We've configured -Dappagent.start.timeout=30000 for the java agent for the webapps after we got the issue of Pods getting failed to start due to AppD taking lot of time initially which was delaying liveness probe in EKS. After adding timeout, as per the doc the AppD agent will start in parallel with the application startup reducing the overall startup time. Until few days ago, we started below issue in webapps running on Wildfly server where it is saying java.lang.NoClassDefFoundError: com/singularity/ee/agent/appagent/entrypoint/bciengine/FastMethodInterceptorDelegatorBoot and surprisingly it is giving this error when we remove the timeout configuration. Can anyone confirm if they came across this issue?  9:17:46,228 INFO [stdout] (AD Agent init) Agent will mark node historical at normal shutdown of JVM 09:17:50,323 INFO [stdout] (AD Agent init) Registered app server agent with Node ID[455861] Component ID[6467] Application ID [553] 09:17:56,727 ERROR [stderr] (Reference Reaper #2) Exception in thread "Reference Reaper #2" java.lang.NoClassDefFoundError: com/singularity/ee/agent/appagent/entrypoint/bciengine/FastMethodInterceptorDelegatorBoot 09:17:56,729 ERROR [stderr] (Reference Reaper #2) at org.wildfly.common.ref.References$ReaperThread.run(References.java) 09:17:56,822 ERROR [stderr] (Reference Reaper #1) Exception in thread "Reference Reaper #3" Exception in thread "Reference Reaper #1" java.lang.NoClassDefFoundError: com/singularity/ee/agent/appagent/entrypoint/bciengine/FastMethodInterceptorDelegatorBoot 09:17:56,823 ERROR [stderr] (Reference Reaper #1) at org.wildfly.common.ref.References$ReaperThread.run(References.java) 09:17:56,823 ERROR [stderr] (Reference Reaper #1) Caused by: java.lang.ClassNotFoundException: com.singularity.ee.agent.appagent.entrypoint.bciengine.FastMethodInterceptorDelegatorBoot from [Module "org.wildfly.common" version 1.6.0.Final from local module loader @7a30d1e6 (finder: local module finder @5891e32e (roots: /opt/jboss/modules,/opt/jboss/modules/system/layers/base))] 09:17:56,824 ERROR [stderr] (Reference Reaper #1) at org.jboss.modules.ModuleClassLoader.findClass(ModuleClassLoader.java:200) 09:17:56,824 ERROR [stderr] (Reference Reaper #1) at org.jboss.modules.ConcurrentClassLoader.performLoadClassUnchecked(ConcurrentClassLoader.java:410) 09:17:56,824 ERROR [stderr] (Reference Reaper #1) at org.jboss.modules.ConcurrentClassLoader.performLoadClass(ConcurrentClassLoader.java:398) 09:17:56,825 ERROR [stderr] (Reference Reaper #1) at org.jboss.modules.ConcurrentClassLoader.loadClass(ConcurrentClassLoader.java:116) 09:17:56,825 ERROR [stderr] (Reference Reaper #1) ... 1 more 09:17:56,826 ERROR [stderr] (Reference Reaper #3) java.lang.NoClassDefFoundError: com/singularity/ee/agent/appagent/entrypoint/bciengine/FastMethodInterceptorDelegatorBoot 09:17:56,827 ERROR [stderr] (Reference Reaper #3) at org.wildfly.common.ref.References$ReaperThread.run(References.java) 09:18:05,627 INFO [stdout] (AD Agent init) Started AppDynamics Java Agent Successfully. 09:18:41,038 ERROR [org.xnio.nio] (default I/O-2) XNIO000011: Task org.xnio.nio.WorkerThread$SynchTask@191c0b4b failed with an exception: java.lang.NoClassDefFoundError: com/singularity/ee/agent/appagent/entrypoint/bciengine/FastMethodInterceptorDelegatorBoot
Hi Ryan, I was able to get this by using sum of calls/min of a particular transaction. This is giving exact calls, so ADQL was not required. Thanks Fadil
Hello Everyone,        I have 2 Individual systems from which I am getting API(GET) responses, I have requirement of comparing these JSON responses which we are getting from 2 different system and i... See more...
Hello Everyone,        I have 2 Individual systems from which I am getting API(GET) responses, I have requirement of comparing these JSON responses which we are getting from 2 different system and if these payloads matching, then mark it as 'SUCCESS' else 'FAILURE'. I want to build report based on these results.  Can anyone please check and let me know possible solution in splunk ?  and also let me know what splunk skills we need to achieve this requirement. Thanks.