All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @QuantumRgw , could you better describe your situation? I understood that you have an Heavy Forwarder, but I don't know your requisites and your architecture. Ciao. Giuseppe
Or, do you mean your events each has its own header like nname,Id,app,Time?  If that is the case, use this alternative: | multikv forceheader=1
Hi @gcusello  Could you be able to help me out in this case. If there is no possibility to go back then I am planning to reinstall the splunk again. I am not sure if my Dashboards will be gone. Wait... See more...
Hi @gcusello  Could you be able to help me out in this case. If there is no possibility to go back then I am planning to reinstall the splunk again. I am not sure if my Dashboards will be gone. Waiting to hear from you.
Even though I am providing accurate inputs, the Speakatoo API is not working as expected for me. Seeking assistance to resolve this issue.
Hi @mayurkale471757 , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: K... See more...
Hi @mayurkale471757 , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Hi @phularah , the Forwarder License was created just for these purposes: input, parse and forward data without local indexing. You don't need a Splunk license on HF, you need it only on Indexers. ... See more...
Hi @phularah , the Forwarder License was created just for these purposes: input, parse and forward data without local indexing. You don't need a Splunk license on HF, you need it only on Indexers. Ciao. Giuseppe
Hi @Dharani , yes it's possible, you should: setup an action that when the first alert is triggered, it writes an event in a summary index (better) or in a lookup; add to you alert the condition ... See more...
Hi @Dharani , yes it's possible, you should: setup an action that when the first alert is triggered, it writes an event in a summary index (better) or in a lookup; add to you alert the condition that the alert wasn't already triggered reading from the summary index. If you have Enterprise Security, you don't need the summary index and you can use the Notable index. Ciao. Giuseppe
Hi,  I want to schedule one splunk alert , please let me know if below option is possible: When the first alert received for xxx error  then query should check if this is the first occurance of an... See more...
Hi,  I want to schedule one splunk alert , please let me know if below option is possible: When the first alert received for xxx error  then query should check if this is the first occurance of an error in last 24 hours  if yes then Alert email can be triggered  If the error is not first occurance then may be based on threshold we should only send one email for more than 15 failures in an  hour or so. 2nd point is basically set up splunk alert for xxx error , threshold: trigger when count>15 in last 1 hour. 1st point is for , when 1st occurrence of error came , it will not wait for count>15 and 1 hr , it will immediately trigger an email.   Please help on this.
Hi, Document was clear. I need to know AppDynamics are capturing windows service application [with Error Transaction] or not. unable to get the Error transactions details. Thanks in advance.
@Amolsbhalerao  You may have a stale splunk.pid file that is not being cleared on startup. This can prevent the splunkd process from starting. If this is the case, you can make a copy of the splunk.... See more...
@Amolsbhalerao  You may have a stale splunk.pid file that is not being cleared on startup. This can prevent the splunkd process from starting. If this is the case, you can make a copy of the splunk.pid file in the /splunkforwarder/var/run/splunk directory and replace it with the original. Run ps -ef | grep splunk command to verify the process is running.
Thanks Guys, changing the settings at HF solved this issue
Thanks for the valueable query, few points here 1- I am unable to locate my HF under h field (search from IP as well as hostname) 2- How can i put restriction on day basis, like to create bar chart... See more...
Thanks for the valueable query, few points here 1- I am unable to locate my HF under h field (search from IP as well as hostname) 2- How can i put restriction on day basis, like to create bar chart having license consumption during the week 3- I have another way to look into it as i mainly would like to calculate data ingestion where index name having common starting name like index="test*" and i found a field which is idx to query the same. However how to add all the data and show it in graph 4- Also i think this is license in GB , | eval licenseGB =round(license/1024/1024/1024,3). Why did you rename it to TB?
@uagraw01  00:00 is an offset from UTC. The %z value should parse this in -/+HHMM format.
@azteksites  I am still confused for 00:00 (for last two pattern )  
Hi @pm2012  you can use following query index=_internal source="*license_usage.log" type=Usage h="<forwader name>" | rename _time as Date | eval Date=strftime(Date,"%b-%y") | stats sum(b)... See more...
Hi @pm2012  you can use following query index=_internal source="*license_usage.log" type=Usage h="<forwader name>" | rename _time as Date | eval Date=strftime(Date,"%b-%y") | stats sum(b) as license by Date h | eval licenseGB =round(license/1024/1024/1024,3) | rename licenseGB as TB
You can try the following TIME_FORMAT value to parse the timestamp, TIME_FORMAT = %Y-%m-%dT%H:%M:%S,%3N%z  
Thanks for the answer . By the way have you missed %M ?   should be like this: %Y-%m-%dT%H:%M:%S,%3Q+00:00
Hi @uagraw01  it seems ,533 is milliseconds 2023-12-05T04:21:21,533+00:00 %Y-%m-%dT%H:%S,%3Q+00:00
Hi SMEs, Hope you are doing great, i am curious to know how to check the daily data consumption (GB/Day) from a specific Heavy Forwarder using Splunk search when there are multiple HFs are there in ... See more...
Hi SMEs, Hope you are doing great, i am curious to know how to check the daily data consumption (GB/Day) from a specific Heavy Forwarder using Splunk search when there are multiple HFs are there in the deployment. thanks in advance
Please help me to get the time format for the below string in props.conf. I am confused with the last three patterns (533+00:00)   2023-12-05T04:21:21,533+00:00   Thanks in advance.