Activity Feed
- Posted Re: How obtain the sum of a multivalue field on Splunk Search. 07-08-2024 04:58 AM
- Posted Re: Can I add python modules to the Splunk environment? on Splunk Dev. 08-18-2023 06:12 AM
- Posted Why is Splunk giving different result when searching on _bkt field? on Splunk Search. 08-18-2023 06:02 AM
- Tagged Why is Splunk giving different result when searching on _bkt field? on Splunk Search. 08-18-2023 06:02 AM
- Tagged Why is Splunk giving different result when searching on _bkt field? on Splunk Search. 08-18-2023 06:02 AM
- Got Karma for Re: Can I add python modules to the Splunk environment?. 04-20-2022 01:42 PM
- Karma Is there a REST endpoint that allows replacing a static resource? for vbumgarner. 06-29-2021 09:14 AM
- Karma Re: debug refresh or _bump using REST API for seegeekrun. 02-11-2021 02:58 AM
- Got Karma for Re: Can I add python modules to the Splunk environment?. 09-19-2020 10:46 PM
- Posted Re: Can I add python modules to the Splunk environment? on Splunk Dev. 09-19-2020 09:21 PM
- Got Karma for Re: Can I add python modules to the Splunk environment?. 09-19-2020 07:11 PM
- Posted Re: Can I add python modules to the Splunk environment? on Splunk Dev. 09-19-2020 05:01 PM
- Karma Re: How do you extract a string from field _raw? for vnravikumar. 06-05-2020 12:50 AM
- Got Karma for Re: Can you help me with a dashboard based on reports that are filtering by time?. 06-05-2020 12:50 AM
- Karma Re: How to get search results from SplunkJS on click of a button? for scottsavaresevi. 06-05-2020 12:48 AM
- Karma Why is my SAML (SSO) session not destroyed after logout? for ryangpeng. 06-05-2020 12:48 AM
- Posted Re: Table time field using transaction on Splunk Search. 10-18-2019 01:55 AM
- Posted Re: Integrate Splunk Support portal tickets with on-prem Splunk on #Random. 10-18-2019 01:50 AM
- Posted Re: Where should I install Fortinet Fortigate Add-On for Splunk? on Splunk Enterprise Security. 10-17-2019 03:24 AM
- Posted Re: How do you specify which version of the REST API to use? on Getting Data In. 10-11-2019 09:43 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 |
07-08-2024
04:58 AM
Another Easy way is to use forearch command: below is the example. |makeresults | eval mv=mvappend("5", "15"), total = 0, count = 0 | foreach mode=multivalue mv [eval total = total + <<ITEM>>, count = count + 1]
... View more
08-18-2023
06:12 AM
Just put all the module files in bin folder of your app and add below code to your python script and you can import any custom module: Let me know if you are facing any error import os
import sys
import re
ta_name = '<splunk_app_name>'
ta_lib_name = '<parent_folder_of_python_pakage>'
pattern = re.compile(r"[\\/]etc[\\/]apps[\\/][^\\/]+[\\/]bin[\\/]?$")
new_paths = [path for path in sys.path if not pattern.search(path) or ta_name in path]
new_paths.insert(0, os.path.sep.join([ta_lib_name]))
sys.path = new_paths Happy Splunking!!! #dataelicitsol.com #bhavik.bhalodia@dataelicitsol.com
... View more
08-18-2023
06:02 AM
While running below search I am not getting any events:
index=main_vulnerability_database sourcetype=vulnerability_overview _bkt="main_vulnerability_database~0~FB1A6C9D-87F2-4A38-B420-94F2171CE493" _cd=0:1015
But while adding search command I am getting events:
index=main_vulnerability_database sourcetype=vulnerability_overview
| search _bkt="main_vulnerability_database~0~FB1A6C9D-87F2-4A38-B420-94F2171CE493" _cd=0:1015
Ideally both should give same results. Looking for reason why it is happening.
... View more
- Tags:
- _bkt
- _cd
- splunk search
09-19-2020
09:21 PM
1 Karma
Thanks @inventsekar
... View more
09-19-2020
05:01 PM
2 Karma
Just add below python code in your main script and it will work: import os
import sys
import re
ta_name = '<splunk_app_name>'
ta_lib_name = '<parent_folder_of_python_pakage>'
pattern = re.compile(r"[\\/]etc[\\/]apps[\\/][^\\/]+[\\/]bin[\\/]?$")
new_paths = [path for path in sys.path if not pattern.search(path) or ta_name in path]
new_paths.insert(0, os.path.sep.join([ta_lib_name]))
sys.path = new_paths
... View more
10-18-2019
01:50 AM
Hi @adalbor
You can create one python script which will fire Splunk Portal and Fetch All the required information from Splunk Portal and Send it to Splunk Using HEC or on any port. And this script will run at some defined interval.
Let me know for more information.
... View more
10-17-2019
03:24 AM
Hi @bsuresh1
As per your requirement, you have to install Add-on on the Heavy Forwarder(HF). No need to uninstall Add-on from Search Head.
... View more
10-11-2019
09:43 AM
Can you please provide us the Splunk Document from where you get the above reference?
... View more
05-21-2019
12:19 AM
Can you please explain what does below sentence represent?
This practice does not generate the results you would expect.
... View more
05-05-2019
02:50 AM
Getting same error. PDF generation from dashboards works fine but schedules don't work properly.
... View more
02-07-2019
03:41 AM
Hi @lemmons2
Here are the answers of your questions:
So to use the Puppet TA does this require the following?
a) Install Enterprise Splunk on the PE Server. No
b) Install Splunk Puppet TA on the PE Server. No
c) Configure Splunk as HF (heavy forwarder) on the PE server. No
d) Configure the HF to send the logs its gathered to our institutional Splunk indexer cluster? Yes
Does this require that the HF on the PE server have its own Splunk license for the daily ingest of PE server logs?
*Answer *: No there is no requirement to purchase a separate license for HF. You can use the license which is used for Splunk Enterprise.
Since my organization manages the institutional Splunk infrastructure but doesn't manage the PE server or have access to it, would this be considered risky or unconventional to have an outside organization running a HF that forwards data into our indexers?
*Answer *: You can create a HF in your environment and install Splunk Add-on for Puppet Enterprise on that HF. And configure inputs in that Splunk Add-on. If you already have any HF in your environment then no need to create a new one. You can install Add-on on that HF.
Are there better ways to do this that allows my organization to centrally manage all aspects of getting the PE server logs into Splunk? Perhaps simply setting up a universal forwarder on the PE server?
Answer: I think you will get the answer to this question from question 3. You cannot implement this use case by the help of universal forwarders.
If you think this helps you then please upvoted this answer.
Thanks,
Bhavik
... View more
02-06-2019
08:38 AM
Hi @memow8
I don't think there is a way to create this in Splunk. But you can implement your own logic outside of Splunk to get this functionality. This logic will fetch time from CSV and that time will use to reschedule report. You can reschedule the report by RestApi.
https://docs.splunk.com/Documentation/Splunk/7.2.3/RESTREF/RESTsearch#scheduled.2Fviews.2F.7Bname.7D.2Fdispatch
Thanks,
Bhavik
... View more
01-15-2019
06:25 AM
Hi abhijittikekar,
Try to run below query.
sourcetype=linux:audit type=CWD
| table msg, cwd
| map
[ search sourcetype=linux:audit NOT auid=4294967295 type=SYSCALL comm=chmod OR comm=rm OR comm=chown msg=$msg$
|eval cwd=$cwd$
| table _time, msg, auid,cwd]
|map
[ search sourcetype=linux:audit NOT auid=4294967295 msg=$msg$ type=EXECVE (a0=chmod (a1=-R a2=777 OR a2=755) OR (a1=777)) OR (a0=rm a2=-r OR a2=-rf) OR (a0=chown)
| eval auid=$auid$ , cwd=$cwd$
| table + _time, msg, host, a0, a1, a2, a3,auid,cwd]]
| table _time, host, msg, auid, a0, a1, a2, a3, cwd
Thanks,
Bhavik
... View more
01-15-2019
05:40 AM
Hi @manalhadrach ,
You can check error of script by running below query.
index="_internal" "cpu.sh"
Thanks,
Bhavik
... View more
01-15-2019
04:43 AM
Hi @naagaraj,
First of all, it is not advisable to run Splunk on a windows server, but if you still want to use windows server than in your case I think there is an issue with the firewall. You have to allow instances to access 8000 port for a windows server on which Splunk installed. You can check this thing by run below command on an instance from you are trying to access Splunk.
telnet ipaddress_of_splunk_server 8000
By use of the above command, you came to know that your instance can connect to Splunk server or not.
If your instance can not connect to Splunk server than check rules of firewall of the instance from which you are accessing Splunk. The rules of firewall of the instace from which you are accessing Splunk must contains 8000 port for outbound to access Splunk server.
Thanks,
Bhavik
... View more
01-11-2019
12:46 PM
@luke222010,
You can try below query :
sourcetype="sourcetype_a"
|table msgID
|appendpipe
[|search sourcetype="sourcetype_b" result="success" |table result,msgID ]
|stats values(result) as result count by msgID
| where count=2
| table result
Thanks,
Bhavik
... View more
01-11-2019
12:40 PM
Hi Dannili,
Check this thing with the use of KV store lookup, you might get your answer.
Thanks,
Bhavik
... View more
01-11-2019
12:15 PM
1 Karma
Hi Patrycja,
You can create a datamodel which takes data from an index and reindex them. And after when you try to fetch data from datamodel at that time you can get a result more quickly. And you can apply the time range to search when you try to fetch data from datamodel.
Thanks,
Bhavik
... View more
12-30-2018
03:21 PM
Hi,
You can configure your requirements in a query. You have to manage your query in such a way that when a search of alert runs holidays it should return 0 events. You can use addinfo and lookup commands.
If you are satisfied with a comment then please upvote it.
Thanks,
Bhavik
... View more
12-30-2018
02:43 PM
Use below additional two options in inputs.conf and check.
[monitor://(file path)]
disabled = 0
index = XXXXX
sourcetype = XXXXX
ignoreOlderThan=(non-negative integer)[s|m|h|d]
alwaysOpenFile = 1
... View more
12-28-2018
02:28 AM
You can try below syntax in /opt/splunk/etc/system/local/limits.conf and check the change in results.
[searchresults]
max_mem_usage_mb = 400
maxresultrows = 10000000
tocsv_maxretry = 10
tocsv_retryperiod_ms = 1000
... View more
12-27-2018
10:38 AM
Hi, you can use below query to get a list of the users who are outside of the country which does not contain throttled user.
*index="authenticatior" action=success
| search "location.country"!="" AND "location.country"!="US"
| table _time device,username,user_first,user_last,user_managedBy,factor,integration,result,location.city,location.country
| lookup mylookup.csv
| where isnull(last_date)
| fields - last_date
| eval _time=strftime(_time, "%m/%d/%y %I:%M:%S:%p")
| rename _time as Timestamp location.city as City, location.country as Country user_managedBy as Manager username as "User ID" user_first as First, user_last as Last, factor as Factor integration as Integration result as Result device as Device
| sort Last *
And use below query to add a user in the lookup.
| inputlookup mylookup.csv
| append
[| makeresults 1
| eval username="Name of User",numberofdays=numberofdays , last_date=_time+86400*(numberofdays)
| fields user,last_date]
| outputlookup mylookup.csv
You have to schedule below query to remove throttled user from lookup when the time will expire so that schedule below query which runs at 12:00 AM(for example) every day.
| inputlookup mylookup.csv
| where last_date > _time
| outputlookup mylookup.csv
... View more
12-27-2018
05:01 AM
It seems that the format of the field callId is a string and contains space after the value of callId, because of that when you use ***** your problem is solved.
... View more