Activity Feed
- Posted Can UF be installed in E drive and monitor logs present in different drive(Network drive) on Splunk Enterprise. 03-26-2023 10:24 PM
- Tagged Can UF be installed in E drive and monitor logs present in different drive(Network drive) on Splunk Enterprise. 03-26-2023 10:24 PM
- Posted Re: Can we fetch the logs from a drive when UF is not installed on Splunk Enterprise. 03-26-2023 10:19 PM
- Karma Re: Can we fetch the logs from a drive when UF is not installed for glc_slash_it. 03-24-2023 05:12 AM
- Posted Re: Can we fetch the logs from a drive when UF is not installed on Splunk Enterprise. 03-24-2023 04:47 AM
- Posted Re: Rex command to extract data from csv on Splunk Enterprise. 03-24-2023 03:42 AM
- Posted Re: Can we fetch the logs from a drive when UF is not installed on Splunk Enterprise. 03-24-2023 03:36 AM
- Posted Can we fetch the logs from a drive when UF is not installed? on Splunk Enterprise. 03-24-2023 02:05 AM
- Tagged Can we fetch the logs from a drive when UF is not installed? on Splunk Enterprise. 03-24-2023 02:05 AM
- Posted What rex command extracts data from csv? on Splunk Enterprise. 03-23-2023 11:42 PM
- Tagged What rex command extracts data from csv? on Splunk Enterprise. 03-23-2023 11:42 PM
- Posted Re: Calculate average of specific fields on Splunk Search. 03-21-2023 12:37 AM
- Posted Re: Calculate average of specific fields on Splunk Search. 03-20-2023 11:21 PM
- Posted How to calculate average of specific fields? on Splunk Search. 03-20-2023 11:00 AM
- Posted Re: Case condition to check 2 events on the same field on Splunk Search. 03-10-2023 12:55 AM
- Posted Re: Case condition to check 2 events on the same field on Splunk Search. 03-09-2023 11:41 PM
- Posted Case condition to check 2 events on the same field on Splunk Search. 03-09-2023 10:47 PM
- Posted Re: Monitor Job in Dashboard on Splunk Enterprise. 03-02-2023 07:21 AM
- Karma Re: Monitor Job in Dashboard for ITWhisperer. 03-02-2023 07:20 AM
- Posted Re: Monitor Job in Dashboard on Splunk Enterprise. 03-02-2023 07:06 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 |
2 weeks ago
Did you solve this problem? I had the same symptoms and was wondering how you solved it.
... View more
04-09-2024
12:39 AM
Please, don't dig out old threads. Let them rest in peace 🙂 But seriously, to gain more visibility, you should just make a new thread, possibly linking to any informations you already found for reference. But to the point - if all else fails, you can always create your own script using Selenium and emulate a user clicking through your Sharepoint share and downloading the files but it's a very very ugly idea.
... View more
02-26-2024
12:49 AM
Hi, I have the same issue but its not working for me.. I first created the lookup and save the search as a report, and then i need to edit my query to append ONLY new values. The current query does not push values at all. index="rapid7_threat_intelligence" type="Domain" |table _time, source, type, value |outputlookup DOMAIN_IOC_ACTIVE.csv append=true | append [ | inputlookup append=true DOMAIN_IOC_ACTIVE.csv] | dedup value
... View more
- Tags:
- lookup
11-17-2023
07:52 AM
have you already managed to get it? I need to do the same for a client Thank you in advanced
... View more
04-03-2023
02:31 AM
Great, that means the UF is working fine, communicating with Splunk and sending logs. In index=_internal, do you see any reference to the input stanza? try searching for a portion of the path, if you can't find logs with the full path (//gtyojn201gp.kttc.aoi.com\Share\Integrations\MyLogins\out\REM*) That should point you to the problem.
... View more
03-29-2023
03:00 AM
Hi, Have you manage to come up with solution? Been trying this for a while now to appy hyperlink just for selected column value or just column instead of the whole row. Regards Bartek
... View more
03-27-2023
10:47 AM
Did you restart the UF after editing inputs.conf (it's required)? Check splunkd.log to see if the UF reports any problems reading E:.
... View more
03-24-2023
03:51 AM
Sorry, I left the '=' in unnecessarily - try this | inputlookup Meta.csv
| rex field=Application "^(?<name>.*) \["
... View more
03-21-2023
12:37 AM
@somesoni2 @PickleRick Below code was my fix | eval var=1
| addcoltotals COUNT* Q1* Q2* Q3* var Total
| foreach Q1_Score*
[ eval '<<FIELD>>' = round('<<FIELD>>'/var,2)]
| foreach Q2_Score*
[ eval '<<FIELD>>' = round('<<FIELD>>'/var,2)]
... View more
03-10-2023
01:40 AM
But what meaning would the additional colum have to have if it was "attached" to only one parameter? You want to have the same value in both rows? Do xyseries and then untable (yes, untable might be tricky if you have multiple columns; you'd need to "pack" and "unpack" them) or do eventstats.
... View more
03-02-2023
07:21 AM
@ITWhisperer I guess we can accept Answer only once per post! Anyway Thank You 🙂
... View more
02-22-2023
10:49 PM
@bowesmana Thank You. Works perfect
... View more
01-30-2023
05:12 AM
I do have same use case , please respond anyone where can I find the solution around this.
... View more
01-17-2023
10:43 PM
Hi,
I am trying to use this Splunk Add-on for Microsoft Cloud Services on Splunk Enterprise platform.
I have followed all the steps mentioned in the splunk doc Configure a Storage Account in Microsoft Cloud Services - Splunk Documentation
But Data is not getting indexed in Splunk unless i select the highlighted one in below pic in the Azure storage account
Due to company policy i cannot set it to "Enabled from all networks". I have tried raising microsoft support request but didnt get the solution.
I am able to fetch the data from the storage account directly into Virtual Machine using azcopy command but using add on i am not able to index/fetch the data into splunk.
Any help on troubleshooting this issue will be of great help
... View more
01-06-2023
06:31 AM
1 Karma
Monitor stanzas support a single blacklist setting. That's why you get the message about only the last one being applied. Contrast this with the blacklists for wineventlog stanzas. Blacklists use regular expressions. There is no concept in regex for "not this year" or "< 2023", etc. I suggest using a whitelist for the current year, instead. Something like this whitelist = server-202[34]-\d\d-\d\d will match files created this year and next (giving you time to update it at the end of 2023).
... View more
12-26-2022
07:26 AM
Hi,
In place of count I want to show the server name, And change color based on condition if count is >250.
I referred many links and docs but could not achieve what I wanted
index = webss
sourcetype = webphesst earliest= -1d latest=now
| where HTTP=500
| stats count by host
| eval color=if(count>=250, "#dc4e41", "#65a637"), icon=if(count>=250, "times-circle", "check-circle")
... View more
12-12-2022
02:43 AM
I need to index only the lines which has .pl in the source file into splunk(highlighted below data). Regex expression is working as expected(tested in rex tool) . Now i am using below props and transform.conf to index the only required data captured in regex expression but my data is not getting index or it indexes completed log file. Please assit where am i going wrong props.conf
[phone_access] TRANSFORMS-set= phone_access_extraction transforms.conf
[phone_access_extraction] REGEX = ^(\d{1,2}\.\d\.\d\.\d - - \[\w+\/\w+\/\w+:\d+:\d+:\d+ -\d+\] .\w+ \/\w+.+\.pl.+) DEST_KEY = queue FORMAT = indexQueue Log file: 11.7.1.0 - - [27/Nov/2022:00:00:00 -0600] "GET /cgi-bin/phonedata.pl?pq=a1%3oGHK9416&names=a1%7Ca2&&attrs=a1a2&delim=%09 HTTP/1.1" 302 - 11.7.1.0 - - [27/Nov/2022:00:00:04 -0600] "-" 408 - 11.7.1.0 - - [27/Nov/2022:00:00:21 -0600] "-" 408 - 11.7.1.0 - - [27/Nov/2022:00:00:22 -0600] "GET / HTTP/1.1" 20 14497 11.7.1.0 - - [27/Nov/2022:00:00:23 -0600] "GET /mobile.html HTTP/1.1" 200 1001 11.7.1.0 - - [27/Nov/2022:00:00:24 -0600] "GET /PhoneOrgiChart/ HTTP/1.1" 302 - 11.7.1.0 - - [27/Nov/2022:00:01:15 -0600] "GET /cgibiWn/xml.pl?vk236e HTTP/1.1" 20 11.7.1.0 - - [27/Nov/2022:00:01:15 -0600] "GET /cgi-bFin/xml.pl?hv163t HTTP/1.1" 20
... View more
Labels
12-01-2022
04:04 PM
This might not be a app/add-on error, but from Splunk itself. Which version of Splunk Enterprise are you running? Did you raise a support case? we are facing lots of warnings on our SH Cluster after upgrading to Splunk v9.0.1 WARN HttpListener [8466 HttpDedicatedIoThread-7] - Socket error from <ip:port> while accessing <URI>: Broken pipe
... View more
11-23-2022
11:58 PM
Hello, I am trying fetch Azure Virtual Machine Metrics data using Add on 'Splunk_TA_microsoft-cloudservices' I have created/added azure storage account and Inputs as stated in the doc in the add on. But i dont see any logs indexed in splunk for the same.When i check the internal index i see the below error. What does it mean and How do i fix this error?
... View more
Labels
11-11-2022
02:06 AM
Hi, We are using Splunk add on Splunk_TA_windows to capture CPU,Memory,Disk and other infrastructure log details.Through this add on we are getting cpu,memory,disk all other sourcetype in Splunk for windows servers. But for only two of our windows server, except CPU & Memory other sourcetypes are being captured in Splunk.In inputs.conf of the add on monitoring stanza in present for CPU and Memory Why we aren't receiving CPU and Memory sourcetype in those servers? How do we get those details as well?Please suggest
... View more
Labels
10-18-2022
12:46 AM
@ITWhisperer @jcoates_splunk +++ Any suggestions please
... View more
- Tags:
- it
09-14-2022
02:37 AM
1 Karma
You have provided a very little information on your environment, but If I understand your problem correctly, the following should allow you to determine if current time is later than the checking trigger time (2AM) | makeresults | eval triggerTime = relative_time(now(), "@d") + (2*3600) | eval check = if(now() > triggerTime, "Trigger", "Don't") The relative_time function is used to round epoch time down to the beginning of the day and then the number of seconds 2 hours have (2*3600) is added.
... View more
08-04-2022
03:38 AM
Hello Were you able to solve this problem? I have the same problem. On some hosts the CPU=all field is available and on some hosts it is not
... View more
07-11-2022
04:03 AM
You're mixing two different things. One is the actual data retention - how long you keep the data in the index in case you want to search it later. And here indeed those two parameters are responsible for when splunk choses to "evict" old data and remove a bucket (remember that splunk doesn't expire single events but whole buckets). Another thing is timerange of your search. You set it either using timepicker or specifying "earliest" and "latest" parameters. You can also limit your users to some subset of data available if you don't want some of them to peek too far into the past (but I don't remember if you can set it per index or only globally for user).
... View more
06-27-2022
12:57 AM
1 Karma
The DS does not connect actively to deployment clients (forwarders) and on its own it does not enforce anything. So whether DS is up or down does not have any impact on the immediate operations of the forwarders. It's just that the deployment clients periodically connect to the DS and "ask" it whether there is an updated configuration bundle for them and if there is one, they pull one from DS and apply it. So if the DS is not available the forwarder is simply not able to "call home" and ask for new config bundle. It does not stop the forwarder or anything like that. You simply can't distribute new config to clients but nothing else should happen.
... View more