Activity Feed
- Posted Can UF be installed in E drive and monitor logs present in different drive(Network drive) on Splunk Enterprise. 03-26-2023 10:24 PM
- Tagged Can UF be installed in E drive and monitor logs present in different drive(Network drive) on Splunk Enterprise. 03-26-2023 10:24 PM
- Posted Re: Can we fetch the logs from a drive when UF is not installed on Splunk Enterprise. 03-26-2023 10:19 PM
- Karma Re: Can we fetch the logs from a drive when UF is not installed for glc_slash_it. 03-24-2023 05:12 AM
- Posted Re: Can we fetch the logs from a drive when UF is not installed on Splunk Enterprise. 03-24-2023 04:47 AM
- Posted Re: Rex command to extract data from csv on Splunk Enterprise. 03-24-2023 03:42 AM
- Posted Re: Can we fetch the logs from a drive when UF is not installed on Splunk Enterprise. 03-24-2023 03:36 AM
- Posted Can we fetch the logs from a drive when UF is not installed? on Splunk Enterprise. 03-24-2023 02:05 AM
- Tagged Can we fetch the logs from a drive when UF is not installed? on Splunk Enterprise. 03-24-2023 02:05 AM
- Posted What rex command extracts data from csv? on Splunk Enterprise. 03-23-2023 11:42 PM
- Tagged What rex command extracts data from csv? on Splunk Enterprise. 03-23-2023 11:42 PM
- Posted Re: Calculate average of specific fields on Splunk Search. 03-21-2023 12:37 AM
- Posted Re: Calculate average of specific fields on Splunk Search. 03-20-2023 11:21 PM
- Posted How to calculate average of specific fields? on Splunk Search. 03-20-2023 11:00 AM
- Posted Re: Case condition to check 2 events on the same field on Splunk Search. 03-10-2023 12:55 AM
- Posted Re: Case condition to check 2 events on the same field on Splunk Search. 03-09-2023 11:41 PM
- Posted Case condition to check 2 events on the same field on Splunk Search. 03-09-2023 10:47 PM
- Posted Re: Monitor Job in Dashboard on Splunk Enterprise. 03-02-2023 07:21 AM
- Karma Re: Monitor Job in Dashboard for ITWhisperer. 03-02-2023 07:20 AM
- Posted Re: Monitor Job in Dashboard on Splunk Enterprise. 03-02-2023 07:06 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 |
03-26-2023
10:24 PM
Hi, I have installed Splunk UF in E drive on windows server and able to monitor all the logs present in E drive. I have request to monitor Network drive logs present on the same windows server and the user has full access to network drive. I have placed monitoring stanza under splunk_home/etc/system/local/inputs.conf( E drive) mentioning the path of the logs present in network drive . But i do not see any logs related to Network drive in Splunk. Is there way to monitor network drive logs when UF is present in E drive on windows server?
... View more
Labels
03-26-2023
10:19 PM
@glc_slash_it 1- Do you see any logs at all from that UF host on _internal? (if not, there is a problem with installation or running splunkforwarder service) - Yes i can see internal logs 2- Can you successfully monitor any file in drive E? - Yes i am monitoring server logs from Drive E 3- Does the user you use to run splunkforwarder service, has access to the network drive you're trying to monitor? - Yes user has full access
... View more
03-24-2023
04:47 AM
@glc_slash_it Zero result. Logs are not read from that path. Can we index logs from network drive when Splunk UF is installed in E drive on windows machine
... View more
03-24-2023
03:36 AM
@glc_slash_it I do not see any errors. [monitor://gtyojn201gp.kttc.aoi.com\Share\Integrations\MyLogins\out\REM*] index = Batchtest sourcetype = Batchtest_st disabled=false
... View more
03-24-2023
02:05 AM
Hi,
I have installed UF in one of the drive ( E drive) on a windows server. I want to fetch logs from another drive (Netwrok drive) .This drive is present on the same server itself and the server has access to this drive. I have placed montirong stanza in splunk_home/etc/system/local in E drive mentioning the path of the logs present in Network drive. But i am not seeing any logs in Splunk getting indexed.
So can you suggest how can i fetch logs from another drive when Splunk UF is installed in E drive.
P.s i cannot change Splunk UF location from E drive because of compliance.
Thanks
... View more
Labels
03-23-2023
11:42 PM
Hi,
I am trying to extract data from one of the column in lookup file. Regex expression is working in rex tool. I want to use that regex expression in rex command in splunk . Rexgex expression ^.*(?= \[)
Example: Want to extract the highlighted bold data. This data is present in csv lookup file.
TAX PLATFORM [12998]
CPOI [0639]
| inputlookup Meta.csv
| rex field=Application "^.*(?<name>= \[)"
... View more
03-21-2023
12:37 AM
@somesoni2 @PickleRick Below code was my fix | eval var=1
| addcoltotals COUNT* Q1* Q2* Q3* var Total
| foreach Q1_Score*
[ eval '<<FIELD>>' = round('<<FIELD>>'/var,2)]
| foreach Q2_Score*
[ eval '<<FIELD>>' = round('<<FIELD>>'/var,2)]
... View more
03-20-2023
11:21 PM
@somesoni2 This isn't giving me any output,I don't see any additional field being added to the result. Also, I want to show the average for Q1_score PREPAID & so on in "count by segment" row i.e. at the bottom row | eventstats avg("Q1_score PREPAID") as "Avg Q1_score PREPAID" ,avg("Q2_score PREPAID") as "Avg Q2_score PREPAID",avg("Q1_score CONSUMER") as "Avg Q1_score CONSUMER"
... View more
03-20-2023
11:00 AM
Hi, I am formatting data as required and getting it in below format. Now I want to calculate average of only highlighted fields in green color i.e. Q1_score PREPAID,Q2_score PREPAID,Q1_score CONSUMER so on
Example Q1_score CONSUMER ,Count by segment value should be 4.50
This is last piece of my query
| addcoltotals COUNT* Q1* Q2* Q3* Total
| eval Month=coalesce(Month, "Count by Segment")
Please suggest
... View more
03-10-2023
12:55 AM
@PickleRick Thanks, But i need the result to be as before i.e. Metric Application range new column and untable isnt working here.
... View more
03-09-2023
11:41 PM
@PickleRick Tried Case condition with the existing field but only default condition is matching Also, tried to get results in separate columns yet getting only default condition value i.e. 0 Field name differs in below code | fillnull value=NA rangecost rangeproduct
| eval Valuecombine=case(((rangecost="low" AND Metric="Cost") AND (rangeproduct="low" AND Metric="Product")),"1",((rangecost="severe" AND Metric="Cost") AND (rangeproduct="severe" AND Metric="Product")),"2",((rangecost="low" AND Metric="Cost") AND (rangeproduct="severe" AND Metric="Product")),"3",((rangecost="severe" AND Metric="Cost") AND (rangeproduct="low" AND Metric="Product")),"4",1=1,0)
| table Metric Application rangeproduct rangecost Valuecombine | eval Valuecombine=case(rangeproduct="low" AND rangecost="low","1",rangeproduct="severe" AND rangecost="severe","2",rangeproduct="low" AND rangecost="severe","3",rangeproduct="severe" AND rangecost="low","4",1=1,0)
... View more
03-09-2023
10:47 PM
Hi, I want to write a case condition where i can check values from Range column. For instance If range for both cost & product is low the a new column should show value as low If range for both Cost & Product = severe then New Column should show severe If range for Cost=severe & Product=low OR if Cost=low & Product =severe Then New column = elevated Please suggest
... View more
- Tags:
- case
- if
- nestedloop
Labels
03-02-2023
07:21 AM
@ITWhisperer I guess we can accept Answer only once per post! Anyway Thank You 🙂
... View more
03-02-2023
06:39 AM
@ITWhisperer That's correct. That is the challenge i am facing on how to check only specific jobs at specific interval. Can we write a query to check the condition only between 9:30 PM to 10: 30 PM
... View more
03-02-2023
04:25 AM
@ITWhisperer I need to monitor Jobs only at specific interval in dashboard. From source i am extracting job name and Timestamp of file generated. This Job generates anywhere between 9:30PM IST- 10:30PM IST . My below query is not checking for any time interval so before 9:30 PM also it is running the query and showing as "Job has not run". I need to check and run the query only after 9:30 PM and before that it should not run
... View more
03-02-2023
03:19 AM
Hi,
I need to monitor jobs only at specific interval .From Application server we are getting only Job Name And Date of Job generated into Splunk.
For example:
Job will only run between 9:30 PM -10:30 so Splunk will have data only after 9:30 PM so up to 9:30 PM dashboard will be showing as 'Job has not run' which is incorrect. I need to check only between 9:30 PM -10:30 PM and if there is no data in Index then show as "Job has not run"
Please suggest.
query:index = test_job sourcetype = test_job | rex field=source ".*/(?<name>.*?)_(?<date>.*)\." | eval DATE=strftime(strptime(date,"%m%d%Y_%I.%M.%S.%p"),"%m-%d-%Y %I:%M:%S %p") | rename name as JobName | table JobName DATE | append [| inputlookup job.csv | search NOT [ search index = test_job sourcetype = test_job | rex field=source ".*/(?<name>.*?)_(?<date>.*)\." | eval DATE=strftime(strptime(date,"%m%d%Y_%I.%M.%S.%p"),"%m-%d-%Y %I:%M:%S %p") | rename name as JobName | table JobName ]] | fillnull value="N" DATE | eval DATE=if(DATE="N","Job has not run", DATE)
... View more
Labels
02-22-2023
10:49 PM
@bowesmana Thank You. Works perfect
... View more
02-22-2023
03:46 AM
@bowesmana Based on user selection from the dashboard the data keeps changing as below. So i cannot table it since depending on user selection data will be displayed in dashboard. Irrespective of any selection the days should sorted in order. Can we do that in query Example
... View more
02-21-2023
10:37 PM
Hi,
I have to rearrange below columns in below order i.e. 31-60 Days, 61-90 Days, 91-120 Days,151-180 Days,Over 180 Days, Total Query: | inputlookup ACRONYM_Updated.csv |stats count by ACRONYM Aging |xyseries ACRONYM Aging count |addtotals
... View more
Labels
- Labels:
-
stats
01-17-2023
10:43 PM
Hi,
I am trying to use this Splunk Add-on for Microsoft Cloud Services on Splunk Enterprise platform.
I have followed all the steps mentioned in the splunk doc Configure a Storage Account in Microsoft Cloud Services - Splunk Documentation
But Data is not getting indexed in Splunk unless i select the highlighted one in below pic in the Azure storage account
Due to company policy i cannot set it to "Enabled from all networks". I have tried raising microsoft support request but didnt get the solution.
I am able to fetch the data from the storage account directly into Virtual Machine using azcopy command but using add on i am not able to index/fetch the data into splunk.
Any help on troubleshooting this issue will be of great help
... View more
01-06-2023
04:23 AM
Hi,
I need to index windows server logs and blacklist all the previous year logs. Inputs.conf.
[monitor://E:\application\logs\server*] disabled=0 sourcetype=_error_text index=_error_file
Logs in the servers looks like below
I refered solunk doc and came up with this stanza but it says only the last filter will be applied. Does it mean only 2019 blacklist regex will be applied? [monitor://E:\application\logs\server*] disabled=0 sourcetype=_error_text index=_error_file blacklist.1=^server-2021-\d{2}-\d{2} blacklist.2=^server-2020-\d{2}-\d{2} blacklist.3=^server-2019-\d{2}-\d{2}
Please suggest.
... View more
Labels
12-26-2022
07:26 AM
Hi,
In place of count I want to show the server name, And change color based on condition if count is >250.
I referred many links and docs but could not achieve what I wanted
index = webss
sourcetype = webphesst earliest= -1d latest=now
| where HTTP=500
| stats count by host
| eval color=if(count>=250, "#dc4e41", "#65a637"), icon=if(count>=250, "times-circle", "check-circle")
... View more
12-12-2022
02:43 AM
I need to index only the lines which has .pl in the source file into splunk(highlighted below data). Regex expression is working as expected(tested in rex tool) . Now i am using below props and transform.conf to index the only required data captured in regex expression but my data is not getting index or it indexes completed log file. Please assit where am i going wrong props.conf
[phone_access] TRANSFORMS-set= phone_access_extraction transforms.conf
[phone_access_extraction] REGEX = ^(\d{1,2}\.\d\.\d\.\d - - \[\w+\/\w+\/\w+:\d+:\d+:\d+ -\d+\] .\w+ \/\w+.+\.pl.+) DEST_KEY = queue FORMAT = indexQueue Log file: 11.7.1.0 - - [27/Nov/2022:00:00:00 -0600] "GET /cgi-bin/phonedata.pl?pq=a1%3oGHK9416&names=a1%7Ca2&&attrs=a1a2&delim=%09 HTTP/1.1" 302 - 11.7.1.0 - - [27/Nov/2022:00:00:04 -0600] "-" 408 - 11.7.1.0 - - [27/Nov/2022:00:00:21 -0600] "-" 408 - 11.7.1.0 - - [27/Nov/2022:00:00:22 -0600] "GET / HTTP/1.1" 20 14497 11.7.1.0 - - [27/Nov/2022:00:00:23 -0600] "GET /mobile.html HTTP/1.1" 200 1001 11.7.1.0 - - [27/Nov/2022:00:00:24 -0600] "GET /PhoneOrgiChart/ HTTP/1.1" 302 - 11.7.1.0 - - [27/Nov/2022:00:01:15 -0600] "GET /cgibiWn/xml.pl?vk236e HTTP/1.1" 20 11.7.1.0 - - [27/Nov/2022:00:01:15 -0600] "GET /cgi-bFin/xml.pl?hv163t HTTP/1.1" 20
... View more
Labels