Activity Feed
- Karma Re: Props.conf settings are not working for kamlesh_vaghela. 07-12-2021 02:25 AM
- Posted Re: Props.conf settings are not working on Getting Data In. 07-12-2021 12:05 AM
- Posted Re: Props.conf settings are not working on Getting Data In. 07-10-2021 07:54 AM
- Posted Re: Props.conf settings are not working on Getting Data In. 07-10-2021 04:26 AM
- Posted Re: Props.conf settings are not working on Getting Data In. 07-10-2021 03:18 AM
- Posted Re: Props.conf settings are not working on Getting Data In. 07-09-2021 08:51 AM
- Posted Re: Props.conf settings are not working on Getting Data In. 07-09-2021 08:25 AM
- Posted Re: Props.conf settings are not working on Getting Data In. 07-09-2021 02:15 AM
- Posted Props.conf settings are not working on Getting Data In. 07-09-2021 12:43 AM
- Posted Re: How to get the raw events sorted as per the log available in server? on Splunk Enterprise. 02-03-2021 11:51 PM
- Posted How to get the raw events sorted as per the log available in server? on Splunk Enterprise. 02-03-2021 10:09 PM
- Tagged How to get the raw events sorted as per the log available in server? on Splunk Enterprise. 02-03-2021 10:09 PM
- Tagged How to get the raw events sorted as per the log available in server? on Splunk Enterprise. 02-03-2021 10:09 PM
- Tagged How to get the raw events sorted as per the log available in server? on Splunk Enterprise. 02-03-2021 10:09 PM
- Posted Re: How to make another search in if condition using eval on Splunk Search. 07-24-2020 02:18 AM
- Tagged Re: How to make another search in if condition using eval on Splunk Search. 07-24-2020 02:18 AM
- Tagged Re: How to make another search in if condition using eval on Splunk Search. 07-24-2020 02:18 AM
- Posted Re: How to make another search in if condition using eval on Splunk Search. 07-23-2020 08:44 PM
- Tagged Re: How to make another search in if condition using eval on Splunk Search. 07-23-2020 08:44 PM
- Posted How to make another search in if condition using eval on Splunk Search. 07-22-2020 01:17 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 |
07-12-2021
12:05 AM
Hello All, @kamlesh_vaghela, My props settings are working now from both UI and backend. The reason why my props.con file not reading was, I have placed that file in forwarder i.e., /opt/splunkforwarder/etc/system/local where we basically keep inputs.conf and outputs.conf files. But we actually have to put it in indexer (i.e., /opt/splunkforwarder/etc/system/local). I was confused as everywhere in Splunk documentation it is mentioned as $SPLUNK_HOME. Thanks for all your support!! I really appreciate.
... View more
07-10-2021
07:54 AM
@kamlesh_vaghela , I kept my inputs.conf , outputs.conf and props.conf in Universal forwarder. I will check the reference links, which you have provided. Also could you please tell me if any props.conf or any file settings are not detected when we debug those using btool, what are the actions we could take next? or how to resolve the issue. Thanks!!
... View more
07-10-2021
04:26 AM
Hello All, I dont see my props settings when I debug it like below. ./splunk btool props list --debug can you please clarify one thing for me?. Will these settings work when we are using trail version of Splunk enterprise and UF? I am using trail version of Splunk enterprise and UF and trying to send data from Unix environment to Splunk using UF. Will it break events according to props settings? Thanks for your response.
... View more
07-10-2021
03:18 AM
@kamlesh_vaghela , I am getting the events broke as per my props settings from Splunk front end ( i.e., when I upload file manually using "Upload" option in Splunk UI) as I said earlier. But when I try to use props.conf file in linux environment and send the file using Universal forwarder, those settings are not working and data is breaking as only one event. That is my problem. Also when I create index in UI, it is taking the data to that index. But not taking data to any index when I create that from conf files. Not Sure why. Something seriously wrong!! Can someone please help??
... View more
07-09-2021
08:51 AM
Sure @kamlesh_vaghela , But meanwhile can you please help me to understand why the settings worked for you, are not working for me. Also why the props settings are working from UI but not from backend. <?xml version="1.0" encoding="UTF-8" standalone="yes"?><logs schemaVersion="0"><messages><timestamp>2021-05-20T11:55:13.766-07:00</timestamp><level>PROGRESS</level><thread>backup4 ee5fa1cb0c31a3e56f4fed2c99ff7745</thread><location>com.netapp.common.flow.tasks.Log</location><msgKeyClass>com.netapp.smvi.SMMsgKey</msgKeyClass><msgKeyValue>PROGRESS_TASK_BACKUP_STARTING</msgKeyValue><parameters xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:nil="true" /><message>Starting backup request</message></messages><messages><timestamp>2021-05-20T11:55:14.156-07:00</timestamp><level>INFO</level><thread>backup4 aaaaaaaaaajksbcjkbud7yh8y83eh38</thread><location>com.netapp.smvi.task.validation.BackupValidation</location><msgKeyClass>com.netapp.smvi.SMMsgKey</msgKeyClass><msgKeyValue>BACKUP_VALIDATION_INTERNAL_BACKUP_NAME_FOR_SCHEDULE_JOB</msgKeyValue><parameters><parameter>66fc1387-594c-48cb-b35d-94ca319a4a3c</parameter><parameter>backup_PM cDOT Datastore_20210520115514</parameter></parameters><message>Generating backupName for the scheduleJob 66fc1387-594c-48cb-b35d-94ca319a4a3c is backup_PM cDOT Datastore_20210520115514</message></messages></logs>
... View more
07-09-2021
08:25 AM
Hello @kamlesh_vaghela, Thanks for the trial!! I have kept the same settings which you have given, but still not working for me. Not sure why. I could see my settings and yours are working when I manually try to upload the file from Splunk UI/front end. Only props.conf file settings are not working. Someone please help me to check what exactly went wrong. I really appreciate your help!
... View more
07-09-2021
02:15 AM
Hello @kamlesh_vaghela , Thanks for the response! The XML data is valid only. As the data is huge, I have shared only 20% of the data here. Also I tried keeping the data in .txt, .xml and even .log format to test the props settings. But still same issue persists in all the ways. Could you please help?
... View more
07-09-2021
12:43 AM
Hello All, Hope you all are doing good!! I am trying to send some data to Splunk using UF. Below are my settings but I am getting data to Splunk without breaking the lines as I specified in my stanza. I want to break my events whenever there is messages tag. Kindly help me. I am just getting started my journey as admin but getting all issues. If possible please help with points using which we can trouble shoot all the issues My Settings: inuts.conf: [monitor:///usr/narmada/props_test.log]
index=narmada
sourcetype=logs_format outputs.conf: [tcpout:abc]
server = 65.2.122.16:9997 props.conf: [logs_format]
SHOULD_LINEMERGE=false
LINE_BREAKER=([\r\n]*)<messages>
BREAK_ONLY_BEFORE=<messages> raw data: <?xml version="1.0" encoding="UTF-8" standalone="yes"?><logs schemaVersion="0"><messages><timestamp>2021-04-22T11:55:13.766-07:00</timestamp><level>PROGRESS</level><thread>backup4 ee5fa1cb0c31a3e56f4fed2c99ff7745</thread>location>com.netapp.common.flow.tasks.Log</location><msgKeyClass>com.netapp.smvi.SMMsgKey</msgKeyClass><msgKeyValue>PROGRESS_TASK_BACKUP_STARTING</msgKeyValue><parameters xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:nil="true"/><message>Starting backup request</message>/messages>messages<timestamp>2021-04-22T11:55:14.156-07:00</timestamp><level>INFO</level><thread>backup4 aaaaaaaaaajksbcjkbud7yh8y83eh38</thread><location>com.netapp.smvi.task.validation.BackupValidation</location><msgKeyClass>com.netapp.smvi.SMMsgKey</msgKeyClass><msgKeyValue>BACKUP_VALIDATION_INTERNAL_BACKUP_NAME_FOR_SCHEDULE_JOB</msgKeyValue><parameters><parameter>66fc1387-594c-48cb-b35d-94ca319a4a3c</parameter><parameter>backup_PM cDOT Datastore_20210422115514</parameter></parameters><message>Generating backupName for the scheduleJob 66fc1387-594c-48cb-b35d-94ca319a4a3c is backup_PM cDOT Datastore_20210422115514</message></messages>
... View more
- Tags:
- props.conf
Labels
- Labels:
-
props.conf
02-03-2021
11:51 PM
Hello @ITWhisperer , My log is like below.. From here I need to check whether file committed successfully or any error found. If successfully committed, then how many rows updated. In splunk, it is configured every line as an event. + 1/29/2021 11:20:05 AM : Start of file commit for BSB file A_A__20210128103539.TXT
+ 1/29/2021 11:20:07 AM : End of file commit for BSB file A_A__20210128103539.TXT
+ 1/29/2021 11:20:07 AM : Successful File Commit for BSB file A_A__20210128103539.TXT
+ 1/29/2021 11:20:07 AM : File Summary:
+ 1/29/2021 11:20:07 AM : Row(s) Updated 390
+ 1/29/2021 11:20:07 AM : Time Started 29/01/2021 11:20:05 AM
+ 1/29/2021 11:20:07 AM : Time Ended 29/01/2021 11:20:07 AM
+ 1/29/2021 11:20:07 AM : Reference Date 20210201
+ 1/29/2021 11:20:07 AM : OR Code A
+ 1/29/2021 11:20:07 AM : OPF Code A
+ 1/29/2021 11:19:40 AM : Start of file commit for BSB file B_B__20210128063543.TXT
+ 1/29/2021 11:19:45 AM : End of file commit for BSB file B_B__20210128063543.TXT
+ 1/29/2021 11:19:45 AM : Error encountered during file commit 70Permission denied I Splunk, I can see like below: 1/29/2021 11:23:03 AM : TNS OPSG09
+ 1/29/2021 11:23:03 AM : FileName A_A__20210129020530.TXT
+ 1/29/2021 11:23:03 AM : Path SIF
+ 1/29/2021 11:23:03 AM : ProcType D:\SAM\TH\Inbox\BESHEB_TRANSCO\
+ 1/29/2021 11:23:03 AM : ValidationKey BESHEB
+ 1/29/2021 11:23:03 AM : Start of file commit for BESHEB file A_A__20210129020530.TXT
+ 1/29/2021 11:23:03 AM : Special Field 2
+ 1/29/2021 11:23:03 AM : Special Field 1 P2
+ 1/29/2021 11:23:03 AM : End of file validation for BESHEB file B_B__20210129020530.TXT
+ 1/29/2021 11:22:35 AM : ORCode TH
+ 1/29/2021 11:22:35 AM : ConnectionPass SAM_CAD
+ 1/29/2021 11:22:35 AM : ConnectionID SAM_CAD
+ 1/29/2021 11:22:35 AM : TNS OPSG09
+ 1/29/2021 11:22:35 AM : FileName B_B__20210129020530.TXT
+ 1/29/2021 11:22:35 AM : Path SIF
+ 1/29/2021 11:22:35 AM : ProcType D:\SAM\TH\Inbox\BESHEB_TRANSCO\
+ 1/29/2021 11:22:35 AM : ValidationKey BESHEB
+ 1/29/2021 11:22:35 AM : Start of file validation for BESHEB file B_B__20210129020530.TXT
+ 1/29/2021 11:22:35 AM : Row(s) Updated 856
+ 1/29/2021 11:22:35 AM : OPF Code B
+ 1/29/2021 11:22:35 AM : OR Code B So I am not able to map, which row updated for what file. Please help on this.
... View more
- Tags:
- Whispere
02-03-2021
10:09 PM
Hello All, when I send some log to Splunk, I am not getting the events as per the log order. For example, my first line in the log is 7th or 10th line in Splunk, 2nd line would be 20th and 3rd might me 1st line in Splunk events. In my data, I need to check whether the file processed successfully or not, any error occurred or not. If processed successfully, how many rows updated. These three things will be in different lines or events. If data is not in correct sorted order, I could not relate one with other. Can you please help me to get the data according to the log order. Really appreciate the suggestion!! Thanks in advance!!
... View more
Labels
- Labels:
-
troubleshooting
07-24-2020
02:18 AM
Hello @to4kawa , Thanks for the response! Let me put the question more clear. For example, I have 10 countries, for which I have to check whether the transmission is successful or not. If the transmission is not success the I have to show, what error has been occurred. In the log, Successful transmission can be determined by the below line: Status--20/07/2020 12:18:15--CALC_RFS_TUE_PM--(KE)--0 : - Transmission Complete If this line is not available for any of the countries, it should show the status as failed and show the error message in the reason column. Expected Output: SNo Country Date Status Reason 1 KE 2020-07-20 12:18:15 Successful NA 2 AU 2020-07-20 12:18:15 Successful NA 3 SI 2020-07-20 12:18:15 Successful NA 4 MY 2020-07-20 12:18:15 Failed Failed--20/07/2020 12:19:10--CALC_RFS_TUE_PM--(MY)---2207217873 :ORA-00001: unique constraint (WIMS.PK_TB_TRN_FCST_DAILY) violated ORA-06512: at "WIMS.SP_BUILD_FCST", line 573 - RFS calculation failed 5 TD 2020-07-20 12:18:15 Failed Failed--20/07/2020 12:19:10--CALC_RFS_TUE_PM--(TD)---2207217873 :ORA-00001: unique constraint (WIMS.PK_TB_TRN_FCST_DAILY) violated ORA-06512: at "WIMS.SP_BUILD_FCST", line 573 - RFS calculation failed
... View more
07-23-2020
08:44 PM
Please reply if someone know the how to figure it out..
... View more
- Tags:
- subsearch
07-22-2020
01:17 AM
Hello, I think this might be simple but need some guidance. Any help would be really appreciated. I have a log and in which, I have to check the successful transmission for all countries. When some transmission is failed, I have to show the reason or error for that country. Below is the sample data: Status--20/07/2020 12:18:15--CALC_RFS_TUE_PM--(KE)--0 : - Initializing Communications... Status--20/07/2020 12:18:15--CALC_RFS_TUE_PM--(KE)--0 : - Sending Sender Information... Status--20/07/2020 12:18:15--CALC_RFS_TUE_PM--(KE)--0 : - Sending Recipient Information... Status--20/07/2020 12:18:15--CALC_RFS_TUE_PM--(KE)--0 : - Sending Message... Status--20/07/2020 12:18:15--CALC_RFS_TUE_PM--(KE)--0 : - Transmission Complete Success--20/07/2020 12:19:10--CALC_RFS_TUE_PM--(MY)---2207217873 :ORA-00001: unique constraint (WIMS.PK_TB_TRN_FCST_DAILY) violated ORA-06512: at "WIMS.SP_BUILD_FCST", line 573 - ForeCast data committed successfully. Failed--20/07/2020 12:19:10--CALC_RFS_TUE_PM--(MY)---2207217873 :ORA-00001: unique constraint (WIMS.PK_TB_TRN_FCST_DAILY) violated ORA-06512: at "WIMS.SP_BUILD_FCST", line 573 - RFS calculation failed Trace--20/07/2020 12:19:10--CALC_RFS_TUE_PM--(MY)---2207217873 :ORA-00001: unique constraint (WIMS.PK_TB_TRN_FCST_DAILY) violated ORA-06512: at "WIMS.SP_BUILD_FCST", line 573 - Connecting to SMTP server for attempt:1 Status--20/07/2020 12:19:10--CALC_RFS_TUE_PM--(MY)---2207217873 :ORA-00001: unique constraint (WIMS.PK_TB_TRN_FCST_DAILY) violated ORA-06512: at "WIMS.SP_BUILD_FCST", line 573 - Connecting to SMTP Server (notesGWEUR.MICHELIN.com )... Status--20/07/2020 12:19:10--CALC_RFS_TUE_PM--(MY)---2207217873 :ORA-00001: unique constraint (WIMS.PK_TB_TRN_FCST_DAILY) violated ORA-06512: at "WIMS.SP_BUILD_FCST", line 573 - Initializing Communications... Here "KE" and "MY" are the countries. I have tried like below, but giving errors. | makeresults | eval Possible_ORs="AU,MY,KE,JP,SI,VN,ID,TD,KO,J1" | eval Possible_ORs=split(Possible_ORs, ",") | mvexpand Possible_ORs | eval count=0 | rename Possible_ORs as "ORs" | fields - _time | append [| search sourcetype=RFS_Log | rex "Status\W+(?P<Date>\d{1,2}\/\d{1,2}\/\d+\s+\d+\W+\d+\W+\d+).*(?P<OR_NAME>[A-Z]{2}).*Transmission\s+Complete" | eval Date=strftime(Date, "%Y-%m-%d %H:%M:%S") | eval ORs=OR_NAME | eval ORs = split(ORs,",") | mvexpand ORs | eval count=1 | fields - _raw _time] | dedup ORs sortby - count | eval Job Name=case(count>=0, "RFS Calculation") | eval Status=case(count>0, "Calculation Successful", count=0, "Calculation Failed") | eval Status=if(isnull(Status), "Calculation Failed", Status) | eval Reason=if(Status="Calculation Failed", [search sourcetype=RFS_Log | rex "Status\W+\d{1,2}\/\d{1,2}\/\d+\s+\d+\W+\d+\W+\d+.*(?P<OR_NAME>[A-Z]{2}).*violated"], "failed")
... View more
Labels
- Labels:
-
subsearch
06-19-2020
03:02 AM
Hello, I have tried like below and is working for me. Thanks!! Still I accept any better solution than this.. haha.. index=myindex sourcetype=mysourcetype | stats latest(_raw) as latest_event | eval status=case(match(latest_event, "TransactionRolledbackException"), "Down", match(latest_event, "WIMSystemException"), "latest_event", match(_raw, "ConnectionWaitTimeoutException"), "Down", match(latest_event, "\w+Exception"), "Warning", 1!=2, "OK") | stats count by status | eval status=case(status="OK", 0, status="Warning", 5, status="Down", 10) | rangemap field=status low=0-4 elevated=4-6 default=severe | eval status = replace (status,"0","OK") | eval status = replace (status,"5","WARNING") | eval status = replace (status,"10","DOWN")
... View more
06-18-2020
12:44 AM
Hello, Below is the piece of data. Let me know if anything else required. [3/14/18 0:12:41:610 EDT] 00000039 SystemErr R at com.ibm.io.async.ResultHandler.runEventProcessingLoop(ResultHandler.java:775)
46 3/14/18
12:12:41.610 AM
[3/14/18 0:12:41:610 EDT] 00000039 SystemErr R at com.ibm.io.async.ResultHandler$2.run(ResultHandler.java:905)
47 3/14/18
12:12:41.610 AM
[3/14/18 0:12:41:610 EDT] 00000039 SystemErr R at com.ibm.ws.util.ThreadPool$Worker.run(ThreadPool.java:1662)
48 3/14/18
1:00:02.465 AM
[3/14/18 1:00:02:465 EDT] 0000006b SystemErr R com.ibm.wcc.service.intf.ProcessingException
... View more
06-17-2020
12:17 AM
@richgalloway @dmaislin_splunk Hello All, Can someone please help.
... View more
06-15-2020
08:07 AM
Hello Team,
Here is my requirement:
I have to check the application running status, which is installed in Linux server. For this, I have a log generated by the application, which might not contain the continuous-time intervals. The log will get updated when the user is using the app. In the log, I have 3 high priority exceptions: TransactionRolledbackException, WIMSystemException, ConnectionWaitTimeoutException. When any of these exceptions occurred in the log, the status should be "DOWN". If any other exceptions occur, then the status should be "WARNING, and if no exception, it should show "OK". Also once the high priority exception occurs, we will notify the users by email alert. After the email alert, it would be cleared then the next events will generate. once the next event generates and does not contain any high priority exceptions, then the status should be shown in the dashboard as "OK" and low priority exceptions, warning. And if the latest event contains exception again, then "DOWN".
Noe: when the application is down in real time, the log will not generate.
Here are my sample codes but not satisfied with the results:
1.
index=myIndex sourcetype=mySourcetpe
| stats count as Total earliest(_time) as start_time latest(_time) as latest_time earliest(_raw) as Earliest_Event latest(_raw) as Latest_Event by _time
| eval stop=strptime(stop, "%m/%d/%Y")
| eval Earliest_Count= Total - 1
| eval Latest_Count= Total + 1
| eval status=case(((Latest_count > Total) AND match(_raw, "TransactionRolledbackException")), "Down",((Latest_count > Total) AND match(_raw, "WIMSystemException")), "Down",((Latest_count > Total) AND match(_raw, "ConnectionWaitTimeoutException")), "Down",((Latest_count > Total) AND match(_raw, "\w+Exception")), "Warning", 1!=2, "OK")
| stats count by status
2.
index=myIndex sourcetype=myscourcetype
| eval status=case( match(_raw, "TransactionRolledbackException"), "Down", match(_raw, "WIMSystemException"), "Down", match(_raw, "ConnectionWaitTimeoutException"), "Down", match(_raw, "\w+Exception"), "WARNING" , 1!=2, "OK")
| timechart count by status
Any Help or suggestion would be really appreciated!! Thanks!
... View more
Labels
- Labels:
-
timechart
04-07-2020
11:34 PM
@p_gurav , This is working and helped me. Thanks for the help
... View more
04-07-2020
11:03 AM
@niketnilay The field1 and field2 are two columns extracted from two sources through regular expressions. These two fields contains some standard codes. One file generates from one system and another file from another system. So, what I have to do is whether the standard code available in both the files or not by date. Example, today I have received 20 values in field1 and 20 or more than 20 values in field2. I have to compare what are the matched values and unmatched values like you have posted. Uncommon in the sense if field2 is having more than 20 values it will not match with field1, in this case I have to show as unmatched.
Please let me know if still the requirement is not clear. I will try to keep it in best way with my tried code.
... View more
04-07-2020
11:02 AM
@niketnilay The field1 and field2 are two columns extracted from two sources through regular expressions. These two fields contains some standard codes. One file generates from one system and another file from another system. So, what I have to do is whether the standard code available in both the files or not by date. Example, today I have received 20 values in field1 and 20 or more than 20 values in field2. I have to compare what are the matched values and unmatched values like you have posted. Uncommon in the sense if field2 is having more than 20 values it will not match with field1, in this case I have to show as unmatched.
Please let me know if still the requirement is not clear. I will try to keep it in best way with my tried code.
... View more
04-07-2020
12:14 AM
Hello,
I have a data from two different sourcetypes. In that data, I have two specific columns where in I have to check whether there are common values in both fields or not and if there are common values in bot the fields, I have to show then on the same row in their respective fields and uncommon fields next to the common fields. For the common files, the status should be yes else no.
The data is like below:
Field1 Field2
A B
C D
Z L
L A
B K
S C
D M
Expected Output:
Field1 Field2 Status
A A Yes
C C Yes
L L Yes
L Z No
B K No
S S Yes
D M NO
Please help me... I have used join, but it is giving blank values in the middle of the table
... View more
08-04-2019
10:52 PM
Thanks for your response!!
There are lot of "sended and deleted" and "processed and deleted" statements/sentences in my log. I have given only one simple stanza of data.
I have tried with you comments. But it is taking all 4394 events and showing status when there is processed and deleted and sended and deleted.
That is not needed.
The condition is when there are both the statements present which are "processed and deleted" with MA07 code in path, which is file name and "sended and deleted" with MA07 file name then file received. If only "processed and deleted" is present then file stuck somewhere.
For reference please find this, adding one more stanza which is available in log.
Note: there are lot of files like this but only MA07 need to be checked.
19:20:11 C:\Pelibib\MBX\20190618192001754_MA09.MBX processed and deleted
19:20:11 === Step #05 - Calling C:\Pelibib\SEM\AFTER.pl
19:20:11 - Running Program AFTER.pl
19:20:11 PELAP.pl Unable access to server C:\PELIDATA\F05X\REC\F0566811.TXT
19:20:12 PELAP.pl Unable access to server C:\PELIDATA\MC25\REC\F0575784.TXT
19:20:12 PELAP.pl Unable access to server C:\PELIDATA\MC24\REC\F0586633.TXT
19:20:12 PELAP.pl Unable access to server C:\PELIDATA\MC28\REC\F0586634.TXT
19:20:12 PELAP.pl Unable access to server C:\PELIDATA\MC24\REC\F0586635.TXT
19:20:12 PELAP.pl Unable access to server C:\PELIDATA\MC28\REC\F0586636.TXT
19:20:12 PELAP.pl Unable access to server C:\PELIDATA\MC25\REC\F0586637.TXT
19:20:12 PELAP.pl Unable access to server C:\PELIDATA\MC30\REC\F0586638.TXT
19:20:13 PELAP.pl Unable access to server C:\PELIDATA\MB97\REC\F0586639.TXT
19:20:13 PELAP.pl Copy file C:\PELIBIB\STD\RECEP\F0601320 on C:\PELIDATA\F05X\REC\F0601320.TXT
19:20:15 === Step #06 - Calling C:\Pelibib\UTI\SCANNEREXIT.pl C:\Pelibib\LOGSCANNER\20190618.LOG
19:20:16 === Step #07 - PSG2N802 End of Pelsem.bat ***
19:20:16 Step #4 - PSG2N802 End of Pelsem.bat ***
19:20:28 BEFORE call C:\Pelint\Server\run_time\tmp\TR601322.BAT
19:20:28 O 601322 PAHKP102 PMGTN901 C:\PELIBIB\STD\EMI\PMGTN901-MA09-MA09JPJDE_AL20190618.TXT_20190618192001754 MA09-169000397PSG2N802
19:20:28 AFTER call C:\Pelint\Server\run_time\tmp\TR601322.BAT
19:20:28 O 601322 FILE C:\PELIBIB\STD\EMI\PMGTN901-MA09-MA09JPJDE_AL20190618.TXT_20190618192001754 sended and deleted
19:20:29 le fichier STDOUT=c:\pelint\server\run_time\tmp\XPR601322.out
*---------------------------------------------------------------------**
------------------
19:30:05 POST -bd "" PMGTN901 MA07 C:\PELIBIB\STD\EMI\PMGTN901-MA07-2019618.TXT_20190618193001755 PAHKP102 "LFI16P_AE" ""
19:30:06 C:\Pelibib\MBX\20190618193001755_MA07.MBX processed and deleted
19:30:06 === Step #05 - Calling C:\Pelibib\SEM\AFTER.pl
19:30:06 - Running Program AFTER.pl
19:30:06 PELAP.pl Unable access to server C:\PELIDATA\F05X\REC\F0566811.TXT
19:30:07 PELAP.pl Unable access to server C:\PELIDATA\MC25\REC\F0575784.TXT
19:30:07 PELAP.pl Unable access to server C:\PELIDATA\MC24\REC\F0586633.TXT
19:30:07 PELAP.pl Unable access to server C:\PELIDATA\MC25\REC\F0586637.TXT
19:30:07 PELAP.pl Unable access to server C:\PELIDATA\MC30\REC\F0586638.TXT
19:30:07 PELAP.pl Unable access to server C:\PELIDATA\MB97\REC\F0586639.TXT
19:30:09 BEFORE call C:\Pelint\Server\run_time\tmp\TR601323.BAT
19:30:09 O 601323 PAHKP102 PMGTN901 C:\PELIBIB\STD\EMI\PMGTN901-
19:30:09 O 601323 TRTTAB.pl MA07 PAHKP102 2 => CALL EXITS\E_XXXX_E.CMD
19:30:09 AFTER call C:\Pelint\Server\run_time\tmp\TR601323.BAT
19:30:09 O 601323 FILE C:\PELIBIB\STD\EMI\PMGTN901-MA07-MA07AE_2019618.TXT_20190618193001755 sended and deleted
Thanks
... View more
08-04-2019
11:41 AM
Hello All,
Here is my sample data.
"****19:30:06 C:\Pelibib\MBX\20190618193001755_MA07.MBX processed and deleted****
19:30:06 === Step #05 - Calling C:\Pelibib\SEM\AFTER.pl
19:30:06 - Running Program AFTER.pl
19:30:06 PELAP.pl Unable access to server C:\PELIDATA\F05X\REC\F0566811.TXT
19:30:09 BEFORE call C:\Pelint\Server\run_time\tmp\TR601323.BAT
19:30:09 O 601323 PAHKP102 PMGTN901 C:\PELIBIB\STD\EMI\PMGTN901-MA07-MA07AE_2019618.TXT_20190618193001755 MA07-169000398PSG2N802
19:30:09 AFTER call C:\Pelint\Server\run_time\tmp\TR601323.BAT
**19:30:09 O 601323 FILE C:\PELIBIB\STD\EMI\PMGTN901-MA07-MA07AE_2019618.TXT_20190618193001755 sended and deleted** "
My case is:
If two sentences which are below:
19:30:06 C:\Pelibib\MBX\20190618193001755_MA07.MBX processed and deleted
19:30:09 O 601323 FILE C:\PELIBIB\STD\EMI\PMGTN901-MA07-MA07AE_2019618.TXT_20190618193001755 sended and deleted
If the above two statements are present in one log, then I have to show it as "File Received" and if only processed and deleted sentence there(i.e., 19:30:06 C:\Pelibib\MBX\20190618193001755_MA07.MBX processed and deleted) and there is no sended and deleted sentence in the log, then I have to show it as "File stuck in XYZ folder".
But I am not able to use it properly and not getting any Idea how to use it in splunk.
Please help me with your thoughts.
I have tried like below, but which of no use:
index=idx_rfs7 sourcetype=st_fs7_pelican_logs | regex "FILE\s+C\W+PELIBIB\WSTD\WEMI\W.*\W(?P<AVIEXP_SENT>MA07)\W[A-Z]{2}\d{2}[A-Z]{2}\S\d{6,8}" | stats count as "File_Received" by _time
| append [search index=idx_rfs7 sourcetype=st_fs7_pelican_logs
| regex "\d{1,2}:\d{1,2}:\d{1,2}\sC\W+Pelibib\WMBX\W\d{16,18}_(?P<AVIEXP_PROCESSED>MA07)\WMBX\sprocessed\sand\sdeleted"
| stats count as "File_Processed" by _time]
| eval Status=if(File_Received=File_Processed, "File Received", "File Stuck in INFERTEXT Folder")
Many Thanks in Advance!!
... View more
07-27-2019
12:30 PM
Hi,
It would be great if at least I get solution for the below:
In my dashboard, whenever my regex match found, it is showing the values and there is no match showing null values as I did stats count by _time.
I want another color instead of showing null.
For example on 27th july there is a match in my log file so, it showing one color column in dashboard and on 28th july I did not receive file so showing null. I want another color in the place is null rather null.
In the source code tried like this but not working. Please help me.
<colorPalette type="expression">if(match(value,"FILE\s+C\W+PELIBIB\WSTD\WEMI\W[A-Z]{5}\d{3}\WMA07\W[A-Z]{2}\d{2}[A-Z]{2}\S\d{6,8}"),"#555555","#D93F3C")</colorPalette>
</format>
Thanks in advance.
... View more
07-27-2019
03:52 AM
I have changed my default time zone and it is working fine now.
But...
when I use the above query, it is just taking the matched events from the sources but when there is no match in one file/source, it is not giving any value value or custom message or atleast not giving that particular day date and file not received(its custom message). And in the visualization it is giving all day values which matched and when there is no match it is showing null.
For example, this month we have received the file everyday except, 9th july and 15th july. As there is no file received on 9th and 15th, it should give the custom message i.e., file not received in the table but it is showing only File received values.
This is not helpful for the business.
Also in the dashboard it is showing blank but I need a column in the column chart saying file not received.
Please help me with your guidance.
Thanks in Advance!!
... View more