All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I have below sample events- type=2, time=04/03/2020 01:01:000 type=3, time=04/03/2020 01:16:000 type=3, time=04/03/2020 01:22:000 type=2, time=04/03/2020 02:20:000 type=4, time=04/03/2020 0... See more...
I have below sample events- type=2, time=04/03/2020 01:01:000 type=3, time=04/03/2020 01:16:000 type=3, time=04/03/2020 01:22:000 type=2, time=04/03/2020 02:20:000 type=4, time=04/03/2020 03:00:000 here I want duration which startswith="type=2" and endswith="type=3 OR type=4" without using transction command since using transaction query becomes very slow. can I achieve above using streamstats? Thanks,
I am trying to use Splunk in a docker container behind an NGINX proxy. It is working for the most part but I get error when trying to upload lookup files or modify any server settings - "Your ent... See more...
I am trying to use Splunk in a docker container behind an NGINX proxy. It is working for the most part but I get error when trying to upload lookup files or modify any server settings - "Your entry was not saved. The following error was reported: SyntaxError: Unexpected token < in JSON at position 0." Has anyone seen this before or have experience working with Splunk behind NGINX proxy? Note: I have web.conf configured with "root_endpoint = /splunk" Thanks!
I have a kvstore collection with two columns: "_key", and "last_online". The idea is that a search to update the values, manually specifying the key, is scheduled and only needs to run with a time ra... See more...
I have a kvstore collection with two columns: "_key", and "last_online". The idea is that a search to update the values, manually specifying the key, is scheduled and only needs to run with a time range as long as the schedule interval. In general this is so that other searches can access this information quickly via a lookup rather than running over an extended time range to find out when the device was last online. I have the searches working properly but using a workaround that I'm trying to avoid. When I use the lookup, I'm unable to match an event field with the key field like so: | makeresults count=1 | eval id = 1234 | lookup last_online_lookup _key as id OUTPUTNEW One of the possible workarounds I've found is duplicating the key field in the collection so that it is accessible under another name. But the one I've opted for is changing the search to look like this: | makeresults count=1 | eval id = 1234 | join type=left id [ | inputlookup last_online_lookup | eval id = _key ] This achieves the desired result but I want to know if it's possible to match an event field to the internal _key field directly. Any ideas? Thanks.
Hi All, I've installed the latest SolarWinds Add-on for Splunk (https://splunkbase.splunk.com/app/3584/#/details) After installing and restart I'm getting the following error; Unable to initi... See more...
Hi All, I've installed the latest SolarWinds Add-on for Splunk (https://splunkbase.splunk.com/app/3584/#/details) After installing and restart I'm getting the following error; Unable to initialize modular input "solwarwinds_query" defined in the app "Splunk_TA_SolarWinds": Introspecting scheme=solwarwinds_query: script running failed (exited with code 1).. When opening the app it just sits there spinning its logo. Any ideas Splunk Ent V8 on WIndows.
I have been following this example (https://answers.splunk.com/answers/683820/why-is-the-custom-alert-script-failing-with-sendal.html) and encountered an error in the showconfiguration.py script. ... See more...
I have been following this example (https://answers.splunk.com/answers/683820/why-is-the-custom-alert-script-failing-with-sendal.html) and encountered an error in the showconfiguration.py script. import pprint, json, sys if __name__ == "__main__": if len(sys.argv) > 1 and sys.argv[1] = "--execute": f.open("/tmp/splunktest.txt", "w") f.write("Here's the info we get from splunk:") f.write(pprint.pprint(json.loads(sys.stdin.read()))) f.close() Errors in log: 04-02-2020 07:30:11.193 +0000 WARN sendmodalert - action=showconfiguration - Alert action script returned error code=1 04-02-2020 07:30:11.193 +0000 INFO sendmodalert - action=showconfiguration - Alert action script completed in duration=11 ms with exit code=1 04-02-2020 07:30:11.191 +0000 ERROR sendmodalert - action=showconfiguration STDERR - SyntaxError: invalid syntax 04-02-2020 07:30:11.191 +0000 ERROR sendmodalert - action=showconfiguration STDERR - ^ 04-02-2020 07:30:11.191 +0000 ERROR sendmodalert - action=showconfiguration STDERR - if len(sys.argv) > 1 and sys.argv[1] = "--execute": 04-02-2020 07:30:11.191 +0000 ERROR sendmodalert - action=showconfiguration STDERR - File "/opt/splunk/etc/apps/showconfiguration/bin/showconfiguration.py", line 4 host = 6c83f2e55cd4source = /opt/splunk/var/log/splunk/splunkd.logsourcetype = splunkd
While saving connection getting error: There was an error processing your request. It has been logged (ID 57cb497db88fde58). Trying to connect to Microsoft SQL DB windows authentication. I have ... See more...
While saving connection getting error: There was an error processing your request. It has been logged (ID 57cb497db88fde58). Trying to connect to Microsoft SQL DB windows authentication. I have installed java 8 and set the environment variables, placed db drivers(Install the SQL Server using MS Generic driver with Windows authentication) in Splunk\etc\apps\splunk_app_db_connect\drivers. Created new identity and when trying to set connection it throwing the above error. And not letting me to query Splunk DB Connect version 3.3.0 verified port and IP Any help is appreciated!!!
I am doing an experiment at home to capture Internet traffic for all of my devices in my house connected to my home wi-fi. I heard in a conference that a guy setup Splunk Streaming on his splunk ins... See more...
I am doing an experiment at home to capture Internet traffic for all of my devices in my house connected to my home wi-fi. I heard in a conference that a guy setup Splunk Streaming on his splunk instance and was able to capture all traffic between his wireless router and any device in his house. sourcetype="stream:ip" src_ip="192.168.1.16" | stats count by dest_ip I put this quick query together but I don't think I'm capturing everything, but I'd also like to have splunk resolve the Dest_IP For example, if I pull up Google.com, I'd like to see in a Splunk Table "google.com" and not "172.217.5.78" my results are as follows
I would like to change the dashboards in a vendors app, How do I accomplish this?? Some apps I can edit others I am not able to?
After updating a bucket replication policy and doing a rolling restart of cluster indexers, one of the indexers seems stuck in this state: Question: where do I go, what do I do, to figure out ... See more...
After updating a bucket replication policy and doing a rolling restart of cluster indexers, one of the indexers seems stuck in this state: Question: where do I go, what do I do, to figure out what's the root cause and how to fix it? Cluster status in plaintext: - Search Factor Not Met - Replication Factor Not Met - One of three indexers: Fully Searchable: No, Status: Pending. - One out of 12 indexes shows with Searchable and Replicated Data Copies (the rest seem fine) Under "Indexer Clustering: Service Activity", "Snapshots" - a number of "pending" tasks that seem to be stuck and never moving to "in progress" status: - "Fixup Tasks - In Progress (0)" - "Fixup Tasks - Pending": -- Tasks to Meet Search Factor (4) -- Tasks to Meet Replication Factor (6) -- Tasks to Meet Generation (6) Tasks to Meet Search Factor (4) Bucket Index Trigger Condition Trigger Time Current State _metrics~34~4C2AF0DE-E42F-489B-92FB-2CA3FC68AC85 _metrics does not meet: sf & rf Waiting 'target_wait_time' before search factor fixup _metrics~34~64AE7236-EE5E-4EEE-AEBF-203F149FCB61 _metrics does not meet: primality & sf & rf Waiting 'target_wait_time' before search factor fixup _metrics~35~4C2AF0DE-E42F-489B-92FB-2CA3FC68AC85 _metrics does not meet: sf & rf Waiting 'target_wait_time' before search factor fixup _metrics~35~64AE7236-EE5E-4EEE-AEBF-203F149FCB61 _metrics does not meet: sf & rf Waiting 'target_wait_time' before search factor fixup Tasks to Meet Replication Factor (6) Bucket Index Trigger Condition Trigger Time Current State _metrics~34~4C2AF0DE-E42F-489B-92FB-2CA3FC68AC85 _metrics does not meet: sf & rf Waiting 'target_wait_time' before replicating bucket _metrics~34~64AE7236-EE5E-4EEE-AEBF-203F149FCB61 _metrics does not meet: primality & sf & rf Waiting 'target_wait_time' before replicating bucket _metrics~35~4C2AF0DE-E42F-489B-92FB-2CA3FC68AC85 _metrics does not meet: sf & rf Waiting 'target_wait_time' before replicating bucket _metrics~35~64AE7236-EE5E-4EEE-AEBF-203F149FCB61 _metrics does not meet: sf & rf Waiting 'target_wait_time' before replicating bucket _metrics~36~4C2AF0DE-E42F-489B-92FB-2CA3FC68AC85 _metrics non-streaming failure - src=64AE7236-EE5E-4EEE-AEBF-203F149FCB61 tgt=4C2AF0DE-E42F-489B-92FB-2CA3FC68AC85 failing=tgt _metrics~37~4C2AF0DE-E42F-489B-92FB-2CA3FC68AC85 _metrics non-streaming failure - src=9B5D3504-81B2-4DCC-BF4D-F7ED811A3571 tgt=4C2AF0DE-E42F-489B-92FB-2CA3FC68AC85 failing=tgt ... etc. Some of the errors on the indexer(s): 04-03-2020 07:55:10.100 -0700 ERROR TcpInputProc - event=replicationData status=failed err="Close failed" host = bvl-mit-splkin2source = /opt/splunk/var/log/splunk/splunkd.logsourcetype = splunkd 04-03-2020 07:55:10.100 -0700 WARN BucketReplicator - Failed to replicate warm bucket bid=_metrics~37~4C2AF0DE-E42F-489B-92FB-2CA3FC68AC85 to guid=4C2AF0DE-E42F-489B-92FB-2CA3FC68AC85 host=10.101.128.89 s2sport=9887. Connection closed. host = bvl-mit-splkin1source = /opt/splunk/var/log/splunk/splunkd.logsourcetype = splunkd 04-03-2020 07:55:10.097 -0700 WARN S2SFileReceiver - event=processFileSlice bid=_metrics~37~4C2AF0DE-E42F-489B-92FB-2CA3FC68AC85 msg='aborting on local error' host = bvl-mit-splkin2source = /opt/splunk/var/log/splunk/splunkd.logsourcetype = splunkd 04-03-2020 07:55:10.097 -0700 ERROR S2SFileReceiver - event=onFileClosed replicationType=eJournalReplication bid=_metrics~37~4C2AF0DE-E42F-489B-92FB-2CA3FC68AC85 state=eComplete src=64AE7236-EE5E-4EEE-AEBF-203F149FCB61 bucketType=warm status=failed err="bucket is already registered, registered not as a streaming hot target (SPL-90606)" host = bvl-mit-splkin2source = /opt/splunk/var/log/splunk/splunkd.logsourcetype = splunkd 04-03-2020 07:55:10.089 -0700 WARN BucketReplicator - Failed to replicate warm bucket bid=_metrics~36~4C2AF0DE-E42F-489B-92FB-2CA3FC68AC85 to guid=4C2AF0DE-E42F-489B-92FB-2CA3FC68AC85 host=10.101.128.89 s2sport=9887. Connection closed. host = bvl-mit-splkin3source = /opt/splunk/var/log/splunk/splunkd.logsourcetype = splunkd Additional notes: Output of /opt/splunk/bin/splunk list peer-info on the peer: slave base_generation_id:651 is_registered:1 last_heartbeat_attempt:0 maintenance_mode:0 registered_summary_state:3 restart_state:NoRestart site:default status:Up /opt/splunk/etc/master-apps/_cluster/local/indexes.conf on CM (successfully replicated to peers via /opt/splunk/bin/splunk apply cluster-bundle [default] repFactor = auto [_introspection] repFactor = 0 [windows] frozenTimePeriodInSecs = 31536000 coldToFrozenDir = $SPLUNK_DB/$_index_name/frozendb [wineventlog] frozenTimePeriodInSecs = 31536000 coldToFrozenDir = $SPLUNK_DB/$_index_name/frozendb Details: Splunk Enterprise 8.02 mostly default settings Thank you!
Hello everyone. I need to index the logs below and the example that is on my Dropbox link in a new sourcetype. The event line break occurs through the timestamp at the beginning of each interac... See more...
Hello everyone. I need to index the logs below and the example that is on my Dropbox link in a new sourcetype. The event line break occurs through the timestamp at the beginning of each interaction: "2020-04-02 22:09:52,416", this is the time format of my log. Another point is that it is added with a time zone of 03 hours more, so for example:  - The log of the time "2020-04-02 22:09:52,416" should be indexed in Splunk with the time "2020-04-02 19:09:52,416", if it is not clear I will explain it again.  Can you help me how to set up this sourcetype in props.conf? Link Dropbox: https://www.dropbox.com/s/qn2b2vnjyo1t0mj/server.txt?dl=0 2020-04-02 21:57:38,063 INFO ecp-1-1784929 25000 ExtractWindow: CFG, [2020-02-28 05:53:42,2020-04-02 21:57:14(1582869222,1585864634)] *** SESSIONS(2): 2020-04-02 21:32:52,779 DEBUG ecp-1-872908 35000 SQLUtils.queryAndScan - exit(elapsed 47 ms) returning 43 2020-04-02 21:32:53,278 WARN ecp-1-872908 20000 User data mapping and data base schema validation warnings: Default value in data base schema for user dimension column USER_DATA_CUST_DIM_2.SEGMENTO is empty, will use hardcoded default: none Default value in mapping for user dimension column USER_DATA_CUST_DIM_13.CORRENTISTAS is empty, will use hardcoded default: none Default value in mapping for user dimension column USER_DATA_CUST_DIM_13.MULTIPLO is empty, will use hardcoded default: none Default value in mapping for user dimension column USER_DATA_CUST_DIM_13.TPESSOA is empty, will use hardcoded default: none Default value in mapping for user dimension column USER_DATA_CUST_DIM_13.AVI is empty, will use hardcoded default: none Default value in mapping for user dimension column USER_DATA_CUST_DIM_13.ELEG is empty, will use hardcoded default: none Default value in mapping for user dimension column USER_DATA_CUST_DIM_14.ASSUNTO is empty, will use hardcoded default: none Default value in mapping for user dimension column USER_DATA_CUST_DIM_14.PRODUTODN is empty, will use hardcoded default: none Default value in mapping for user dimension column USER_DATA_CUST_DIM_14.CONPO is empty, will use hardcoded default: none Default value in mapping for user dimension column USER_DATA_CUST_DIM_15.FIDELIZA is empty, will use hardcoded default: none 2020-04-02 21:32:53,278 INFO ecp-1-872908 30000 JobTransform: SERVICE_OBJECTIVE default is = -1 2020-04-02 21:32:53,278 INFO ecp-1-872908 30000 JobTransform: creating Lookup ... 2020-04-02 21:32:53,278 DEBUG ecp-1-872908 35000 COMMIT: 2004326974; called by com.genesyslab.gim.etl.jobs.transform.JobTransform.init(JobTransform.java:269) 2020-04-02 21:32:53,278 INFO ecp-1-872908 30000 JobTransform: initialized 2020-04-02 21:32:53,309 INFO ecp-1-872908 30042 Job step INIT completed successfully. 2020-04-02 21:32:53,309 INFO ecp-1-885538 30041 Job step AGENTtoRESOURCE started. 2020-04-02 21:32:51,999 DEBUG ecp-1-872908 35000 ConcurrentUtils.shutdown: all tasks completed, executor terminated 2020-04-02 21:32:51,999 INFO ecp-1-872908 20104 Job 'Job_ExtractICON' completed successfully. 2020-04-02 21:32:51,999 INFO ecp-1-872908 25000 Execution Info +[Job_ExtractICON].....................................................21653 ms. Invocations 1 |-- [INIT]................................................................15 ms. Invocations 1 |--+[RUN]..............................................................20905 ms. Invocations 1 |-- [TRUNCATE_TMP]....................................................203 ms. Invocations 1 |--+[EXTRACT].......................................................20670 ms. Invocations 1 |--+[ExtractAndMerge]............................................13417 ms. Invocations 1 |--+[ExtractTriplets]..........................................2886 ms. Invocations 1 |-- [G_IR]..................................................3181 ms. Invocations 7 |-- [G_CALL]................................................8859 ms. Invocations 7 |-- [G_IS_LINK].............................................3135 ms. Invocations 7 |--+[MergeMove]...............................................10359 ms. Invocations 1 |-- [insertIRs4ConCalls2TmpMerge](0).........................499 ms. Invocations 1 |--+[insertClassifiedIsLinks](5993)..........................561 ms. Invocations 1 |-- [classify links - join](5993).........................234 ms. Invocations 1 |-- [classify links - insert](5993).......................171 ms. Invocations 1 |-- [insertIsLinks2TmpMerge](1058)............................93 ms. Invocations 1 |-- [update_G_CALL_ROOTIRID](1533)...........................561 ms. Invocations 2 |-- [update_G_IR_ROOTIRID](1096).............................156 ms. Invocations 2 |-- [insertIRs2TmpMerge](38).................................218 ms. Invocations 1 |-- [updateTO_CYCLE](0)......................................234 ms. Invocations 4 |-- [updateRootIrInTmpMerge](41).............................156 ms. Invocations 4 |-- [updateRootirInTmpMerge2](3937)..........................141 ms. Invocations 1 |-- [insertPendingRootIrs2TmpMerge](1881).....................47 ms. Invocations 1 |-- [insertNotPendingRootIrs2TmpMerge](3648).................187 ms. Invocations 1 |-- [insertNotPendingLinks](2116)............................593 ms. Invocations 1 |-- [deleteNotPendingLinks](2116)............................124 ms. Invocations 1 |-- [G_IR_copyMerged](4696)..................................531 ms. Invocations 1 |-- [G_IR_deleteMerged](4696)................................125 ms. Invocations 1 |-- [G_CALL_copyMerged](6846)...............................1263 ms. Invocations 1 |-- [G_CALL_deleteMerged](6846).............................2543 ms. Invocations 1 |-- [G_IR_copyStuckRecords](0)...............................187 ms. Invocations 1 |-- [G_CALL_copyStuckRecords](0).............................422 ms. Invocations 1 |--+[Extract].....................................................6833 ms. Invocations 1 |-- [G_IR]......................................................578 ms. Invocations 1 |-- [G_VIRTUAL_QUEUE]..........................................2059 ms. Invocations 8 |-- [GC_LOGIN].................................................1014 ms. Invocations 1 |-- [GC_BUS_ATTRIBUTE].........................................1357 ms. Invocations 1 |-- [G_USERDATA_HISTORY]......................................15678 ms. Invocations 8 |-- [GC_IVRPORT]................................................983 ms. Invocations 1 |-- [GC_TREATMENT]..............................................858 ms. Invocations 1 |-- [GC_SKILL]..................................................982 ms. Invocations 1 |-- [GCX_GROUP_PLACE]...........................................749 ms. Invocations 1 |-- [G_SECURE_USERDATA_HISTORY].................................811 ms. Invocations 8 |-- [GC_PLACE]..................................................999 ms. Invocations 1 |-- [GC_ANNEX]..................................................265 ms. Invocations 1 |-- [GCX_GROUP_ROUTEDN]........................................1482 ms. Invocations 1 |-- [G_DND_HISTORY].............................................750 ms. Invocations 6 |-- [G_ROUTE_RESULT]...........................................2416 ms. Invocations 8 |-- [G_AGENT_STATE_RC].........................................1186 ms. Invocations 6 |-- [GC_SWITCH].................................................952 ms. Invocations 1 |-- [GC_ATTR_VALUE].............................................921 ms. Invocations 1 |-- [GM_L_USERDATA].............................................625 ms. Invocations 1 |-- [GCX_FORMAT_FIELD].........................................1498 ms. Invocations 1 |-- [G_CUSTOM_DATA_S]...........................................920 ms. Invocations 8 |-- [GM_F_USERDATA]..............................................94 ms. Invocations 1 |-- [G_PARTY].................................................10294 ms. Invocations 8 |-- [G_CALL_STAT]..............................................1387 ms. Invocations 7 |-- [GCX_ENDPOINT_PLACE].......................................1310 ms. Invocations 1 |-- [GCX_GROUP_AGENT]..........................................1326 ms. Invocations 1 |-- [GCX_CAMPGROUP_INFO].......................................1045 ms. Invocations 1 |-- [GCX_SKILL_LEVEL]..........................................1435 ms. Invocations 1 |-- [GCX_LOGIN_INFO]............................................999 ms. Invocations 1 |-- [G_ROUTE_RES_VQ_HIST]......................................1295 ms. Invocations 8 |-- [GC_AGENT]..................................................858 ms. Invocations 1 |-- [GCX_AGENT_PLACE]...........................................983 ms. Invocations 1 |-- [GC_CAMPAIGN]...............................................749 ms. Invocations 1 |-- [GC_CALLING_LIST]..........................................1061 ms. Invocations 1 |-- [G_LOGIN_SESSION]..........................................3900 ms. Invocations 6 |-- [GC_TENANT].................................................748 ms. Invocations 1 |-- [G_IR_HISTORY].............................................2668 ms. Invocations 8 |-- [GCX_CAMPLIST_INFO]........................................1372 ms. Invocations 1 |-- [GC_FILTER]................................................1217 ms. Invocations 1 |-- [G_CALL]....................................................639 ms. Invocations 1 |-- [GC_TIME_ZONE]..............................................655 ms. Invocations 1 |-- [GC_OBJ_TABLE]..............................................624 ms. Invocations 1 |-- [GC_VOICE_PROMPT]...........................................593 ms. Invocations 1 |-- [GC_GROUP].................................................1030 ms. Invocations 1 |-- [GC_SCRIPT]................................................1186 ms. Invocations 1 |-- [GC_ACTION_CODE]...........................................1311 ms. Invocations 1 |-- [GC_ENDPOINT]...............................................811 ms. Invocations 1 |-- [G_AGENT_STATE_HISTORY]...................................15163 ms. Invocations 6 |-- [GCX_SUBCODE]..............................................1373 ms. Invocations 1 |-- [GC_TABLE_ACCESS]..........................................1061 ms. Invocations 1 |-- [GCX_GROUP_ENDPOINT].......................................1467 ms. Invocations 1 |-- [GC_IVR]....................................................655 ms. Invocations 1 |-- [G_PARTY_HISTORY].........................................23727 ms. Invocations 8 |-- [GC_FORMAT].................................................780 ms. Invocations 1 |-- [GC_FOLDER]................................................1076 ms. Invocations 1 |-- [GC_FIELD]..................................................312 ms. Invocations 1 |-- [GC_APPLICATION]...........................................1061 ms. Invocations 1 |-- [GX_SESSION_ENDPOINT]......................................5601 ms. Invocations 6 |-- [GCX_LIST_TREATMENT].......................................1389 ms. Invocations 1 |-- [G_IS_LINK_HISTORY]........................................1092 ms. Invocations 7 |--+[MergeMove]...................................................3089 ms. Invocations 1 |-- [insertIRs4ConCalls2TmpMerge](0)............................421 ms. Invocations 1 |--+[insertClassifiedIsLinks](3877).............................187 ms. Invocations 1 |-- [classify links - join](3877).............................78 ms. Invocations 1 |-- [classify links - insert](3877)...........................93 ms. Invocations 1 |-- [insertIsLinks2TmpMerge](0)..................................62 ms. Invocations 1 |-- [insertIRs2TmpMerge](0).....................................234 ms. Invocations 1 |-- [updateRootirInTmpMerge2](1821).............................172 ms. Invocations 1 |-- [insertPendingRootIrs2TmpMerge](1757).......................141 ms. Invocations 1 |-- [insertNotPendingRootIrs2TmpMerge](23).......................63 ms. Invocations 1 |-- [insertNotPendingLinks](24).................................265 ms. Invocations 1 |-- [deleteNotPendingLinks](24)..................................31 ms. Invocations 1 |-- [G_IR_copyMerged](23)........................................78 ms. Invocations 1 |-- [G_IR_deleteMerged](23)......................................47 ms. Invocations 1 |-- [G_CALL_copyMerged](24).....................................297 ms. Invocations 1 |-- [G_CALL_deleteMerged](24)....................................46 ms. Invocations 1 |-- [G_IR_copyStuckRecords](0)..................................188 ms. Invocations 1 |-- [G_CALL_copyStuckRecords](0)................................312 ms. Invocations 1 |-- [DESTROY]............................................................733 ms. Invocations 1 2020-04-02 21:32:51,999 DEBUG main 35000 SQLUtils.queryAndScan (SELECT CTL_SCHEMA_INFO.SCHEMA_VERSION FROM (select 1 as dummy from DUAL) DUAL LEFT OUTER JOIN ginfo.CTL_SCHEMA_INFO ON CTL_SCHEMA_INFO.SCHEMA_NAME = 'Genesys Info Mart',302366050) - enter 2020-04-02 21:32:51,999 DEBUG main 35000 SQLUtils.queryAndScan - exit(elapsed 0 ms) returning 1 2020-04-02 21:32:51,999 INFO main 25000 Reading CTL_SCHEMA_INFO.Genesys Info Mart=8.1.402.01 2020-04-02 21:32:51,999 DEBUG main 35000 SQLUtils.queryAndScan (SELECT CTL_SCHEMA_INFO.SCHEMA_VERSION FROM (select 1 as dummy from DUAL) DUAL LEFT OUTER JOIN ginfo.CTL_SCHEMA_INFO ON CTL_SCHEMA_INFO.SCHEMA_NAME = 'UPDATE_IDB_FOR_GIM',302366050) - enter 2020-04-02 21:32:51,999 DEBUG main 35000 SQLUtils.queryAndScan - exit(elapsed 0 ms) returning 1 2020-04-02 21:32:51,999 INFO main 25000 Reading CTL_SCHEMA_INFO.UPDATE_IDB_FOR_GIM=8.1.400.01 2020-04-02 21:32:51,999 INFO main 31201 GIM Server - current state is TRANSFORM. 2020-04-02 21:32:51,999 INFO ecp-1-872908 20103 Job 'Job_TransformGIM' started. Version='8.1.402.08' built '2015-03-11 18:50:32 UTC'. 2020-04-02 21:32:51,999 INFO ecp-1-872908 30041 Job step INIT started. 2020-04-02 21:32:51,999 INFO ecp-1-872908 30000 JobTransform: initializing... 2020-04-02 21:32:52,093 DEBUG ecp-1-872908 35000 Executing {call DBMS_LOCK.ALLOCATE_UNIQUE(?,?,864000)} 2020-04-02 21:32:52,108 DEBUG ecp-1-872908 35000 Executing {?=call DBMS_LOCK.REQUEST(?,?,10,false)} 2020-04-02 21:32:52,108 DEBUG ecp-1-872908 35000 OPEN: 2004326974; count 1 2020-04-02 21:32:52,108 INFO ecp-1-872908 30000 JobTransform: reading extract HWM info... 2020-04-02 21:32:52,108 DEBUG ecp-1-872908 35000 SQLUtils.queryAndScan (SELECT CTL_EXTRACT_HWM.TABLE_NAME,CTL_EXTRACT_HWM.DATA_SOURCE_KEY,CTL_EXTRACT_HWM.DATA_SOURCE_TYPE,CTL_EXTRACT_HWM.EXTRACT_START_TIME,CTL_EXTRACT_HWM.EXTRACT_END_TIME,CTL_EXTRACT_HWM.ROW_COUNT,CTL_EXTRACT_HWM.MAX_TS,CTL_EXTRACT_HWM.JOB_ID,CTL_EXTRACT_HWM.JOB_NAME,CTL_EXTRACT_HWM.JOB_VERSION,CTL_EXTRACT_HWM.DAP_NAME,CTL_EXTRACT_HWM.DSS_ID,CTL_EXTRACT_HWM.ICON_DBID,CTL_EXTRACT_HWM.PROVIDERTAG FROM ginfo.CTL_EXTRACT_HWM WHERE CTL_EXTRACT_HWM.DATA_SOURCE_KEY > 1 AND ( NOT EXISTS (SELECT 1 FROM ginfo.CTL_DS WHERE CTL_DS.DATA_SOURCE_KEY = CTL_EXTRACT_HWM.DATA_SOURCE_KEY)),2004326974) - enter 2020-04-02 21:32:52,249 DEBUG ecp-1-872908 35000 SQLUtils.queryAndScan - exit(elapsed 141 ms) returning 0 2020-04-02 21:32:52,249 DEBUG ecp-1-872908 35000 SQLUtils.queryAndScan (SELECT COALESCE(MIN(G_IR.TERMINATED_TS),0) FROM ginfo.G_IR,2004326974) - enter 2020-04-02 21:32:52,405 DEBUG ecp-1-872908 35000 SQLUtils.queryAndScan - exit(elapsed 156 ms) returning 1, (1585856618) 2020-04-02 21:32:52,405 DEBUG ecp-1-872908 35000 SQLUtils.queryAndScan (SELECT CTL_EXTRACT_HWM_JOIN_CTL_DS.DATA_SOURCE_TYPE,MIN(CTL_EXTRACT_HWM_JOIN_CTL_DS.MAX_TS) FROM (SELECT TABLE_NAME,(DS_KEY) DATA_SOURCE_KEY,(MAX(MAX_TS)) MAX_TS,(MAX(MAX_TIME)) MAX_TIME,(MAX(DATA_SOURCE_TYPE)) DATA_SOURCE_TYPE,(MAX(DS_DBID)) DS_DBID,(MAX(DS_DBID_PRIM)) DS_DBID_PRIM,(MAX(DS2_DBID)) DS2_DBID FROM (SELECT CTL_EXTRACT_HWM.*,(CTL_EXTRACT_HWM.DATA_SOURCE_KEY) DS_KEY,CTL_DS.DS_DBID,CTL_DS.DS_DBID_PRIM,CTL_DS.DS2_DBID FROM ginfo.CTL_EXTRACT_HWM INNER JOIN ginfo.CTL_DS ON CTL_EXTRACT_HWM.DATA_SOURCE_KEY = CTL_DS.DATA_SOURCE_KEY WHERE CTL_EXTRACT_HWM.DATA_SOURCE_TYPE <> 4 UNION ALL SELECT CTL_EXTRACT_HWM.*,(99) DS_KEY,(99) DS_DBID,(0) DS_DBID_PRIM,(0) DS2_DBID FROM ginfo.CTL_EXTRACT_HWM WHERE CTL_EXTRACT_HWM.DATA_SOURCE_TYPE = 4) CTL_EXTRACT_HWM_JOIN_CTL_DS GROUP BY TABLE_NAME,DS_KEY) CTL_EXTRACT_HWM_JOIN_CTL_DS GROUP BY CTL_EXTRACT_HWM_JOIN_CTL_DS.DATA_SOURCE_TYPE,2004326974) - enter 2020-04-02 21:32:52,529 DEBUG ecp-1-872908 35000 SQLUtils.queryAndScan - exit(elapsed 124 ms) returning 3 2020-04-02 21:37:54,411 INFO Agg.NewData 25000 Got addFactAvailNotification3: 1,585,860,300 1,585,863,000 INTERACTION_RESOURCE_FACT false true 2020-04-02 21:37:54,411 INFO ecp-1-885730 25000 notifyFactAvailable: INTERACTION_RESOURCE_FACT , online_media=false, interval_agg= true, current_time=1585863474, start=1585849500, end=1585863000, range= 14400, delay= 124 (List item=1) 2020-04-02 21:37:54,411 INFO Agg.NewData 25000 Got addFactAvailNotification3: 1,585,849,500 1,585,863,000 INTERACTION_RESOURCE_FACT true false 2020-04-02 21:37:54,411 INFO ecp-1-885730 25000 notifyFactAvailable: INTERACTION_RESOURCE_FACT , online_media=false, interval_agg=false, current_time=1585863474, start=1585849500, end=1585849500, range= 900, delay= 124 (List item=1) 2020-04-02 21:37:54,411 INFO Agg.NewData 25000 Got addFactAvailNotification3: 1,585,849,500 1,585,849,500 INTERACTION_RESOURCE_FACT false false 2020-04-02 21:37:54,411 INFO ecp-1-885730 25000 notifyFactAvailable: INTERACTION_RESOURCE_FACT , online_media=false, interval_agg=false, current_time=1585863474, start=1585861200, end=1585863000, range= 2700, delay= 124 (List item=2) 2020-04-02 21:37:54,411 INFO Agg.NewData 25000 Got addFactAvailNotification3: 1,585,861,200 1,585,863,000 INTERACTION_RESOURCE_FACT false false 2020-04-02 21:37:54,411 DEBUG ecp-1-885730 35000 SQLUtils.executeUpdate (INSERT INTO ginfo.CTL_AUDIT_LOG (AUDIT_KEY,JOB_ID,CREATED_TS,CREATED,PROCESSING_STATUS_KEY,MIN_START_DATE_TIME_KEY,MAX_START_DATE_TIME_KEY,MAX_CHUNK_TS,DATA_SOURCE_KEY,ROW_COUNT,INSERTED) VALUES (?,?,?,?,?,?,?,?,?,?,?),317443306,[CONSISTENT_READ_FAILURE]) - enter 2020-04-02 21:37:54,411 DEBUG ecp-1-885730 35000 SQLUtils.executeUpdate - exit(elapsed 0 ms) returning 1 2020-04-02 21:37:54,426 DEBUG ecp-1-885730 35000 COMMIT: 317443306; called by com.genesyslab.gim.etl.jobs.transform.TransformTask.commitAndRelease(TransformTask.java:165)
What Query should i use to look for a certain directory in Linux Servers where the data is mounted? So basically suppose linux server name is abdhw003... so please help me for the query: ind... See more...
What Query should i use to look for a certain directory in Linux Servers where the data is mounted? So basically suppose linux server name is abdhw003... so please help me for the query: index=*_nix_xxxx sourcetype=df host=abdhw003. So in this case I want to find the "/doc" folder in that server, What would be the query for that? Any help is appreciated, thanks,
I am writing a modular input using the App Builder (which is super cool) and I need to make my API calls to be in sync with when Splunk schedules the script to run. I notice there are allot of helper... See more...
I am writing a modular input using the App Builder (which is super cool) and I need to make my API calls to be in sync with when Splunk schedules the script to run. I notice there are allot of helper functions where you can mine data from your config you create, however I do not see one for interval. Am I missing it?
we are looking for best approach and steps for migrating splunk 6.2.15 from solaris to splunk 8.x on Linux server, Can you please provide the options to upgrade with steps ? Also is there any change ... See more...
we are looking for best approach and steps for migrating splunk 6.2.15 from solaris to splunk 8.x on Linux server, Can you please provide the options to upgrade with steps ? Also is there any change needed to splunk universal forwarder agents to point to linux splunk server? Thank You.
Does AppDynamics integrate with CA Spectrum? CA Spectrum is the application used in our Data Center to receive application alerts and CA Spectrum integrates with ServiceNow and create incidents for t... See more...
Does AppDynamics integrate with CA Spectrum? CA Spectrum is the application used in our Data Center to receive application alerts and CA Spectrum integrates with ServiceNow and create incidents for the alerts/events. I was wondering if AppDynamics can send SNMP traps or emails to Spectrum?  Thank you. Ferhana ^ Edited by @Ryan.Paredez to make the title clear
I'm trying to troubleshoot an repeated authentication failure by specific users(s). When I try to filter the search with (user account status=failure) I see all failure events with the details below... See more...
I'm trying to troubleshoot an repeated authentication failure by specific users(s). When I try to filter the search with (user account status=failure) I see all failure events with the details below. host - domain controllers (AD servers) of that region where user is in source - WinEventLog:Security sourcetype - WinEventLog:Security Only concern is, i don't see the src_ip field showing any IP address. It just has a dash (hyphen) The other fields are fine as listed below. src - domain controllers src_ip - dash (hypen) src_user - account name of the user in the AD Can someone help why the src_ip isn't populating ? Is that because Splunk couldn't determine it ? or any config tweaks has to be done ? Please help !
Hi there, I have just had a trail version of Splunk Enterprise installed on my laptop. The install went ok, but I cant get to the login page or open the application, see error. Any ideas? its on a... See more...
Hi there, I have just had a trail version of Splunk Enterprise installed on my laptop. The install went ok, but I cant get to the login page or open the application, see error. Any ideas? its on a corporate network. Can’t reach this page •Make sure the web address http://localhost:8000 is correct •Search for this site on Bing •Refresh the page More information More information The connection to the website was reset. Error Code: INET_E_DOWNLOAD_FAILURE
I have 3 types of events as below: Apr 2 11:35:28 vg1 : %ASA-4-113019: Group = EMPLOYEE, Username = karrc03, IP = ..., Session disconnected. Session Type: SSL, Duration: 2h:15m:12s, Bytes xmt: 593... See more...
I have 3 types of events as below: Apr 2 11:35:28 vg1 : %ASA-4-113019: Group = EMPLOYEE, Username = karrc03, IP = ..., Session disconnected. Session Type: SSL, Duration: 2h:15m:12s, Bytes xmt: 59389646, Bytes rcv: 14229526, Reason: Idle Timeout Apr 2 11:35:23 vg1 : %ASA-4-722051: Group User IP <...> IPv4 Address <...> IPv6 address <::> assigned to session Apr 2 11:03:47 vg2 : %ASA-4-113005: AAA user authentication Rejected : reason = Invalid password : server = ... : user = SHAFED61 : user IP = ... Now, I would like to fetch the events based on the words: Session disconnected, assigned to session, and Rejected in the separate column "EventType" by user Your help would be appreciated. Thanks in advance
Good evening, How to extract couple of subject email keywords from specific field "message_subject" Let's consider the below three dump subject emails that the user receive/send: CEO urgent emai... See more...
Good evening, How to extract couple of subject email keywords from specific field "message_subject" Let's consider the below three dump subject emails that the user receive/send: CEO urgent email for the invitation CEO need the request urgently secret national project I want to have count not for the whole subject email but only to visualize the number of "secret" and "urgent" without the full subject email and count per hour My query index=mail-pri sourcetype="MSExchange*" sender=* OR recipient=* | search message_subject IN ("*secret*","*urgent*") | search NOT sender IN ("noreply@xyz.com","info@xyz.com") | timechart span=1h count by Message_subject The number of count I get (For example) Which is three counts CEO urgent email for the invitation CEO need the request urgently secret national project What I want to achieve is to get count like. Urgent 2 Secret 1 for your kind support and thanks
In our inputs.conf we have a default ignoreOlderThan=3d value set, but I would like to override that default for a specific monitor (source) entry. I have tried using ignoreOlderThan=0d as well as ig... See more...
In our inputs.conf we have a default ignoreOlderThan=3d value set, but I would like to override that default for a specific monitor (source) entry. I have tried using ignoreOlderThan=0d as well as ignoreOlderThan=disabled but they both result in an "Invalid value for parameter 'ignoreOlderThan'" error in the splunkd.log. Is there a value that equates to disabling the ignoreOlderThan setting for a specific monitor/source without applying it to the entire inputs.conf?
I need to setup a TCP data input and I need to ensure that it is SSL/TLS. I understand that I can add a stanza to an inputs.conf file as referenced in this post: https://answers.splunk.com/answers... See more...
I need to setup a TCP data input and I need to ensure that it is SSL/TLS. I understand that I can add a stanza to an inputs.conf file as referenced in this post: https://answers.splunk.com/answers/684045/how-to-enable-tcp-data-input-with-ssl.html?utm_source=typeahead&utm_medium=newquestion&utm_campaign=no_votes_sort_relev My question is - which inputs.conf file? The data is coming in to my Search Head server and there are a bunch of apps installed there, each with their own inputs.conf file. Which one controls the TCP Data Inputs? Thanks.