All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Resolved!  Two issues: (1) Don't trust regex you find on the Internet, (2) Trust but verify Turns out, I had assumed what I see in "Source" would line up with the data Splunk processed.  It lined ... See more...
Resolved!  Two issues: (1) Don't trust regex you find on the Internet, (2) Trust but verify Turns out, I had assumed what I see in "Source" would line up with the data Splunk processed.  It lined up with the regex ("New Process Name:  "has a space after the colon.  In actuality, this is a tab. I'm using this now.  Could probably use "\t" but playing it safe and allowing one or more whitespace. blacklist2 = EventCode="4688" Message="New Process Name:\s+C:\\Program Files\\SplunkUniversalForwarder\\bin\\splunk-(?:powershell|regmon|admon|netmon|MonitorNoHandle).exe" Above is what I ended up with.  Not perfect, but good enough for a POC, and actually works, at least, in the current environment. Cheers!
Nice solution. I'm, working with similar situation. What does this situation look like with checkbox?
I have two query tables table 1 index="k8s_main" namespace="app02013" "EConcessionItemProcessingStartedHandler.createRma PH successfully created RMA" NOT [search index="k8s_main" namespace="app0201... See more...
I have two query tables table 1 index="k8s_main" namespace="app02013" "EConcessionItemProcessingStartedHandler.createRma PH successfully created RMA" NOT [search index="k8s_main" namespace="app02013" "NonCustomerOrderShippingLabelGeneratedEventsUtil.processShippingLabelEvent Successfully published" | fields LPN] | rex "LPN\": \"(?<LPN>[^,]+)\"\," | rex "location\": \"(?<location>[^,]+)\"\," | rex "orderNumber\": \"(?<orderNumber>[^,]+)\"\," | rex "orderLineId\": \"(?<orderLineId>[^,]+)\"\," | dedup orderLineId | eval LPN = replace(LPN, "\\[|\\]", "") | eval location = replace(location, "\\[|\\]", "") | eval orderNumber = replace(orderNumber, "\\[|\\]", "") | eval orderLineId = replace(orderLineId, "\\[|\\]", "") | table LPN location orderNumber orderLineId table 2 index="k8s_main" namespace="app02013" "Published successfully event=[order-events-avro / com.nordstrom.customer.event.OrderLineReturnReceived]" ECONCESSION | rex "orderLineId\": \"(?<orderLineId>[^,]+)\"\," | rex "orderNumber\": \"(?<orderNumber>[^,]+)\"\," | dedup orderLineId | eval orderNumber = replace(orderNumber, "\"", "") | eval orderLineId = replace(orderLineId, "\"", "") | table orderNumber orderLineId here is my join query: index="k8s_main" namespace="app02013" "EConcessionItemProcessingStartedHandler.createRma PH successfully created RMA" NOT [search index="k8s_main" namespace="app02013" "NonCustomerOrderShippingLabelGeneratedEventsUtil.processShippingLabelEvent Successfully published" | fields LPN] | rex "LPN\": \"(?<LPN>[^,]+)\"\," | rex "location\": \"(?<location>[^,]+)\"\," | rex "orderNumber\": \"(?<orderNumber>[^,]+)\"\," | rex "orderLineId\": \"(?<orderLineId>[^,]+)\"\," | dedup orderLineId | eval LPN = replace(LPN, "\\[|\\]", "") | eval location = replace(location, "\\[|\\]", "") | eval orderNumber = replace(orderNumber, "\\[|\\]", "") | eval orderLineId = replace(orderLineId, "\\[|\\]", "") | table LPN location orderNumber orderLineId | join left=L right=R where L.orderLineId = R.orderLineId [search index="k8s_main" namespace="app02013" "Published successfully event=[order-events-avro / com.nordstrom.customer.event.OrderLineReturnReceived]" ECONCESSION | rex "orderLineId\": \"(?<orderLineId>[^,]+)\"\," | rex "orderNumber\": \"(?<orderNumber>[^,]+)\"\," | dedup orderLineId | eval orderNumber = replace(orderNumber, "\"", "") | eval orderLineId = replace(orderLineId, "\"", "") | table orderNumber orderLineId] Each table returns unique row. But the result of the above query returns less data. Please help to find the problem.
god blessit, i feel so dumb now lol.  Fixing the "a" from upper to lowercase was all I needed to do.  Thank you for catching that, i didn't realize that the capitalization would have an effect, but I... See more...
god blessit, i feel so dumb now lol.  Fixing the "a" from upper to lowercase was all I needed to do.  Thank you for catching that, i didn't realize that the capitalization would have an effect, but I see now why it does.   Thanks again, everyone works great now. 
Using YYYY-MM-DD HH:MM:SS will yield incorrect results with the current dashboard studio version due to the overlap of Month and Minute. The correct way would be to use: YYYY-MM-DD HH:mm:ss @sbar... See more...
Using YYYY-MM-DD HH:MM:SS will yield incorrect results with the current dashboard studio version due to the overlap of Month and Minute. The correct way would be to use: YYYY-MM-DD HH:mm:ss @sbarnes_nj was correct in stating the format reference here: https://momentjs.com/docs/#/displaying/
Since multiple lines with the event start with "Importer", we can't use that to break the event. I suggest breaking after "Elapsed Time".  Try these settings SHOULD_LINEMERGE = false LINE_BREAKER = ... See more...
Since multiple lines with the event start with "Importer", we can't use that to break the event. I suggest breaking after "Elapsed Time".  Try these settings SHOULD_LINEMERGE = false LINE_BREAKER = Elapsed Time:\d+\/\d+\/\d+ \d+:\d+:\d+ \w\w([\r\n]+) MAX_TIMESTAMP_LOOKAHEAD = 23 TIME_PREFIX = Started:\s+ TIME_FORMAT = %m/%d/%Y %I:%M:%S %p EVENT_BREAKER = Elapsed Time:\d+\/\d+\/\d+ \d+:\d+:\d+ \w\w([\r\n]+) EVENT_BREAKER_ENABLE = true KV_MODE = none
I'm trying to allow users to have a limited search against indexes they don't have access to. This might very well be the problem (and maybe it's not possible), but I'm hoping the solution below shou... See more...
I'm trying to allow users to have a limited search against indexes they don't have access to. This might very well be the problem (and maybe it's not possible), but I'm hoping the solution below should work and I'm simply missing a user capability/permission (unrelated to the index access) somewhere. Set up a saved search (using variables) to run as the owner (user 'A' that does have access to the indexes). Set up a dashboard to receive those variables and pass them along to a search panel using a search similar to '| savedsearch searchname var1=$v1$ var2=$v2$' . The dashboard works when running as the user with access to the indexes (user 'A'), so the search and variable passthrough appear to be working. When I run as a test user (with only default 'user' Splunk capabilities, no index access) I get no results. Is what I am trying to accomplish possible? If it is, does anyone have any guidance on what I might be doing wrong? I asked this in the community Slack as well. I'm trying to avoid a summary index if possible as the long term goal is to have multiple users (without index permissions) be able to run the search specific to them without allowing each user access to all other users' searches. An example scenario is viewing a users web history as seen from a firewall or secure web gateway (allows vs blocks), and limiting the search to a logged in user ($env:user$). This could also be used by a support center (group of users) doing first level troubleshooting who might not need access to all the logs available in an index. 
Unfortunately, as you can see, it's still splitting the two lines.
Can you build your dashboard in SimpleXML / Classic (as token management is a little better there)?
Hi team, I am not getting the event break at required. my requirement is to break event from log file which start with "Importer:" and end with "Elapsed Time:" below is config i did. Please suggest... See more...
Hi team, I am not getting the event break at required. my requirement is to break event from log file which start with "Importer:" and end with "Elapsed Time:" below is config i did. Please suggest if any change in props config or I am good to go. SHOULD_LINEMERGE=false LINE_BREAKER=([\r\n]+)\S+\s\S+\s\W+ MAX_TIMESTAMP_LOOKAHEAD=-1 TIME_PREFIX=^\*Importer:\s+ TIME_FORMAT=%m/%d/%Y %I:%M:%S %p EVENT_BREAKER = ([\n\r]*Elapsed Time:\.) EVENT_BREAKER_ENABLE = true KV_MODE=none sample log: Importer: DealerLoansImporter Started : 6/6/2024 4:10:16 AM Begin Reading Data File: \\nao.global.gmacfs.com\AllyApps\Ipartners.Pd\Facts_to_Carrs\GC01RD21.DLR_LOAN_20240605223729.DAT : 6/6/2024 4:10:16 AM End Reading Data File: \\nao.global.gmacfs.com\AllyApps\Ipartners.Pd\Facts_to_Carrs\GC01RD21.DLR_LOAN_20240605223729.DAT : 6/6/2024 4:10:16 AM Beginning Dealer Loans truncate table : 6/6/2024 4:10:16 AM Completed Dealer Loans truncate table : 6/6/2024 4:10:16 AM Begin Loading Database : 6/6/2024 4:10:16 AM 1757 Total Records Inserted : 6/6/2024 4:10:17 AM Beginning RefreshDealerLoansMonthEnd : 6/6/2024 4:10:17 AM Completed RefreshDealerLoansMonthEnd : 6/6/2024 4:10:18 AM Beginning RefreshDealerLoan : 6/6/2024 4:10:18 AM Completed RefreshDealerLoan : 6/6/2024 4:10:21 AM Beginning Adv_RefreshProposalCreditLineSummaryFromDealerLoan : 6/6/2024 4:10:21 AM Completed Adv_RefreshProposalCreditLineSummaryFromDealerLoan : 6/6/2024 4:10:22 AM Beginning RefreshBorrowerLoanForDefault : 6/6/2024 4:10:22 AM Completed RefreshBorrowerLoanForDefault : 6/6/2024 4:10:22 AM Beginning RefreshBorrowerLoanForDCVR : 6/6/2024 4:10:22 AM Completed RefreshBorrowerLoanForDCVR : 6/6/2024 4:10:23 AM Importer: DealerLoansImporter Ended : 6/6/2024 4:10:24 AM Importer: DealerLoansImporter Elapsed Time: 00:00:07.4098788 **************************************************************************************************** **************************************************************************************************** Importer: AdvantageDimensionImporter Started : 6/6/2024 4:10:24 AM Begin Reading Data File: \\nao.global.gmacfs.com\AllyApps\Ipartners.Pd\Facts_to_Carrs\ADV_DIM_20240606030006.DAT : 6/6/2024 4:10:24 AM End Reading Data File: \\nao.global.gmacfs.com\AllyApps\Ipartners.Pd\Facts_to_Carrs\ADV_DIM_20240606030006.DAT : 6/6/2024 4:10:24 AM Beginning AdvantageDimension truncate table : 6/6/2024 4:10:24 AM Completed AdvantageDimension truncate table : 6/6/2024 4:10:24 AM Begin Loading Database : 6/6/2024 4:10:24 AM 411 Total Records Inserted : 6/6/2024 4:10:24 AM Beginning refreshing Dimensions : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshFranchiseFromDimension : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshFranchiseFromDimension : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshDealerCommercialPrivilegesTypeFromDimension : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshDealerCommercialPrivilegesTypeFromDimension : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshBACManufacturerType : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshBACManufacturerType : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshStateFromDimensions : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshStateFromDimensions : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshFormOfBusinessTypeFromDimension : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshFormOfBusinessTypeFromDimension : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshTAATypeFromDimension : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshTAATypeFromDimension : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshGuarantorAssociationTypeFromDimension : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshGuarantorAssociationTypeFromDimension : 6/6/2024 4:10:24 AM Beginning FetchNewDealerStatusAdvantage : 6/6/2024 4:10:24 AM Completed FetchNewDealerStatusAdvantage : 6/6/2024 4:10:24 AM Beginning FetchDeletedDealerStatusAdvantage : 6/6/2024 4:10:24 AM Completed FetchDeletedDealerStatusAdvantage : 6/6/2024 4:10:24 AM Beginning FetchDealerStatusAdvantageChanges : 6/6/2024 4:10:24 AM Completed FetchDealerStatusAdvantageChanges : 6/6/2024 4:10:25 AM Completed refreshing Dimensions : 6/6/2024 4:10:25 AM Importer: AdvantageDimensionImporter Ended : 6/6/2024 4:10:25 AM Importer: AdvantageDimensionImporter Elapsed Time: 00:00:00.9732853 **************************************************************************************************** **************************************************************************************************** Importer: SmartAuctionImporter Started : 6/6/2024 4:10:25 AM Importer: SmartAuctionImporter Ended : 6/6/2024 4:10:25 AM Importer: SmartAuctionImporter Elapsed Time: 00:00:00.0312581 ****************************************************************************************************
The query looks OK, but its speed also depends on how many events it is processing. Try running the search more often over a smaller time range. Try to reduce "indexes" to the smallest set of indexe... See more...
The query looks OK, but its speed also depends on how many events it is processing. Try running the search more often over a smaller time range. Try to reduce "indexes" to the smallest set of indexes that contain relevant Windows events. Consider removing the lookup and hard-coding the relevant event codes. index IN (indexes) sourcetype=xmlwineventlog sAMAccountName IN (_x*, x_*, lx*, hh*) | eval action = case(event_code=x, "login_failure", event_code=y, "lockout") | stats count(eval(action=="login_failure")) as failure_count, count(eval(action=="lockout")) as lockout_count by sAMAccountName | where failure_count >= 3 OR lockout_count > 0
Hi Dhana, I'm currently facing the same issue. Did you find a way to solve this issue without using a load balancer? Thanks.
Yes. You can name multiple capture groups in one rex statement.  e.g. | rex field=my_field "foo:\s+\"(?<first_capture>[^\"]+)\",\s+bar:\s+(?<second_capture>[^\"]+)"
index="webmethods_prd" source="/apps/WebMethods/IntegrationServer/instances/default/logs/DFO.log" |eval timestamp=strftime(_time, "%Y-%m-%d %H:00") | stats values(B2BUnknownTrxCount) by timestamp
  index="webmethods_prd" source="/apps/WebMethods/IntegrationServer/instances/default/logs/DFO.log" | eval timestamp=strftime(_time, "%F"),hour=strftime(_time, "%H,%M")  | stats list(hour) as hour... See more...
  index="webmethods_prd" source="/apps/WebMethods/IntegrationServer/instances/default/logs/DFO.log" | eval timestamp=strftime(_time, "%F"),hour=strftime(_time, "%H,%M")  | stats list(hour) as hour, list(B2BUnknownTrxCount) by timestamp
this is the log data   i want a report like this:     my current query is : index="webmethods_prd" source="/apps/WebMethods/IntegrationServer/instances/default/logs/DFO.log" |eval times... See more...
this is the log data   i want a report like this:     my current query is : index="webmethods_prd" source="/apps/WebMethods/IntegrationServer/instances/default/logs/DFO.log" |eval timestamp=strftime(_time, "%F") | stats values(B2BUnknownTrxCount) by timestamp it giving report like this: I need to add date time in hh:mm in a chart.  Please help to update my query
Hello, I have been asked to optimize this logic because is taking too long to run. I am not sure how else can I write to make it run faster. It's not throwing any errors it just takes a long time to ... See more...
Hello, I have been asked to optimize this logic because is taking too long to run. I am not sure how else can I write to make it run faster. It's not throwing any errors it just takes a long time to run. Any help would be highly appreciate. Thanks!   index IN (indexes) sourcetype=xmlwineventlog sAMAccountName IN (_x*, x_*, lx*, hh*) | lookup mas_pam_eventcode.csv event_code AS EventCode OUTPUT action | stats count(eval(action=="login_failure")) as failure_count, count(eval(action=="lockout")) as lockout_count by sAMAccountName | where failure_count >= 3 OR lockout_count > 0
Sample dashboard { "visualizations": { "viz_3hKq7uoX": { "type": "splunk.singlevalue", "options": {}, "dataSources": { "primary": "ds_... See more...
Sample dashboard { "visualizations": { "viz_3hKq7uoX": { "type": "splunk.singlevalue", "options": {}, "dataSources": { "primary": "ds_a2mWNgri" } }, "viz_JUFcFWVl": { "type": "splunk.singlevalue", "options": {}, "dataSources": { "primary": "ds_LbaP4o2H" } } }, "dataSources": { "ds_a2mWNgri": { "type": "ds.search", "options": { "query": "| makeresults\n| eval selected_total = count($element$)\n| table selected_total" }, "name": "Search_eval" }, "ds_LbaP4o2H": { "type": "ds.search", "options": { "query": "| makeresults\n| stats count($element$) as selected_total\n| table selected_total" }, "name": "Search_stats" } }, "defaults": { "dataSources": { "ds.search": { "options": { "queryParameters": { "latest": "$global_time.latest$", "earliest": "$global_time.earliest$" } } } } }, "inputs": { "input_global_trp": { "type": "input.timerange", "options": { "token": "global_time", "defaultValue": "-24h@h,now" }, "title": "Global Time Range" }, "input_QDGCxYVt": { "options": { "items": [ { "label": "Apple", "value": "a" }, { "label": "Banana", "value": "b" }, { "label": "Coconut", "value": "c" }, { "label": "Dragonfruit", "value": "d" }, { "label": "Elderberry", "value": "e" }, { "label": "Fig", "value": "f" }, { "label": "Grape", "value": "g" } ], "token": "element" }, "title": "Select something", "type": "input.multiselect" } }, "layout": { "type": "absolute", "options": { "width": 1440, "height": 960, "display": "auto" }, "structure": [ { "item": "viz_3hKq7uoX", "type": "block", "position": { "x": 20, "y": 20, "w": 250, "h": 250 } }, { "item": "viz_JUFcFWVl", "type": "block", "position": { "x": 290, "y": 20, "w": 250, "h": 250 } } ], "globalInputs": [ "input_global_trp", "input_QDGCxYVt" ] }, "description": "test", "title": "test" }
Yes, column 3 should equal column 5. *** The last row contained a mistake, which I have corrected. Abc should always equal abc That's how the clean tables look: ID col5 a abc b ... See more...
Yes, column 3 should equal column 5. *** The last row contained a mistake, which I have corrected. Abc should always equal abc That's how the clean tables look: ID col5 a abc b abc b xyz f abc i abc i xyz     ID Col3 col4 a abc No a xyz Yes b xyz No b fgh Yes b abc No f abc No f xyz No i xyz Yes i abc No
Hi @Scott.Lucier, I wanted to share this AppD Documentation. Let me know if it helps out.  https://docs.appdynamics.com/appd/23.x/23.11/en/appdynamics-essentials/alert-and-respond/health-rules/h... See more...
Hi @Scott.Lucier, I wanted to share this AppD Documentation. Let me know if it helps out.  https://docs.appdynamics.com/appd/23.x/23.11/en/appdynamics-essentials/alert-and-respond/health-rules/health-rule-schedules