All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

You can give this a try https://github.com/dtburrows3/Splunk_Expand_Macros_Command A work in progress but seems to cover majority of things I have thrown at it so far.
Thanks a million!
Thanks Yuan, Issue I am seeing is that value for "location" is coming as empty. Whereas I can see there is data in raw for location. What can be the issue? Thanks!
There is more than one way to do that.  If you want to create a new field, use eval with relative_time and strptime. <<base search>> | eval SummaryDate = relative_time(strptime(Date, "%d/%m/%Y"), "@... See more...
There is more than one way to do that.  If you want to create a new field, use eval with relative_time and strptime. <<base search>> | eval SummaryDate = relative_time(strptime(Date, "%d/%m/%Y"), "@w+1d") | chart sum(results) over SummaryDate Since "@w" snaps to Sunday, we use "+1d" to start the week on Monday. Here's another way using the bin command. <<base search>> | bin _time span=1w ``` Convert _time from Sundays to Mondays ``` | eval SummaryDate = _time+86400 | chart sum(results) over SummaryDate  
I am trying to write a splunk search to pull what rules a particular user is hitting. This search is helping with that BUT everything is coming through as a urlrulelabel. When I move apprulelabel to ... See more...
I am trying to write a splunk search to pull what rules a particular user is hitting. This search is helping with that BUT everything is coming through as a urlrulelabel. When I move apprulelabel to the start of the line, everything comes through as an apprulelabel. When I dive into the events, I see there are other rules showing, but they arent populating in the statistics table. I would like to have each rule come through as its own.  index=zscaler sourcetype=zscalernss-web user=* | eval rule_type=case(isnotnull(urlrulelabel), "urlurlelabel", isnotnull(apprulelabel), "apprulelabel", isnotnull(rulelabel), "rulelabel", true(), "unknown") | eval rule=coalesce(apprulelabel, urlrulelabel, rulelabel) | stats count by rule, rule_type | rename rule as Rule, rule_type as "Type of Rule", count as "Hit Count" | sort - "Hit Count" Thank you in advance
eval is the command to use to add a new field to an event.  Use the relative_time function to help set the value. | eval newField = relative_time(now(), "-7d@d")  
Here's what I found (with the help of Perplexity engine) - saved me... : The fields_list in the transforms.conf stanza should match the column names in your CSV file.
I am analyzing some .csvs which have a "date" field present. The .csvs are indexed, but the index time is pretty irrelevant, however, the "date" field is important. I am trying to create a new fie... See more...
I am analyzing some .csvs which have a "date" field present. The .csvs are indexed, but the index time is pretty irrelevant, however, the "date" field is important. I am trying to create a new field which would represent the first day of the week relative to the "date" field in my data. Ultimately I am going to create some charts over time which will use this new field. Below is an example of my desired outcome - from the date present as a field in the .csv, create a new field (Summary Date) which shows the date of Monday for that week. Date (present in .csv) Summary Date  (new field) 6/15/2024 6/10/2024 6/16/2024 6/10/2024 6/18/2024 6/17/2024   * realizing there may be more than one way to skin the cat, ultimately I am looking to group results by week in Line Charts. The query will be very basic, something like this: <base search> | chart sum(results) by "Summary Date"   And I want the date shown on the X-axis to be the first day (Monday in my case) of every week. Maybe there is an easier solution than creating a new "Summary Date" field via an eval expression, but that is where my head goes first. Any suggestions are appreciated!  
How do I add a  new field and set the value to seven days ago from the current date, snapped to the beginning of the current date? I know the date syntax should be "earliest=-7d@d", but am unsure if... See more...
How do I add a  new field and set the value to seven days ago from the current date, snapped to the beginning of the current date? I know the date syntax should be "earliest=-7d@d", but am unsure if I should use the eval command to add the field and the specific syntax. Thanks. 
Greetings to you !! I have a file in which I have a following content : My city is very good your city is also very good but but but but Now, I want only three lines to be indexed in Splunk :... See more...
Greetings to you !! I have a file in which I have a following content : My city is very good your city is also very good but but but but Now, I want only three lines to be indexed in Splunk : My city is very good your city is also very good but Since "but" has appeared multiple times , so we want to use only 1 "but" out of many I want to write props or any kind of configuration so that I can achieve this results. Kindly help !!
Resolved!  Two issues: (1) Don't trust regex you find on the Internet, (2) Trust but verify Turns out, I had assumed what I see in "Source" would line up with the data Splunk processed.  It lined ... See more...
Resolved!  Two issues: (1) Don't trust regex you find on the Internet, (2) Trust but verify Turns out, I had assumed what I see in "Source" would line up with the data Splunk processed.  It lined up with the regex ("New Process Name:  "has a space after the colon.  In actuality, this is a tab. I'm using this now.  Could probably use "\t" but playing it safe and allowing one or more whitespace. blacklist2 = EventCode="4688" Message="New Process Name:\s+C:\\Program Files\\SplunkUniversalForwarder\\bin\\splunk-(?:powershell|regmon|admon|netmon|MonitorNoHandle).exe" Above is what I ended up with.  Not perfect, but good enough for a POC, and actually works, at least, in the current environment. Cheers!
Nice solution. I'm, working with similar situation. What does this situation look like with checkbox?
I have two query tables table 1 index="k8s_main" namespace="app02013" "EConcessionItemProcessingStartedHandler.createRma PH successfully created RMA" NOT [search index="k8s_main" namespace="app0201... See more...
I have two query tables table 1 index="k8s_main" namespace="app02013" "EConcessionItemProcessingStartedHandler.createRma PH successfully created RMA" NOT [search index="k8s_main" namespace="app02013" "NonCustomerOrderShippingLabelGeneratedEventsUtil.processShippingLabelEvent Successfully published" | fields LPN] | rex "LPN\": \"(?<LPN>[^,]+)\"\," | rex "location\": \"(?<location>[^,]+)\"\," | rex "orderNumber\": \"(?<orderNumber>[^,]+)\"\," | rex "orderLineId\": \"(?<orderLineId>[^,]+)\"\," | dedup orderLineId | eval LPN = replace(LPN, "\\[|\\]", "") | eval location = replace(location, "\\[|\\]", "") | eval orderNumber = replace(orderNumber, "\\[|\\]", "") | eval orderLineId = replace(orderLineId, "\\[|\\]", "") | table LPN location orderNumber orderLineId table 2 index="k8s_main" namespace="app02013" "Published successfully event=[order-events-avro / com.nordstrom.customer.event.OrderLineReturnReceived]" ECONCESSION | rex "orderLineId\": \"(?<orderLineId>[^,]+)\"\," | rex "orderNumber\": \"(?<orderNumber>[^,]+)\"\," | dedup orderLineId | eval orderNumber = replace(orderNumber, "\"", "") | eval orderLineId = replace(orderLineId, "\"", "") | table orderNumber orderLineId here is my join query: index="k8s_main" namespace="app02013" "EConcessionItemProcessingStartedHandler.createRma PH successfully created RMA" NOT [search index="k8s_main" namespace="app02013" "NonCustomerOrderShippingLabelGeneratedEventsUtil.processShippingLabelEvent Successfully published" | fields LPN] | rex "LPN\": \"(?<LPN>[^,]+)\"\," | rex "location\": \"(?<location>[^,]+)\"\," | rex "orderNumber\": \"(?<orderNumber>[^,]+)\"\," | rex "orderLineId\": \"(?<orderLineId>[^,]+)\"\," | dedup orderLineId | eval LPN = replace(LPN, "\\[|\\]", "") | eval location = replace(location, "\\[|\\]", "") | eval orderNumber = replace(orderNumber, "\\[|\\]", "") | eval orderLineId = replace(orderLineId, "\\[|\\]", "") | table LPN location orderNumber orderLineId | join left=L right=R where L.orderLineId = R.orderLineId [search index="k8s_main" namespace="app02013" "Published successfully event=[order-events-avro / com.nordstrom.customer.event.OrderLineReturnReceived]" ECONCESSION | rex "orderLineId\": \"(?<orderLineId>[^,]+)\"\," | rex "orderNumber\": \"(?<orderNumber>[^,]+)\"\," | dedup orderLineId | eval orderNumber = replace(orderNumber, "\"", "") | eval orderLineId = replace(orderLineId, "\"", "") | table orderNumber orderLineId] Each table returns unique row. But the result of the above query returns less data. Please help to find the problem.
god blessit, i feel so dumb now lol.  Fixing the "a" from upper to lowercase was all I needed to do.  Thank you for catching that, i didn't realize that the capitalization would have an effect, but I... See more...
god blessit, i feel so dumb now lol.  Fixing the "a" from upper to lowercase was all I needed to do.  Thank you for catching that, i didn't realize that the capitalization would have an effect, but I see now why it does.   Thanks again, everyone works great now. 
Using YYYY-MM-DD HH:MM:SS will yield incorrect results with the current dashboard studio version due to the overlap of Month and Minute. The correct way would be to use: YYYY-MM-DD HH:mm:ss @sbar... See more...
Using YYYY-MM-DD HH:MM:SS will yield incorrect results with the current dashboard studio version due to the overlap of Month and Minute. The correct way would be to use: YYYY-MM-DD HH:mm:ss @sbarnes_nj was correct in stating the format reference here: https://momentjs.com/docs/#/displaying/
Since multiple lines with the event start with "Importer", we can't use that to break the event. I suggest breaking after "Elapsed Time".  Try these settings SHOULD_LINEMERGE = false LINE_BREAKER = ... See more...
Since multiple lines with the event start with "Importer", we can't use that to break the event. I suggest breaking after "Elapsed Time".  Try these settings SHOULD_LINEMERGE = false LINE_BREAKER = Elapsed Time:\d+\/\d+\/\d+ \d+:\d+:\d+ \w\w([\r\n]+) MAX_TIMESTAMP_LOOKAHEAD = 23 TIME_PREFIX = Started:\s+ TIME_FORMAT = %m/%d/%Y %I:%M:%S %p EVENT_BREAKER = Elapsed Time:\d+\/\d+\/\d+ \d+:\d+:\d+ \w\w([\r\n]+) EVENT_BREAKER_ENABLE = true KV_MODE = none
I'm trying to allow users to have a limited search against indexes they don't have access to. This might very well be the problem (and maybe it's not possible), but I'm hoping the solution below shou... See more...
I'm trying to allow users to have a limited search against indexes they don't have access to. This might very well be the problem (and maybe it's not possible), but I'm hoping the solution below should work and I'm simply missing a user capability/permission (unrelated to the index access) somewhere. Set up a saved search (using variables) to run as the owner (user 'A' that does have access to the indexes). Set up a dashboard to receive those variables and pass them along to a search panel using a search similar to '| savedsearch searchname var1=$v1$ var2=$v2$' . The dashboard works when running as the user with access to the indexes (user 'A'), so the search and variable passthrough appear to be working. When I run as a test user (with only default 'user' Splunk capabilities, no index access) I get no results. Is what I am trying to accomplish possible? If it is, does anyone have any guidance on what I might be doing wrong? I asked this in the community Slack as well. I'm trying to avoid a summary index if possible as the long term goal is to have multiple users (without index permissions) be able to run the search specific to them without allowing each user access to all other users' searches. An example scenario is viewing a users web history as seen from a firewall or secure web gateway (allows vs blocks), and limiting the search to a logged in user ($env:user$). This could also be used by a support center (group of users) doing first level troubleshooting who might not need access to all the logs available in an index. 
Unfortunately, as you can see, it's still splitting the two lines.
Can you build your dashboard in SimpleXML / Classic (as token management is a little better there)?
Hi team, I am not getting the event break at required. my requirement is to break event from log file which start with "Importer:" and end with "Elapsed Time:" below is config i did. Please suggest... See more...
Hi team, I am not getting the event break at required. my requirement is to break event from log file which start with "Importer:" and end with "Elapsed Time:" below is config i did. Please suggest if any change in props config or I am good to go. SHOULD_LINEMERGE=false LINE_BREAKER=([\r\n]+)\S+\s\S+\s\W+ MAX_TIMESTAMP_LOOKAHEAD=-1 TIME_PREFIX=^\*Importer:\s+ TIME_FORMAT=%m/%d/%Y %I:%M:%S %p EVENT_BREAKER = ([\n\r]*Elapsed Time:\.) EVENT_BREAKER_ENABLE = true KV_MODE=none sample log: Importer: DealerLoansImporter Started : 6/6/2024 4:10:16 AM Begin Reading Data File: \\nao.global.gmacfs.com\AllyApps\Ipartners.Pd\Facts_to_Carrs\GC01RD21.DLR_LOAN_20240605223729.DAT : 6/6/2024 4:10:16 AM End Reading Data File: \\nao.global.gmacfs.com\AllyApps\Ipartners.Pd\Facts_to_Carrs\GC01RD21.DLR_LOAN_20240605223729.DAT : 6/6/2024 4:10:16 AM Beginning Dealer Loans truncate table : 6/6/2024 4:10:16 AM Completed Dealer Loans truncate table : 6/6/2024 4:10:16 AM Begin Loading Database : 6/6/2024 4:10:16 AM 1757 Total Records Inserted : 6/6/2024 4:10:17 AM Beginning RefreshDealerLoansMonthEnd : 6/6/2024 4:10:17 AM Completed RefreshDealerLoansMonthEnd : 6/6/2024 4:10:18 AM Beginning RefreshDealerLoan : 6/6/2024 4:10:18 AM Completed RefreshDealerLoan : 6/6/2024 4:10:21 AM Beginning Adv_RefreshProposalCreditLineSummaryFromDealerLoan : 6/6/2024 4:10:21 AM Completed Adv_RefreshProposalCreditLineSummaryFromDealerLoan : 6/6/2024 4:10:22 AM Beginning RefreshBorrowerLoanForDefault : 6/6/2024 4:10:22 AM Completed RefreshBorrowerLoanForDefault : 6/6/2024 4:10:22 AM Beginning RefreshBorrowerLoanForDCVR : 6/6/2024 4:10:22 AM Completed RefreshBorrowerLoanForDCVR : 6/6/2024 4:10:23 AM Importer: DealerLoansImporter Ended : 6/6/2024 4:10:24 AM Importer: DealerLoansImporter Elapsed Time: 00:00:07.4098788 **************************************************************************************************** **************************************************************************************************** Importer: AdvantageDimensionImporter Started : 6/6/2024 4:10:24 AM Begin Reading Data File: \\nao.global.gmacfs.com\AllyApps\Ipartners.Pd\Facts_to_Carrs\ADV_DIM_20240606030006.DAT : 6/6/2024 4:10:24 AM End Reading Data File: \\nao.global.gmacfs.com\AllyApps\Ipartners.Pd\Facts_to_Carrs\ADV_DIM_20240606030006.DAT : 6/6/2024 4:10:24 AM Beginning AdvantageDimension truncate table : 6/6/2024 4:10:24 AM Completed AdvantageDimension truncate table : 6/6/2024 4:10:24 AM Begin Loading Database : 6/6/2024 4:10:24 AM 411 Total Records Inserted : 6/6/2024 4:10:24 AM Beginning refreshing Dimensions : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshFranchiseFromDimension : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshFranchiseFromDimension : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshDealerCommercialPrivilegesTypeFromDimension : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshDealerCommercialPrivilegesTypeFromDimension : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshBACManufacturerType : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshBACManufacturerType : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshStateFromDimensions : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshStateFromDimensions : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshFormOfBusinessTypeFromDimension : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshFormOfBusinessTypeFromDimension : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshTAATypeFromDimension : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshTAATypeFromDimension : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshGuarantorAssociationTypeFromDimension : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshGuarantorAssociationTypeFromDimension : 6/6/2024 4:10:24 AM Beginning FetchNewDealerStatusAdvantage : 6/6/2024 4:10:24 AM Completed FetchNewDealerStatusAdvantage : 6/6/2024 4:10:24 AM Beginning FetchDeletedDealerStatusAdvantage : 6/6/2024 4:10:24 AM Completed FetchDeletedDealerStatusAdvantage : 6/6/2024 4:10:24 AM Beginning FetchDealerStatusAdvantageChanges : 6/6/2024 4:10:24 AM Completed FetchDealerStatusAdvantageChanges : 6/6/2024 4:10:25 AM Completed refreshing Dimensions : 6/6/2024 4:10:25 AM Importer: AdvantageDimensionImporter Ended : 6/6/2024 4:10:25 AM Importer: AdvantageDimensionImporter Elapsed Time: 00:00:00.9732853 **************************************************************************************************** **************************************************************************************************** Importer: SmartAuctionImporter Started : 6/6/2024 4:10:25 AM Importer: SmartAuctionImporter Ended : 6/6/2024 4:10:25 AM Importer: SmartAuctionImporter Elapsed Time: 00:00:00.0312581 ****************************************************************************************************