All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi Splunk community, is there documentation that provides step-by-step instructions on how I can ingest data and logs from my MongoDB Atlas cluster using API  to Splunk?  
Is there a css element that can help  move the "really bad" button so it on the same line as the rest?  
I am trying to write a splunk search to pull what rules a particular user is hitting. This search is helping with that BUT everything is coming through as a urlrulelabel. When I move apprulelabel to ... See more...
I am trying to write a splunk search to pull what rules a particular user is hitting. This search is helping with that BUT everything is coming through as a urlrulelabel. When I move apprulelabel to the start of the line, everything comes through as an apprulelabel. When I dive into the events, I see there are other rules showing, but they arent populating in the statistics table. I would like to have each rule come through as its own.  index=zscaler sourcetype=zscalernss-web user=* | eval rule_type=case(isnotnull(urlrulelabel), "urlurlelabel", isnotnull(apprulelabel), "apprulelabel", isnotnull(rulelabel), "rulelabel", true(), "unknown") | eval rule=coalesce(apprulelabel, urlrulelabel, rulelabel) | stats count by rule, rule_type | rename rule as Rule, rule_type as "Type of Rule", count as "Hit Count" | sort - "Hit Count" Thank you in advance
I am analyzing some .csvs which have a "date" field present. The .csvs are indexed, but the index time is pretty irrelevant, however, the "date" field is important. I am trying to create a new fie... See more...
I am analyzing some .csvs which have a "date" field present. The .csvs are indexed, but the index time is pretty irrelevant, however, the "date" field is important. I am trying to create a new field which would represent the first day of the week relative to the "date" field in my data. Ultimately I am going to create some charts over time which will use this new field. Below is an example of my desired outcome - from the date present as a field in the .csv, create a new field (Summary Date) which shows the date of Monday for that week. Date (present in .csv) Summary Date  (new field) 6/15/2024 6/10/2024 6/16/2024 6/10/2024 6/18/2024 6/17/2024   * realizing there may be more than one way to skin the cat, ultimately I am looking to group results by week in Line Charts. The query will be very basic, something like this: <base search> | chart sum(results) by "Summary Date"   And I want the date shown on the X-axis to be the first day (Monday in my case) of every week. Maybe there is an easier solution than creating a new "Summary Date" field via an eval expression, but that is where my head goes first. Any suggestions are appreciated!  
How do I add a  new field and set the value to seven days ago from the current date, snapped to the beginning of the current date? I know the date syntax should be "earliest=-7d@d", but am unsure if... See more...
How do I add a  new field and set the value to seven days ago from the current date, snapped to the beginning of the current date? I know the date syntax should be "earliest=-7d@d", but am unsure if I should use the eval command to add the field and the specific syntax. Thanks. 
Greetings to you !! I have a file in which I have a following content : My city is very good your city is also very good but but but but Now, I want only three lines to be indexed in Splunk :... See more...
Greetings to you !! I have a file in which I have a following content : My city is very good your city is also very good but but but but Now, I want only three lines to be indexed in Splunk : My city is very good your city is also very good but Since "but" has appeared multiple times , so we want to use only 1 "but" out of many I want to write props or any kind of configuration so that I can achieve this results. Kindly help !!
I have two query tables table 1 index="k8s_main" namespace="app02013" "EConcessionItemProcessingStartedHandler.createRma PH successfully created RMA" NOT [search index="k8s_main" namespace="app0201... See more...
I have two query tables table 1 index="k8s_main" namespace="app02013" "EConcessionItemProcessingStartedHandler.createRma PH successfully created RMA" NOT [search index="k8s_main" namespace="app02013" "NonCustomerOrderShippingLabelGeneratedEventsUtil.processShippingLabelEvent Successfully published" | fields LPN] | rex "LPN\": \"(?<LPN>[^,]+)\"\," | rex "location\": \"(?<location>[^,]+)\"\," | rex "orderNumber\": \"(?<orderNumber>[^,]+)\"\," | rex "orderLineId\": \"(?<orderLineId>[^,]+)\"\," | dedup orderLineId | eval LPN = replace(LPN, "\\[|\\]", "") | eval location = replace(location, "\\[|\\]", "") | eval orderNumber = replace(orderNumber, "\\[|\\]", "") | eval orderLineId = replace(orderLineId, "\\[|\\]", "") | table LPN location orderNumber orderLineId table 2 index="k8s_main" namespace="app02013" "Published successfully event=[order-events-avro / com.nordstrom.customer.event.OrderLineReturnReceived]" ECONCESSION | rex "orderLineId\": \"(?<orderLineId>[^,]+)\"\," | rex "orderNumber\": \"(?<orderNumber>[^,]+)\"\," | dedup orderLineId | eval orderNumber = replace(orderNumber, "\"", "") | eval orderLineId = replace(orderLineId, "\"", "") | table orderNumber orderLineId here is my join query: index="k8s_main" namespace="app02013" "EConcessionItemProcessingStartedHandler.createRma PH successfully created RMA" NOT [search index="k8s_main" namespace="app02013" "NonCustomerOrderShippingLabelGeneratedEventsUtil.processShippingLabelEvent Successfully published" | fields LPN] | rex "LPN\": \"(?<LPN>[^,]+)\"\," | rex "location\": \"(?<location>[^,]+)\"\," | rex "orderNumber\": \"(?<orderNumber>[^,]+)\"\," | rex "orderLineId\": \"(?<orderLineId>[^,]+)\"\," | dedup orderLineId | eval LPN = replace(LPN, "\\[|\\]", "") | eval location = replace(location, "\\[|\\]", "") | eval orderNumber = replace(orderNumber, "\\[|\\]", "") | eval orderLineId = replace(orderLineId, "\\[|\\]", "") | table LPN location orderNumber orderLineId | join left=L right=R where L.orderLineId = R.orderLineId [search index="k8s_main" namespace="app02013" "Published successfully event=[order-events-avro / com.nordstrom.customer.event.OrderLineReturnReceived]" ECONCESSION | rex "orderLineId\": \"(?<orderLineId>[^,]+)\"\," | rex "orderNumber\": \"(?<orderNumber>[^,]+)\"\," | dedup orderLineId | eval orderNumber = replace(orderNumber, "\"", "") | eval orderLineId = replace(orderLineId, "\"", "") | table orderNumber orderLineId] Each table returns unique row. But the result of the above query returns less data. Please help to find the problem.
I'm trying to allow users to have a limited search against indexes they don't have access to. This might very well be the problem (and maybe it's not possible), but I'm hoping the solution below shou... See more...
I'm trying to allow users to have a limited search against indexes they don't have access to. This might very well be the problem (and maybe it's not possible), but I'm hoping the solution below should work and I'm simply missing a user capability/permission (unrelated to the index access) somewhere. Set up a saved search (using variables) to run as the owner (user 'A' that does have access to the indexes). Set up a dashboard to receive those variables and pass them along to a search panel using a search similar to '| savedsearch searchname var1=$v1$ var2=$v2$' . The dashboard works when running as the user with access to the indexes (user 'A'), so the search and variable passthrough appear to be working. When I run as a test user (with only default 'user' Splunk capabilities, no index access) I get no results. Is what I am trying to accomplish possible? If it is, does anyone have any guidance on what I might be doing wrong? I asked this in the community Slack as well. I'm trying to avoid a summary index if possible as the long term goal is to have multiple users (without index permissions) be able to run the search specific to them without allowing each user access to all other users' searches. An example scenario is viewing a users web history as seen from a firewall or secure web gateway (allows vs blocks), and limiting the search to a logged in user ($env:user$). This could also be used by a support center (group of users) doing first level troubleshooting who might not need access to all the logs available in an index. 
Hi team, I am not getting the event break at required. my requirement is to break event from log file which start with "Importer:" and end with "Elapsed Time:" below is config i did. Please suggest... See more...
Hi team, I am not getting the event break at required. my requirement is to break event from log file which start with "Importer:" and end with "Elapsed Time:" below is config i did. Please suggest if any change in props config or I am good to go. SHOULD_LINEMERGE=false LINE_BREAKER=([\r\n]+)\S+\s\S+\s\W+ MAX_TIMESTAMP_LOOKAHEAD=-1 TIME_PREFIX=^\*Importer:\s+ TIME_FORMAT=%m/%d/%Y %I:%M:%S %p EVENT_BREAKER = ([\n\r]*Elapsed Time:\.) EVENT_BREAKER_ENABLE = true KV_MODE=none sample log: Importer: DealerLoansImporter Started : 6/6/2024 4:10:16 AM Begin Reading Data File: \\nao.global.gmacfs.com\AllyApps\Ipartners.Pd\Facts_to_Carrs\GC01RD21.DLR_LOAN_20240605223729.DAT : 6/6/2024 4:10:16 AM End Reading Data File: \\nao.global.gmacfs.com\AllyApps\Ipartners.Pd\Facts_to_Carrs\GC01RD21.DLR_LOAN_20240605223729.DAT : 6/6/2024 4:10:16 AM Beginning Dealer Loans truncate table : 6/6/2024 4:10:16 AM Completed Dealer Loans truncate table : 6/6/2024 4:10:16 AM Begin Loading Database : 6/6/2024 4:10:16 AM 1757 Total Records Inserted : 6/6/2024 4:10:17 AM Beginning RefreshDealerLoansMonthEnd : 6/6/2024 4:10:17 AM Completed RefreshDealerLoansMonthEnd : 6/6/2024 4:10:18 AM Beginning RefreshDealerLoan : 6/6/2024 4:10:18 AM Completed RefreshDealerLoan : 6/6/2024 4:10:21 AM Beginning Adv_RefreshProposalCreditLineSummaryFromDealerLoan : 6/6/2024 4:10:21 AM Completed Adv_RefreshProposalCreditLineSummaryFromDealerLoan : 6/6/2024 4:10:22 AM Beginning RefreshBorrowerLoanForDefault : 6/6/2024 4:10:22 AM Completed RefreshBorrowerLoanForDefault : 6/6/2024 4:10:22 AM Beginning RefreshBorrowerLoanForDCVR : 6/6/2024 4:10:22 AM Completed RefreshBorrowerLoanForDCVR : 6/6/2024 4:10:23 AM Importer: DealerLoansImporter Ended : 6/6/2024 4:10:24 AM Importer: DealerLoansImporter Elapsed Time: 00:00:07.4098788 **************************************************************************************************** **************************************************************************************************** Importer: AdvantageDimensionImporter Started : 6/6/2024 4:10:24 AM Begin Reading Data File: \\nao.global.gmacfs.com\AllyApps\Ipartners.Pd\Facts_to_Carrs\ADV_DIM_20240606030006.DAT : 6/6/2024 4:10:24 AM End Reading Data File: \\nao.global.gmacfs.com\AllyApps\Ipartners.Pd\Facts_to_Carrs\ADV_DIM_20240606030006.DAT : 6/6/2024 4:10:24 AM Beginning AdvantageDimension truncate table : 6/6/2024 4:10:24 AM Completed AdvantageDimension truncate table : 6/6/2024 4:10:24 AM Begin Loading Database : 6/6/2024 4:10:24 AM 411 Total Records Inserted : 6/6/2024 4:10:24 AM Beginning refreshing Dimensions : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshFranchiseFromDimension : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshFranchiseFromDimension : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshDealerCommercialPrivilegesTypeFromDimension : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshDealerCommercialPrivilegesTypeFromDimension : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshBACManufacturerType : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshBACManufacturerType : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshStateFromDimensions : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshStateFromDimensions : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshFormOfBusinessTypeFromDimension : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshFormOfBusinessTypeFromDimension : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshTAATypeFromDimension : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshTAATypeFromDimension : 6/6/2024 4:10:24 AM Beginning Refreshing Adv_RefreshGuarantorAssociationTypeFromDimension : 6/6/2024 4:10:24 AM Completed Refreshing Adv_RefreshGuarantorAssociationTypeFromDimension : 6/6/2024 4:10:24 AM Beginning FetchNewDealerStatusAdvantage : 6/6/2024 4:10:24 AM Completed FetchNewDealerStatusAdvantage : 6/6/2024 4:10:24 AM Beginning FetchDeletedDealerStatusAdvantage : 6/6/2024 4:10:24 AM Completed FetchDeletedDealerStatusAdvantage : 6/6/2024 4:10:24 AM Beginning FetchDealerStatusAdvantageChanges : 6/6/2024 4:10:24 AM Completed FetchDealerStatusAdvantageChanges : 6/6/2024 4:10:25 AM Completed refreshing Dimensions : 6/6/2024 4:10:25 AM Importer: AdvantageDimensionImporter Ended : 6/6/2024 4:10:25 AM Importer: AdvantageDimensionImporter Elapsed Time: 00:00:00.9732853 **************************************************************************************************** **************************************************************************************************** Importer: SmartAuctionImporter Started : 6/6/2024 4:10:25 AM Importer: SmartAuctionImporter Ended : 6/6/2024 4:10:25 AM Importer: SmartAuctionImporter Elapsed Time: 00:00:00.0312581 ****************************************************************************************************
this is the log data   i want a report like this:     my current query is : index="webmethods_prd" source="/apps/WebMethods/IntegrationServer/instances/default/logs/DFO.log" |eval times... See more...
this is the log data   i want a report like this:     my current query is : index="webmethods_prd" source="/apps/WebMethods/IntegrationServer/instances/default/logs/DFO.log" |eval timestamp=strftime(_time, "%F") | stats values(B2BUnknownTrxCount) by timestamp it giving report like this: I need to add date time in hh:mm in a chart.  Please help to update my query
Hello, I have been asked to optimize this logic because is taking too long to run. I am not sure how else can I write to make it run faster. It's not throwing any errors it just takes a long time to ... See more...
Hello, I have been asked to optimize this logic because is taking too long to run. I am not sure how else can I write to make it run faster. It's not throwing any errors it just takes a long time to run. Any help would be highly appreciate. Thanks!   index IN (indexes) sourcetype=xmlwineventlog sAMAccountName IN (_x*, x_*, lx*, hh*) | lookup mas_pam_eventcode.csv event_code AS EventCode OUTPUT action | stats count(eval(action=="login_failure")) as failure_count, count(eval(action=="lockout")) as lockout_count by sAMAccountName | where failure_count >= 3 OR lockout_count > 0
Hi,  I have the results of an append operation as follows: ID Col3 col4 col5 a     abc a abc No   a xyz Yes   b     abc b     xyz b xyz No   b... See more...
Hi,  I have the results of an append operation as follows: ID Col3 col4 col5 a     abc a abc No   a xyz Yes   b     abc b     xyz b xyz No   b fgh Yes   b abc No   f     abc f abc No   f xyz No   i     abc i     xyz i xyz Yes   i abc No   The result from the first table and the result from the second should be merged respectively. I cannot use | stats values(col1) values(col2) values(col3) by ID because I cannot lose the distinction between "No" and "Yes" for Col3. I want to create a result as follows: ID Col3 col4 col5 a abc No abc a xyz Yes   b xyz No xyz b fgh Yes   b abc No abc f abc No abc f xyz No   i xyz Yes xyz i abc No abc   I think something like SQL's full join would do the trick, but I am totally stuck.
  Hi all I have an addon plugin that utilizes REST API to obtain specific logs; each generated event has fixed values for both source and sourcetype. Now there are customers who use props.conf and... See more...
  Hi all I have an addon plugin that utilizes REST API to obtain specific logs; each generated event has fixed values for both source and sourcetype. Now there are customers who use props.conf and transforms.conf that will change the value of the source according to a particular column within an event; for instance, if the service is 'a', then the source changes to 'service_a'; if service is 'b' then it changes to 'service_b'. The current problem is that obtaining logs works fine, and content can always be found using sourcetype. But when using transformed source to search, events cannot be found even though events with 'service_a' and 'service_b' are visible.   How should I adjust the addon or how should I configure local settings so that I can search using source?   Regards Emily
I want to extract Jan from Jan-24.
Hello, Is it possible to define the retention duration of logs (hot, warm and cold)  If yes, how can this be done ? Or do we only have the option to define the frozenTimePeriodInSecs ?
Hey not sure if anyone can help, trying to sort the columns in numerical order?   thanks in advance
hello I have a date 2024-06 how can i convert it to 06/2024? and 2023/Q4 to  Q4/2023
My application is a backend web service. All events in a request contain the same value for a "req_id" field. I have a use-case, where I want to look at all the events that occurred for requests, on... See more...
My application is a backend web service. All events in a request contain the same value for a "req_id" field. I have a use-case, where I want to look at all the events that occurred for requests, only when a particular log line is present. My first query would be this -   index="myapp" AND "some log line" | rex field=_raw "req_id=(?<req_id>[a-zA-Z0-9]*)"   And then my second query would be -   index="myapp" AND "$req_id" | transaction req_id   where the $req_id would be fed in by the first query. How do I join these two queries?  Thanks in advance!
my SAML Response to Splunk.   <?xml version="1.0" encoding="UTF-8" standalone="no"?><samlp:Response xmlns:samlp="urn:oasis:names:tc:SAML:2.0:protocol" Destination="http://RTNB336:8000/saml/acs" ID=... See more...
my SAML Response to Splunk.   <?xml version="1.0" encoding="UTF-8" standalone="no"?><samlp:Response xmlns:samlp="urn:oasis:names:tc:SAML:2.0:protocol" Destination="http://RTNB336:8000/saml/acs" ID="_4c16f9be1c813c774f2f9111fd5602f6" InResponseTo="RTNB336.21.0882C4AC-681F-4648-AD0F-FDD9F4BE114B" IssueInstant="2024-06-20T01:56:14.199Z" Version="2.0"><saml2:Issuer xmlns:saml2="urn:oasis:names:tc:SAML:2.0:assertion">http://hive.dreamidaas.com</saml2:Issuer><ds:Signature xmlns:ds="http://www.w3.org/2000/09/xmldsig#"><ds:SignedInfo><ds:CanonicalizationMethod Algorithm="http://www.w3.org/2001/10/xml-exc-c14n#"/><ds:SignatureMethod Algorithm="http://www.w3.org/2001/04/xmldsig-more#rsa-sha256"/><ds:Reference URI="#_4c16f9be1c813c774f2f9111fd5602f6"><ds:Transforms><ds:Transform Algorithm="http://www.w3.org/2000/09/xmldsig#enveloped-signature"/><ds:Transform Algorithm="http://www.w3.org/2001/10/xml-exc-c14n#"/></ds:Transforms><ds:DigestMethod Algorithm="http://www.w3.org/2001/04/xmlenc#sha256"/><ds:DigestValue>Wjlp0IBLeluYep7QMphL/ZBkVsDqxbrFcgSDFiFxQBo=</ds:DigestValue></ds:Reference></ds:SignedInfo><ds:SignatureValue>Y0Lp7OR2BWIie+F60hJUhNdOLKhWlXnjLyD0Y7Ut1lPIYfL9uoClcQA98Ge961M7FjrC/uDA8yxGYKvApU4VOYzy7kLM0wbxFKUVXAuPAl5of0WWrMV8QMSWfCq8/ensPzlzsqg84tf86UgMZ2PodD6WOM9SIIW+izBPOP3emuv2c+UrvR2eyp1s+ItWn0AUB+0R0l+iqd+sNE/Gb+l9THlJYm68yLr2DY0nT66dOLKS3Q3jnMox6xrzsSnwaF6+H+dSnvd5YeBIMyjTC1bF6GjQpdudTNz8162TvtJjvAcTUOwhUmLyY4ytTvL+lHKOsDh57wZenvB4gVYzoF6T+A==</ds:SignatureValue><ds:KeyInfo><ds:X509Data><ds:X509Certificate>MIIDtDCCApygAwIBAgIKJxHdhEoMRRD/JjANBgkqhkiG9w0BAQsFADBCMQswCQYDVQQGEwJLUjEW MBQGA1UECgwNRHJlYW1TZWN1cml0eTEMMAoGA1UECwwDU1NPMQ0wCwYDVQQDDARST09UMB4XDTIy MDIyMzIzNTY1NFoXDTMyMDIyMzIzNTY1NFowTzELMAkGA1UEBhMCS1IxFjAUBgNVBAoMDURyZWFt U2VjdXJpdHkxDDAKBgNVBAsMA1NTTzEaMBgGA1UEAwwRTUFHSUNfU1NPX0lEUF9TaWcwggEiMA0G CSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQCGEA5RIOlCH/xJX5qnAQRixJfuUhv2dBoGyCjO1qbJ GuJh6lCF7mwsJbS+PStFrFvXBrfFt8S2QU7hndK5aj3f83IJiiv6y+a26/4xNf19sp6AtafAmWr9 kkI5AH51/9l8ypzf67OAUfrJxFPW6ZKgWiGp5yjrensl1IKxwP0joxUQXISI+epu07XpdWF2SJQ7 rVRNPZUP6sA+lNQsFDznN7moWFcU+UyrTJHDkgj/2qw4QvucNBY7Hj/bC/6KX1d0XSKfvQCfI4gu Gd/4FL1ApnyTvZ/tnbcbl420NWbKgtn19Q4ZIqhj10ruTzVn1YOpwqBGP/NlKDVmKOCem7tvAgMB AAGjgZ4wgZswagYDVR0jBGMwYYAULffLTJtBlWrpR2I1Coc4OG3funyhRqREMEIxCzAJBgNVBAYT AktSMRYwFAYDVQQKDA1EcmVhbVNlY3VyaXR5MQwwCgYDVQQLDANTU08xDTALBgNVBAMMBFJPT1SC AQEwHQYDVR0OBBYEFPvKSaxuZMLnM8ZqaFFkw0xeDp8CMA4GA1UdDwEB/wQEAwIGwDANBgkqhkiG 9w0BAQsFAAOCAQEAflCL2e6ZHxGDtK6PSjrGrhrkJ4XYHvKGnEWWajhL0RqqDoQhEAOAzEyGMcpQ zWBF6etI+uDlnr7EfPCAojvwfcQCEGK4gCHskHHDkXNz5MAC2sSHqVEn/ChAO9nRnTRo4EZlFVgH SXIDJqeWZd2wJ86u9cqA6XTyB/KuVwnTD2U/1W87ERpKlXtDNnC5hB3cp1ONaW+0+Fnn4NdSgMQd SwteL/CtU+q/gcYt1izy1RGdcDRR11+nmfkZT6UYCyKj0ea0yc4SbRjGIEOgJExDJBL8eyc4X2D3 4k6B4rhPzx+vF1OB1esHB69T6Vlo+iUM+XtoLFUOhloNiDzXq+2Hgg==</ds:X509Certificate></ds:X509Data></ds:KeyInfo></ds:Signature><saml2p:Status xmlns:saml2p="urn:oasis:names:tc:SAML:2.0:protocol"><saml2p:StatusCode Value="urn:oasis:names:tc:SAML:2.0:status:Success"/></saml2p:Status><saml2:Assertion xmlns:saml2="urn:oasis:names:tc:SAML:2.0:assertion" ID="_93ae10442348482eb51b04051c58267a" IssueInstant="2024-06-20T01:56:14.199Z" Version="2.0"><saml2:Issuer>http://hive.dreamidaas.com</saml2:Issuer><saml2:Subject><saml2:NameID Format="urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress" NameQualifier="http://hive.dreamidaas.com" SPNameQualifier="RTNB336">rladnrud@devdreamsso.site</saml2:NameID><saml2:SubjectConfirmation Method="urn:oasis:names:tc:SAML:2.0:cm:bearer"><saml2:SubjectConfirmationData InResponseTo="RTNB336.21.0882C4AC-681F-4648-AD0F-FDD9F4BE114B" NotOnOrAfter="2024-06-20T02:01:14.199Z" Recipient="http://RTNB336:8000/saml/acs"/></saml2:SubjectConfirmation></saml2:Subject><saml2:Conditions NotBefore="2024-06-20T01:56:14.199Z" NotOnOrAfter="2024-06-20T02:01:14.199Z"><saml2:AudienceRestriction><saml2:Audience>RTNB336</saml2:Audience></saml2:AudienceRestriction></saml2:Conditions><saml2:AuthnStatement AuthnInstant="2024-06-20T01:55:52.000Z" SessionIndex="_8028c81d727dcc5a423afa58c645b8c5"><saml2:AuthnContext><saml2:AuthnContextClassRef>urn:oasis:names:tc:SAML:2.0:ac:classes:InternetProtocol</saml2:AuthnContextClassRef></saml2:AuthnContext></saml2:AuthnStatement></saml2:Assertion></samlp:Response>   There's no problem in my IDP. I don't know why Splunk can't verify signature properly
It says, "If you save your IdP certificate under $SPLUNK_HOME/etc/auth/idpCerts, please leave it blank." If you don't type idpCert.pem, it won't save it. I don't think that's the cause, but what more... See more...
It says, "If you save your IdP certificate under $SPLUNK_HOME/etc/auth/idpCerts, please leave it blank." If you don't type idpCert.pem, it won't save it. I don't think that's the cause, but what more information do I need to provide to get this error clearer? How should I solve this problem? I looked for the same error information and found that I need to register IdP certificate chains, but that's not a requirement, is it? Splunk-9.2.1 Windows11 Our company IDP