Security
Highlighted

Constant re-authentication after SSO migration

Communicator

After our SSO migration, users have reported instances where a single tab will re-authenticate which causes a cascading re-authentication across all tabs.

This wouldn't be so bad if it happened once, but for some reason the re-authentication cycles between the tabs and happens continually. Here is a sample of the events that show up in the audit log:

5/15/19 9:52:05.198 AM  127.0.0.1 - 111111 [15/May/2019:15:52:05.198 +0000] "GET /en-US/splunkd/__raw/services/authentication/users/111111?output_mode=json&_=111222333 HTTP/1.0" 200 1904 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.111.111.111 Safari/537.36" - 111aaa222bbb333ccc 0ms
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    /opt/splunk/var/log/splunk/splunkd_ui_access.log sourcetype =   splunkd_ui_access

5/15/19 9:52:05.160 AM  127.0.0.1 - 111111 [15/May/2019:15:52:05.160 +0000] "GET /en-US/splunkd/__raw/services/authorization/roles?output_mode=json&count=0&_=111222333 HTTP/1.0" 200 6593 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.111.111.111 Safari/537.36" - 111aaa222bbb333ccc 0ms
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    /opt/splunk/var/log/splunk/splunkd_ui_access.log sourcetype =   splunkd_ui_access

5/15/19 9:52:05.039 AM  Audit:[timestamp=05-15-2019 15:52:05.039, user=111111, action=change_authentication, info=denied ][n/a]
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    audittrail sourcetype = audittrail

5/15/19 9:51:57.179 AM  05-15-2019 15:51:57.179 +0000 ERROR UiAuth - Request from 127.0.0.1 to "/en-US/splunkd/__raw/servicesNS/111111/aaabbbcccddd/search/jobs" failed CSRF validation -- expected "111aaa222bbb333ccc444ddd", but instead cookie had "111aaa222bbb333ccc444ddd" and header had "555eee666fff777ggg888hhh"
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    /opt/splunk/var/log/splunk/splunkd.log sourcetype = splunkd

5/15/19 9:51:57.123 AM  05-15-2019 15:51:57.123 +0000 ERROR UiAuth - Request from 127.0.0.1 to "/en-US/splunkd/__raw/servicesNS/111111/aaabbbcccddd/search/jobs" failed CSRF validation -- expected "111aaa222bbb333ccc444ddd", but instead cookie had "111aaa222bbb333ccc444ddd" and header had "555eee666fff777ggg888hhh"
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    /opt/splunk/var/log/splunk/splunkd.log sourcetype = splunkd

5/15/19 9:51:57.071 AM  05-15-2019 15:51:57.071 +0000 ERROR UiAuth - Request from 127.0.0.1 to "/en-US/splunkd/__raw/servicesNS/111111/aaabbbcccddd/search/jobs" failed CSRF validation -- expected "111aaa222bbb333ccc444ddd", but instead cookie had "111aaa222bbb333ccc444ddd" and header had "555eee666fff777ggg888hhh"
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    /opt/splunk/var/log/splunk/splunkd.log sourcetype = splunkd

5/15/19 9:49:49.400 AM  Audit:[timestamp=05-15-2019 15:49:49.400, user=111111, action=change_authentication, info=denied ][n/a]
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    audittrail sourcetype = audittrail

5/15/19 9:49:42.259 AM  05-15-2019 15:49:42.259 +0000 ERROR UiAuth - Request from 127.0.0.1 to "/en-US/splunkd/__raw/services/search/jobs/111111__111111__aaabbbcccddd__search3_111111111.145588_0000-000-000-000/control" failed CSRF validation -- expected "999iii000jjj111kkk222lll", but instead cookie had "999iii000jjj111kkk222lll" and header had "333mmm444nnn555ooo666ppp"
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    /opt/splunk/var/log/splunk/splunkd.log sourcetype = splunkd

5/15/19 9:49:42.252 AM  05-15-2019 15:49:42.252 +0000 ERROR UiAuth - Request from 127.0.0.1 to "/en-US/splunkd/__raw/services/search/jobs/111111__111111__aaabbbcccddd__search9_222222222.145594_0000-000-000-000/control" failed CSRF validation -- expected "999iii000jjj111kkk222lll", but instead cookie had "999iii000jjj111kkk222lll" and header had "333mmm444nnn555ooo666ppp"
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    /opt/splunk/var/log/splunk/splunkd.log sourcetype = splunkd

This cycle of Change Auth -> Fail -> CSRF Validation -> Change Auth will just repeat continually until the user closes all tabs and starts over. Somehow one of the tabs gets a new token and stores it in the cookie without telling his friends. Then, instead of reusing the updated token, the other tabs all try to get their own token and store it in the cookie.

Certainly sounds like a defect, but looking for guidance. Thanks.

-Tyler

Highlighted

Re: Constant re-authentication after SSO migration

SplunkTrust
SplunkTrust

Sounds like you've got a potential bug going on... which Splunk version are you using ?
I see that you're on Chrome, has anything been modified there ?

0 Karma
Highlighted

Re: Constant re-authentication after SSO migration

Communicator

Happens regardless of the browser, so definitely a Splunk issue.

Opened a case and they said that in our SSO configuration we didn't supply a SAML logout URL. Working on adding that and will update with result. Thanks.

Highlighted

Re: Constant re-authentication after SSO migration

Path Finder

Have you been able to resolve your issue?

0 Karma
Highlighted

Re: Constant re-authentication after SSO migration

Communicator

No, this is still a problem. Things we've done so far:

  • Increase ADFS Timeout to 9hrs
  • Increase SAML Timeout to 9hrs
  • Increase Web Timeout to 9hrs
  • Provide SAML Logout URL

Recently I recorded a video of the issue happening and did a network capture with Fiddler. The Case Worker was pretty confused about what is happening and escalated to the Developers.

I still very much believe that there is something with the tab session caching the Token Value instead of dynamically retrieving it from the cookie every time.

Also, we have this issue across both our Search Head Cluster and single Search Head instances, so that is not a factor.

0 Karma
Highlighted

Re: Constant re-authentication after SSO migration

SplunkTrust
SplunkTrust

Is your sessionid_PORT cookie expiring that often?

Your load balancing should be sticking you to a search head based on that cookie.

0 Karma
Highlighted

Re: Constant re-authentication after SSO migration

SplunkTrust
SplunkTrust

If that cookie changes you get kicked out of the UI and have to reauthenticate.

Also if there is a proxy on your network it could cause this if configured improperly.

0 Karma
Highlighted

Re: Constant re-authentication after SSO migration

Communicator

Yes on the proxy, but watching the network trace on the browser that doesn't look like a culprit. The tab sends the wrong token...so it will never work.

0 Karma
Highlighted

Re: Constant re-authentication after SSO migration

SplunkTrust
SplunkTrust

I know this sounds janky...but they really just need to close all the tabs except one and it will stop. If they open additional tabs it won't re-occur. Even re-opening the tabs from the history won't cause it to happen again, and they will be able to recover their closed searches that way 😛

Highlighted

Re: Constant re-authentication after SSO migration

Path Finder

I am seeing the same issue and I also believe your workaround works, but it's annoying. I also think it's a bug and hope Splunk will fix it.

0 Karma
Speak Up for Splunk Careers!

We want to better understand the impact Splunk experience and expertise has has on individuals' careers, and help highlight the growing demand for Splunk skills.