Security

Constant re-authentication after SSO migration

Communicator

After our SSO migration, users have reported instances where a single tab will re-authenticate which causes a cascading re-authentication across all tabs.

This wouldn't be so bad if it happened once, but for some reason the re-authentication cycles between the tabs and happens continually. Here is a sample of the events that show up in the audit log:

5/15/19 9:52:05.198 AM  127.0.0.1 - 111111 [15/May/2019:15:52:05.198 +0000] "GET /en-US/splunkd/__raw/services/authentication/users/111111?output_mode=json&_=111222333 HTTP/1.0" 200 1904 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.111.111.111 Safari/537.36" - 111aaa222bbb333ccc 0ms
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    /opt/splunk/var/log/splunk/splunkd_ui_access.log sourcetype =   splunkd_ui_access

5/15/19 9:52:05.160 AM  127.0.0.1 - 111111 [15/May/2019:15:52:05.160 +0000] "GET /en-US/splunkd/__raw/services/authorization/roles?output_mode=json&count=0&_=111222333 HTTP/1.0" 200 6593 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.111.111.111 Safari/537.36" - 111aaa222bbb333ccc 0ms
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    /opt/splunk/var/log/splunk/splunkd_ui_access.log sourcetype =   splunkd_ui_access

5/15/19 9:52:05.039 AM  Audit:[timestamp=05-15-2019 15:52:05.039, user=111111, action=change_authentication, info=denied ][n/a]
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    audittrail sourcetype = audittrail

5/15/19 9:51:57.179 AM  05-15-2019 15:51:57.179 +0000 ERROR UiAuth - Request from 127.0.0.1 to "/en-US/splunkd/__raw/servicesNS/111111/aaabbbcccddd/search/jobs" failed CSRF validation -- expected "111aaa222bbb333ccc444ddd", but instead cookie had "111aaa222bbb333ccc444ddd" and header had "555eee666fff777ggg888hhh"
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    /opt/splunk/var/log/splunk/splunkd.log sourcetype = splunkd

5/15/19 9:51:57.123 AM  05-15-2019 15:51:57.123 +0000 ERROR UiAuth - Request from 127.0.0.1 to "/en-US/splunkd/__raw/servicesNS/111111/aaabbbcccddd/search/jobs" failed CSRF validation -- expected "111aaa222bbb333ccc444ddd", but instead cookie had "111aaa222bbb333ccc444ddd" and header had "555eee666fff777ggg888hhh"
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    /opt/splunk/var/log/splunk/splunkd.log sourcetype = splunkd

5/15/19 9:51:57.071 AM  05-15-2019 15:51:57.071 +0000 ERROR UiAuth - Request from 127.0.0.1 to "/en-US/splunkd/__raw/servicesNS/111111/aaabbbcccddd/search/jobs" failed CSRF validation -- expected "111aaa222bbb333ccc444ddd", but instead cookie had "111aaa222bbb333ccc444ddd" and header had "555eee666fff777ggg888hhh"
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    /opt/splunk/var/log/splunk/splunkd.log sourcetype = splunkd

5/15/19 9:49:49.400 AM  Audit:[timestamp=05-15-2019 15:49:49.400, user=111111, action=change_authentication, info=denied ][n/a]
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    audittrail sourcetype = audittrail

5/15/19 9:49:42.259 AM  05-15-2019 15:49:42.259 +0000 ERROR UiAuth - Request from 127.0.0.1 to "/en-US/splunkd/__raw/services/search/jobs/111111__111111__aaabbbcccddd__search3_111111111.145588_0000-000-000-000/control" failed CSRF validation -- expected "999iii000jjj111kkk222lll", but instead cookie had "999iii000jjj111kkk222lll" and header had "333mmm444nnn555ooo666ppp"
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    /opt/splunk/var/log/splunk/splunkd.log sourcetype = splunkd

5/15/19 9:49:42.252 AM  05-15-2019 15:49:42.252 +0000 ERROR UiAuth - Request from 127.0.0.1 to "/en-US/splunkd/__raw/services/search/jobs/111111__111111__aaabbbcccddd__search9_222222222.145594_0000-000-000-000/control" failed CSRF validation -- expected "999iii000jjj111kkk222lll", but instead cookie had "999iii000jjj111kkk222lll" and header had "333mmm444nnn555ooo666ppp"
host =  sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source =    /opt/splunk/var/log/splunk/splunkd.log sourcetype = splunkd

This cycle of Change Auth -> Fail -> CSRF Validation -> Change Auth will just repeat continually until the user closes all tabs and starts over. Somehow one of the tabs gets a new token and stores it in the cookie without telling his friends. Then, instead of reusing the updated token, the other tabs all try to get their own token and store it in the cookie.

Certainly sounds like a defect, but looking for guidance. Thanks.

-Tyler

1 Solution

Communicator

Success!! Well, at least an answer if not a solution...yet.

Got this note from Splunk regarding case number 1394793 :

Hey Tyler,
Update for you here dev is working on a fix that will:

"set the X-Splunk-Form-Key header directly reading from the splunkwebcsrftoken_ cookie every time instead of storing it in the variable."

They are building and testing this out currently and I will let you know which upcoming version this will be released in once i get a hard commitment.

While it was fun to tell the support folks, "I told you so", it is far better to see that the issue was reproducible and is being fixed. When I get word on the version of Splunk that contains the fix I'll post back. Progress!!

View solution in original post

Communicator

Success!! Well, at least an answer if not a solution...yet.

Got this note from Splunk regarding case number 1394793 :

Hey Tyler,
Update for you here dev is working on a fix that will:

"set the X-Splunk-Form-Key header directly reading from the splunkwebcsrftoken_ cookie every time instead of storing it in the variable."

They are building and testing this out currently and I will let you know which upcoming version this will be released in once i get a hard commitment.

While it was fun to tell the support folks, "I told you so", it is far better to see that the issue was reproducible and is being fixed. When I get word on the version of Splunk that contains the fix I'll post back. Progress!!

View solution in original post

Communicator
0 Karma

Contributor

Dude, I've been ignoring this issue for years. You chased it down. Here- take my karma!!

0 Karma

Communicator

Quick update, working on the version update from 6.8 to 7.2. Got the dev cluster done a couple weeks ago, unfortunately no help with this issue. Still gets the funky continuous re-authentication when multiple tab/windows are left open and then expire.

Also, the case is still being worked by Splunk. They increased the debugging on our dev cluster and are working on analyzing the data.

0 Karma

SplunkTrust
SplunkTrust

I know this sounds janky...but they really just need to close all the tabs except one and it will stop. If they open additional tabs it won't re-occur. Even re-opening the tabs from the history won't cause it to happen again, and they will be able to recover their closed searches that way 😛

Communicator

Yep, that is janky. 😉

Our users are pretty resilient to the abuse and have found ways around it. By increasing timeouts, the issue has been limited to just those people that leave sessions open overnight.

What I can't stomach, is as a web developer I know that the issue is bad session management and should be relatively easy to fix.

0 Karma

SplunkTrust
SplunkTrust

If you leave your browser open overnight of course your session is going to expire and the sessionid_PORT cookie will no longer be valid and you'll need to log in and refresh every page...

0 Karma

SplunkTrust
SplunkTrust

That's good session management.

0 Karma

Path Finder

I don't have an issue with the session expiring. The issue is that refreshing the page triggers the issue described by the original poster. Each window tries to auto-refresh every 15 seconds or so. The only way to stop this continuous reauth is to close all the windows except one as the poster above mentioned. If want your exact searches back, you have to go into your browser history and reopen them.

0 Karma

SplunkTrust
SplunkTrust

When you close down to just one tab you don't have to login again?

0 Karma

Path Finder

Correct. To be more specific, I don't have to login again. The window refreshes, I can see the SSO screen flash for a second and then the screen comes back up. It happens over and over and is very annoying when I am trying to change my query and it refreshes while I am typing and I have to type it in again. The only fix is to shut down all the windows or all except for one and open up new windows.

0 Karma

SplunkTrust
SplunkTrust

Ok that sounds like the session is expiring. I'd be interested in seeing the sessionid cookie before and after this happens on your end.

I'm certain it's changing because you've been time based load balanced or similar. That would be bad and we can show the cloud team how to fix it maybe.

0 Karma

Communicator

You are correct about the cookie changing, you can see this behavior in the original post where there are a lot of messages like "sent 1234, had 0987, expected xyz". I very much believe there is a JS variable that holding the token and not going to the cookie every time. This means if the cookie changes under your tab, the JS variable will be out of synch with the cookie and cause another re-authentication.

0 Karma

Path Finder

I am seeing the same issue and I also believe your workaround works, but it's annoying. I also think it's a bug and hope Splunk will fix it.

0 Karma

SplunkTrust
SplunkTrust

Sounds like you've got a potential bug going on... which Splunk version are you using ?
I see that you're on Chrome, has anything been modified there ?

0 Karma

SplunkTrust
SplunkTrust

Is your sessionid_PORT cookie expiring that often?

Your load balancing should be sticking you to a search head based on that cookie.

0 Karma

SplunkTrust
SplunkTrust

If that cookie changes you get kicked out of the UI and have to reauthenticate.

Also if there is a proxy on your network it could cause this if configured improperly.

0 Karma

Communicator

Yes on the proxy, but watching the network trace on the browser that doesn't look like a culprit. The tab sends the wrong token...so it will never work.

0 Karma

Communicator

Happens regardless of the browser, so definitely a Splunk issue.

Opened a case and they said that in our SSO configuration we didn't supply a SAML logout URL. Working on adding that and will update with result. Thanks.

Path Finder

Have you been able to resolve your issue?

0 Karma