After our SSO migration, users have reported instances where a single tab will re-authenticate which causes a cascading re-authentication across all tabs.
This wouldn't be so bad if it happened once, but for some reason the re-authentication cycles between the tabs and happens continually. Here is a sample of the events that show up in the audit log:
5/15/19 9:52:05.198 AM 127.0.0.1 - 111111 [15/May/2019:15:52:05.198 +0000] "GET /en-US/splunkd/__raw/services/authentication/users/111111?output_mode=json&_=111222333 HTTP/1.0" 200 1904 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.111.111.111 Safari/537.36" - 111aaa222bbb333ccc 0ms
host = sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source = /opt/splunk/var/log/splunk/splunkd_ui_access.log sourcetype = splunkd_ui_access
5/15/19 9:52:05.160 AM 127.0.0.1 - 111111 [15/May/2019:15:52:05.160 +0000] "GET /en-US/splunkd/__raw/services/authorization/roles?output_mode=json&count=0&_=111222333 HTTP/1.0" 200 6593 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.111.111.111 Safari/537.36" - 111aaa222bbb333ccc 0ms
host = sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source = /opt/splunk/var/log/splunk/splunkd_ui_access.log sourcetype = splunkd_ui_access
5/15/19 9:52:05.039 AM Audit:[timestamp=05-15-2019 15:52:05.039, user=111111, action=change_authentication, info=denied ][n/a]
host = sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source = audittrail sourcetype = audittrail
5/15/19 9:51:57.179 AM 05-15-2019 15:51:57.179 +0000 ERROR UiAuth - Request from 127.0.0.1 to "/en-US/splunkd/__raw/servicesNS/111111/aaabbbcccddd/search/jobs" failed CSRF validation -- expected "111aaa222bbb333ccc444ddd", but instead cookie had "111aaa222bbb333ccc444ddd" and header had "555eee666fff777ggg888hhh"
host = sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source = /opt/splunk/var/log/splunk/splunkd.log sourcetype = splunkd
5/15/19 9:51:57.123 AM 05-15-2019 15:51:57.123 +0000 ERROR UiAuth - Request from 127.0.0.1 to "/en-US/splunkd/__raw/servicesNS/111111/aaabbbcccddd/search/jobs" failed CSRF validation -- expected "111aaa222bbb333ccc444ddd", but instead cookie had "111aaa222bbb333ccc444ddd" and header had "555eee666fff777ggg888hhh"
host = sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source = /opt/splunk/var/log/splunk/splunkd.log sourcetype = splunkd
5/15/19 9:51:57.071 AM 05-15-2019 15:51:57.071 +0000 ERROR UiAuth - Request from 127.0.0.1 to "/en-US/splunkd/__raw/servicesNS/111111/aaabbbcccddd/search/jobs" failed CSRF validation -- expected "111aaa222bbb333ccc444ddd", but instead cookie had "111aaa222bbb333ccc444ddd" and header had "555eee666fff777ggg888hhh"
host = sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source = /opt/splunk/var/log/splunk/splunkd.log sourcetype = splunkd
5/15/19 9:49:49.400 AM Audit:[timestamp=05-15-2019 15:49:49.400, user=111111, action=change_authentication, info=denied ][n/a]
host = sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source = audittrail sourcetype = audittrail
5/15/19 9:49:42.259 AM 05-15-2019 15:49:42.259 +0000 ERROR UiAuth - Request from 127.0.0.1 to "/en-US/splunkd/__raw/services/search/jobs/111111__111111__aaabbbcccddd__search3_111111111.145588_0000-000-000-000/control" failed CSRF validation -- expected "999iii000jjj111kkk222lll", but instead cookie had "999iii000jjj111kkk222lll" and header had "333mmm444nnn555ooo666ppp"
host = sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source = /opt/splunk/var/log/splunk/splunkd.log sourcetype = splunkd
5/15/19 9:49:42.252 AM 05-15-2019 15:49:42.252 +0000 ERROR UiAuth - Request from 127.0.0.1 to "/en-US/splunkd/__raw/services/search/jobs/111111__111111__aaabbbcccddd__search9_222222222.145594_0000-000-000-000/control" failed CSRF validation -- expected "999iii000jjj111kkk222lll", but instead cookie had "999iii000jjj111kkk222lll" and header had "333mmm444nnn555ooo666ppp"
host = sh-i-111aaa222bbb.aaabbbccc.splunkcloud.com source = /opt/splunk/var/log/splunk/splunkd.log sourcetype = splunkd
This cycle of Change Auth -> Fail -> CSRF Validation -> Change Auth will just repeat continually until the user closes all tabs and starts over. Somehow one of the tabs gets a new token and stores it in the cookie without telling his friends. Then, instead of reusing the updated token, the other tabs all try to get their own token and store it in the cookie.
Certainly sounds like a defect, but looking for guidance. Thanks.
-Tyler
Success!! Well, at least an answer if not a solution...yet.
Got this note from Splunk regarding case number 1394793 :
Hey Tyler,
Update for you here dev is working on a fix that will:"set the X-Splunk-Form-Key header directly reading from the splunkweb_csrf_token_ cookie every time instead of storing it in the variable."
They are building and testing this out currently and I will let you know which upcoming version this will be released in once i get a hard commitment.
While it was fun to tell the support folks, "I told you so", it is far better to see that the issue was reproducible and is being fixed. When I get word on the version of Splunk that contains the fix I'll post back. Progress!!
No, this is still a problem. Things we've done so far:
Recently I recorded a video of the issue happening and did a network capture with Fiddler. The Case Worker was pretty confused about what is happening and escalated to the Developers.
I still very much believe that there is something with the tab session caching the Token Value instead of dynamically retrieving it from the cookie every time.
Also, we have this issue across both our Search Head Cluster and single Search Head instances, so that is not a factor.