Security

Authenticated API call using webport redirects (303) to login session expired

vckeofgjsolri
Explorer

Our network uses a PKI (client and server certificate) authentication system.  The Splunk administrators are not allowed to open the management port (8089) to allow API queries, so I have been trying to use the webport to mimic the browser interaction to create a search job.

 I use the Developer Tools in the browser to watch the API calls to get the session ID and cookies/tokens, and I pass them to the correct endpoints just like the browser.  (I'm using the requests python library.)
 
No matter what I do, each time I GET or POST I am redirected to a 'login?session_expired=1' endpoint, which then redirects me to the original endpoint I intended to reach.  This works fine for a GET since I reach the resource I was trying to get to.  With a POST (creating the search job) a redirection changes the POST to a GET - so I can only retrieve the status of existing jobs instead of creating a new job.
 
If it adds additional context, the paths are built like this (scoped to name space):
When I'm trying to GET a job slot:
httpx://the.splunk.domain/en-US/splunkd/__raw/servicesNS/myusername/search/saved/searches/_new
 
When I'm trying to POST a new job:
 
httpx://the.splunk.domain/en-US/splunkd/__raw/servicesNS/myusername/search/search/jobs
 
I pass it all of the headers I see in the browser request including the X-Splunk-Form-Key and Cookie field.  If I don't include those fields the connection is rejected, so I know it is accepting those for validity.  When I include the header {'X-Requested-With': 'XMLHttpRequest'} I get a 401 Denied error every time (but the browser is passing it).
 
I tried REPEATEDLY to use the code block feature but it failed every time.  Code block listed in my own reply below.
 
The request follows the redirection and ends up making a GET to the jobs endpoint, which returns existing jobs and NOT POSTing the new job.
 
I don't know what else it want's in order to accept the POST data without first redirecting me through the login endpoint.
 
I need to understand what a Splunk instance in our environment needs to accept an authenticated connection without redirecting me to a login endpoint first.  If anyone has experience with this server configuration and can help, I would really appreciate it.  Thank you!
0 Karma
1 Solution

vckeofgjsolri
Explorer

I tried using Selenium to replicate browser functionality, which worked, but ended up being extremely fragile and not suitable for long-term use.  After a renewed investigation I have finally found the solution:

import json
import requests
from datetime import datetime
from dateutil import tz
tz_est = tz.gettz('US/Eastern')
tz_utc = tz.UTC

session = requests.Session()
url_initial = 'https://the.splunk.domain/en-US/account/login?session_expired=1'
resp_initial = session.get(url_initial) # Response is good (200)

str_session_id = session.cookies['session_id_8000']
str_token = session.cookies['splunkweb_csrf_token_8000']
str_splunkd = session.cookies['splunkd_8000']
str_json_cookies = json.dumps(session.cookies.get_dict())

# Create headers and params
dict_headers = {
  'Accept': 'text/javascript, text/html, application/xml, text/xml, */*',
  'Accept-encoding': 'gzip, deflate, br',
  'Accept-Language': 'en-US,en;q=0.9',
  'Cache-Control': 'no-cache',
  'Connection': 'keep-alive',
  'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8',
  'Host': 'the.splunk.domain',
  'Origin': 'https://the.splunk.domain',
  'Pragma': 'no-cache',
  'sec-ch-ua-mobile': '?0',
  'sec-ch-ua-platform': 'Windows',
  'Sec-Fetch-Dest': 'empty',
  'Sec-Fetch-Mode': 'cors',
  'Sec-Fetch-Site': 'same-origin',
  'X-Requested-With': 'XMLHttpRequest',
  'X-Splunk-Form-Key': str_token
  }

dict_params = {'output_mode': 'json'}

# Create a search job
url_post = 'https://the.splunk.domain/en-US/splunkd/__raw/servicesNS/myusername/search/search/jobs'
str_query = 'index="*" | head 8'
dt_start_date = datetime(2021, 12, 15).astimezone(tz_utc)
dt_start_ms = int(datetime.timestamp(dt_start_date))
dt_end_date = datetime(2021, 12, 16).astimezone(tz_utc)
dt_end_ms = int(datetime.timestamp(dt_end_date))

dict_data = {
  'rf': '*',
  'auto_cancel': '30',
  'status_buckets': '300',
  'output_mode': 'json',
  'custom.display.page.search.mode': 'fast',
  'custom.dispatch.sample_ration': 1,
  'custom.search': str_query,
  'custom.dispatch.earliest_time': dt_start_ms,
  'custom.dispatch.latest_time': dt_end_ms,
  'name': 'jobs',
  'search': f'search {str_query}',
  'earliest_time': dt_start_ms,
  'latest_time': dt_end_ms,
  'ui_dispatch_app': 'search',
  'preview': 1,
  'adhoc_search_level': 'fast',
  'sample_ratio': 1,
  'check_risky_command': False,
  'provenance': 'UI:Search'
  }

resp_post = session.post(url_post, headers=headers_post, data=dict_data)
resp_post.ok # True
resp_post.json() # {'sid': '123456789.0987654'}

 

The most notable points are:

  • Ensure the headers include: 
    • 'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8'
    • 'X-Splunk-Form-Key': str_token (the splunkweb_csrf_token_8000 cookie value)
  • When POSTing, ensure you pass your data dictionary with the data keyword instead of json. The json keyword serializes the dictionary but the data keyword submits it string/form encoded (which is apparently necessary here).

Hooray!

View solution in original post

vckeofgjsolri
Explorer

I could not get the code block to render properly in my first post, but apparently the exact same code will work as a reply.  Here it goes:

 

import json
import requests
from datetime import datetime
from dateutil import tz
tz_est = tz.gettz('US/Eastern')
tz_utc = tz.UTC

session = requests.Session()
url_initial = 'https://the.splunk.domain/en-US/account/login?session_expired=1'
resp_initial = session.get(url_initial) # Response is good (200)


str_session_id = session.cookies['session_id_8000']
str_token = session.cookies['splunkweb_csrf_token_8000']
str_splunkd = session.cookies['splunkd_8000']
str_json_cookies = json.dumps(session.cookies.get_dict())

# Create a search slot
dt_now_est = datetime.now(tz=tz_est) # Timezone aware
dt_now_utc = dt_now_est.astimezone(tz_utc)
dt_now_utc_ms = int(datetime.timestamp(dt_now_utc) * 1000)

url_slot = 'https://the.splunk.domain/en-US/splunkd/__raw/servicesNS/myusername/search/saved/searches/_new'
params_slot = {'output_mode': 'json', '_': dt_now_utc_ms}
headers_slot = {'X-Splunk-form-Key': str_token, 'Cookie': str_json_cookie}
resp_slot = session.get(url_slot, params=params_slot, headers=headers_slot) # Response is good (200)

# Create a search job
str_user_agent = 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4515.131 Safari/537.36'
url_post = 'https://the.splunk.domain/en-US/splunkd/__raw/servicesNS/myusername/search/search/jobs'
str_query = 'index="*" | head 8'
dt_start_date = datetime(2021, 7, 13).astimezone(tz_utc)
dt_start_ms = int(datetime.timestamp(dt_start_date) * 1000)
dt_end_date = datetime(2021, 7, 14).astimezone(tz_utc)
dt_end_ms = int(datetime.timestamp(dt_end_date) * 1000)

headers_post = {'Cookie': str_json_cookie,
                'Accept': 'text/javascript, text/html, application/xml, text/xml, */*',
                'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8',
                'host': 'the.splunk.domain',
                'Origin': 'https://the.splunk.domain',
                'X-Splunk-form-Key': str_token,
                'User-Agent': str_user_agent,
                'Pragma': 'no-cache',
                'sec-ch-ua': '" Not;A brand";v="99", "Google Chrome";v="92", "Chromium";v="92"',
                'sec-ch-ua-mobile': '?0',
                'Sec-Fetch-Dest': 'empty',
                'Sec-Fetch-Mode': 'cors',
                'Sec-Fetch-Site': 'same-origin'
                }
dict_post = {'rf': '*',
             'auto_cancel': 30,
             'status_buckets': 300,
             'output_mode': 'json',
             'custom.display.page.search.mode': 'fast',
             'custom.dispatch.sample_ration': 1,
             'custom.search': str_query,
             'custom.dispatch.earliest_time': dt_start_ms,
             'custom.dispatch.latest_time': dt_end_ms,
             'name': 'jobs',
             'search': f'search {str_query}',
             'earliest_time': dt_start_ms,
             'latest_time': dt_end_ms,
             'ui_dispatch_app': 'search',
             'preview': 1,
             'adhoc_search_level': 'fast',
             'sample_ratio': 1,
             'check_risky_command': False,
             'provenance': 'UI:Search'
             }

resp_post = session.post(url_post, headers=headers_post, json=dict_post)

 

0 Karma

vckeofgjsolri
Explorer

I tried using Selenium to replicate browser functionality, which worked, but ended up being extremely fragile and not suitable for long-term use.  After a renewed investigation I have finally found the solution:

import json
import requests
from datetime import datetime
from dateutil import tz
tz_est = tz.gettz('US/Eastern')
tz_utc = tz.UTC

session = requests.Session()
url_initial = 'https://the.splunk.domain/en-US/account/login?session_expired=1'
resp_initial = session.get(url_initial) # Response is good (200)

str_session_id = session.cookies['session_id_8000']
str_token = session.cookies['splunkweb_csrf_token_8000']
str_splunkd = session.cookies['splunkd_8000']
str_json_cookies = json.dumps(session.cookies.get_dict())

# Create headers and params
dict_headers = {
  'Accept': 'text/javascript, text/html, application/xml, text/xml, */*',
  'Accept-encoding': 'gzip, deflate, br',
  'Accept-Language': 'en-US,en;q=0.9',
  'Cache-Control': 'no-cache',
  'Connection': 'keep-alive',
  'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8',
  'Host': 'the.splunk.domain',
  'Origin': 'https://the.splunk.domain',
  'Pragma': 'no-cache',
  'sec-ch-ua-mobile': '?0',
  'sec-ch-ua-platform': 'Windows',
  'Sec-Fetch-Dest': 'empty',
  'Sec-Fetch-Mode': 'cors',
  'Sec-Fetch-Site': 'same-origin',
  'X-Requested-With': 'XMLHttpRequest',
  'X-Splunk-Form-Key': str_token
  }

dict_params = {'output_mode': 'json'}

# Create a search job
url_post = 'https://the.splunk.domain/en-US/splunkd/__raw/servicesNS/myusername/search/search/jobs'
str_query = 'index="*" | head 8'
dt_start_date = datetime(2021, 12, 15).astimezone(tz_utc)
dt_start_ms = int(datetime.timestamp(dt_start_date))
dt_end_date = datetime(2021, 12, 16).astimezone(tz_utc)
dt_end_ms = int(datetime.timestamp(dt_end_date))

dict_data = {
  'rf': '*',
  'auto_cancel': '30',
  'status_buckets': '300',
  'output_mode': 'json',
  'custom.display.page.search.mode': 'fast',
  'custom.dispatch.sample_ration': 1,
  'custom.search': str_query,
  'custom.dispatch.earliest_time': dt_start_ms,
  'custom.dispatch.latest_time': dt_end_ms,
  'name': 'jobs',
  'search': f'search {str_query}',
  'earliest_time': dt_start_ms,
  'latest_time': dt_end_ms,
  'ui_dispatch_app': 'search',
  'preview': 1,
  'adhoc_search_level': 'fast',
  'sample_ratio': 1,
  'check_risky_command': False,
  'provenance': 'UI:Search'
  }

resp_post = session.post(url_post, headers=headers_post, data=dict_data)
resp_post.ok # True
resp_post.json() # {'sid': '123456789.0987654'}

 

The most notable points are:

  • Ensure the headers include: 
    • 'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8'
    • 'X-Splunk-Form-Key': str_token (the splunkweb_csrf_token_8000 cookie value)
  • When POSTing, ensure you pass your data dictionary with the data keyword instead of json. The json keyword serializes the dictionary but the data keyword submits it string/form encoded (which is apparently necessary here).

Hooray!

Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...