Getting Data In

Splunk Add-On for ServiceNow not capturing all the selected tables

artelia
Explorer

Hi, 

We are trying to pull information from some of the database tables in ServiceNow into our Splunk Enterprise environment using the add-on, but since the tables are fairly heavy, we aren't able to successfully get it all working as some of the tables end up with the following error message:

2022-02-10 09:08:31,159 ERROR pid=12171 tid=Thread-20 file=snow_data_loader.py:collect_data:181 | Failure occurred while getting records for the table: syslog_transaction from https://---.net/. The reason for failure= {'message': 'Transaction cancelled: maximum execution time exceeded', 'detail': 'maximum execution time exceeded Check logs for error trace or enable glide.rest.debug property to verify REST request processing'}. Contact Splunk administrator for further information.

Now, I was told by the ServiceNow support that we might be able to prevent that from happening (and hence, get it successfully going) by introducing query parameters. Has anyone experience on how to configure the add-on to comply with that?

As a reference, the ServiceNow support sent me this:

"
I'm not familiar with the configuration options for the Splunk addon. However if you would like your API requests to take shorter time I would suggest that you limit the number of records you are fetching per request, use pagination and also limit the number of columns you are selecting.

a). You can implement pagination by using the URL parameter sysparm_offset. As an example in the initial request you can configure sysparm_offset=0&sysparm_limit=100, then on the next call you will increment the offset by 100 to sysparm_offset=100&sysparm_limit=100.
You will need to keep on incrementing the offset after each response until you reach the limit of 25000.

b). In order for you to limit the number of columns you will need to use the URL parameter sysparm_fields. For example if you only require the task number and short description you will configure the URL parameter as sysparm_fields=number,short_description&sysparm_limit=100.

Below is an example of a complete URL with both sysparm_fields and sysparm_offset configured.

api/now/table/task?sysparm_limit=100&sysparm_query=ORDERBYDESCsys_created_on&sysparm_fields=number,short_description&sysparm_offset=0
"

Does anyone have an idea on how to proceed to better get it working? Any ideas/suggestions would be really helpful.

Thanks,
Artelia

Labels (2)
0 Karma
1 Solution

VatsalJagani
SplunkTrust
SplunkTrust

In the Service Now Add-on, on the account page, there is a parameter called "record_count" which is used for this purpose (pagination or limit the number of results on each call) only I think.

Try reducing that number (min value on the Add-on side is 1000, maximum is 10000).

View solution in original post

0 Karma

VatsalJagani
SplunkTrust
SplunkTrust

In the Service Now Add-on, on the account page, there is a parameter called "record_count" which is used for this purpose (pagination or limit the number of results on each call) only I think.

Try reducing that number (min value on the Add-on side is 1000, maximum is 10000).

0 Karma

artelia
Explorer

Reducing it all to 1000 seems to have done the magic for us. We are still evaluating it, but we doesn't seem to see those error messages frequently anymore. Thanks!

VatsalJagani
SplunkTrust
SplunkTrust

@artelia - Please accept the solution if this was helpful.

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...