<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Issue with SNOW | Data is not getting in correctly in Getting Data In</title>
    <link>https://community.splunk.com/t5/Getting-Data-In/Issue-with-SNOW-Data-is-not-getting-in-correctly/m-p/524366#M88525</link>
    <description>&lt;P&gt;We have enabled the jobs to pull the records from each of the tables, post which we have created a report/dashboard as per our requirement. We could see that for few tickets the data is not being indexed with the latest details. Say for example, a ticket number = INC101023, is closed in Service Now on 20-Apr-2020, but in the Splunk index it is showing as "Work in Progress" with the date as 15-Apr-2020.&lt;BR /&gt;Can you please let us know, how to retrieve the missing data?&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I am getting the below errors in snow logs :&amp;nbsp;&lt;/P&gt;
&lt;P&gt;ValueError: Expecting : delimiter: line 1 column 1791875 (char 1791874)&lt;/P&gt;
&lt;P&gt;2020-10-10 14:30:59,831 ERROR pid=20332 tid=Thread-29 file=snow_data_loader.py:collect_data:170 | Failure occurred while getting records from &lt;A href="https://henkelprod.service-now.com/change_request" target="_blank" rel="noopener"&gt;https://henkelprod.service-now.com/change_request&lt;/A&gt;. The reason for failure= , u'detail': u'maximum execution time exceeded Check logs for error trace or enable glide.rest.debug property to verify REST request processing'}. Contact Splunk administrator for further information.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;2020-10-10 14:31:34,604 ERROR pid=20332 tid=Thread-24 file=snow_data_loader.py:collect_data:170 | Failure occurred while getting records from &lt;A href="https://henkelprod.service-now.com/rm_defect" target="_blank" rel="noopener"&gt;https://henkelprod.service-now.com/rm_defect&lt;/A&gt;. The reason for failure= {u'message': u'Transaction cancelled: maximum execution time exceeded', u'detail': u'maximum execution time exceeded Check logs for error trace or enable glide.rest.debug property to verify REST request processing'}. Contact Splunk administrator for further information.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;2020-10-10 15:00:59,950 ERROR pid=20332 tid=Thread-29 file=snow_data_loader.py:collect_data:170 | Failure occurred while getting records from &lt;A href="https://henkelprod.service-now.com/change_request" target="_blank" rel="noopener"&gt;https://henkelprod.service-now.com/change_request&lt;/A&gt;. The reason for failure= {u'message': u'Transaction cancelled: maximum execution time exceeded', u'detail': u'Transaction cancelled: maximum execution time exceeded Check logs for error trace or enable glide.rest.debug property to verify REST request processing'}. Contact Splunk administrator for further information.&amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;020-10-13 12:28:51,678 ERROR pid=8752 tid=Thread-39 file=snow_data_loader.py:_json_to_objects:268 | Obtained an invalid json string while parsing.Got value of type &amp;lt;type 'str'&amp;gt;. Traceback : Traceback (most recent call last):&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&amp;nbsp;File "D:\SplunkProgramFiles\etc\apps\Splunk_TA_snow\bin\snow_data_loader.py", line 265, in _json_to_objects&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;json_object = json.loads(json_str)&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&amp;nbsp;File "D:\SplunkProgramFiles\Python-2.7\Lib\json\__init__.py", line 339, in loads&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;return _default_decoder.decode(s)&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&amp;nbsp;File "D:\SplunkProgramFiles\Python-2.7\Lib\json\decoder.py", line 364, in decode&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;obj, end = self.raw_decode(s, idx=_w(s, 0).end())&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&amp;nbsp;File "D:\SplunkProgramFiles\Python-2.7\Lib\json\decoder.py", line 380, in raw_decode&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;obj, end = self.scan_once(s, idx)&lt;/P&gt;
&lt;P&gt;ValueError: Expecting : delimiter: line 1 column 972953 (char 972952)&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Mon, 14 Feb 2022 14:59:09 GMT</pubDate>
    <dc:creator>kittu1</dc:creator>
    <dc:date>2022-02-14T14:59:09Z</dc:date>
    <item>
      <title>Issue with SNOW | Data is not getting in correctly</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Issue-with-SNOW-Data-is-not-getting-in-correctly/m-p/524366#M88525</link>
      <description>&lt;P&gt;We have enabled the jobs to pull the records from each of the tables, post which we have created a report/dashboard as per our requirement. We could see that for few tickets the data is not being indexed with the latest details. Say for example, a ticket number = INC101023, is closed in Service Now on 20-Apr-2020, but in the Splunk index it is showing as "Work in Progress" with the date as 15-Apr-2020.&lt;BR /&gt;Can you please let us know, how to retrieve the missing data?&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I am getting the below errors in snow logs :&amp;nbsp;&lt;/P&gt;
&lt;P&gt;ValueError: Expecting : delimiter: line 1 column 1791875 (char 1791874)&lt;/P&gt;
&lt;P&gt;2020-10-10 14:30:59,831 ERROR pid=20332 tid=Thread-29 file=snow_data_loader.py:collect_data:170 | Failure occurred while getting records from &lt;A href="https://henkelprod.service-now.com/change_request" target="_blank" rel="noopener"&gt;https://henkelprod.service-now.com/change_request&lt;/A&gt;. The reason for failure= , u'detail': u'maximum execution time exceeded Check logs for error trace or enable glide.rest.debug property to verify REST request processing'}. Contact Splunk administrator for further information.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;2020-10-10 14:31:34,604 ERROR pid=20332 tid=Thread-24 file=snow_data_loader.py:collect_data:170 | Failure occurred while getting records from &lt;A href="https://henkelprod.service-now.com/rm_defect" target="_blank" rel="noopener"&gt;https://henkelprod.service-now.com/rm_defect&lt;/A&gt;. The reason for failure= {u'message': u'Transaction cancelled: maximum execution time exceeded', u'detail': u'maximum execution time exceeded Check logs for error trace or enable glide.rest.debug property to verify REST request processing'}. Contact Splunk administrator for further information.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;2020-10-10 15:00:59,950 ERROR pid=20332 tid=Thread-29 file=snow_data_loader.py:collect_data:170 | Failure occurred while getting records from &lt;A href="https://henkelprod.service-now.com/change_request" target="_blank" rel="noopener"&gt;https://henkelprod.service-now.com/change_request&lt;/A&gt;. The reason for failure= {u'message': u'Transaction cancelled: maximum execution time exceeded', u'detail': u'Transaction cancelled: maximum execution time exceeded Check logs for error trace or enable glide.rest.debug property to verify REST request processing'}. Contact Splunk administrator for further information.&amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;020-10-13 12:28:51,678 ERROR pid=8752 tid=Thread-39 file=snow_data_loader.py:_json_to_objects:268 | Obtained an invalid json string while parsing.Got value of type &amp;lt;type 'str'&amp;gt;. Traceback : Traceback (most recent call last):&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&amp;nbsp;File "D:\SplunkProgramFiles\etc\apps\Splunk_TA_snow\bin\snow_data_loader.py", line 265, in _json_to_objects&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;json_object = json.loads(json_str)&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&amp;nbsp;File "D:\SplunkProgramFiles\Python-2.7\Lib\json\__init__.py", line 339, in loads&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;return _default_decoder.decode(s)&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&amp;nbsp;File "D:\SplunkProgramFiles\Python-2.7\Lib\json\decoder.py", line 364, in decode&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;obj, end = self.raw_decode(s, idx=_w(s, 0).end())&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&amp;nbsp;File "D:\SplunkProgramFiles\Python-2.7\Lib\json\decoder.py", line 380, in raw_decode&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;obj, end = self.scan_once(s, idx)&lt;/P&gt;
&lt;P&gt;ValueError: Expecting : delimiter: line 1 column 972953 (char 972952)&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 14 Feb 2022 14:59:09 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Issue-with-SNOW-Data-is-not-getting-in-correctly/m-p/524366#M88525</guid>
      <dc:creator>kittu1</dc:creator>
      <dc:date>2022-02-14T14:59:09Z</dc:date>
    </item>
    <item>
      <title>Re: Issue with SNOW | Data is not getting correctly</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Issue-with-SNOW-Data-is-not-getting-in-correctly/m-p/584749#M102897</link>
      <description>&lt;P&gt;Hi kittu1!&lt;BR /&gt;&lt;BR /&gt;Where you able to solve this as we are also experiencing the same error message.&amp;nbsp;&lt;BR /&gt;We may have to use query parameters to receive smaller chunks of information at a time, but I haven't yet figured out an easy way to do so. Any advice would therefore be much appreciated.&lt;/P&gt;</description>
      <pubDate>Fri, 11 Feb 2022 15:02:02 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Issue-with-SNOW-Data-is-not-getting-in-correctly/m-p/584749#M102897</guid>
      <dc:creator>artelia</dc:creator>
      <dc:date>2022-02-11T15:02:02Z</dc:date>
    </item>
    <item>
      <title>Re: Issue with SNOW | Data is not getting correctly</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Issue-with-SNOW-Data-is-not-getting-in-correctly/m-p/584873#M102914</link>
      <description>&lt;P&gt;In the Service Now Add-on, on the account page, there is a parameter called "record_count" which is used for this purpose (pagination or limit the number of results on each call) only I think.&lt;/P&gt;&lt;P&gt;Try reducing that number (min value on the Add-on side is 1000, maximum is 10000).&lt;/P&gt;</description>
      <pubDate>Sun, 13 Feb 2022 08:06:58 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Issue-with-SNOW-Data-is-not-getting-in-correctly/m-p/584873#M102914</guid>
      <dc:creator>VatsalJagani</dc:creator>
      <dc:date>2022-02-13T08:06:58Z</dc:date>
    </item>
    <item>
      <title>Re: Issue with SNOW | Data is not getting correctly</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Issue-with-SNOW-Data-is-not-getting-in-correctly/m-p/585250#M102973</link>
      <description>&lt;P&gt;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/227493"&gt;@kittu1&lt;/a&gt;,&amp;nbsp;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/243002"&gt;@artelia&lt;/a&gt;&amp;nbsp; &amp;nbsp;- was the answer helpful. Please accept the solution if so.&lt;/P&gt;</description>
      <pubDate>Wed, 16 Feb 2022 04:32:16 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Issue-with-SNOW-Data-is-not-getting-in-correctly/m-p/585250#M102973</guid>
      <dc:creator>VatsalJagani</dc:creator>
      <dc:date>2022-02-16T04:32:16Z</dc:date>
    </item>
    <item>
      <title>Re: Issue with SNOW | Data is not getting in correctly</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Issue-with-SNOW-Data-is-not-getting-in-correctly/m-p/601116#M104747</link>
      <description>&lt;P&gt;I was getting the same error but&amp;nbsp;changing&amp;nbsp;&lt;STRONG&gt;record_count&lt;/STRONG&gt; from 3000 to 1000 fixed the issue for me.&lt;/P&gt;</description>
      <pubDate>Thu, 09 Jun 2022 06:29:38 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Issue-with-SNOW-Data-is-not-getting-in-correctly/m-p/601116#M104747</guid>
      <dc:creator>Jasdeep</dc:creator>
      <dc:date>2022-06-09T06:29:38Z</dc:date>
    </item>
    <item>
      <title>Re: Issue with SNOW | Data is not getting in correctly</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Issue-with-SNOW-Data-is-not-getting-in-correctly/m-p/638708#M109067</link>
      <description>&lt;P&gt;I had similar issue, the difference is that my TA has inputs for multiple tables. All the inputs work ok with 3000 record count except one of them. Once I adjusted the problematics input to&amp;nbsp; 1000 record count, events started to be ingested.&amp;nbsp; why only experiencing issues with one of them.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Error reported for this input is:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;The reason for failure= {'message': 'Transaction cancelled: maximum execution time exceeded', 'detail': 'maximum execution time exceeded Check logs for error trace or enable glide.rest.debug property to verify REST request processing'}.&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN class=""&gt;any suggestions will be appreciated.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 04 Apr 2023 20:27:47 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Issue-with-SNOW-Data-is-not-getting-in-correctly/m-p/638708#M109067</guid>
      <dc:creator>Juan_Leon</dc:creator>
      <dc:date>2023-04-04T20:27:47Z</dc:date>
    </item>
  </channel>
</rss>

