All Apps and Add-ons

what type of data and how to forward data in o Service-now

vamsi92
Explorer

Hi there
I am using linux installation of splunk enterprise, which is configured to work with service-now by adding "splunk add-on for service now" and "splunk app for service-now".
Now i am able to create incidents in service-now using splunk " |snowincident " command.
Now i want to know to what kind of data can we send to service-now (ie. like logs or search result or atleast some numbered outputs) and how to send the data to service-now. (like using forwarders or any).
And also please indicate the possible way to get data from service-now into splunk(kindly mention if its posssible or not)
(Please mention if any type of config or app needs to be installed or any syntax of custom search)

0 Karma

jkat54
SplunkTrust
SplunkTrust

As for sending data to SNOW:

You're creating an incident in SNOW... that is sending data to SNOW. It sounds like you want to arbitrarily upload data files when you create incidents. This isnt possible using the splunk app for service now. Instead you'd have to write your own command for appending files to incidents or log into servicenow after creating the incident and upload the data.

Perhaps you can take this script for pulling data from SNOW and modify it:

I wrote a python script for pulling SNOW data as XML, then convert to CSV. It's not the best, but we schedule it so that it runs every day. You'll want to remove all references to elasticsearch such as (getESHealth, snowElasticUri, etc). This script creates a delta file when it is executed. The delta file will stay in the data folder specified so that the script only retrieves new incidents since the delta date contained in the delta file. You may wish to remove the delta functionality.

#!/usr/bin/python
# getSNOWCSV.py
# Downloads SNOW xml, converts to CSV (removing non-ascii, extra spaces, transforming \n\t\r & double quotes)
# CAUTION: deletes records that already exist in elasticsearch
# Requires snow_incident index and hard coded uri's

import csv
from datetime import date
import logging
import os
import re
import requests
import subprocess
import time
import urllib3
import xml
import xml.etree.ElementTree as ET

# hardcoded user/pass/instance name must be updated as per SNOW credentials
snowUser = 'user@domaincom'
snowPass = 'password'
snowInstance = 'yourSnowInstanceName'  #see your SNOW URL: https://yourSnowInstanceName.servicenow.com
snowElasticUri = 'http://elk-poc-e:23001' #elasticsearch instance and port
snowEndpoints = 'incident' #if you need something besides incident, specify just the endpoint name here.  Most *.do pages work but leave the .do off.  And comma separate the values.  ex:  "incident,problem,change_request"
dataFolder = '/opt/snow/data'  #folder to keep data

# timestamp of execution, coverted to int to remove decimal, and then string so it can be appended to other strings as needed
timestamp = str(int(time.time()))

# create logger object for logging - see https://docs.python.org/2/howto/logging.html for details
logger = logging.getLogger('getSNOWCSV.py')
logger.setLevel(logging.DEBUG)
fh = logging.FileHandler('/var/log/snow')
fh.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(asctime)s | src="%(name)s" | lvl="%(levelname)s" | msg="%(message)s"')
fh.setFormatter(formatter)
logger.addHandler(fh)

###################### FUNCTIONS #############################
def getESHealth(uri):
    # expects uri to be http://host:port/_cat/health
    # returns True if cluster health is green, False if else
    try:
        if requests.get(uri).text.split(' ')[3] == "green":
            return True
        else:
            return False
    except OSError as e:
        print(e)

def getDeltaDate(datapath):
    try:
        if os.path.exists(datapath + '/.delta_date'):
            #open delta file and return date from file
            with open(datapath + '/.delta_date','r') as deltafile:
                for lastrundate in deltafile:
                    return lastrundate
                deltafile.close()
            #write new date to file, overwriting the original
            with open(datapath + '/.delta_date','w') as deltafile:
                deltafile.write(str(date.today()))
            deltafile.close()
        else:
            #write new date to file, original shouldnt exist at this point but will overwrite if so
            with open(datapath + '/.delta_date','w') as deltafile:
                deltafile.write(str(date.today()))
            deltafile.close()
            firstDate = "1970-01-01"
            return firstDate
    except OSError as e:
        logger.critical('Function: getDeltaDate failed due to the following error(s): ' + str(e))
        print('Function: getDeltaDate failed due to the following error(s): ' + str(e))

def getSNOWCSV (doPage, username, password, instance, datapath, elasticUri):
    u, p, i, dataPath = username, password, instance, datapath
    # Create URI with Service Now xml URL syntax
    #deltaDate will either be blank string or string in YYYY-MM-DD format
    deltaDate = str(getDeltaDate(dataPath)).replace("\n",'')
    uri = 'https://'+u+':'+p+'@'+i+'.service-now.com/'+doPage+'.do?XML&sysparm_query=sys_updated_on>=' + deltaDate
    # try to get XML using requests
    try:
        r = requests.get(uri, auth=(username, password))
        print('Downloaded ' + str(doPage) + ' with length of: ' + str(len(r.text)))
        logger.info('Downloaded ' + str(doPage) + ' with length of: ' + str(len(r.text)))
        page = r.text.encode('utf-8','ignore')
        print('Encoded ' + str(doPage) + ' with length of: ' + str(len(page)))
        logger.info('Encoded ' + str(doPage) + ' with length of: ' + str(len(page)))
    except OSError as e:
        logger.critical('Function: getSNOWCSV could not download xml file from endpoint' + doPage + ' due to the following error(s): ' + str(e))
        print('Function: getSNOWCSV could not download xml file from endpoint ' +doPage+ ' due to the following error(s): ' + str(e))
    # try to create directory if not exists
    try:
        if not os.path.isdir(dataPath):
            os.system("mkdir " + dataPath)
    except OSError as e:
        logger.critical('Function: getSNOWCSV could not create ' + dataPath + 'due to the following error(s): ' + str(e))
        print('Function: getSNOWCSV could not create ' + dataPath + 'due to the following error(s): ' + str(e))
    # try to create xml
    try:
        with open(dataPath + '/' + doPage + '_' + timestamp + '.xml', 'w') as xmlfile:
            xmlfile.write(page)
            xmlfile.close()
    except OSError as e:
        logger.critical('Function: getSNOWCSV could not write xml file ' + dataPath + '/' + doPage + '_' + timestamp + '.xml due to the following error(s): ' + str(e))
        print('Function: getSNOWCSV could not write xml file ' + dataPath + '/' + doPage + '_' + timestamp + '.xml due to the following error(s): ' + str(e))
    try:
        xmlFile = dataPath + '/' + doPage + '_' + timestamp + '.xml'
        csvFile = dataPath + '/' + doPage + '_' + timestamp + '.csv'
        convertSNOWXMLtoCSV(xmlFile,csvFile,dataPath, doPage, elasticUri)
    except OSError as e:
        logger.critical('Function: getSNOWCSV could not convert XML file to CSV file named ' + dataPath + '/' + doPage + '_' + timestamp + '.csv due to the following error(s): ' + str(e))
        print('Function: getSNOWCSV could not convert XML file to CSV file named ' + dataPath + '/' + doPage + '_' + timestamp + '.csv due to the following error(s): ' + str(e))

def getSNOWXMLFields(rootObj):
    try:
        n = 0
        if len(rootObj) > 0:
            max = len(rootObj[0])
            while n < max:
                if 'row' in locals():
                    row = row + "," + rootObj[0][n].tag
                else:
                    row = rootObj[0][n].tag
                n = n + 1
            fields = row.split(',')
        else:
            fields = []
    except OSError as e:
        logger.critical('Function: getSNOWXMLFields failed due to the following error(s): ' + str(e))
        print('Function: getSNOWXMLFields failed due to the following error(s): ' + str(e))
    return fields

def convertSNOWXMLtoCSV(xmlInputFile, csvOutputFile, dataPath, filePrefix, elasticUri):
    try:
        if 'row' in locals():
            del row
        tree = ET.parse(xmlInputFile)
        root = tree.getroot()
        fields = getSNOWXMLFields(root)
        print('XML Input File: ' + str(xmlInputFile))
        logger.info('XML Input File: ' + str(xmlInputFile))
        print('Discovered Field Count: ' + str(len(fields)))
        logger.info('Discovered Field Count: ' + str(len(fields)))
        if len(fields) > 0:
            print('Discovered Field List: ' + str(fields))
            logger.info('Discovered Field List: ' + str(fields))
        i = 0
        recordcount = len(root)
        print('Total Records: ' + str(recordcount))
        logger.info('Total Records: ' + str(recordcount))
        if recordcount > 0:
            while i < recordcount:
                #Due to delta, we have to delete previous records of existing incidents
                #First check to see if the index exists, and to reduce API calls, memorize that we've found the index
                if 'indexFound' not in locals():
                    if requests.get(elasticUri + '/snow_incident').status_code == 200 :
                        indexFound = 1
                        #print("index found")
                        #print('checking elk-poc-e:23001/snow_incident/_search/exists?q=number:' + str(root[i].find('number').text))
                        if requests.get(elasticUri + '/snow_incident/_search/exists?q=number:' + str(root[i].find('number').text)).status_code == 200:
                            #print('deleting with: curl -XDELETE "elk-poc-e:23001/snow_incident/?q=number:' + str(root[i].find('number').text) + '"')
                            with open(os.devnull, 'wb') as devnull:
                                string = elasticUri + '/snow_incident/_query?q=number:' + str(root[i].find('number').text)
                                subprocess.check_call(['/usr/bin/curl','-XDELETE',string], stdout=devnull, stderr=subprocess.STDOUT)
                                #os.system('curl -XDELETE "elk-poc-e:23001/snow_incident/_query?q=number:' + str(root[i].find('number').text) + '" > /dev/null')
                    else:
                        indexFound = 0
                        #print("index not found")
                        #Once we've confirmed the index has been found or has not been found, if it has been found, make two api calls
                        # 1. to see if the record already exists in elasticsearch, and 2. to delete the record if it does exist
                else:
                    if indexFound == 1:
                        #print("index found")
                        #print('checking elk-poc-e:23001/snow_incident/_search/exists?q=number:' + str(root[i].find('number').text))
                        if requests.get(elasticUri + '/snow_incident/_search/exists?q=number:' + str(root[i].find('number').text)).status_code == 200:
                            #print('deleting with: curl -XDELETE "elk-poc-e:23001/snow_incident/?q=number:' + str(root[i].find('number').text) + '"')
                            with open(os.devnull, 'wb') as devnull:
                                string = elasticUri + '/snow_incident/_query?q=number:' + str(root[i].find('number').text)
                                subprocess.check_call(['/usr/bin/curl','-XDELETE',string], stdout=devnull, stderr=subprocess.STDOUT)
                                #os.system('curl -XDELETE "elk-poc-e:23001/snow_incident/_query?q=number:' + str(root[i].find('number').text) + '" > /dev/null')
                #foreach field append the field text to row string variable
                for field in fields:
                    if 'row' in locals():
                        if root[i].find(field).text:
                            row = row + ',"' + re.sub(r'  +','', re.sub(r'[^\x00-\x7f]','',root[i].find(field).text.replace('"','-').replace('\t',' ').replace('\n',' ').replace('\r',' ').encode('utf-8','replace'))) + '"'
                        else:
                            row = row + ',"null"'
                    else:
                        if root[i].find(field).text:
                            row = '"' + re.sub(r'  +','', re.sub(r'[^\x00-\x7f]','',root[i].find(field).text.replace('"','-').replace('\t',' ').replace('\n',' ').replace('\r',' ').encode('utf-8','replace'))) + '"'
                        else:
                            row =  '"null"'
                with open(dataPath + '/' + filePrefix + '_' + timestamp + '.csv__temp', 'a') as csvfile:
                    csvfile.write(row)
                    csvfile.write('\n')
                    csvfile.close()
                del row
                i = i + 1
            cr = csv.reader(open(dataPath + '/' + filePrefix + '_' + timestamp + '.csv__temp', 'r'))
            cw = csv.writer(open(dataPath + '/' + filePrefix + '_' + timestamp + '.csv', 'a'))
            for row in cr:
                cw.writerow(row)
        os.system('rm -f ' + dataPath + '/' + filePrefix + '_' + timestamp + '.xml')
        os.system('rm -f ' + dataPath + '/' + filePrefix + '_' + timestamp + '.csv__temp')
    except OSError as e:
        logger.critical('Function: convertSNOWXMLtoCSV failed due to the following error(s): ' + str(e))
        print('Function: convertSNOWXMLtoCSV  due to the following error(s): ' + str(e))

##################### EXECUTION #################################


# Disables SSL Warning Message
urllib3.disable_warnings()

# Log program execution
print('getSNOWCSV.py has been executed')
logger.info('getSNOWCSV.py has been executed')

# Ensure cluster health is good before proceeding
if getESHealth(snowElasticUri + "/_cat/health"):
    # foreach endpoint specified, getSNOWCSV to create a csv file
    try:
        for endpoint in snowEndpoints.split(','):
            getSNOWCSV(endpoint, snowUser, snowPass, snowInstance, dataFolder, snowElasticUri)
    except OSError as e:
        logger.critical('getSNOWCSV failed due to the following error(s): ' + str(e))
        print('getSNOWCSV failed due to the following error(s): ' + str(e))
else:
    print('Please check the snowElasticUri variable and the cluster health.  You should be able to open it using curl/wget and see the health of the elasticsearch cluster')
    logger.critical('Please check the snowElasticUri variable and the cluster health.  You should be able to open it using curl/wget and see the health of the elasticsearch cluster')

# Log program completion
print('getSNOWCSV.py has completed')
logger.info('getSNOWCSV.py has completed')
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...