All Apps and Add-ons

How can one get Google BigQuery data in Splunk?

Gchouane
Engager

Hello ,

How can one get Google Big Query data in Splunk?
At the moment we use a script that extracts necessary parameters out of Google Analytics into a csv file that is then indexed in Splunk.

Should we use the Big Query API and REST API Modular Input (https://apps.splunk.com/app/1546/) ?

If yes, can you give me advice on how to configure this connection?

Thank you.

agoodall_splunk
Splunk Employee
Splunk Employee

There's a new Add-on that might be of interest:

Splunk-GABQ-Addon on GitHub
https://github.com/AusDTO/Splunk-GABQ-Addon

0 Karma

tpetersonalpine
Explorer

Thank you for the share. This code was a great head start.

I've been able to harvest the majority of the 50 fields.

I've got a few that giving me grief.
Here's one: message_info.triggered_rule_info.consequence.action - holds an integer

focused in this code

def buildStruct(dataRow, fieldList):
                        returnVal = {}
                        for field in fieldList:
                                ew.log(EventWriter.INFO, "Debug field: %s " % field)
                                if '.' in field:
                                        record, recfield = field.split(".", 1)
                                        ew.log(EventWriter.INFO, "Debug datarow record: %s and recfield %s " % (dataRow,  recfield))
                                        if len(dataRow) != 0:
                                                returnVal[field] = buildStruct(dataRow[record], [recfield])
                                else:
                                        if type(dataRow) is list:
                                                ew.log(EventWriter.INFO, "Debug hit the dataRow is a list " )
                                                ew.log(EventWriter.INFO, "Debug dataRow: %s " % dataRow)
                                                if len(dataRow) == 1:
                                                        returnVal[field] = dataRow[0][field]
                                                else:
                                                        returnVal[field] = []
                                                        for x in dataRow:
                                                                ew.log(EventWriter.INFO, "Thayne Debug dataRow: %s " % x)
                                                                returnVal[field].append(x[field])
                                        else:
                                                returnVal[field] = dataRow[field]
                        return returnVal

Debug datarow record: [{u'consequence': [{u'action': u'17', u'subconsequence': [], u'reason': u'Triggered by CONTENT_COMPLIANCE rule. Rule description: GA GS Keyword Test'}], u'spam_label_modifier': None, u'policy_holder_address': u'dipak.samanta@dev.megadiamond.com', u'string_match': [{u'predefined_detector_name': None, u'matched_string': u'\nInternal use only\r', u'type': u'1', u'source': u'1', u'match_expression': u'(?i)(\W|^)(Not\s*for\s*Distribution|Do\s*Not\s*Distribute|Internal\s*Use\s*Only|IUO|Confidential|(?i', u'attachment_name': None}, {u'predefined_detector_name': None, u'matched_string': u'GS\r', u'type': u'1', u'source': u'1', u'match_expression': u'(?i)(\W|^)(Gold\s*Story|GS|Gold)(\W|$)', u'attachment_name': None}], u'rule_name': u'GA GS Keyword Test', u'rule_type': u'8'}] and recfield action

/splunk/etc/apps/GoogleAnalyticsBQ/bin/gabq.py" File "/opt/splunk/etc/apps/GoogleAnalyticsBQ/bin/gabq.py", line 104, in buildStruct
07-27-2018 18:27:42.515 -0400 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/GoogleAnalyticsBQ/bin/gabq.py" returnVal[field] = buildStruct(dataRow[record], [recfield])

07-27-2018 18:27:42.515 -0400 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/GoogleAnalyticsBQ/bin/gabq.py" TypeError: list indices must be integers, not str

0 Karma

daviduslan
Path Finder

Has anyone tried this add-on? We're interested in querying bigquery tables similar to how we'd use dbconnect (without having to index the data).

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...