All Apps and Add-ons

Splunk Add-on for Bamboo: Is there any more documentation for this add-on?

miro_hiscox
Engager

Hi,

  • Is there any more documentation for the Splunk Add-on for Bamboo?
  • Which index does it forward data to by default?
  • Do I need to create "bamboo" index for it in the Splunk indexer manually?
  • Can I or do I have to set the index in the $SPLUNK_HOME/etc/apps/ta-bamboo/local/inputs.conf file? E.g. index = bamboo

Thanks,
Miro

0 Karma

twinpeakslog
Explorer

I have not found any further documentation, I was looking due to a SSL error when trying to connect.

The script looks like it sends to either the "main" index or whatever you have in inputs, according to line 218 of the bin/bamboo.py script, haven't been able to verify because I'm not getting logs in but fairly certain thats how it works.

    def extract(self, inputs):
201         """
202         Extract data from provided inputs
203         :param inputs: inputs_items object
204         """
205         log.info("Inside extract ...")
206         self.input_name, self.input_items = inputs.inputs.popitem()
207         self.server = self.input_items['server']
208         self.protocol = self.input_items['protocol']
209         self.port = self.input_items['port']
210         self.username = self.input_items['username']
211         self.password = self.input_items['password']
212         self.bamboo_service = BambooService(self.username, self.password, self.server, self.port, self.protocol)
213         #self.jql = self.input_items['jql']
214         post_endpoint = '%s://%s:%s/rest/api/latest/plan' % (
215             self.protocol, self.server, self.port)
216         self.post_url = _get_url(post_endpoint)
217         log.info("PostURL: " + self.post_url)
218         self.output_index = self.input_items['index'] or 'main'
219         self.output_sourcetype = self.input_items['sourcetype'] or 'bamboo'
0 Karma

simon_branton_h
New Member

I wish I knew the answers to this.

I did add index=something_something to the local/input.conf file on the heavy forwarder and the data was ingested in that index.

Also, this will not run on a universal forwarder, since it needs Splunk's python to operate.

0 Karma
Get Updates on the Splunk Community!

Customer Experience | Splunk 2024: New Onboarding Resources

In 2023, we were routinely reminded that the digital world is ever-evolving and susceptible to new ...

Celebrate CX Day with Splunk: Take our interactive quiz, join our LinkedIn Live ...

Today and every day, Splunk celebrates the importance of customer experience throughout our product, ...

How to Get Started with Splunk Data Management Pipeline Builders (Edge Processor & ...

If you want to gain full control over your growing data volumes, check out Splunk’s Data Management pipeline ...