Getting Data In

CarbonBlack/Splunk Integration

fdarrigo
Path Finder

I am having difficulty configuring the Cb Defense Add-On for Splunk on a heavy forwarder, which is forwarding to my Splunk cloud environment.

I have followed the configuration guides and I have created a Carbon Black API, but I see the following errors in the ta-cb_defense-carbonblack_defense_.... log file.

2020-09-23 23:44:39,961 +0000 log_level=INFO, pid=11719, tid=MainThread, file=ta_config.py, func_name=set_logging, code_line_no=77 | Set log_level=INFO
2020-09-23 23:44:39,961 +0000 log_level=INFO, pid=11719, tid=MainThread, file=ta_config.py, func_name=set_logging, code_line_no=78 | Start CarbonBlack task
2020-09-23 23:44:39,962 +0000 log_level=INFO, pid=11719, tid=MainThread, file=ta_checkpoint_manager.py, func_name=_use_cache_file, code_line_no=76 | Stanza=CarbonBlack using cached file store to create checkpoint
2020-09-23 23:44:39,962 +0000 log_level=INFO, pid=11719, tid=MainThread, file=event_writer.py, func_name=start, code_line_no=28 | Event writer started.
2020-09-23 23:44:39,962 +0000 log_level=INFO, pid=11719, tid=MainThread, file=thread_pool.py, func_name=start, code_line_no=66 | ThreadPool started.
2020-09-23 23:44:39,963 +0000 log_level=INFO, pid=11719, tid=MainThread, file=timer_queue.py, func_name=start, code_line_no=39 | TimerQueue started.
2020-09-23 23:44:39,963 +0000 log_level=INFO, pid=11719, tid=MainThread, file=ta_data_loader.py, func_name=run, code_line_no=48 | TADataLoader started.
2020-09-23 23:44:39,964 +0000 log_level=INFO, pid=11719, tid=Thread-2, file=scheduler.py, func_name=get_ready_jobs, code_line_no=100 | Get 1 ready jobs, next duration is 119.999002, and there are 1 jobs scheduling
2020-09-23 23:44:40,010 +0000 log_level=WARNING, pid=11719, tid=Thread-4, file=loader.py, func_name=_get_log_level, code_line_no=133 | [stanza_name="CarbonBlack"] The log level "" is invalid, set it to default: "INFO"
2020-09-23 23:44:40,016 +0000 log_level=INFO, pid=11719, tid=Thread-4, file=engine.py, func_name=start, code_line_no=36 | [stanza_name="CarbonBlack"] Start to execute requests jobs.
2020-09-23 23:44:40,016 +0000 log_level=INFO, pid=11719, tid=Thread-4, file=engine.py, func_name=run, code_line_no=219 | [stanza_name="CarbonBlack"] Start to process job
2020-09-23 23:44:40,016 +0000 log_level=INFO, pid=11719, tid=Thread-4, file=engine.py, func_name=_get_checkpoint, code_line_no=189 | [stanza_name="CarbonBlack"] Checkpoint not specified, do not read it.
2020-09-23 23:44:40,016 +0000 log_level=INFO, pid=11719, tid=Thread-4, file=http.py, func_name=request, code_line_no=165 | [stanza_name="CarbonBlack"] Preparing to invoke request to [https://defense-prod05.conferdeploy.net/integrati$
2020-09-23 23:44:40,325 +0000 log_level=INFO, pid=11719, tid=Thread-4, file=http.py, func_name=_decode_content, code_line_no=36 | [stanza_name="CarbonBlack"] Unable to find charset in response headers, set it to default "utf-8"
2020-09-23 23:44:40,325 +0000 log_level=INFO, pid=11719, tid=Thread-4, file=http.py, func_name=_decode_content, code_line_no=39 | [stanza_name="CarbonBlack"] Decoding response content with charset=utf-8
2020-09-23 23:44:40,325 +0000 log_level=INFO, pid=11719, tid=Thread-4, file=http.py, func_name=request, code_line_no=169 | [stanza_name="CarbonBlack"] Invoking request to [https://defense-prod05.conferdeploy.net/integrationServices/$
2020-09-23 23:44:40,335 +0000 log_level=INFO, pid=11719, tid=Thread-4, file=engine.py, func_name=_on_post_process, code_line_no=164 | [stanza_name="CarbonBlack"] Skip post process condition satisfied, do nothing
2020-09-23 23:44:40,335 +0000 log_level=INFO, pid=11719, tid=Thread-4, file=engine.py, func_name=_update_checkpoint, code_line_no=178 | [stanza_name="CarbonBlack"] Checkpoint not specified, do not update it.

 

I am curious if the lack of a checkpoint reference is preventing the API from returning data to the TA, but I didn't find any documentation related to the checkpoint or how to debug the TA log file. 

Where do I go from here?

 

Labels (1)
0 Karma
Get Updates on the Splunk Community!

Enter the Splunk Community Dashboard Challenge for Your Chance to Win!

The Splunk Community Dashboard Challenge is underway! This is your chance to showcase your skills in creating ...

.conf24 | Session Scheduler is Live!!

.conf24 is happening June 11 - 14 in Las Vegas, and we are thrilled to announce that the conference catalog ...

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...