Activity Feed
- Karma Re: What is the best practices to collect data (high frequently) out of the Azure Monitor? for jconger. 06-05-2020 12:49 AM
- Got Karma for What is the best practices to collect data (high frequently) out of the Azure Monitor?. 06-05-2020 12:49 AM
- Posted Why is my node_modules folder deleted after docker build? on Getting Data In. 05-29-2018 08:11 AM
- Tagged Why is my node_modules folder deleted after docker build? on Getting Data In. 05-29-2018 08:11 AM
- Tagged Why is my node_modules folder deleted after docker build? on Getting Data In. 05-29-2018 08:11 AM
- Tagged Why is my node_modules folder deleted after docker build? on Getting Data In. 05-29-2018 08:11 AM
- Posted What is the best practices to collect data (high frequently) out of the Azure Monitor? on All Apps and Add-ons. 03-01-2018 04:48 AM
- Tagged What is the best practices to collect data (high frequently) out of the Azure Monitor? on All Apps and Add-ons. 03-01-2018 04:48 AM
- Tagged What is the best practices to collect data (high frequently) out of the Azure Monitor? on All Apps and Add-ons. 03-01-2018 04:48 AM
- Tagged What is the best practices to collect data (high frequently) out of the Azure Monitor? on All Apps and Add-ons. 03-01-2018 04:48 AM
- Tagged What is the best practices to collect data (high frequently) out of the Azure Monitor? on All Apps and Add-ons. 03-01-2018 04:48 AM
Topics I've Started
05-29-2018
08:11 AM
Hi guys!
I'm currently building a docker container with splunk/universalforwarder:7.0.0-monitor-k8s-logs as base image. During my docker build i install my own node.js and want to add my own scriptedinput scripts.
my eventhub folder:
/bin/myscript
/bin/package.json
Dockerfile:
FROM splunk/universalforwarder:7.0.0-monitor-k8s-logs
RUN apt-get update \
&& apt-get install curl -y \
&& curl -sL https://deb.nodesource.com/setup_8.x | bash \
&& sudo apt-get install -y nodejs
COPY eventhub /opt/splunk/etc/apps/eventhub/
WORKDIR /opt/splunk/etc/apps/eventhub/bin/
RUN npm i --production && ls -al <==
ls -al gives me that all are successfully created:
myscript
node_modules
package.json
package.json.lock
However running my new image, the node_modules folder and package.json.lock is missing. After doing a manual npm install in the container everything is working.
Copying the files to /opt/splunk/ and doing the npm install there and not to /opt/splunk/etc the files are there.
I think it has something to do with the base image volumes of /opt/splunk/etc and /opt/splunk/etc see: https://github.com/splunk/docker-splunk/blob/master/enterprise/Dockerfile. But i cant't figure out what...I don't run splunk when running the image so i don't know how the volumes getting manipulated and also why only the npm install files getting deleted.
Thanks for your help!
... View more
03-01-2018
04:48 AM
1 Karma
Hi everyone :),
at the moment i am building a service based on Azure Cloud Infrastructure. I am not very happy with the monitoring solutions given by Microsoft Azure like azure App Insights concerning performance and usability of the dashboards... What i came up with is using Azure Monitor to collect diagnostic logs, metrics from my resources e.g. sql databases , storage blobs ( no app-service, because its at the moment not supported to collect these logs via Azure Monitor) .. Now i would love to know how i can get this data near-realtime into splunk. I already did some research and found "mainly" two solutions.
From Azure Monitor directly to an Event Hub to a binded Azure Function which sends the log data via HEC into splunk. Described here: https://github.com/sebastus/AzureFunctionForSplunkCSX
From Azure Monitor directly to an Azure blobs/table storage and then periodically via Splunk Add-on for Microsoft Cloud Services into splunk.
Solution 1: I mainly don't like the fact that i need an extra function to send data to the HEC. I would prefer to directly speak to the EventHub via amqp. I know that this is possible but i didn't found a let's call it "trusted add-on" for splunk and i don't want to write it on my own.
Solution 2: I am not quite sure if this is very practicable for my near-realtime needs and don't like the fact that i would have to poll the data and how this would behave on a very huge amount of data (To make sure, i didn't try it ) .
Is there anything i understand wrong or any better way to do this?
Thx for your help!
... View more