All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

It always amaze me how much time people waste searching endlessly for magic shortcuts to success and fulfilment when the only real path is staring them right in the face. Yes, I know that’s hard. It'... See more...
It always amaze me how much time people waste searching endlessly for magic shortcuts to success and fulfilment when the only real path is staring them right in the face. Yes, I know that’s hard. It's a lot of work. What can I say, that’s life. Besides, look on the bright side: You get to do what you want and you get to do it your way. There’s just one catch. You’ve got to start somewhere. Ideas and opportunities don’t just materialize out of thin air.   I early started developing my interest in Data Analytics and Splunk when I joined energetic start up organisation - Avotrix. The only way I know to get started is by learning a marketable skill and getting to work. I completed my initial courses from Splunk and started exploring advance features of it. My motivation is simple really. I just want to secure a well-paying career that allows me to support my family and be able to take them on good holidays. I’m willing to put in the time to achieve that.   I started giving courses to the unemployed and new Grads students from various countries. My obsession is providing tools knowledge to the youth to support their development and initiatives for career aspiration. Because My company believes that individuals work optimally in different ways, it allows me to design my own interest towards Splunk technology. With this inspiration I have initiated "Mumbai Splunk User Group" to gather Splunk Ninjas from various location to share  thoughts and grow together. This conviction and determination helped overcome insurmountable odds and become vivid group in Splunk community. I believe  accumulating a diverse set of experiences can help you run your business best.   I always explore various ideas to develop app which can make day to day data analysis easier for every individuals. As a outcome I have developed "Gogs App for Splunk" for Opensource Version Control Service. I am glad to share the recognition and reward from Splunk for this innovative work. I would like to take this opportunity to Thank Splunk for recognising my contribution towards Splunk community.   You never know what journey life is going to take you on or how technology could end up allowing you to share your passions with the world. The journey I went on may have taken years but it led me to create the app that has more than 100+ downloads from users worldwide. I will be happy to connect with you all. Linkedin Facebook Instagram
Could someone please help me convert epoch time to human readable time? "Date":1605030538646  
I have 2 searches: 1) |dbxquery query="select member, gate, port from fo.member connection=fo_member" 2) |dbxquery query="select description from fo.date connection=datelog" How I can union the... See more...
I have 2 searches: 1) |dbxquery query="select member, gate, port from fo.member connection=fo_member" 2) |dbxquery query="select description from fo.date connection=datelog" How I can union them into one search? I tryed to use |union and |join between each other but it didn't work.
hi all, Is there a way to uninstall properly an EUM server? I didn't find any documentation on it, for a controller yes but for EUM no. My EUM is hosted with an events service Linux server and... See more...
hi all, Is there a way to uninstall properly an EUM server? I didn't find any documentation on it, for a controller yes but for EUM no. My EUM is hosted with an events service Linux server and I want to uninstall it properly and install it on another Linux server  thanks a lot  ludo
Hi, I have thousands of csv file on my splunk by monitoring a local share. Each day the folder is replicated by new csv files (current day).The csv file are about switches and are all differents and... See more...
Hi, I have thousands of csv file on my splunk by monitoring a local share. Each day the folder is replicated by new csv files (current day).The csv file are about switches and are all differents and contain for each one some informations about their interfaces (ip, interface name...). In my query I filter for some fileds to include only interfaces that I need. Ip and Interface name should be unique, so I think I should dedup like | dedup IP, Interface. Then I want to extract for each month the max count of those filtered Interfaces. My query is like this but it's incomplete: index=appliance sourcetype=new field1=........ field2=........ field3=........ | dedup Ip, Interface (I don't know if is correct and if is what I need) | ...?
I have 2 searches: 1) |dbxquery query="select member, gate, port from fo.member connection=fo_member" 2) |dbxquery query="select description from fo.date connection=datelog" How I can union them i... See more...
I have 2 searches: 1) |dbxquery query="select member, gate, port from fo.member connection=fo_member" 2) |dbxquery query="select description from fo.date connection=datelog" How I can union them into one search? I tryed to use |union and |join between each other but it didn't work.
..........NOT [search logLevel IN (DEBUG,INFO)]........... it is not giving desired results.   how can I search not IN as I was working on a solution building for not including multiple parameters.
In my dashboard there's a panel thats changes a token when I use a drilldown.  The same token can also be changed by a global dropdown on the dashboard. Unfortunately the dropdown doesn't change visi... See more...
In my dashboard there's a panel thats changes a token when I use a drilldown.  The same token can also be changed by a global dropdown on the dashboard. Unfortunately the dropdown doesn't change visibly when i use the drilldown option. Is this possible? Maybe by refreshing only the dropdown while drilling down?
Hello i'm trying to pull logs from Prometheus exporter to splunk i installed the "Prometheus Metrics for Splunk" app but i can't see where and what should i config  someone has been tried to conne... See more...
Hello i'm trying to pull logs from Prometheus exporter to splunk i installed the "Prometheus Metrics for Splunk" app but i can't see where and what should i config  someone has been tried to connect Prometheus to Splunk ?   thanks
After looking at the "Data Model Audit" dashboard in Splunk ES, in the "Acceleration Details" panel, we saw that some of the datamodels had their "earliest" time sat to "01/01/1970 01:00:00". We foun... See more...
After looking at the "Data Model Audit" dashboard in Splunk ES, in the "Acceleration Details" panel, we saw that some of the datamodels had their "earliest" time sat to "01/01/1970 01:00:00". We found out that this is from the command "| rest /services/admin/summarization by_tstats=t splunk_server=local count=0", which sets the field "summary.earliest_time" for some of our datamodels to "0". If we run a tstats search on the datamodels for "all time", there are no events with this timestamp, not even close. As far as I can tell, the parsing works just fine, and no events in the datamodels are older than 3 months. Why is the REST API still saying that the earliest time is epoch 0?  
how to count the number of file transferred within one month by an user id using particular file and display events in below manner: username filename count_of_files_tras_in1day count_of_files_trasf... See more...
how to count the number of file transferred within one month by an user id using particular file and display events in below manner: username filename count_of_files_tras_in1day count_of_files_trasferred_in_1week count_of_files_trasferred_in_1mon
Hello guys, is there a way to query all data inputs as seen below? Thanks.
I processed 100 million events. from the data I use to present several kinds of visualization. but this makes memory limited so that an error appears like the following Dag Execution Exception: Sear... See more...
I processed 100 million events. from the data I use to present several kinds of visualization. but this makes memory limited so that an error appears like the following Dag Execution Exception: Search has been cancelled. Search auto-canceled.    
Hi everyone, Am having issues with the configuration of the AlienVault OTX feed in Splunk ES and would appreciate any help. Have got my AlienVault OTX key ready but need help with the Threat Intel ... See more...
Hi everyone, Am having issues with the configuration of the AlienVault OTX feed in Splunk ES and would appreciate any help. Have got my AlienVault OTX key ready but need help with the Threat Intel taxii feed settings in the web gui. Data inputs » Intelligence Downloads » Type: taxii URL: https://otx.alienvault.com/taxii/discovery POST Arguments: <this is where my key should be placed but how is this formatted??> -> have tried taxii_username="my_key"  in the post arguments to no avail. Just keep seeing the "TAXII feed polling starting" message on the "Threat Intelligence Audit" page. Any help is greatly appreciated. Cheers
I am trying to export Azure application insights (Custom events) via Azure blob storage as a continuous export to Splunk. The add-on for connecting Azure and Splunk is "Splunk Add-on for Microsoft Cl... See more...
I am trying to export Azure application insights (Custom events) via Azure blob storage as a continuous export to Splunk. The add-on for connecting Azure and Splunk is "Splunk Add-on for Microsoft Cloud Services" which allows you to define inputs and pull in data from Azure blob storage.  So far so good! I am able to fetch the data. The problem with data is that the exported comes in form of a raw JSON dump which is actually not very useful for any sort of visualization in Splunk. Can anyone suggest a better way of handling this?
Hi Splunkers, Just want to know if you guys experienced ingesting data from blackbox? I just want to know how to integrate and use the telemetry of black box to splunk to create a useful dashboard i... See more...
Hi Splunkers, Just want to know if you guys experienced ingesting data from blackbox? I just want to know how to integrate and use the telemetry of black box to splunk to create a useful dashboard in it. 
Hi I need to format background in <h1> tag and <p> tags in my xml   <row> <panel> <html> <h1> <center>DEVICE</center> </h1> </html> </panel> ... See more...
Hi I need to format background in <h1> tag and <p> tags in my xml   <row> <panel> <html> <h1> <center>DEVICE</center> </h1> </html> </panel> </row> <row> <panel> <html> <p> abcdefg..... </p> </html> </panel> </row>   In my css file, there is    .h1 { background: #848484 !important; } .p { background: #848484 !important; }   I have just a background for <h1> but not on the entire row but just behind the h1 text... how to add background in a <h1> and a <p> tag please?
while Installing Splunk forwarder in windows, what IP address should be used for receiving indexer? Is it my Windows IP address? The browser for Splunk in showing localhost address 127.0.0.1  
Hi, From the below github link, I see this python code, which works on the search head(linux), when I use this command. /opt/splunk/bin/splunk cmd python ko_change.py https://github.com/harsmarvan... See more...
Hi, From the below github link, I see this python code, which works on the search head(linux), when I use this command. /opt/splunk/bin/splunk cmd python ko_change.py https://github.com/harsmarvania57/splunk-ko-change/blob/master/ko_change.py I see below import statements. import splunk.rest as rest import splunk.auth as auth When I use these, import statements, in my laptop, vscode, it get below error. "ModuleNotFoundError: No Module named "splunk.auth";  'splunk' is not a package" "ModuleNotFoundError: No Module named "splunk.rest";  'splunk' is not a package" I have splunk-sdk installed in my vscode. splunklib is working. So the question here is, how come the above import statements work on splunk search head(Linux) , but not in vscode (in my laptop), even when I have splunk-sdk installed? What library, do I need to install, to make splunk.auth and splunk.rest work in vscode (on my windows laptop)  
Hello, On a Universal Forwarder can someone tell me where the config is that tells the universal forwarder where to send the logs? I need this for Windows and Linux. Thank you