All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Great, I'm glad to hear that this solution was helpful for your use case happy splunking and best regards ; ) P.S.: Karma Points are always appreciated
So we have an internal load balancer that distributes HEC requests between 2 heavy forwarders. HEC is working fine and all but a small fraction of the requests are not making it to the heavy forwarde... See more...
So we have an internal load balancer that distributes HEC requests between 2 heavy forwarders. HEC is working fine and all but a small fraction of the requests are not making it to the heavy forwarders. The sender of the events get the 503 error below: upstream connect error or disconnect/reset before headers. reset reason: connection termination while the internal load balancer get this error: backend_connection_closed_before_data_sent_to_client What really baffles me is that I couldn't find any error logs in Splunk that might be connected to this issue. There's also no indication that our heavy forwarders are hitting their queue limits. I even tried increasing the max queue size of certain queues including that of the HEC input in question but even that didn't help at all. Is there any other stuff that I can check to help me pin point the cause of this problem?
Thanks!!  Its working
Hi, so I also had the same problem. I tested several setups and what worked was the solution provided by MaverickT. Just create a GPO and add the virtual Account to the "Event Log Readers" Group. Th... See more...
Hi, so I also had the same problem. I tested several setups and what worked was the solution provided by MaverickT. Just create a GPO and add the virtual Account to the "Event Log Readers" Group. This does the trick. It seems that the privilege "SeSecurityPrivilege" isnt enough to read the sysmon event log.   Which is weird, because all the other logs are readable. I can read power shell logs with this settings, but not the sysmon logs.
We noticed this morning that all the certificates for our Splunk servers are expired since a week (discovered whilst investigating why KVStore stopped this weekend).  I followed recommendation from ... See more...
We noticed this morning that all the certificates for our Splunk servers are expired since a week (discovered whilst investigating why KVStore stopped this weekend).  I followed recommendation from other community ask by renaming server.pem to server.pem.old and restarting the Splunk service to create a new one.  It correctly creates  a new server.pem with a valid expiration date, however it still displays the old cerficate in my browser.  I already checked with btool, and it seems fine (pointing to server.pem). I also already checked web.conf and tried to manually indicate the file path but it's still not working... Am I missing something? 
Hi @woodlandrelic , Hope this message finds you well. I have recently moved from out of a Splunk developer role to an admin role. I have to build a cluster environment out of scratch. I have the b... See more...
Hi @woodlandrelic , Hope this message finds you well. I have recently moved from out of a Splunk developer role to an admin role. I have to build a cluster environment out of scratch. I have the basic understanding of a clustered environment but haven't setup yet. Could you please guide me how can I start. Like what type of knowledge/ information gathering need to do with the client or customer before head. Also if there is any procedure/ order of components to follow. It will be really helpful for me.   Thanks in advance
any ideas? i am still not able to make it work
Hi @pavithra you could try with the ceiling function: ceiling or ceil  example: | eval day_of_month=strftime(_time, "%d") | eval day_of_week=strftime(_time, "%A") | eval week_of_month=ceil(day... See more...
Hi @pavithra you could try with the ceiling function: ceiling or ceil  example: | eval day_of_month=strftime(_time, "%d") | eval day_of_week=strftime(_time, "%A") | eval week_of_month=ceil(day_of_month/7) | where day_of_week="Tuesday" AND week_of_month=2   best regards,
Hi @Kaushaas, all the hints I can think were checked, open a case to Splunk Support, there isn't any other solution. Ciao. Giuseppe
Hi @gcusello  I have got the power user access role but i still dont see Edit permissions option . Anything that you can suggest ?
Hello @mahesh27, You can write an eval condition to rename the NULL value as whatever string you wish. | eval service_code=if(service_code="NULL","Non-Servicecode",service_code)  Just append the a... See more...
Hello @mahesh27, You can write an eval condition to rename the NULL value as whatever string you wish. | eval service_code=if(service_code="NULL","Non-Servicecode",service_code)  Just append the above eval condition to your SPL query and it should work.  Thanks, Tejas.
Hi @joock3r , id depends on the data source: if you have a lookup containing two columns (country and campus), you can fiter the second dopdown using the choice in the first, somthing like this: |... See more...
Hi @joock3r , id depends on the data source: if you have a lookup containing two columns (country and campus), you can fiter the second dopdown using the choice in the first, somthing like this: | inputookup your_lookup.csv WHERE country=$token1$ | fields campus if instead you have only one list (USA 1, USA 2, Romania 1, Romania 2, Turkey 1, Turkey2), you should extract the country from the list using a regex, e.g. something like this (having only one column called campus, containing always the country and a number): first dropdown | inputookup your_lookup.csv | rex field=campus "^(?<country>[^0-9]+)\d+" | fields country second dropdown: | inputookup your_lookup.csv | rex field=campus "^(?<country>[^0-9]+)\d+" | search country="$token1$" | fields campus Ciao. Giuseppe  
Hi @paragg , is you main dashboard only a menù or does it contain also panels with data to drildown in the othe dashboards? in the first case, you have to use HTML tags to create links to the othe ... See more...
Hi @paragg , is you main dashboard only a menù or does it contain also panels with data to drildown in the othe dashboards? in the first case, you have to use HTML tags to create links to the othe dashboards, something like this: <dashboard version="1.1"> <label>Home Page</label> <row> <panel> <html> <h1>Title Panel 1</h1> <table border="0" cellpadding="10" align="center"> <th> <tr> <td align="center"> <a href="/app/my_app/dashboard1"> <img style="width:80px;border:0;" src="/static/app/my_app/image1.png"/> </a> </td> <td align="center"> <a href="/app/my_app/dashboard2"> <img style="width:80px;border:0;" src="/static/app/my_app/image2"/> </a> </td> </tr> <tr> <td align="center"> <a href="/app/my_app/dashboard1"> Title Dashboard 1 </a> </td> <td align="center"> <a href="/app/my_app/dashboard2"> Title Dashboard 2 </a> </td> </tr> </th> </table> </html> </panel> </row> </dashboard> If instead you have to create a drilldown, please folow the instructions at https://docs.splunk.com/Documentation/Splunk/latest/Viz/DrilldownIntro Ciao. Giuseppe
Hi @payl_chdhry , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Hi @pavithra  at first it isn't so clear with timeframe you want to display, but this is inside your search, so if it's the previous month, you could run something like this: <your_search> earliest... See more...
Hi @pavithra  at first it isn't so clear with timeframe you want to display, but this is inside your search, so if it's the previous month, you could run something like this: <your_search> earliest=-mon@mon latest=@mon | stats count BY key and schedule your search using this cron: 0 0 8-15 * 3 Ciao. Giuseppe
thank you for your reply, but this will double the consumption of CPU and memory resources.
Just pointing out here that the statement | inputlookup test.csv OR inputlookup test2.csv is not valid Splunk - you cannot do two inputlookup commands like that.
It's pretty straightforward to do that | makeresults format=csv data="VIN,MAKE,MODEL 1234ABCD,FORD,GT ABCD1234,DODGE,VIPER 1A2B3C4D,CHEVROLET,CORVETTE A1B2C3D4,AUDI," | eval sourcetype="autos" | app... See more...
It's pretty straightforward to do that | makeresults format=csv data="VIN,MAKE,MODEL 1234ABCD,FORD,GT ABCD1234,DODGE,VIPER 1A2B3C4D,CHEVROLET,CORVETTE A1B2C3D4,AUDI," | eval sourcetype="autos" | append [ | makeresults format=csv data="SN,MANUFACTURER,PRODUCT 1234ABCD,FORD,GT ABCD1234,DODGE,CARAVAN 1A2B3C4D,CHEVY,CORVETTE A1B2C3D4, ,A8" | eval sourcetype="cars" ] ``` Above is sample data setup, but imagine your data above has come from index=your_index sourcetype=autos OR sourcetype=cars ``` ``` Now use VIN as the common field - there are actually many ways to do the same thing, but what you are doing here is to make the dc_XXX fields ones to be counted for uniqueness. ``` | eval VIN=coalesce(VIN, SN), dc_makes=coalesce(MAKE, MANUFACTURER), dc_models=coalesce(MODEL, PRODUCT) ``` Here there stats values collects all the original data - you may want to add a | fields statement here to limit to the fields you want It also counts the unique values of the dc_* fields which is the make and model from whichever sourcetype ``` | stats values(*) as * dc(dc_*) as dc_* by VIN ``` And now this will find your mismatch items ``` | where dc_makes>1 OR dc_models>1 | fields - sourcetype dc_* Hope this helps
In case you haven't noticed, this is a really old thread. Your question might not get the visibility you want in this thread. Try starting a new thread describing your problem in the Getting Data In ... See more...
In case you haven't noticed, this is a really old thread. Your question might not get the visibility you want in this thread. Try starting a new thread describing your problem in the Getting Data In section of this forum.