All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I am facing the same issue. Not able to see any of the data in splunk. Did you find the solution to get the data? Please let me know how to resolve this issue.
Hi Team, One of end point(Get shipping) under the business transaction, it's not being captured consistently, And don't know why it's behaving this way, Does anyone help me on this issue . I just s... See more...
Hi Team, One of end point(Get shipping) under the business transaction, it's not being captured consistently, And don't know why it's behaving this way, Does anyone help me on this issue . I just selected from 8 days data
Thanks - tokens look promising enough... i can check if it is Cloud and if so set the token to one value, if not as another. 
-in multi-site cluster if initially, the replication factor was site_replication_factor = origin:2,total:2 site_search_factor =origin:1,total:1    and later I change it to site_replication_facto... See more...
-in multi-site cluster if initially, the replication factor was site_replication_factor = origin:2,total:2 site_search_factor =origin:1,total:1    and later I change it to site_replication_factor = origin:2,total:3 site_search_factor =origin:1,total:2   Will the old data also be replicated with  new replication and search factor Or only the new data will have the replication copies as per new replication and search factors  
Hi Team,   We have deployed Splunk Cloud in our environment and currently have a requirement to generate monthly report statistics separately based on Index, Host, Source, and Sourcetype. Could yo... See more...
Hi Team,   We have deployed Splunk Cloud in our environment and currently have a requirement to generate monthly report statistics separately based on Index, Host, Source, and Sourcetype. Could you please provide the queries to pull the required statistics in Splunk? We need separate reports for the top 10 in GB, excluding internal indexes and their sourcetypes. Your assistance with the query is much appreciated.        
@gcusello When i navigate to the Cloud Monitoring Console-->License Usage-->Workload. I can see Indexing Process-->Peak SVC usage per hour split by indexing source. So when i navigate to the query ... See more...
@gcusello When i navigate to the Cloud Monitoring Console-->License Usage-->Workload. I can see Indexing Process-->Peak SVC usage per hour split by indexing source. So when i navigate to the query in another search window.  I can see the query as below:   index=summary source="splunk-ingestion" [`sim_get_local_stack` | eval host="*.".stack.".*splunk*" | fields host] | dedup keepempty=t _time idx st | stats sum(ingestion_gb) as ingestion_gb by _time idx | eventstats sum(ingestion_gb) as total_gb by _time | eval pct=ingestion_gb/total_gb | bin _time span=1h | join _time [ search index=summary source="splunk-svc-consumer" svc_consumer="data services" svc_usage=* | fillnull value="" svc_consumer process_type search_provenances search_type search_app search_label search_user unified_sid search_modes labels search_head_names usage_source | eval unified_sid=if(unified_sid="",usage_source,unified_sid) | stats max(svc_usage) as utilized_svc by _time svc_consumer search_type search_app search_label search_user search_head_names unified_sid process_type | timechart span=1h sum(utilized_svc) as svc_usage ] | eval svc_usage=svc_usage*pct | timechart useother=false span=1h sum(svc_usage) by idx     I need to generate three separate reports, each identifying the top 10 items based on license usage in GB over the last 30 days. Specifically, I want to pull the following information: 1. The top 10 indexes (excluding internal indexes). 2. The top 10 sourcetypes (excluding internal index sourcetypes). 3. The top 10 sources. These reports need to be scheduled to run every month. Could you please provide the queries for these three requirements?  
that's my outputs.conf    [syslog] defaultGroup = group2 [syslog:remote_siem] server = xx.xx.xx.xx:514 sendCookedData = false
Hello Community,   I wondering that i forward the logs using syslog instead of TCP, I received the packets using TcpDump and everything is good but the data not showing there and it's transferred u... See more...
Hello Community,   I wondering that i forward the logs using syslog instead of TCP, I received the packets using TcpDump and everything is good but the data not showing there and it's transferred using tcpdump....   that's my configuration in HF    Outputs.conf   [syslog] defaultGroup = group2 [syslog:remote_siem] server = xx.xx.xx.xx:514 sendCookedData = false transforms.conf   [send_tmds_to_remote_siem] REGEX = . SOURCE_KEY = _MetaData:Index DEST_KEY = _SYSLOG_ROUTING FORMAT = remote_siem [send_tmao_to_remote_siem] REGEX = . SOURCE_KEY = _MetaData:Index DEST_KEY = _SYSLOG_ROUTING FORMAT = remote_siem   props.conf [source::udp:1518] TRANSFORMS-send_tmds_to_remote_siem = send_tmds_to_remote_siem [source::udp:1517] TRANSFORMS-send_tmao_to_remote_siem = send_tmao_to_remote_siem   is it fine or something not correct please help .    
Hi @marysan - please see attached screenshots for Alert Configuration.          
Thank you so much for the response @tscroggins. I will validate using your math, though looking at it may suggest not be a negative number but I will definitely doublr check, I will also reach out to... See more...
Thank you so much for the response @tscroggins. I will validate using your math, though looking at it may suggest not be a negative number but I will definitely doublr check, I will also reach out to our support. Thank you so much.
Thank you for your help.
I am trying to create a splunk alert to monitor the heap used utilization and alert when it exceeds 85 percent, can anyone please help: heap.used would be the keyword below The raw data:PLATFORMINST... See more...
I am trying to create a splunk alert to monitor the heap used utilization and alert when it exceeds 85 percent, can anyone please help: heap.used would be the keyword below The raw data:PLATFORMINSTRUMENTS {"timestamp":"1717989699","instrumentList":[{"name":"sr.jql-functions.linkedIssuesOf","value":"2703057"},{"name":"writer.lucene.commit","value":"72497"},{"name":"index.writes","value":"46672292"},{"name":"cache.JiraOsgiContainerManager.hitCount","value":"0"},{"name":"cache.VelocityTemplateCache.totalLoadTime","value":"0"},{"name":"cache.VelocityTemplateCache.directives.evictionCount","value":"0"},{"name":"entity.workflows.total","value":"186"},{"name":"jmx.class.loaded.total","value":"209062"},{"name":"db.conns.time.to.borrow","value":"0"},{"name":"entity.attachments.total","value":"7707192"},{"name":"jmx.thread.cpu.wait.count","value":"0"},{"name":"issue.index.reads","value":"108244292"},{"name":"entity.projects.total","value":"2315"},{"name":"issue.worklogged.count","value":"2938"},{"name":"sr.jql-functions.addedAfterSprintStart","value":"7490"},{"name":"jira.license","value":"0"},{"name":"jmx.thread.ever.count","value":"329348"},{"name":"db.conns","value":"433059620"},{"name":"cache.i18n.CachingI18nFactory.missCount","value":"0"},{"name":"dbcp.maxActive","value":"-1"},{"name":"concurrent.requests","value":"0"},{"name":"jmx.memory.nonheap.committed","value":"1913360384"},{"name":"replicated.index.operations.total","value":"1390921"},{"name":"sr.jql-functions.removedAfterSprintStart","value":"1841"},{"name":"dbcp.numIdle","value":"31"},{"name":"sr.jql-functions.releaseDate","value":"32988"},{"name":"sr.jql-functions.linkedIssuesOfAllRecursive","value":"2169"},{"name":"entity.versions.total","value":"88928"},{"name":"jmx.memory.nonheap.used","value":"1783621536"},{"name":"cache.VelocityTemplateCache.missCount","value":"0"},{"name":"cache.VelocityTemplateCache.directives.loadSuccessCount","value":"0"},{"name":"cache.JiraOsgiContainerManager.size","value":"23"},{"name":"entity.issues.total","value":"12654909"},{"name":"jmx.memory.heap.used","value":"14251500568"},{"name":"sr.jql-functions.epicsOf","value":"301596"},{"name":"sr.jql-functions.aggregateExpression","value":"10"},
Below is my dashboard XMLcode. The behavior I want to implement is to have the user's selection of values ​​in the table's columns automatically enter the multi-selection input. I don't know what to ... See more...
Below is my dashboard XMLcode. The behavior I want to implement is to have the user's selection of values ​​in the table's columns automatically enter the multi-selection input. I don't know what to do. I want to make this...  Does anybody know what can I to do..??? Pleasee Help..me..... : ((((     <form version="1.1" theme="dark"> <label>Sales DashBoard</label> <fieldset submitButton="true" autoRun="false"> <input type="time" token="globalTime" searchWhenChanged="true"> <label>Select Time Range</label> <default> <earliest>0</earliest> <latest></latest> </default> </input> <input type="text" token="country" searchWhenChanged="true"> <label>select Country</label> <default>*</default> </input> <input type="multiselect" token="client_token"> <label>client_token</label> <choice value="*">ALL</choice> <prefix>(</prefix> <suffix>)</suffix> <valuePrefix>clientip="</valuePrefix> <valueSuffix>"</valueSuffix> <delimiter> OR </delimiter> <fieldForLabel>clientip</fieldForLabel> <fieldForValue>clientip</fieldForValue> <search> <query>index=main | stats count by clientip</query> </search> <default>*</default> </input> <input type="multiselect" token="field1" searchWhenChanged="true"> <label>field1 $clicked_value$</label> <choice value="*">all</choice> <choice value="clicked_value">choice</choice> <default>*</default> <initialValue>*</initialValue> <fieldForLabel>products</fieldForLabel> <fieldForValue>products</fieldForValue> <search> <query>| index=main productName=$clicked_value$ | stats count by productName</query> </search> <delimiter> </delimiter> </input> <input type="text" token="input_02" searchWhenChanged="true"> <label></label> <default>$clicked_value$</default> <initialValue>$clicked_value$</initialValue> </input> </fieldset> <row> <panel> <title>test demo</title> <table> <title>셀트리온 과제임 $clicked_value$</title> <search> <query>index=main sourcetype="access*" action=purchase $client_token$ | stats values(productName) as products by clientip</query> <earliest>$globalTime.earliest$</earliest> <latest>$globalTime.latest$</latest> </search> <option name="drilldown">cell</option> <format type="color" field="clientips"> <colorPalette type="minMidMax" maxColor="#118832" minColor="#FFFFFF"></colorPalette> <scale type="minMidMax"></scale> </format> <format type="number" field="clientips"></format> <drilldown> <set token="clicked_value">$click.value2$</set> </drilldown> </table> </panel> </row> <row> <panel> <title>Actual Purchase Rate</title> <single> <title>transition from shopping cart to actual purchase</title> <search> <query>index=main sourcetype="access_combined_wcookie" status=200 action IN(addtocart, purchase) | iplocation clientip | search Country="$country$" | eval action_type=if(action="addtocart", "cart", if(action="purchase", "purchase", "other")) | stats count(eval(action_type="cart")) as cart_count count(eval(action_type="purchase")) as purchase_count | eval rate=round(purchase_count*100/cart_count, 2) | table rate</query> <earliest>$globalTime.earliest$</earliest> <latest>$globalTime.latest$</latest> </search> <option name="colorMode">block</option> <option name="drilldown">none</option> <option name="numberPrecision">0.00</option> <option name="rangeColors">["0xd41f1f","0xd94e17","0xf8be34","0x1182f3","0x118832"]</option> <option name="rangeValues">[60,70,85,90]</option> <option name="refresh.display">progressbar</option> <option name="useColors">1</option> </single> </panel>..</form>
Hi @KhalidAlharthi, Advice and answers have been provided in three similar, previously asked questions. If you know the format the other SIEM (QRadar?) expects, please add it here, and we can help y... See more...
Hi @KhalidAlharthi, Advice and answers have been provided in three similar, previously asked questions. If you know the format the other SIEM (QRadar?) expects, please add it here, and we can help you tailor the Splunk output. Since this is a Splunk community forum, however, you're more likely to find expertise in Splunk Enterprise Security than QRadar. If you see the data arriving on the remote system using tcpdump, Splunk has already successfully forwarded the data irrespective of the format.
Hi @jasantor, The implementation is in $SPLUNK_HOME/etc/apps/Splunk_ML_Toolkit/bin/algos_support/density_function/beta_distribution.py: 1. Sample min(data.shape[0], 10000) elements from field using... See more...
Hi @jasantor, The implementation is in $SPLUNK_HOME/etc/apps/Splunk_ML_Toolkit/bin/algos_support/density_function/beta_distribution.py: 1. Sample min(data.shape[0], 10000) elements from field using numpy.random.choice. 2. Normalize sample to [0..1] using (data - data.min()) / (data.max() - data.min()). 3. Fit normalized sample to Beta using scipy.stats.beta.fit. 4. If either alpha <= 0 or beta <= 0, estimate parameters using normalized sample mean and variance. The return values for scipy.stats.beta.fit are alpha, beta, loc, and scale. MLTK's implementation of dist=beta either misinterprets or mislabels loc and scale as mean and standard deviation, respectively. You could compute the values yourself: | summary MyModel2 | rex field=other "Alpha: (?<alpha>[^,]+), Beta: (?<beta>.+)" | eval mean=alpha/(alpha+beta), std=sqrt((alpha*beta)/(pow(alpha+beta,2)*(alpha+beta+1))) However, this will give you the approximate mean and standard deviation of the normalized sample, not the original data. The dist=beta implementation is a little over four years old now, and something tells me no one has validated it. At the risk of being overly critical, the code looks suspiciously like it was copied from Stack Overflow. I don't have a personal Splunk support account, so I can't report the issue. If you have support, I recommend opening a support case.
I suspect something on your end - either an outdated trusted RootCA store or some TLS decrypting appliance doing MitM. I see a perfectly good DigiCert-issued certificate when curling download.splunk.... See more...
I suspect something on your end - either an outdated trusted RootCA store or some TLS decrypting appliance doing MitM. I see a perfectly good DigiCert-issued certificate when curling download.splunk.com
looks like the issue still there.  ok, as it is a simple issue and considering the above workaround, closing this post, thanks. 
Hi @VijaySrrie ... you have given only very least details.  pls provide us the modular input script.. the config files..  is the modular input working fine previously and just recently it started t... See more...
Hi @VijaySrrie ... you have given only very least details.  pls provide us the modular input script.. the config files..  is the modular input working fine previously and just recently it started the duplicates?
guys i have obtained routing through syslog method and i faced a problem the logs are coming when i run Tcpdump in the third-party system but i can't see them in the other SIEM    how can i solve t... See more...
guys i have obtained routing through syslog method and i faced a problem the logs are coming when i run Tcpdump in the third-party system but i can't see them in the other SIEM    how can i solve this issue ....   hellp
@gcusello I sent you private message .