All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, I'm new to creating custom search commands, and haven't been able to understand the examples in the SDK repo on GitHub, if someone can point me to or provide a step by step guide to achieve the ... See more...
Hi, I'm new to creating custom search commands, and haven't been able to understand the examples in the SDK repo on GitHub, if someone can point me to or provide a step by step guide to achieve the below that'd be really appreciated. Problem: I need to filter my results to only include events where a field (cidr) is a subnet of a range entered on a dashboard ($search_cidr$). As the builtin cidrmatch(CIDR, IP) only works with an IP and a CIDR range, I don't beleive I can do this natively, so I'm thinking a custom where command is the way to go, so I can do something like this in SPL: | from datamodel:mymodel | where SubnetOf(cidr, "$search_cidr$") I've written a basic Python function in my apps bin directory to do the comparison, it takes either (IP, Subnet) or (Subnet. Subnet) and returns a boolean, however I don't know how to turn this into a filter for the where command to use.   #!/usr/bin/env python3 from ipaddress import ip_network def SubnetOf(Needle, Haystack): try: Needle = ip_network(Needle) Haystack = ip_network(Haystack) return Needle.subnet_of(Haystack) except: return False     Any guidance to get me started would be great.
  HI Guys - relative newbie, so apologise if this has been answered, I did look around, but couldn't find anything.  I have built a dashboard, it has a time input that generated values like:  earli... See more...
  HI Guys - relative newbie, so apologise if this has been answered, I did look around, but couldn't find anything.  I have built a dashboard, it has a time input that generated values like:  earliest = -15m to latest = now, how can I turn these into SQL compatible data types so that when I execute:       | dbxquery query="SELECT * FROM tableName WHERE created BETWEEN $LATEST$ AND $EARLIEST$" connection="dbConnectionName"   Where in the illustration above my time picker is: $tok.timeWindow.earliest$ to $tok.timeWindow.latest$    So that the earliest and latest values can sit within a between clause.   Thanks.
Hello Splunkers! We have deployed SC4S and it works fine for Trend but we're now using it for VPN (Aviatrix) which doesn't have a prebuilt source. Data coming into main on the fallback so we're ... See more...
Hello Splunkers! We have deployed SC4S and it works fine for Trend but we're now using it for VPN (Aviatrix) which doesn't have a prebuilt source. Data coming into main on the fallback so we're good to go, but looking for details on HOW to add custom sources.  I've been through https://splunk-connect-for-syslog.readthedocs.io/en/master/ many times but nothing really explains it. We've deployed Bring Your Own Environment and everything is under /etc/syslog-ng. Would really appreciate some steps on how to add new source! Thanks
I have used Splunk addon for aws and configured the billing inputs(cost and usage reports). I was able to get the reports under my main index with sourcetype=aws:billing:cur and was able to see them ... See more...
I have used Splunk addon for aws and configured the billing inputs(cost and usage reports). I was able to get the reports under my main index with sourcetype=aws:billing:cur and was able to see them under detailed_billing_cur data model. However, When i try to use the historical monthly billing and historical daily billing dashboards i was not able to get billing data in the dashboard. I have configured the dashboard to use billing cost and usage report under the configure section of the Splunk app. Also, i have selected the required tags I need under the configure section. When i get deep into the searches for the dashboards there was source="*20200801-*" AND (source="*12463346-sf456-235fsd-dh4-ea45gb4675774*")). I could find the source=20200801 as this is the month of the report. but couldn't find any source with (source="*12463346-sf456-235fsd-dh4-ea45gb4675774*") in the datamodel. I could find this under billing_report_assemblyid_cur input lookup as a different field named assemblyid. The other source which i could find is the csv file itself and it looks like source=daily-aws-cost-usage-report-0230-csv.zip. Can youhelp me whether i should change anything?  
Hello! I'm a new member of an IT Department at a large company in the United States, and I've been tasked with learning about the Splunk program and what it can do. To that end, a lot of my job cu... See more...
Hello! I'm a new member of an IT Department at a large company in the United States, and I've been tasked with learning about the Splunk program and what it can do. To that end, a lot of my job currently is research, and I've been given a few questions to track down answers to specifically. One of which is this: Are there any ways to share Splunk data with people in our company who don't have access to Splunk? I mean, setting aside the question of security clearance, just... like is there a way to make Dashboards or Pivots available to employees who don't have a way to log into the Splunk Application. I'm currently taking the Splunk Fundamentals 101 course, so I'm not exactly an expert, but I figured this is a good question to ask the community. Forgive me if I'm ignorant, I'm still learning.
Hello folks; I have a multiselect for my Clients. I have a timechart where I can select 1 or more clients to get their average download speeds over a period of time...days, weeks, months, etc.   ... See more...
Hello folks; I have a multiselect for my Clients. I have a timechart where I can select 1 or more clients to get their average download speeds over a period of time...days, weeks, months, etc.   | timechart span=1d avg(DN_Mbps) as Avg_Download by client_name   I would like to show spikes (would that be a trendline?) in the data for each of the clients I choose - whether it is 1 or 2 or 10 clients. How can I do this?   Smiddy
Hello, I've been trying this app Splunk Security Essentials on a test instance of Splunk and I have difficulty setting content/use-case as "active". My main goal is to have a representation of all ... See more...
Hello, I've been trying this app Splunk Security Essentials on a test instance of Splunk and I have difficulty setting content/use-case as "active". My main goal is to have a representation of all my existing production alerts on Mitre Att&ck matrice. I created use-cases in "Custom Content" and enabled exiting ones. On the "Manage Bookmarks" page I have few use-cases, all "Successfully Implemented", but when on the Mitre Overview page none is active, all content have "needs data" status :   I'm sorry if I'm missing something obvious and thank you in advance for your support. Kind regards.
Hello,  I am working with a GPX file and installed the haversine app to calculate the distance between lat/lon coordinates in each event. This is the latest  index="gpx" source="11.22.20-Long-7.5.g... See more...
Hello,  I am working with a GPX file and installed the haversine app to calculate the distance between lat/lon coordinates in each event. This is the latest  index="gpx" source="11.22.20-Long-7.5.gpx" | eval tlat='trkpt{@lat}' | eval tlon='trkpt{@lon}' | haversine originfield="tlat,tlon" outputField=d lat lon | table trkpt.time, trkpt{@lat}, lat, trkpt{@lon}, lon, d, trkpt.ele I know I am missing something so minor, but I can't see it. Thanks in advance for any assistance you can provide here. Safe and healthy to you and yours. Happy and Blessed Thanksgiving, Genesius
Hi, In view of this answer it is not possible to put a macro in a yew, but is it still valid? https://community.splunk.com/t5/Splunk-Search/Conditional-Macros/mp/194852 Thank you in advance
Dears,      When i manual injection of the JavaScript Agent, I have a problem. It seems to have something conflict with Dexie.js and window.Promise. (This might be helpful: https://dexie.org/docs/Pr... See more...
Dears,      When i manual injection of the JavaScript Agent, I have a problem. It seems to have something conflict with Dexie.js and window.Promise. (This might be helpful: https://dexie.org/docs/Promise/Promise)     The script is below, and Dexie's version is  3.0.1. <script charset="UTF-8" type="text/javascript"> window["adrum-start-time"] = new Date().getTime(); (function(config){ config.appKey = "*"; config.adrumExtUrlHttp = "http://cdn.appdynamics.com"; config.adrumExtUrlHttps = "https://cdn.appdynamics.com"; config.beaconUrlHttp = "http://pdx-col.eum-appdynamics.com"; config.beaconUrlHttps = "https://pdx-col.eum-appdynamics.com"; config.resTiming = {"bufSize":200,"clearResTimingOnBeaconSend":true}; config.maxUrlLength = 512; })(window["adrum-config"] || (window["adrum-config"] = {})); </script> <script src="//cdn.appdynamics.com/adrum/adrum-20.9.0.3268.js"></script> <script>     In addition, it leads thousands of requests and doesn't stop itself. We have to close the page in the end.     If you have any ideas, please let me know. Thanks a lot. error a lot of request in seconds
Hello i want to use IN command with subsearch like in the query above: | tstats summariesonly=true allow_old_summaries=true max(_time) as _time, values("events.eventtype") as eventtype FROM datamod... See more...
Hello i want to use IN command with subsearch like in the query above: | tstats summariesonly=true allow_old_summaries=true max(_time) as _time, values("events.eventtype") as eventtype FROM datamodel=events_prod WHERE "events.kafka_uuid" IN ("search= [ | inputlookup kv_alerts_prod where _key="5f" | table uuids]") BY "events.kafka_uuid", "events.tail_id", "events._indextime", "events._raw", source, sourcetype this query returns no results.. what am i missing ? 
Anybody knows how to find out what is the actual WCF operation name, when we have a custom detection rule which groups a bunch of operations together if they are a part of the same service? For inst... See more...
Anybody knows how to find out what is the actual WCF operation name, when we have a custom detection rule which groups a bunch of operations together if they are a part of the same service? For instance, let's say we have WCF service called appd. It has a bunch of operations called op1, op2... Automatic rule will create a single BT for each of the operations and name them like "appd.op1", "appd.op2" etc. If we create a custom detection rule to include all of them in a single BT, we will end up with a single BT that will, say, have a name "appd operations". Here comes the issue now. After this new, single BT is created, I am not able to again see the operation names. Cannot find that possibility. I would like to somehow display these op1, op2... and so on... names so I can check few things. Tried in snapshots, look everywhere, it looks like the only way to again fetch these operation names is to delete the custom rule and revert back to auto detection rule. Any ideas?
Hi,  I have a query  index=network_appliance Hostname=* (sourcetype="old" OR sourcetype="new")  Interface=* field=...................... and I want to display in a column chart the count of In... See more...
Hi,  I have a query  index=network_appliance Hostname=* (sourcetype="old" OR sourcetype="new")  Interface=* field=...................... and I want to display in a column chart the count of Interfaces of the last day for each month. So the result should be something like this: October        1000 November    1100 So 1000 and 1100  should be the count of interfaces of 31/October and  25/November (last day of the current month).   Thank you in advance! 
Hi all, I have been trying to use a search in order to compare two results. One is my lookup and one with an ldapsearch. I am trying to only keep the records of users who are actually still in the A... See more...
Hi all, I have been trying to use a search in order to compare two results. One is my lookup and one with an ldapsearch. I am trying to only keep the records of users who are actually still in the AD. So my lookup contains the usernames and latest login time and my ldapsearch obviously has the updated list of every account still in the AD.  My goal is to crosscheck if there are rows in the lookup that could be delete which is the goal of my query. So far I have the following but I am unable to append the lookup file. Can anyone help me achieve my goal? Tell me if this is the best way to do it and otherwise help me? If it this the best way can you correct my search?    | ldapsearch domain="default" search="(&(objectClass=user))" attrs="sAMAccountName, distinguishedName" | append [| inputlookup account_status_tracker | fields Latest, user] | eval match = if(user==sAMAccountName, "MATCH", "NOMATCH") | table _time sAMAccountName Latest user match    The table displays the values related to the ldapsearch but not the ones of the lookup file.  Thanks anyway, Sasquatchatmars
I have a json file like below, i need to broke it up in to events {"env":"UAT","label":"jenkins-17887.api.v2.dm.btc","App":"dm-d-services","rlmtemplate":"f2_api_fed","lastupdate":2020-11-23 11:09:78... See more...
I have a json file like below, i need to broke it up in to events {"env":"UAT","label":"jenkins-17887.api.v2.dm.btc","App":"dm-d-services","rlmtemplate":"f2_api_fed","lastupdate":2020-11-23 11:09:78:455,"region":"APAC"},{"env":"UAT","label":"jenkins-17687.api.v2.dm.btc","App":"dt-s-services","rlmtemplate":"f3_api_fed","lastupdate":2020-11-23 11:025:79:475,"region":"APAC"},{"env":"UAT","label":"jenkins-18657.api.v2.dm.btc","App":"dt-s-services","rlmtemplate":"f3_api_fed","lastupdate":2020-11-23 11:025:79:475,"region":"APAC"},{"env":"UAT","label":"jenkins-17637.api.v2.dm.btc","App":"dt-s-services","rlmtemplate":"f3_api_fed","lastupdate":2020-11-23 11:025:79:475,"region":"APAC"} I'm trying to forward it to splunk modified props.conf file like below [test_json] INDEXED_EXTRACTIONS = JSON LINEBREAKER = }(,){"env": SHOULD_LINEMERGE = false NO_BINARY_CHECK = true TRUNCATE = 0 TZ = Asia/Singapore   But only getting first line of json as event , remaining data is not coming to splunk. ==Firstline == "env":"UAT","label":"jenkins-17887.api.v2.dm.btc","App":"dm-d-services","rlmtemplate":"f2_api_fed","lastupdate":2020-11-23 11:09:78:455,"region":"APAC" Can any one suggest what's going wrong.      
Hello, I have made a new app under deployment apps with the following inputs.conf     [monitor:///root/something/something/something/something/] index = test whitelist=console-202[\S\s]+\.log$  ... See more...
Hello, I have made a new app under deployment apps with the following inputs.conf     [monitor:///root/something/something/something/something/] index = test whitelist=console-202[\S\s]+\.log$     whitelist is written to input filenames such as console-2020-06-02.log etc  I have not created any sourcetype for the index, so I do not have a props.conf file on the deployment app, neither on the searchheads. I have reloaded the server class that is linked to the host and app but I do not see any attempts to monitor the path I have given on the following spl query: "index=_internal sourcetype=splunkd *something*" Am I missing something on the inputs.conf? Am I forced to put a sourcetype? Cant I create my own custom sourcetpe via the gui or do I have to create a props and transforms conf for a sourcetype that does not exist? Any help is appreciated, Regards,
Is there a way to capture some part of the logged messages and ignore the rest? I have Errors which are logged via logger and appdynamics is capturing that but including some other information which ... See more...
Is there a way to capture some part of the logged messages and ignore the rest? I have Errors which are logged via logger and appdynamics is capturing that but including some other information which is not required.  Regards, Gopikrishnan 
Hello, I have applied the app : TA-ciscoaxl on splunk. But I don't know where I can add the CUCM IP so that it can monitor through AXL API and get the data in splunk ? I tried to configure HTTP eve... See more...
Hello, I have applied the app : TA-ciscoaxl on splunk. But I don't know where I can add the CUCM IP so that it can monitor through AXL API and get the data in splunk ? I tried to configure HTTP event collector from data inputs, but somehow it did not work. Could you please share some process to get the callmanager integrated with splunk ?   Thank You .  
| mstats avg(_value) as current where data.consumerName IN("XXXXXXX*") AND NOT (data.consumerName="*OH")AND host="*" AND `delphix_metrics_index` metric_name="system.capacity.consumer.size" span=30s b... See more...
| mstats avg(_value) as current where data.consumerName IN("XXXXXXX*") AND NOT (data.consumerName="*OH")AND host="*" AND `delphix_metrics_index` metric_name="system.capacity.consumer.size" span=30s by data.totalSpace, data.consumerName | eval usage = round((current) / 1024 /1024/1024, 3) | eval totalGB=round('data.totalSpace' / 1024 / 1024 / 1024, 3) | eval name = 'data.consumerName' | eval free = (totalGB - usage) -----------  i want pass the value to calculate the free space , the statment is not working can you please help  | chart avg(usage) as "Usage (GB)" avg(totalGB) as "Total (GB)" avg(free GB) as "FREE MB" by name
I have a query where I am getting count of different http status codes like below I want to add a error percentage column to above chart.