All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello all, I would like to add 3 links of 3 different dashboards in a separate dashboard panel. My current code is as below <row> <panel> <title></title> <html> <p> <b style="font-size:12pt"... See more...
Hello all, I would like to add 3 links of 3 different dashboards in a separate dashboard panel. My current code is as below <row> <panel> <title></title> <html> <p> <b style="font-size:12pt">Link to other relevant Dashboards</b></p> <a href="https://******/en-US/app/******/****">Click here for Dashboard 1</a> <a href="https://******/en-US/app/*****/dlp_unstructured_data_discovery_dim?form.timePolicyDivision.earliest=-30d%40d&form.timePolicyDivision.latest=now&form.policyType=*">Click here for Dashboard 2</a> <a href="https://*****/en-US/app/********/symantec_dim__blocked_detail?form.blockedTime.earliest=-7d%40h&form.blockedTime.latest=now&form.businessUnitSearch=*&form.emplidSearch=*&form.policyCategorySearch=*&form.lanidSearch=*">Click here for Dashboard 3</a> </html> </panel> </row> Now, the URL for Dashboard 1 is working fine that includes the combination of characters, dot, and underscore. But, it's giving me an error for Dashboard 2 and 3 as I have various characters in the URL. So, how can I resolve this issue? I don't have search element in my XML so I can't solve it at UI or search level. Any help would be appreciated!!
Hi, I was wondering if there would be an issue with changing the permissions on the buckets. currently the permissions are set to 0700 with the files being owned by the local account for splunk. If ... See more...
Hi, I was wondering if there would be an issue with changing the permissions on the buckets. currently the permissions are set to 0700 with the files being owned by the local account for splunk. If I change the permissions to 0705 will this cause any issues? I do not intend to keep this permission forever. I only need it to backup the primary copies of the files. The local account for splunk does not have rsync capability and only need to change this permission to perform rsync with an account that does have the capability to do so. Regards Arjun
I have a filed with xml as below, can some onehelp me how can parse out ErrorDescription "<?xml version="1.0" encoding="UTF-8" standalone="yes"?> <ns1:TServiceResponse version="us-3.0.0.1029" xmlns:... See more...
I have a filed with xml as below, can some onehelp me how can parse out ErrorDescription "<?xml version="1.0" encoding="UTF-8" standalone="yes"?> <ns1:TServiceResponse version="us-3.0.0.1029" xmlns:ns1="TServiceResponse"> <ns1:ServiceRequest> <ns1:SID ns1:New="false">3b2509cd-da09-4a02-bce1-a1f5fe36b15f</ns1:SID> <ns1:CID ns1:New="false">093a83d9-35fa-49f9-bcea-cccca3ae996c</ns1:CID> <ns1:ID ns1:New="false">02625697-7fee-387e-e053-0100007fcd53</ns1:ID> <ns1:CollectionDateGMT>2020-07-16 23:06:27.816</ns1:CollectionDateGMT> <ns1:TID>4a42ca3bd5a8:02625697-7fee-387e-e053-0100007fcd53:a7a2d372-4db5-41a7-b4fa-37285302fea6:230726924</ns1:TID> <ns1:FirmwareVersion>081120</ns1:FirmwareVersion> <ns1:PN>WWWWREFE</ns1:PN> <ns1:CollectionMethodType>Auto Collection</ns1:CollectionMethodType> </ns1:ServiceRequest> <ns1:ServiceError> <ns1:ErrorCode>3.1.12.309</ns1:ErrorCode> <ns1:ErrorDescription>DeviceType not supported.:DEVICE_TYPE_UPDATE_ERROR for TID</ns1:ErrorDescription> </ns1:ServiceError> </ns1:TServiceResponse>" your help is appreciated  
I have the below query which seemingly working okay. I was looking things that I can use to optimize the below query with possibly less to no joins.   attrs.stack=produs line.et=model line.model="\... See more...
I have the below query which seemingly working okay. I was looking things that I can use to optimize the below query with possibly less to no joins.   attrs.stack=produs line.et=model line.model="\"Unique Model2\"" OR line.model="\"Unique Model1\"" | dedup line.dev_id | rename line.customer_id as CUSTOMER_ID | join CUSTOMER_ID [search STACK=produs et=CP earliest=-1d@d latest=@d [search attrs.stack=produs line.et=model line.model="\"Unique Model2\"" OR line.model="\"Unique Model1\"" | dedup line.customer_id | rename line.customer_id as CUSTOMER_ID | fields CUSTOMER_ID]| table CUSTOMER_ID,SERVICE_PROVIDER_PATH ] | table SERVICE_PROVIDER_PATH, CUSTOMER_ID, line.model, line.serial_num   et=CP sample log message    2020-07-15 18:20:21.391, STACK="produs", EVENT_TYPE="et=CP", YEAR_MONTH_DAY="20200715", SERVICE_PROVIDER_PATH="xyz\partner", SERVICE_PROVIDER_NAME="partner1", PARENT_SERVICE_PROVIDER_ID="178", CUSTOMER_NAME="38091", SERVICE_PROVIDER_ID="245", CUSTOMER_ID="382091", CUSTOMER_GUID="9AB6B77D-3646-413F-8F8D-95F3833F1919", TENANT_ID="33146442", IS_ACTIVE_FL="Y", INSERT_DT="2020-07-13 20:50:43.0", UPDATE_DT="2020-07-13 21:29:49.0", PSE_ENABLED_FL="N", AMV_ENABLED_FL="N", AUTO_UPDATE_ENABLED_FL="Y", LICENSE_EXPIRATION_DT="{null}", AUTO_UPDATE_VERSION="{null}", IS_TEST_FL="{null}", INSTANT_ON_ENABLED_FL="Y", SDA_ENABLED_FL="N", DNS_PREFERRED_FL="N", REMOTE_EWS_ENABLED_FL="Y", CLOUD_PROXY_OFFLINE_SECONDS="259200", REMOTE_EWS_CSRF_ENABLED_FL="N", TOTAL_CUSTOMERS="15014596"   sample et=model   {"line":{"time":"2020-07-16T23:12:06.9099864Z","msg":"device info","currentLinkPlatformVersion":"{null}","deviceIntrinsicUuid":"31bc0037-d260-44c1-aeef-a7329c1553c3","isLinkForWebEnabled":"True","isLinkForWebSupported":"True","isLinkForDeviceEnabled":"{null}","isLinkForDeviceSupported":"False","printer_def_id":"\"Unique Model1\"","mac_address":"BxxxxxxxEF0","firmware_version":"00000000_0111","mcid":"QWDEFRV","ocv_status":"ok","mSKU":"Present","model_num":"2567DE","model_support":"DABC","serial_num":"ABCDE32345","host_name":"{null}","ipaddr":"1.1.1.1","conn_id":"1173","customer_name":"329","customer_id":"3129","model":"\"Unique Model1\"","dev_id":"1246614","cid":"8697DE5311D6","et":"B","logger":"DeviceResultLoggerContinuationWorker"},"source":"stdout","task":"wiblig","attrs":{"service":"service","stack":"produs"}}   Any help is greatly appreciated!!
I'm looking to move some buckets around (as a test for now) and I found this link: https://docs.splunk.com/Documentation/Splunk/8.0.5/Troubleshooting/CommandlinetoolsforusewithSupport The locktool ... See more...
I'm looking to move some buckets around (as a test for now) and I found this link: https://docs.splunk.com/Documentation/Splunk/8.0.5/Troubleshooting/CommandlinetoolsforusewithSupport The locktool seemed like something useful to my scenario as the description reads "If you were to write an external script to copy db buckets in and out of indexes you should acquire locks on the db colddb and thaweddb directories as you are modifying them and release the locks when you are done.". I tried to integrate it into my workflow, however, I'm not seeing the expected behaviour with it. Despite the lock being in place for db path, I found that sometimes during my copy operation the tsidx files get changed inside the buckets. I tried locking at various levels (db path, each bucket individually, just tsidx files or a combination of the 3) but I still see this behaviour every now and then. I was hoping to achieve this operation without bringing the splunkd service down on my peer nodes. Is there something I'm missing here? I would really appreciate some insight into this.
Hello How to create the text box with the condition for Accepting user input only if the URL starts from https://somthinglike.com. if any user enters www.facebook.com it ll not consider or save to ... See more...
Hello How to create the text box with the condition for Accepting user input only if the URL starts from https://somthinglike.com. if any user enters www.facebook.com it ll not consider or save to the text box because https is missing. Need to implement 1.URL condition. 2. Only 5 URL allow Below is the screenshort
I know someone is going to say this has been answered many times but...I have gone through every example I can find here in the Community and have yet to find something that will work. I have a monit... See more...
I know someone is going to say this has been answered many times but...I have gone through every example I can find here in the Community and have yet to find something that will work. I have a monitor that runs every 5 minutes to validate a web URL is alive and it records the number of milliseconds for the total response time. My query returns two values _time and total_time for each 5 minute ping. I can use the "pencil" on the right to add color and ranges to the table display but it does not flow into visualization. My search: index=xxx_website_monitoring sourcetype=web_ping https://my.website.com | stats values(total_time) by _time The column chart works but I need to make each column Green, Yellow or Red based on the value of total_time like:  | eval red = if(total_time>=500,total_time,0) | eval yellow = if(total_time<500 AND total_time>=300,total_time,0) | eval green = if(total_time<300, total_time, 0) However, none of the examples I have found work because they expect Func(Expression) or if([bool expr], [expr],  or some other command combination and not just a Field. Here is the chart with the default Blue. total_response on the Y and _time on the X:     Here is an example of what does not work: index=xxx_website_monitoring sourcetype=web_ping https://my.website.com | stats values(total_time) by _time | eval red = if(total_time>=500,total_time,0) | eval yellow = if(total_time<500 AND total_time>=300,total_time,0) | eval green = if(total_time<300, total_time, 0) <chart>       <searchName>URLMonitor</searchName>       <title>Clarifire URL Response Times</title>       <option name="charting.chart">column</option>       <option name="charting.fieldColors">{"red":0xFF0000,"yellow":0xFFFF00, "green":0x73A550}</option>       <option name="charting.legend.placement">none</option>       <option name="charting.axisLabelsX.majorLabelStyle.rotation">90</option>      </chart> Error in 'eval' command: Fields cannot be assigned a boolean result. Instead, try if([bool expr], [expr], [expr]).
Hello,  I have events with id, status that is collected everyday for all the ids. I would like to know when the time(days) taken for the status to change from one value to another for each id.  S... See more...
Hello,  I have events with id, status that is collected everyday for all the ids. I would like to know when the time(days) taken for the status to change from one value to another for each id.  Sample Data: _time     ID        Status t1           100        open t1           101        In progress t1           102        open t1           103        closed ----------------------------------------------------- t2           100        open t2           101        In progress t2           102        In progress t2           103        closed ---------------------------------------------------------- t3           100        In progress t3           101        closed t3           102        In progress t3            103       closed   Expected Output: ID            Time Taken from Open to In Progress 100                  t3-t1 102                  t2-t1   Can you please help me on how to proceed? Kindly let me know in case of any further clarifications. 
Hi all, I have a situation where there are servers from which we wish to get logs into Splunk. However, we cannot use the traditional Universal Forwarder because these servers are not allowed to ... See more...
Hi all, I have a situation where there are servers from which we wish to get logs into Splunk. However, we cannot use the traditional Universal Forwarder because these servers are not allowed to connect through the firewall to the indexer. We can go from "inside" the firewalled network to these servers "outside" the network. So we could perform some sort of API calls if necessary. Is there a way to set up a Heavy Forwarder to poll the servers "outside" the network and pull in the logs? Does the UF support being a "server" instead of a "client", meaning UF would expose an API that could be queried by an HF? Any and all suggestions (aside from reconfiguring firewall) are welcome, thanks!
Hello, Still rather new at Splunk, I have 4 hosts that I need to add the values of 3 different graphs I obtained from Analytics. I'm not entirely sure how to do that, I tried putting all those toget... See more...
Hello, Still rather new at Splunk, I have 4 hosts that I need to add the values of 3 different graphs I obtained from Analytics. I'm not entirely sure how to do that, I tried putting all those together and opening search to see if I could work out how to add the values, but I'm just pretty lost on this. So is there a way to do this? If so, how?
I would like retrieve data from Epic Hyperspace Logs via Syslog. I know you can use the Epic APIs like FIHR but I would like to use Syslog instead.
using a base query i am able to create a table with various fields like this. field1 field2 32 63.68 90 449.1 75 149.25 60 299.4 56 167.44 27 539.73 36 179.64   Now... See more...
using a base query i am able to create a table with various fields like this. field1 field2 32 63.68 90 449.1 75 149.25 60 299.4 56 167.44 27 539.73 36 179.64   Now i need to find various stats operations by each field in efficient way as base query is quite heavy. need output something like -    average 90thpercentile 95thpercentile field1 50 60.6 80.2 field2 150.2 190.3 210.2    
I have variables that I am trying to use to get in a search with a foreach loop... for example..  I have customers: a, b, c, d, e   --- |makeresults | eval customer=a,b,c,d,e | foreach customer ... See more...
I have variables that I am trying to use to get in a search with a foreach loop... for example..  I have customers: a, b, c, d, e   --- |makeresults | eval customer=a,b,c,d,e | foreach customer   search index=main customer
Hi,  I am unable to figure out a regex that matches the key value pairs of my data , I think the transforms.conf regex and format would help here.  I am posting a sample event.    SAEGW-SGW10... See more...
Hi,  I am unable to figure out a regex that matches the key value pairs of my data , I think the transforms.conf regex and format would help here.  I am posting a sample event.    SAEGW-SGW10,sdfsd-sdfafsadf:1,sdafsdf:3,asdfsdf:3,dsfgdsfgretewq:0 It is just a FIELD_NAME:FIELD_VALUE pair. Just the first word of the event does not have a value associated with it.    I have tried this ([^\:]+)\:([^\,]+)\, but this not 100% accurate . Looking for more accuracy.    Thanks  
I'm currently trying to use the results of my eval fields in my base search  For example, I would like for my search to be index=rapid7 sourcetype=rapid7:nexpose:vuln vuln_age > 30 ratings=high is... See more...
I'm currently trying to use the results of my eval fields in my base search  For example, I would like for my search to be index=rapid7 sourcetype=rapid7:nexpose:vuln vuln_age > 30 ratings=high is there a way that I can do this? Here's the current  code: index=rapid7 sourcetype=rapid7:nexpose:vuln | eval p_date=strptime(vulnerability_date_published, "%Y-%m-%d") | eval t_date=now() | eval vuln_age= round((t_date - p_date)/86400) | eval ratings=case(vulnerability_cvss3_score>0 AND vulnerability_cvss3_score<=3.9, "Low",vulnerability_cvss3_score>3.9 AND vulnerability_cvss3_score<=6.9, "Medium", vulnerability_cvss3_score>6.9 AND vulnerability_cvss3_score<=8.9, "High", vulnerability_cvss3_score > 8.9, "Critical")  
Why do you not support syntax highlighting for SPL in the Code Sample widget here on Answers?  You do it in the main product, so you do have the logic. Looks really strange.
Hi All, I am working on Cisco Firepower field extraction. I got 2 different patterns mentioned below: 1. For the below one, I am getting action field either "Allow" or "Block" and I am able to extr... See more...
Hi All, I am working on Cisco Firepower field extraction. I got 2 different patterns mentioned below: 1. For the below one, I am getting action field either "Allow" or "Block" and I am able to extract that. Jul 16 2020 17:47:00 %FTD-1-430002: AccessControlRuleAction: Allow, SrcIP: 10.216.6.64, DstIP: 40.126.2.50, SrcPort: 57033, DstPort: 443, Protocol: tcp, IngressZone: in, EgressZone: out, ACPolicy: Azure-Policy-old, AccessControlRuleName: ASADmz_Internal_Access, Prefilter Policy: Allow_new, User: No Authentication Required, InitiatorPackets: 2, ResponderPackets: 1, InitiatorBytes: 120, ResponderBytes: 66, NAPPolicy: Unknown   2. How can I write single extractor for action field to cover below type of logs? Field action would be unknown for below pattern I believe. Jul 16 17:47:00 UTC: %FTD-session-6-305011: Built dynamic TCP translation from Inside:10.216.6.64/57035 to Outside:10.216.3.10/57035   The same issue I am facing with ASA firewall logs also.   Any direction would be highly appreciated.   Regards, Tejas
We had one search head have to be rebuilt because of JAVA issues.  We had another search head, due to a network switch outage, loose connection to the search head cluster. When we re-adding those ... See more...
We had one search head have to be rebuilt because of JAVA issues.  We had another search head, due to a network switch outage, loose connection to the search head cluster. When we re-adding those servers to the search head cluster we not have a strange issue with where custom apps can't search.  They either provided a "error fetching saved searches" in the panel then get stuck in a infinite loop of refreshing the browser tab or they load an error 255 where they can't search indexes even though they can search indexes fine in the default search and reporting app. We use custom authorization.conf and authorize.conf configs in /opt/splunk/etc/system/local and affected servers have the latest configs copied from a healthy server.  I'm working with splunk support but they are requesting har browser files and the issues seem to be permissions related.  Has anyone else seen this issue and able to resolve it?
Hello Guys , i have aquestion regarding search and replication factor , i have currently 2 SF AND 2 RF factor is set , is their any way that i can reduce it to make SF 1 AND RF 1. or SF 1 and RF 2 ?... See more...
Hello Guys , i have aquestion regarding search and replication factor , i have currently 2 SF AND 2 RF factor is set , is their any way that i can reduce it to make SF 1 AND RF 1. or SF 1 and RF 2 ? Any recommendations please let me know if so what will be the impact i will be facing ?  Whats the
Hello All, I have the Office 365 plugin, and looking to refine some alerts I have setup. The alert is to notify me of an attempted login from Outside the United States, except a few users with a sp... See more...
Hello All, I have the Office 365 plugin, and looking to refine some alerts I have setup. The alert is to notify me of an attempted login from Outside the United States, except a few users with a specific user ID and excluding a specific domain.  Everything works, with the exception of excluding a specific domain. Here is the search that I have setup (I have removed the sensitive information): `m365_default_index` sourcetype="o365:management:activity" Workload=AzureActiveDirectory Operation=UserLoggedIn | iplocation ClientIP | where Country !="United States" AND NOT UserId="user1@test.com" AND NOT UserId="user2@test.com" AND NOT UserId="user3@test.com" AND NOT UserId="user1@test2.com" AND NOT UserId="user2@test2.com" AND NOT UserId="user3@test2.com" AND NOT UserId="user4@test2.com" AND NOT UserId="user5@test2.com" AND NOT !UserId="*@test3.com" | table _time UserId LogonError ClientIP Country | rename app AS App UserId AS User ExtendedProperties{}.Value AS Reason ClientIP AS "Client IP" | sort - _time, user   The things that work are highlighted in green. The thing that doesn't work is highlighted in red.   Side note: My goal with the item in red is to exclude the entire domain test3.com for any user. For example, user1@test3.com is under the "User ID" field, so I am just wanting anything at test3.com to be excluded. That is not working for some reason.   Thank you all!