All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

We have lookup editor 3.4.6 on Splunk 7.2.6 distributed architecture. While editing KV store lookup by typing manually, it reflects the live result, but while we copy and paste the result in the edi... See more...
We have lookup editor 3.4.6 on Splunk 7.2.6 distributed architecture. While editing KV store lookup by typing manually, it reflects the live result, but while we copy and paste the result in the editor mode, it shows the result only after refresh. Till we refresh, it will just show empty cells. Any guidance, please. Thanks.    
Hello, I've got an application that generates an archive file with nested archive files in it. here is a sample of my file :   AppArchive.tar.gz |_InsideArchive1.tar.gz |_InsideInsideArchive1.t... See more...
Hello, I've got an application that generates an archive file with nested archive files in it. here is a sample of my file :   AppArchive.tar.gz |_InsideArchive1.tar.gz |_InsideInsideArchive1.tar.gz |_filetoindex1.csv |_InsideArchive2.tar.gz |_InsideInsideArchive2.tar.gz |_filetoindex2.csv   When I'm uploading my archive file to Splunk via the web UI, Splunk doesn't seem to find and extract all the files. I would like to replace the .tar.gz splunk default configuration to make my own unarchive_cmd but it seems like my app config (props.conf) is never called, is there a way to override the splunk system configuration unarchived_cmd (/opt/splunk/etc/system/default/props.conf) with only changing my user app configuration ? Actually im trying this in my app configuration but it doesn't work and my script(myscript.py) is never called :   props.conf : [source::...myapp.tar.gz] invalid_cause = archive unarchiv_cmd = /opt/splunk/etc/apps/myapp/bin/myscript.py NO_BINARY_CHECK = true sourcetype = myapparchive priority = 10002   Thank you for your help !
Hi all, I have made a search that gives me every user who's password expires in less than 10 days. Is there a way to send an email daily to that user instead of the IT department? So I can fully aut... See more...
Hi all, I have made a search that gives me every user who's password expires in less than 10 days. Is there a way to send an email daily to that user instead of the IT department? So I can fully automate this process and that the users themselves are notified in case of password expiring. Thank you, Sasquatchatmars
Hello, Am following the below doc to install SSL certificates on Splunk web. https://docs.splunk.com/Documentation/Splunk/7.2.8/Security/SecureSplunkWebusingasignedcertificate So, what is the issu... See more...
Hello, Am following the below doc to install SSL certificates on Splunk web. https://docs.splunk.com/Documentation/Splunk/7.2.8/Security/SecureSplunkWebusingasignedcertificate So, what is the issue for below settings? when i use the web.conf below settings, the web console is in up. [settings] enableSplunkWebSSL = 0 when I use the web.conf below settings, the web console is not in up. [settings] startwebserver = 1 enableSplunkWebSSL = true privKeyPath = D:\Splunk\etc\auth\MyOrg\privatekey.key serverCert = D:\Splunk\etc\auth\MyOrg\xxxxxxxxx.pem So, here i have already .pem format cert is available so I just renamed with .key format cert for privatekey.key. So is this right way to get privatekey like this. is this works with installing the SSL certs.
Hi, I am not able to login to my AppD Lite account. It gives error "Error Checking Security provider" and my applications also is not able to send metrics to the AppD Lite account, failing with ... See more...
Hi, I am not able to login to my AppD Lite account. It gives error "Error Checking Security provider" and my applications also is not able to send metrics to the AppD Lite account, failing with a not accessible error. Why is this happening?
I have a CSV data in following format and I have written props and transforms to extract the fields. Somehow, the ""Summary|vSphere Tag"" field values are not getting extracted wherein I have written... See more...
I have a CSV data in following format and I have written props and transforms to extract the fields. Somehow, the ""Summary|vSphere Tag"" field values are not getting extracted wherein I have written transforms for it. Below are my configuration files - inputs, props and transforms : CSV Data:     "Name","Summary|vSphere Tag" "DC4VPWSAM","[<Application_category-Software Asset Management>, <Sub_class-Facilities>, <Department-Infrastructure & Operations>, <Primary_System_Owner-Pankaj Gadhari>, <Section-SM Service Support_Sec>, <Organisation-Technology & Infrastructure>, <Division-I&O Service Management>, <Application_Name-Manager Suite>, <Unit-SM Service Support>, <Class-Line of Business>]" "DC1VPWSAM","[<Application_category-Software Asset Management>, <Sub_class-Facilities>, <Department-Infrastructure & Operations>, <Primary_System_Owner-Pankaj Gadhari>, <Section-SM Service Support_Sec>, <Organisation-Technology & Infrastructure>, <Division-I&O Service Management>, <Application_Name-Manager Suite>, <Unit-SM Service Support>, <Class-Line of Business>]" "DC3VPWSAM","[<Application_category-Software Asset Management>, <Sub_class-Facilities>, <Department-Infrastructure & Operations>, <Primary_System_Owner-Pankaj Gadhari>, <Section-SM Service Support_Sec>, <Organisation-Technology & Infrastructure>, <Division-I&O Service Management>, <Application_Name-Manager Suite>, <Unit-SM Service Support>, <Class-Line of Business>]" "DCVPWSCCM","[<Primary_System_Owner-Pankaj Gadhari>]" "witsql-esx","none"     Inputs.conf     [monitor://C:\VMware-Tags\tagsplit\*.csv] disabled = false index = vmware sourcetype = vmware-tags-csv crcSalt = <SOURCE>       Props.conf     [vmware-tags-csv] DATETIME_CONFIG = CURRENT INDEXED_EXTRACTIONS = csv KV_MODE = none NO_BINARY_CHECK = true SHOULD_LINEMERGE = false category = Structured disabled = false pulldown_type = true REPORT-vmtags = myplaintransform1 EXTRACT-vmname = (?<vmname>[A-Za-z0-9]+),       transforms     [myplaintransform1] REGEX=(?<vmname>[A-Za-z0-9]+),\<(.*?)-(.*?)\> FORMAT=$1::$2         somehow the transforms is not working and the fields are not getting extracted. I want to extract Key Value pairs from "Summary|vSphere Tag" field so that it should show in search as below : vmname, Application_category Primary_System_Owner and so on...  Please help resolve the issue.
Hi, I got data like this, which will show hop server between two destinations and certain metrics between them  source,final_destination,hop_number,hop_host,loss_perc 1.1.1.1,4.4.4.4,1,2.2.2.2... See more...
Hi, I got data like this, which will show hop server between two destinations and certain metrics between them  source,final_destination,hop_number,hop_host,loss_perc 1.1.1.1,4.4.4.4,1,2.2.2.2,3 1.1.1.1,4.4.4.4,2,3.3.3.3,2 So I wanna show a visualization which shows like this 1.1.1.1. ( source ) ----- 3 ( loss perc ) -------> 2.2.2.2 ----2 ( loss perc ) ------> 3.3.3.3 -------> 4.4.4.4 ( final destination )  @niketn 
Hi Everyone, Below are my logs: 2020-10-12 23:52:22,228 INFO [Web Server-25646] o.a.n.w.s.Filter Attempting request for (<drath20><CN=50088.phx.aexp.com, OU=Middleware Utilities, L=Phoenix, ST=Ariz... See more...
Hi Everyone, Below are my logs: 2020-10-12 23:52:22,228 INFO [Web Server-25646] o.a.n.w.s.Filter Attempting request for (<drath20><CN=50088.phx.aexp.com, OU=Middleware Utilities, L=Phoenix, ST=Arizona, C=US>) GET https://abcdefghk50088.phx.exp.com:9091/api/flow/process-groups/ef451556-016d-1000-0000-00005025535d (source ip) 2020-10-12 23:52:22,228 INFO [Web Server-25646] o.a.n.w.s.Filter Attempting request for (<drath20><CN=50088.phx.aexp.com, OU=Middleware Utilities, L=Phoenix, ST=Arizona, C=US>) GET https://abcdefghk50088.phx.exp.com:9091/api/flow/process-groups/22b93621-b347-1f81-964a-a87c2019828c (source ip:)   I want to extract the highlighted field as Request URL. Can someone guide me how I can extract it. Thanks in advance.
We need to onboard Honeypot in our Splunk ES Instance, Can you Please help how we can Proceed further.   Also I can see there is canary app and add on https://help.canary.tools/hc/en-gb/articles/36... See more...
We need to onboard Honeypot in our Splunk ES Instance, Can you Please help how we can Proceed further.   Also I can see there is canary app and add on https://help.canary.tools/hc/en-gb/articles/360002432418-Installing-the-Canary-Splunk-App-and-Add-on   Is this fine approach?Please suggest
Hi, How to get the affected tier name in the HTTP email template for Anomaly Detection alert? It would be great if someone helps me with it. I tried with ${latestEvent.tier.name}  but it is no... See more...
Hi, How to get the affected tier name in the HTTP email template for Anomaly Detection alert? It would be great if someone helps me with it. I tried with ${latestEvent.tier.name}  but it is not working  ^ Edited by @Ryan.Paredez to improve the title. 
Hi Team, Want to upgrade Splunk enterprise from version 8.0.1 to 8.0.6 on  Linux environment. 1. indexer upgrade 2. Search Head Upgrade. 3. Heavy forwarder upgrade. Please let me know step by st... See more...
Hi Team, Want to upgrade Splunk enterprise from version 8.0.1 to 8.0.6 on  Linux environment. 1. indexer upgrade 2. Search Head Upgrade. 3. Heavy forwarder upgrade. Please let me know step by step by process with the commands from the backup the current version configurations and backup the bucket indexers as well.   Thanks, Amarbabu
How do i execute macros in rest API , example : curl -ku user:pass https://<url> -d search="`macro name` | table data1 data2"
Can we install Splunk Enterprise as a root user using tar method?
Hi all, I'm in trouble because the value of the fields tag doesn't change when the dropdown changes.   <input type="dropdown" token="tk_report"> <label>month</label> <fieldForLabel>re... See more...
Hi all, I'm in trouble because the value of the fields tag doesn't change when the dropdown changes.   <input type="dropdown" token="tk_report"> <label>month</label> <fieldForLabel>reporting_month</fieldForLabel> <fieldForValue>reporting_month</fieldForValue> <search> <query>| makeresults | eval reporting_month=mvrange(relative_time(now(),"- 5mon@mon"),now(),"1mon") | table reporting_month | mvexpand reporting_month | eval reporting_month=strftime(reporting_month,"%Y-%m") | sort - reporting_month</query> <earliest>@d</earliest> <latest>now</latest> </search> <change> <eval token="reporting_month1">strftime(relative_time(strptime($value$."-01","%Y-%m-%d"),"-1mon"),"%Y-%m")</eval> <eval token="reporting_month2">strftime(relative_time(strptime($value$."-01","%Y-%m-%d"),"-2mon"),"%Y-%m")</eval> </change> <selectFirstChoice>true</selectFirstChoice> </input>   This is table option   <option name="count">10</option> <option name="drilldown">row</option> <option name="refresh.display">progressbar</option> <fields["$reporting_month2$","$reporting_month1$","$tk_report$"]</fields>    I think the first token remains. but I have no idea to solve this problem. Is there anyone know how to solve this problem? Thank you for helping.
Hi All,  Can you please suggest whether we can leverage any Splunk app for Google Firebase ? I wasnt able to find any relevant apps in Splunkbase.  Can you please advise which splunk apps are you u... See more...
Hi All,  Can you please suggest whether we can leverage any Splunk app for Google Firebase ? I wasnt able to find any relevant apps in Splunkbase.  Can you please advise which splunk apps are you using within your organization to ingest the Google FireBase Data?   Please do let me know in case of any  queries.  Thanks and regards Arjit Goswami
Hi,   I want to add external link in my dashboard description. I saw similar question, but cant find the correct answer https://community.splunk.com/t5/Dashboards-Visualizations/Add-link-to-extern... See more...
Hi,   I want to add external link in my dashboard description. I saw similar question, but cant find the correct answer https://community.splunk.com/t5/Dashboards-Visualizations/Add-link-to-external-site-in-dashboard-description/td-p/252640 I tried adding <html> tag in the dashboard <description> but I got a warning that <description> is not supposed to have a child tag. EDIT (so that the accepted answer directly correlates to this question): I want to add link in dashboard description because I want the link to appear under the title, and above the input. If I just add <html> tag under <panel>, then it will appear after input.
I am new to splunk and need some guidance. I have install RWI and the add-in's required.  I would like to pull the Active VPN Sessions and number of VPN logins  from my SonicWall firewall, but I'm n... See more...
I am new to splunk and need some guidance. I have install RWI and the add-in's required.  I would like to pull the Active VPN Sessions and number of VPN logins  from my SonicWall firewall, but I'm not sure where to start so that I can see this information in the RWI Dashboard.   Regards, Z_Kat  
Evening Splunk community, My organization practices Blue / Green data-centers and requires us to switch production data centers every quarter.  In my environment I manage two standalone Search Head... See more...
Evening Splunk community, My organization practices Blue / Green data-centers and requires us to switch production data centers every quarter.  In my environment I manage two standalone Search Heads. One Search Head in each data-center, separated by region. I'm trying to determine a clean solution for keeping user knowledge artifacts (saved searches, reports, alerts..ect) synced across the two Search Heads without having to implement Search Head Cluster Replication. Does anyone have any tips, advice, or general best practices when it comes to keeping knowledge objects synced between two or more standalone Search Heads? I've read a few forum posts that cover this topic and I've detailed some of the solutions I'm brainstorming, but wanted to get everyone's opinion before I start down the wrong path. For starters I believe the knowledge artifacts on the Search Head reside under the following directories. Not including the saved searches within /etc/apps.     $SPLUNK_HOME/etc/system/local/authentication.conf $SPLUNK_HOME/etc/system/local/authorize.conf $SPLUNK_HOME/etc/users/* Everything in local folders and local.meta files under splunkhome/etc/apps   Take a complete backup of /opt/splunk and restore it to the standby Search Head as needed. Implement an rsync script on each search head I'd create an environment variable that indicates if the search head is currently the active search head for production workloads. If yes, push that SH's configurations / changes to the standby Search Head. Implement Search Head Clustering. While I believe this is the intended solution for what I need to achieve. I'd like to avoid this if possible as I'm our only Splunk administrator and from what I've been told there's a fair bit more management overhead to tackle. Implement a CI / CD pipeline that checks user knowledge artifacts into git as they change and then pushes the data out to the environment as needed I think this would be the ultimate goal for me as we're working to eliminate as much toil and as many landmines in our environment as possible. Are there any good blog posts or guides concerning managing and deploying via CI / CD in regards to Splunk? Create an EFS mount on both Search Heads and point the directories that could contain Splunk User Data to the EFS mount for both Search Heads to share.     References - https://community.splunk.com/t5/Deployment-Architecture/What-is-the-best-practice-to-backup-Splunk/m-p/201159 - https://community.splunk.com/t5/Splunk-Enterprise-Security/What-is-considered-a-full-back-up-of-Search-Head/m-p/351484 - https://lantern.splunk.com/hc/en-us/articles/360043720673-Managing-backup-and-restore-processes- - https://community.splunk.com/t5/All-Apps-and-Add-ons/Has-anyone-attempted-to-run-Splunk-with-indexes-stored-on-AWS/td-p/274270 - https://community.splunk.com/t5/Splunk-Search/How-to-get-all-users-searches-and-lookups-synchronized-to-both/td-p/394614 - https://community.splunk.com/t5/Deployment-Architecture/I-have-a-Splunk-Enterprise-Search-Head-in-a-Production-and-a/td-p/507154     
Does Splunk Cloud 8.0.x.x support two factor authentication?  If so, what are the options?   Is there a way to access Splunk Cloud through Azure portal? Thank you!