All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Can we populate the  primary index logs  to summary index . How to populate the logs from primary index to summary index. 
Hello Splunkers, I have installed splunk universal forwarder (ARMv6) on Raspbeery pi (running on Raspbeery OS 32 bit). I had enabled splunkd instance for user splunk (not an admin user) by follo... See more...
Hello Splunkers, I have installed splunk universal forwarder (ARMv6) on Raspbeery pi (running on Raspbeery OS 32 bit). I had enabled splunkd instance for user splunk (not an admin user) by following commands 1. sudo chown -R splunk: /opt/splunkforwarder 2.  sudo /opt/splunkforwarder/bin/splunk enable boot-start -systemd-managed 1 -user splunk -group splunk 3.  sudo su splunk 4. sudo systemctl start SplunkForwarder   unfortunately I get the following error: Job for SplunkForwarder.service failed because the control process exited with error code. See "systemctl status SplunkForwarder.service" and "journalctl -xe" for details SplunkForwarder.service:     #This unit file replaces the traditional start-up script for systemd #configurations, and is used when enabling boot-start for Splunk on #systemd-based Linux distributions. [Unit] Description=Systemd service file for Splunk, generated by 'splunk enable boot-start' After=network.target [Service] Type=simple Restart=always ExecStart=/opt/splunkforwarder/bin/splunk _internal_launch_under_systemd KillMode=mixed KillSignal=SIGINT TimeoutStopSec=360 LimitNOFILE=65536 SuccessExitStatus=51 52 RestartPreventExitStatus=51 RestartForceExitStatus=52 User=splunk Group=splunk Delegate=true CPUShares=1024 MemoryLimit=1963114496 PermissionsStartOnly=true ExecStartPost=/bin/bash -c "chown -R splunk:splunk /sys/fs/cgroup/cpu/system.slice/%n" ExecStartPost=/bin/bash -c "chown -R splunk:splunk /sys/fs/cgroup/memory/system.slice/%n" [Install] WantedBy=multi-user.target root@raspberrypi:/etc/systemd/system#   Does anyone have an idea how to enable systemd service on Raspberry pi so that splunk start automatically even with reboot? 
Hello guys, I need a help with a dropdown, basically i have a dropdown based on a previous one that returns me some choice about hostnames. Based on this hostname i need to show some panels, bu... See more...
Hello guys, I need a help with a dropdown, basically i have a dropdown based on a previous one that returns me some choice about hostnames. Based on this hostname i need to show some panels, but i would like that  till the user doesnt's choose one option from the dropdown nothing appear in the dasboard. But my token assume the "$label$" value so the depends attribute show alway what i'm trying to hide. Below the code: <input depends="$switch_type$" type="dropdown" searchWhenChanged="true" token="device_name"> <label>Switch Name</label> <fieldForLabel>switch</fieldForLabel> <fieldForValue>hostname</fieldForValue> <search> <query>"my query "</query> </search> <change> <set token="switch">$label$</set> </change> I would like that till the user doesnt select something from the dropdwon the token $switch$ is false or unset, in order to be able use it on depends condition... The value  that token can assume are not fixed so i can  not use custom condition.. Thanks for the help
Hi Community, Splunk newbie here.... I am trying to set-up a demo of Aruba/HPE Clearpass to Splunk integration. I have configured Clearpass to send syslog (udp-514) to Splunk for Audit records on ... See more...
Hi Community, Splunk newbie here.... I am trying to set-up a demo of Aruba/HPE Clearpass to Splunk integration. I have configured Clearpass to send syslog (udp-514) to Splunk for Audit records on Clearpass. I have also installed the Clearpass App in Splunk, set-up a Data Input and can see syslog events hitting the Splunk server when using Wireshark. I have also set-up a new index 'aruba' and can see that this is being populated frequently, however I do not see any events in the Splunk dashboard for the Clearpass App.  Any idea what could be causing this?  Splunk is installed on a Windows 2019 server in my home lab that is also my lab AD domain controller (I only have one server license).  Thanks
Hi at all, I installed the Check Point App for Splunk and I found a strange behaviour: at first the name is "Check Point App for Splunk" but the folder name is "TA-check-point-app-for-splunk" ,th... See more...
Hi at all, I installed the Check Point App for Splunk and I found a strange behaviour: at first the name is "Check Point App for Splunk" but the folder name is "TA-check-point-app-for-splunk" ,that's strange: it's an App or a TA? But this isn't my problem: installing this app I found that, for each event, there are some fields (date, time and rule_action) that are duplicated with the same value, in other words, for each event there is two times the same field and the same value (e.g. rule_action="allowed"). Has anyone encountered this problem? Ciao and thanks, Giuseppe
This Question got deleted.
I'm using Splunk Enterprise 8.2.4 with deployment server. I wat to push out all config/apps to my forwarders to prevent server admins adding config/apps locally. To date system admins have been creat... See more...
I'm using Splunk Enterprise 8.2.4 with deployment server. I wat to push out all config/apps to my forwarders to prevent server admins adding config/apps locally. To date system admins have been creating their own inputs and dumping data into main, flooding the license usages etc. and I need to stop this happening. I only want approved configs/inputs etc. to be pushed to the forwarders. As such, I have onboarded all my forwarders to deployment server. My first question is: Q1: How to prevent a user at the system creating an input and pushing data to the indexers? Is their a config item to only accept deployment server deployed inputs? On a test system I pushed an application I created that disabled the collection of the [WinEventLog://Security]. I found though that that system had received the app but was still pushing those events. Running btool at the forwarder shows: C:\Program Files\SplunkUniversalForwarder\etc\apps\SplunkUniversalForwarder\local\inputs.conf [WinEventLog://Security] C:\Program Files\SplunkUniversalForwarder\etc\apps\SplunkUniversalForwarder\local\inputs.conf disabled = 0 So this seems to be the config from when the forwarder was installed ad the windows inputs were selected in the forwarder MSI installation UI. Q2: How to override this with deployment server i.e. a locally configured input not necessarily in the apps folder?  
Below is the query I am  trying to use to get the result but, its giving error  for eval statement. Could anyone please help me here. index="application_name" | spath logger | search logger=" loggin... See more...
Below is the query I am  trying to use to get the result but, its giving error  for eval statement. Could anyone please help me here. index="application_name" | spath logger | search logger=" logging.transcation.filter "| spath event | search event = "responseActivity"| search requestURI IN (/login,/api/v1/user/profile,/api/v1/app/version,/api/v1/user/profile/pickey,/api/v1/home/reseller/*) | eval requestURI=case((requestURI="/api/v1/home/reseller/*"), "/api/v1/homepage")
Hi I need to mask some timepicker items I have succeded to mask "Temps réel" with the code below but I dont succeed to mask the items with  yellow cross Could you help please?     ... See more...
Hi I need to mask some timepicker items I have succeded to mask "Temps réel" with the code below but I dont succeed to mask the items with  yellow cross Could you help please?     /* ---------------------------------------- suppression choix Temps Reel */ div[data-test="real-time-column"] { display: none; } div[data-test="other-column"] { display: none; } div[data-test-panel-id="realTime"] { display: none; }    
Hi I use a search wich is enough verbose because it queries on system events base on different tokens By default all these tokens are put on "*" index=toto sourcetype="system" site="$Site$" type=$... See more...
Hi I use a search wich is enough verbose because it queries on system events base on different tokens By default all these tokens are put on "*" index=toto sourcetype="system" site="$Site$" type=$type$ name="$name$"  I would like to know if there is a solution to reduce the disk quota and the number of events without playing with the timepicker or without playing with tokens? thanks
Hello, how can I ingest logs starting with a specific word: Sample Log Entry: SPLUNKD-123456: Hello World Hello World123 Hello World456 Hello World789 SPLUNKD-0000: Hello World SPLUNKD-0012:... See more...
Hello, how can I ingest logs starting with a specific word: Sample Log Entry: SPLUNKD-123456: Hello World Hello World123 Hello World456 Hello World789 SPLUNKD-0000: Hello World SPLUNKD-0012: Hello World Hello World0123 Hello World0456 Logs that will be ingested into Splunk: SPLUNKD-123456: Hello World SPLUNKD-0000: Hello World SPLUNKD-0012: Hello World Thanks!    
We are recently migrated to QRadar. So we decide to decommission the splunk. before decommission we need to stop any user login into splunk? How can i do that for all users. could you please suggest... See more...
We are recently migrated to QRadar. So we decide to decommission the splunk. before decommission we need to stop any user login into splunk? How can i do that for all users. could you please suggest what actions to be taken.
hi why my sort _time doesnt works please? what is strange is that when I click directly on the field the sort doesnt works too   | eval _time = strftime(_time, "%d-%m-%y %H:%M:%S") | sort _time ... See more...
hi why my sort _time doesnt works please? what is strange is that when I click directly on the field the sort doesnt works too   | eval _time = strftime(_time, "%d-%m-%y %H:%M:%S") | sort _time | stats last(host) as host, last(os) as OS by _time | rename host as Host, _time as Date | table Date, Host, OS | sort - Date  
suppose i had data like below field="_raw" afadfadfadf afadsfagafg adfafafa string1 ......... afjal;dkfhao ilhaf ajkf;haldghag;lakg akuhfajkdhfalkfha; auhaghkajdgakg jkalfagafg string2........ See more...
suppose i had data like below field="_raw" afadfadfadf afadsfagafg adfafafa string1 ......... afjal;dkfhao ilhaf ajkf;haldghag;lakg akuhfajkdhfalkfha; auhaghkajdgakg jkalfagafg string2......... afdasdgadfas **bleep**adgafgafgaf agfgertfergreg And i want to extract the data in between string1 and string2  
 Ingestion Latency Root Cause(s): Events from tracker.log have not been seen for the last 74130 seconds, which is more than the red threshold (210 seconds). This typically occurs when indexing... See more...
 Ingestion Latency Root Cause(s): Events from tracker.log have not been seen for the last 74130 seconds, which is more than the red threshold (210 seconds). This typically occurs when indexing or forwarding are falling behind or are blocked.
when iam using this command sanjay@ubuntu:~/opt/splunk$ cd var    iam getting this error bash: cd: var: Permission denied and when i try to access opt with root  root@ubuntu:~# cd opt it showing ... See more...
when iam using this command sanjay@ubuntu:~/opt/splunk$ cd var    iam getting this error bash: cd: var: Permission denied and when i try to access opt with root  root@ubuntu:~# cd opt it showing error bash: cd: opt: No such file or directory  then how to access var
help
Hello Splunkers,  has any one done getting ping identity ( SAS)  data from  from portal to Splunk On-prem if you have any instructions please share to me. https://docs.pingidentity.com/bundle/pingo... See more...
Hello Splunkers,  has any one done getting ping identity ( SAS)  data from  from portal to Splunk On-prem if you have any instructions please share to me. https://docs.pingidentity.com/bundle/pingoneforenterprise/page/chq1564020494373-2.html This is what i found in Ping website but  i feel Ping is widely used product they should have developed app or TA for data collection its really poor explanation.
sudo ./splunk add forward-server 10.0.0.218 :9997 after running this command [sudo] password for smsplunkforwarder: iam getting this error sudo: ./splunk: command not found
Hi, what is the best way to edit (overwrite) values in savedsearch.conf file in local directory on SHC members using deployer?