All Topics

Top

All Topics

I have 5 separate endpoints for our Okta environment that I'm pulling into Splunk. The data is all event driven so if I'm trying to map user, group and application data together and the groups or app... See more...
I have 5 separate endpoints for our Okta environment that I'm pulling into Splunk. The data is all event driven so if I'm trying to map user, group and application data together and the groups or applications were created over a year ago, it won't find the data unless I move the search window back, causing long searches. What I would like to do is  create lookup tables for each of those endpoints so I only have to run one long query, one time for those endpoints, and then append any group, application and user that is create each data on a saved search. Is this the right strategy and could someone help me with how you would do that? I did see a few articles on appending data to table but it didn't seem to meet my needs for this scenario. Thanks, Joel
Hello everyone. First of all, this was working fine using images 8.x. Here is my compose for 8.2:     version: '3.6' services: splunkuf82: tty: true image: splunk/universalforward... See more...
Hello everyone. First of all, this was working fine using images 8.x. Here is my compose for 8.2:     version: '3.6' services: splunkuf82: tty: true image: splunk/universalforwarder:8.2 hostname: universalforwarder82 container_name: universalforwarder82 environment: SPLUNK_START_ARGS: "--accept-license --answer-yes --no-prompt" SPLUNK_USER: root SPLUNK_GROUP: root SPLUNK_PASSWORD: "adminadmin"     Here are some commands to check if it is running:     jpla@rd:~/pd/rd/docker/rundeck/rd.universalforwarder82$ docker compose down jpla@rd:~/pd/rd/docker/rundeck/rd.universalforwarder82$ docker compose up -d [+] Running 2/2 ⠿ Network rduniversalforwarder82_default Created 0.1s ⠿ Container universalforwarder82 Started 0.4s jpla@rd:~/pd/rd/docker/rundeck/rd.universalforwarder82$ docker exec -it universalforwarder82 bash [ansible@universalforwarder82 splunkforwarder]$ cd bin [ansible@universalforwarder82 bin]$ sudo ./splunk status splunkd is running (PID: 1125). splunk helpers are running (PIDs: 1126).     Here is my compose for 9.0.3:     version: '3.6' services: splunkuf903: tty: true image: splunk/universalforwarder:9.0.3 hostname: universalforwarder903 container_name: universalforwarder903 environment: SPLUNK_START_ARGS: "--accept-license --answer-yes --no-prompt" SPLUNK_USER: root SPLUNK_GROUP: root SPLUNK_PASSWORD: "adminadmin"     Here are the same commands to check if it is running:     jpla@rd:~/pd/rd/docker/rundeck/rd.universalforwarder903$ docker compose down jpla@rd:~/pd/rd/docker/rundeck/rd.universalforwarder903$ docker compose up -d [+] Running 2/2 ⠿ Network rduniversalforwarder903_default Created 0.1s ⠿ Container universalforwarder903 Started 0.5s jpla@rd:~/pd/rd/docker/rundeck/rd.universalforwarder903$ docker exec -it universalforwarder903 bash [ansible@universalforwarder903 splunkforwarder]$ cd bin [ansible@universalforwarder903 bin]$ sudo ./splunk status Warning: Attempting to revert the SPLUNK_HOME ownership Warning: Executing "chown -R root /opt/splunkforwarder" Error calling execve(): No such file or directory Error launching command: No such file or directory execvp: No such file or directory Do you agree with this license? [y/n]: y This appears to be an upgrade of Splunk. --------------------------------------------------------------------------------) Splunk has detected an older version of Splunk installed on this machine. To finish upgrading to the new version, Splunk's installer will automatically update and alter your current configuration files. Deprecated configuration files will be renamed with a .deprecated extension. You can choose to preview the changes that will be made to your configuration files before proceeding with the migration and upgrade: If you want to migrate and upgrade without previewing the changes that will be made to your existing configuration files, choose 'y'. If you want to see what changes will be made before you proceed with the upgrade, choose 'n'. Perform migration and upgrade without previewing configuration changes? [y/n] y -- Migration information is being logged to '/opt/splunkforwarder/var/log/splunk/migration.log.2023-01-31.23-16-18' -- Migrating to: VERSION=9.0.3 BUILD=dd0128b1f8cd PRODUCT=splunk PLATFORM=Linux-x86_64 Error calling execve(): No such file or directory Error launching command: Invalid argument ^C [ansible@universalforwarder903 bin]$ sudo ./splunk status Warning: Attempting to revert the SPLUNK_HOME ownership Warning: Executing "chown -R root /opt/splunkforwarder" Error calling execve(): No such file or directory Error launching command: No such file or directory execvp: No such file or directory Do you agree with this license? [y/n]:     As you can see in 9.0.3 it asks for license again, and again after saying yes the first time. This behaviour is running on Docker version 20.10.23, also happening on Minikube version: v1.29.0.- on Linuxmint 21.1.- I added tty: true per this recommendation, but it didn't work for me. Could anybody please confirm the issue? Thanks!
Managing ever-growing data volumes in the age of cloud, microservices, and continuous delivery requires a modern observability approach It’s time to rethink your approach to SAP monitoring In thi... See more...
Managing ever-growing data volumes in the age of cloud, microservices, and continuous delivery requires a modern observability approach It’s time to rethink your approach to SAP monitoring In this AppDynamics Blog post, regular Community author and AppDynamics Technical Product Marketing Manager Aaron Schifman discusses the challenges that today’s distributed tech stack adds to traditional SAP monitoring, with real-world examples of how AppDynamics has helped Carhartt, Cepheid, and 3M. For an informed view of the essentials of effective modern observability solutions, see Aaron’s post here.
How can I combine multiple fields results in to single column with common name for example Test1, Test2, Test3 and so on up to Test 20  with a common word as "Test" in all the fields (either using fo... See more...
How can I combine multiple fields results in to single column with common name for example Test1, Test2, Test3 and so on up to Test 20  with a common word as "Test" in all the fields (either using foreach or any other solution)? Test1 Test2  Test3 Test4 Test5 Test6 Test7 Test8                 1 6 11 16 21 26 31 36 2 7 12 17 22 27 32 37 3 8 13 18 23 28 33 38 4 9 14 19 24 29 34 39 5 10 15 20 25 30 35 40   Result:  Test21 1 2 3 4 5 6 7 8 9 so on    Any help would be appreciated. 
I'm fairly new to Splunk and I am having some trouble setting up a data input from my universal forwarder. I've currently got it configured to pull windows event files from a specific folder on the m... See more...
I'm fairly new to Splunk and I am having some trouble setting up a data input from my universal forwarder. I've currently got it configured to pull windows event files from a specific folder on the machine that are moved to it manually. However it is only pulling seemingly random files, but 99% aren't getting indexed. I've tried specifying the file type to see if that was in issue, with no luck. I've also tried adding crcSalt = <string> to the input.conf file, no luck there either. Trying to see if I'm missing something as I've gone through many other posts for similar issues to no avail. Any ideas are greatly appreciated. 
So I have a vSphere environment.  our indexer machines are running rhel 8.7 and I installed splunk enterprise on all them. We named them indx01, indx02 and indx03 real creative yep but with some go... See more...
So I have a vSphere environment.  our indexer machines are running rhel 8.7 and I installed splunk enterprise on all them. We named them indx01, indx02 and indx03 real creative yep but with some googling we turning off distributed searching and disabled firewall just to be sure we initially had a success in adding peers to the index cluster master, but they were throwing an error that said unable to connect to the cluster master and so on the replication factor blah blah. So then we disabled indexer clustering on all of them and now I can't get any of them to be added. Distributed search turned off check firewall disabled   check on the same domain and dns  check I am attaching a image of a warning but I don't know what if anything it has to do with the problem
I've been working on a Dashboard/Query that takes two date/time values (UTC) from Zscaler ZPA logs and converts to local timezone (PST). Some entries have a blank Time_Disconnected value and I do not... See more...
I've been working on a Dashboard/Query that takes two date/time values (UTC) from Zscaler ZPA logs and converts to local timezone (PST). Some entries have a blank Time_Disconnected value and I do not know why. Original (Zscaler): TimestampAuthentication=2023-01-31T16:51:09.000Z TimestampUnAuthentication=2023-01-31T17:19:05.169Z Query: | rename TimestampAuthentication AS Time_Auth, TimestampUnAuthentication AS Time_Disconn | eval Time_Authenticated=strftime(strptime(Time_Auth, "%Y-%m-%dT%H:%M:%S.%z"), "%Y-%m-%d %H:%M:%S") | eval Time_Disconnected=strftime(strptime(Time_Disconn, "%Y-%m-%dT%H:%M:%S.%z"), "%Y-%m-%d %H:%M:%S") | sort -_time | table _time, Time_Auth, Time_Authenticated, Time_Disconn, Time_Disconnected (Time_Auth and Time_Disconn are the raw values) Result: Why is it that the last entry does not have the Time_Disconnected field populated? I have seen a few of those conversions not working. Is my query incorrectly formatted in some way?
Dear Splunkers, I would like to inform you that, I am very curious to learn splunk admin, can anyone refer me good YouTube channel or any other online institute moreover If possible please provid... See more...
Dear Splunkers, I would like to inform you that, I am very curious to learn splunk admin, can anyone refer me good YouTube channel or any other online institute moreover If possible please provide me list of topics available in splunk admin course Would be appropriate your kind support Thanks in advance..
I have installed my first splunk enterprise on a linux server and installed forwarders on windows workstations using the ports as instructed. Firewall is off and selinux off. The forwarders are calli... See more...
I have installed my first splunk enterprise on a linux server and installed forwarders on windows workstations using the ports as instructed. Firewall is off and selinux off. The forwarders are calling in. Now perhaps I am missing something in that in splunk, I select search and enter * (or index=anything, there is a long list) and the error is; The transform ca_pam_login_auth_action_success is invalid. Its regex has no capturing groups, but its FORMAT has capturing group references. I tried another search, and saw another error; Error in "litsearch" command: Your splunk license expired (the license is new) or you have exceeded your license limit too many times. Renew your splunk license by visiting www.splunk/com/store or calling 866-GET-SPLUNK. The search job failed due to an error. You may be able to view the job in the job inspector. All I want is to understand why FORMAT has capturing group references, but the regex does not and to turn my paperweight into a thriving reporting tool. Can anyone help? Thank you!  
Query: |tstats count where index=afg-juhb-appl   host_ip=*     source=*     TERM(offer) i want to get the count of each source by host_ip as shown below. output: source 11.56.67.12 11.56.6... See more...
Query: |tstats count where index=afg-juhb-appl   host_ip=*     source=*     TERM(offer) i want to get the count of each source by host_ip as shown below. output: source 11.56.67.12 11.56.67.15 11.56.67.18 11.56.67.19 /app/clts/shift.logs 987 67 67 89 /apps/lts/server.logs 45 45 67 43 /app/mts/catlog.logs 89 89 65 56 /var/http/show.logs 12 87 43 65
Splunk Education believes in the value of training and certification in today’s rapidly-changing data-driven industry.  Our ethos and commitment are stronger than ever – to expand our learning opport... See more...
Splunk Education believes in the value of training and certification in today’s rapidly-changing data-driven industry.  Our ethos and commitment are stronger than ever – to expand our learning opportunities and lower the barriers to entry for anyone, anywhere who wants to develop in their career and feel more confident navigating our highly-technical world.  Using Splunk to Boost Cybersecurity Knowledge Aspiring cybersecurity analyst, Marc Alicea, caught our attention when he shared on the Splunk Training and Certification LinkedIn page that he was well on his way to achieving his goal of completing 25 Splunk free, self-paced training courses in 25 days. Marc is currently a student at the New Jersey Institute of Technology (NJIT) where he is enrolled in the Cybersecurity Bootcamp.  “As a cybersecurity student, I’m learning that Splunk is the backbone of SIEM and SOAR operations within the cybersecurity infrastructure,” says Marc. “This made me curious enough to explore my options in Splunk training and get more in-depth knowledge of the product.” Marc said he learned about Splunk free, self-paced training from his assistant professor who took many of the courses herself. The NJIT bootcamp offers students like Marc the opportunity to take courses at night and on weekends as a way to begin a new career in cybersecurity, or upskill into positions like IT Security Manager, Network Security Administration, Cybersecurity Analysts, and more.  An Injury Was a Detour on His Career Path Passionate about vintage cars, Marc started his career journey in school training to be an automotive technician and after graduating was offered the rare opportunity to continue his education with the Mercedes-Benz Drive Program. After only two years as an automotive technician for Mercedes- Benz, Marc was badly injured during a roadside assistance call. “My injury forced me to rethink my future,” he said. “But I knew I wanted to use my skills and knowledge of advanced technologies in the automotive industry towards whatever I did next in my career.” Filling the Workforce Gap With the pace of change in technology creating a wider and wider workforce gap, Marc saw a future of opportunity – especially since he realized how he could parlay his passion for technology and cars into a new career in cybersecurity.  “Cars are becoming more and more technologically-advanced and cybersecurity is going to be an essential piece of that evolution,” noted Marc. “Splunk is already used in Formula 1 racing, so what a dream it would be if I could be a player in that space.” Bootcamp Curriculum Plus Free Splunk Training “In addition to my bootcamp curriculum designed around security and networking, I've actually completed all the free Splunk training needed to be prepared to sit for the 'Splunk Core Certified User' exam,” said Marc. “I plan to get this certification along with CompTIA SEC+, AWS Cloud Practitioner, and CompTIA NET+ certifications to prove my dedication and knowledge in the cybersecurity space.”  Driven by a Good Challenge  If it’s not clear already, Marc Alicea loves a good challenge. On top of the exams he’s taking to get through his cybersecurity bootcamp and prepping for the Splunk Core Certified User exam, Marc also passed all the course exams that were part of the Splunk free training curriculum. “I really appreciated the learn-at-my-own -pace format of the courses, which helped me better prepare for the exams. They were challenging.”  He was very impressed with the product as well. “The training gave me an incredible inside look at the Splunk UI and how powerful it can be for data capture and analytics. I can see how Splunk could be used within the auto industry and for applications like Formula 1 racing and engine analytics.”  Marc is hopeful that his training and certifications, along with his passion and dedication, will position him strongly for a cybersecurity role at the company of his choice – and will give him the practice he needs to stay up-to-date as the technology continues to advance. ______________________________________________________ We are so grateful to Marc for sharing his experience with the Splunk Community. If you are also on a new career path and driven by what’s possible, maybe these free courses will fulfill those needs for you too! You can check out the catalog of free, self-paced courses here.  — Callie Skokos, Representing the Splunk Education Crew  
I feel like there's a simple solution to this that I just can't remember. I have a field named Domain that has 13 values and I want to combine ones that are similar into single field values. This is ... See more...
I feel like there's a simple solution to this that I just can't remember. I have a field named Domain that has 13 values and I want to combine ones that are similar into single field values. This is how it currently looks: Domain:                                                                           Count: BC                                                                                      1 WIC                                                                                    3 WIC, BC                                                                            2 WIC, UPnet                                                                    3 WIC, DWnet                                                                   5 WIC, DWnet, BC                                                           6 WIC, DWnet, UPnet                                                    1 WIC/UPnet                                                                    3 WIC/DWnet                                                                   2 UPnet                                                                              5 UPnet, SG                                                                       6 DWnet                                                                              1 DW                                                                                     1 I want to merge the values "WIC, UPnet" and "WIC/UPnet" to "WIC,UPnet" | "WIC, DWnet" and WIC/DWnet" to "WIC, DWnet" | "DWnet" and "DW" to "DWnet" New results should read: Domain:                                                                           Count: BC                                                                                      1 WIC                                                                                    3 WIC, BC                                                                            2 WIC, UPnet                                                                    6 WIC, DWnet                                                                   7 WIC, DWnet, BC                                                           6 WIC, DWnet, UPnet                                                    1 UPnet                                                                              5 UPnet, SG                                                                       6 DWnet                                                                              2
Hi, Need a search query to find the either if  first_find and last_find values matches with the current date should raise an alert .   first_find last_find fields are in  2020-04-30T13:... See more...
Hi, Need a search query to find the either if  first_find and last_find values matches with the current date should raise an alert .   first_find last_find fields are in  2020-04-30T13:18:13.000Z 2023-01-15T14:12:18.000Z format need this in  2020-04-30 format  2. Instead of receiving all the alerts we require, if today's date matches the first _find or the last_find, raise an alert *todays date will change every day do not bound that with actual todays date* note : last_find  , first_find are multi valued fields.. Thanks...
Hello! I am calculating utilization using the code below. Yet, I want to only account for utilization during the weekdays, instead of the whole week. To do this, I set date_wday= Monday, Tuesda... See more...
Hello! I am calculating utilization using the code below. Yet, I want to only account for utilization during the weekdays, instead of the whole week. To do this, I set date_wday= Monday, Tuesday, Wednesday, Thursday, or Friday BUT when doing this, the utilization still accounts for the whole search time frame when I just want it to look at the time for business weeks. Code: index=example date_wday=monday OR tuesday or wednesday OR thrusday OR friday | transaction Machine maxpause=300s maxspan=1d keepevicted=T keeporphans=T | addinfo | eval timepast=info_max_time-info_min_time | eventstats sum(duration) as totsum by Machine | eval Util=min(round( (totsum)/(timepast) *100,1),100) | stats values(Util) as "Utilized" by Machine |stats max(Utilized) Can I please have help!! Thank you.
Hello everyone,    I have a question with base search in Splunk Dashboard Studio.  I used this option to made my parent search and my chain search :  For example, I create this search, which ... See more...
Hello everyone,    I have a question with base search in Splunk Dashboard Studio.  I used this option to made my parent search and my chain search :  For example, I create this search, which used the base search : SI_bs_nb_de_pc However, I have a problem with thoses errors:  * Can you help me please ? An other quetsion, how to use multiple base search in a same search ? I didn't find an issue to do this in Dashboard Studio  I need your help , thank you so much 
Configured the script based app for the databases which brings the data as follows. As mentioned below.  When I am running the script at UF corrected expeced output. But when I push an application ... See more...
Configured the script based app for the databases which brings the data as follows. As mentioned below.  When I am running the script at UF corrected expeced output. But when I push an application containg the same script it is fetching me different output. Expected data after running the script in the UF is as below.  Date, datname="sql", age="00:00:00" Output we are receiving at splunk SH is like below.  Date, datname="datname", age="age" The script is kept in the location -> /opt/splunkforwarder/etc/apps/appname/bin - scripts  and /opt/splunkforwarder/etc/apps/appname/local - inputs.conf For troubleshooting I have followed below steps.  Removed and Pushed the app again Tried restarting the UF Can any one know or faced similar issue. Please help me on this. 
Hi  We are having multiple UFs running on old version and i wanted to upgrade them to the latest version using Deployment server using Scripts. can you please  help me how to do it.  Can you pleas... See more...
Hi  We are having multiple UFs running on old version and i wanted to upgrade them to the latest version using Deployment server using Scripts. can you please  help me how to do it.  Can you please provide powershell script to upgrade UF version   
Hi, is there an alert action to save the results of the search directly to a specified, existing index? I already tried the "Log event" alert action, but in the "Event" field that has to be specifi... See more...
Hi, is there an alert action to save the results of the search directly to a specified, existing index? I already tried the "Log event" alert action, but in the "Event" field that has to be specified, I did not know how to access the results of my search.   Thanks for your help!
Hi! im trying to detect multiple user access from the same source (same mobile device). Im feeding splunk with logs from a mobile app like this: 09:50:14,524 INFO [XXXXXXXXXXXX] (default task-XXXXX... See more...
Hi! im trying to detect multiple user access from the same source (same mobile device). Im feeding splunk with logs from a mobile app like this: 09:50:14,524 INFO [XXXXXXXXXXXX] (default task-XXXXXX) [ authTipoPassword=X, authDato=XXXXX, authTipoDato=X, nroDocEmpresa=X, tipoDocEmpresa=X, authCodCanal=XXX, authIP=XXX.XXX.XXX.XXX, esDealer=X, dispositivoID=XXXXXXXXXX, dispositivoOS=XXXXX ] im using the following search search XXXX |  stats dc(authDato) as count,values(authDato) as AuthDato by dispositivoID dispositivoOS authIP | where count > 1 | sort - count  and get almost all the info i wanted (like two different users - authDato - from same deviceID (dispositivoID), but i would like to enrich the data with the last time of ocurrence for the event. Is there a way to include this information?  Thanks in advance.
A have two tables anda i want to relation this two tables by nember of events in a hour, i  manage to make a SQL query,  but struggle to do in splank. I send the data of this 2 tables for two diferen... See more...
A have two tables anda i want to relation this two tables by nember of events in a hour, i  manage to make a SQL query,  but struggle to do in splank. I send the data of this 2 tables for two diferent indexes (simple copy) and want to make this:     WITH count_reserved as ( SELECT count (ru.id) reserved, to_char(ru.date,'yyyy-mm-dd hh24') as time FROM reserved ru GROUP BY to_char(ru.date,'yyyy-mm-dd hh24') ), count_concluid as ( SELECT count (u.id) as concluid, to_char(u.date,'yyyy-mm-dd hh24') as time FROM concluid u GROUP BY to_char(u.date,'yyyy-mm-dd hh24') ) SELECT coalesce(concluid,0) as concluid, reserved, count_reserved.time, ((coalesce(concluid::decimal,0)/reserved)*100) as percentage FROM count_reserved LEFT JOIN count_concluid ON count_concluid.time=count_reserved.time ORDER BY 3 ASC     the information that a want to return is the percentage value and the time to make a graph hour bar