All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello community, I am searching since few days a solution to display the earliest and latest value from a chart into a dashboard. here is my query; <search> <query>index=main Name=volume_* | char... See more...
Hello community, I am searching since few days a solution to display the earliest and latest value from a chart into a dashboard. here is my query; <search> <query>index=main Name=volume_* | chart sum("Used Capacity TB") AS "Used Capacity TB", sum("Total Capacity TB") AS "Total Capacity TB" by _time span=7d</query> <earliest>$time_token.earliest$</earliest> <latest>$time_token.latest$</latest> </search> I would like to extract the earliest and latest value and then substract the latest-earliest divided by the number of days.   exemple, values of  earliest is 50 latest is 52 the calculation will be  52-50 = 2 2/7d = 0.286   Thank you!  
We are trying to see if there is a way to modify this TA to pull in another AD attribute.  In our case, it is 'department', but it could really be anything.  Is there a file to modify where we coul... See more...
We are trying to see if there is a way to modify this TA to pull in another AD attribute.  In our case, it is 'department', but it could really be anything.  Is there a file to modify where we could then bring in a new attribute from the Azure AD?  Regards.
Hello. How can two files be compared for identity ? file1.csv: username id_user Jonh 123   file2.csv  username id_user Jonh 124   How to write a request correctly. To che... See more...
Hello. How can two files be compared for identity ? file1.csv: username id_user Jonh 123   file2.csv  username id_user Jonh 124   How to write a request correctly. To check if id_user is identical for one user in different files. Compare two files with the same ID or not if file1.csv.id_user != file2.csv.id_user  There was a message that the id is different
I'm trying to gather how many CPUs and Cores a server has but, it seems like on most VMs the CPUs and Cores reports as just 1 regardless of the actual number. Here is the search I was running: in... See more...
I'm trying to gather how many CPUs and Cores a server has but, it seems like on most VMs the CPUs and Cores reports as just 1 regardless of the actual number. Here is the search I was running: index=windows sourcetype=winhostmon source=processor | table host cpu* Number* | dedup host And here is an section of the output: host cpu_architecture cpu_cores cpu_count cpu_mhz NumberOfCores NumberOfProcessors server1 x64 1 1 2397 1 1 server2 x64 1 1 2397 1 1 server3 x64 1 1 2497 1 1 server4 x64 1 1 2497 1 1 server5 x64 1 1 2397 1 1 server6 x64 1 1 2397 1 1 server7 x64 1 1 2497 1 1 server8 x64 1 1 3193 1 1 server9 x64 1 1 2594 1 1 server10 x64 1 1 2397 1 1 server11 x64 1 1 2397 1 1 server12 x64 1 1 2397 1 1 server13 x64 1 1 2497 1 1 server14 x64 1 1 2597 1 1 server15 x64 1 1 2497 1 1 server16 x64 1 1 2397 1 1 server17 x64 1 1 2397 1 1 server18 x64 1 1 2497 1 1 server19 x64 1 1 2597 1 1 server20 x64 1 1 2497 1 1 server21 x64 1 1 2397 1 1 server22 x64 1 1 2397 1 1 server23 x64 1 1 2497 1 1 server24 x64 1 1 2597 1 1 server25 x64 1 1 2397 1 1 This is what I have in my inputs.conf [WinHostMon://Processor] interval = 300 disabled = 0 type = Processor What commands or data sources are used to gather this data? I want to view this data on the server itself and see if the server is reporting it to Splunk wrong(my assumption) or if there is a bug in winhostmon.   Thanks!
I can't make this script work for forwarder deployment. It takes up a lot of time to deploy to many servers.. I guess it has som obvious flaw that i can't see... My script: [Edit] $DEPLOYMENT_SER... See more...
I can't make this script work for forwarder deployment. It takes up a lot of time to deploy to many servers.. I guess it has som obvious flaw that i can't see... My script: [Edit] $DEPLOYMENT_SERVER="SPLUNK-05:8089" $RECEIVING_INDEXER="SPLUNK-05:9997" $MONITOR_PATH="C:\Temp\" $CERTFILE="c:\temp\cert.pfx" $CERTPASSWORD="pass" $LOGON_USERNAME="Admin" $LOGON_PASSWORD="pass" $SET_ADMIN_USER=1 $SPLUNKUSERNAME="Admin" $SPLUNKPASSWORD="pass" $AGREETOLICENSE="yes" msiexec.exe /i "\\server\splunkforwarder-8.1.2-545206cc9f70-x64-release.msi" DEPLOYMENT_SERVER=$DEPLOYMENT_SERVER RECEIVING_INDEXER=$RECEIVING_INDEXER MONITOR_PATH=$MONITOR_PATH CERTFILE=$CERTFILE CERTPASSWORD=$CERTPASSWORD SET_ADMIN_USER=$SET_ADMIN_USER SPLUNKUSERNAME=$SPLUNKUSERNAME SPLUNKPASSWORD=$SPLUNKPASSWORD AGREETOLICENSE=$AGREETOLICENSE /Quiet
How _time will be set for azure:eventhub using TA-MS-AAD  by default  ? I don't see any configuration for _time but the indextime and the _time are not identical     [azure:eventhub] SHOULD_LINEM... See more...
How _time will be set for azure:eventhub using TA-MS-AAD  by default  ? I don't see any configuration for _time but the indextime and the _time are not identical     [azure:eventhub] SHOULD_LINEMERGE = 0 category = Splunk App Add-on Builder pulldown_type = 1 #################### # Metrics ####################   Few results for reference  _time indextime count 2021-10-14 13:00:01.431 2021-10-14 13:10:27 1 2021-10-14 13:00:01.431 2021-10-14 13:10:35 1 2021-10-14 13:00:01.431 2021-10-14 13:10:37 1 2021-10-14 13:00:01.431 2021-10-14 13:10:43
Hello, I am looking to create a report of a search. I have a requirement of tracking user logon to window machines (Active directory). I am currently getting all the data, but I am having problems wi... See more...
Hello, I am looking to create a report of a search. I have a requirement of tracking user logon to window machines (Active directory). I am currently getting all the data, but I am having problems with false logons, or services using the credentials. for example, I will see people logged in at 1 am, but the logon id is 0x0, or there is an error code 000, so that most likely will be a service or something using the credentials of someone, and no one actually logging in. there are about 1500 records a day of these false logons.  I also have the requirement to track Monday - Friday from 6pm to 6am overnight, and I cant seem to get the time of recording properly in the search. Below is the search I am currently using, and help would be appreciated, thank you!    source= “wineventlog: security" EventCode=528 OR EventCode=540 OR EventCode=4624 OR (EventCode=4776 Error_Code=0x0) NOT Account_Name=“*$” NOT Logon _Account="*$" NOT User_Name="*$' | eval Account_Name=mvindex(Account Name, 1) | eval User=coalesce(Account_Name, Logon_Account, Logon_account, User_Name) | eval User=lower (User) | table _time, User, EventCode
We have two addon built by addon builder 1.0.1 and when we scan with python readiness app, these  addons are failing. We are running with splunk 8.1 and like to know if there is any way covert these ... See more...
We have two addon built by addon builder 1.0.1 and when we scan with python readiness app, these  addons are failing. We are running with splunk 8.1 and like to know if there is any way covert these addons to python3 without much rework.
hi what is the rex for extract all brackets contain this pattern [AB_123] [ZXY_987] 1-check all brackets if start with AB_ extract this AB_123 2-check all brackets if start with ZXY_ extract this... See more...
hi what is the rex for extract all brackets contain this pattern [AB_123] [ZXY_987] 1-check all brackets if start with AB_ extract this AB_123 2-check all brackets if start with ZXY_ extract this ZXY_987 3-put all this in single field that call "Codes" any idea? thanks
Hi I've got a csv file where the first line contains the field names and the rest are separate events but the first column is year and the second is month. There are no other time-based fields i.e. ... See more...
Hi I've got a csv file where the first line contains the field names and the rest are separate events but the first column is year and the second is month. There are no other time-based fields i.e. day, hour, etc so I am having difficulty in creating a _time field. Does anyone know how I can generate a _time field using only year and month? The day, hour, etc are irrelevant. Thanks in advance  
Hi all, I need some advice what we should use to get the logging to splunk  working for "Microsoft Defender for Identity". There is so many diffrent apps on splunkbase with diffrent qualities and ... See more...
Hi all, I need some advice what we should use to get the logging to splunk  working for "Microsoft Defender for Identity". There is so many diffrent apps on splunkbase with diffrent qualities and I know that MS have rename the service. Any suggestion? //Jan
Hi, we have a cluster environment with 6 indexers. Each host has 128GB Ram, but as I see Splunk using only ~4GB. Is there any chance to optimize (speedup) memory usage and let splunk to use for exa... See more...
Hi, we have a cluster environment with 6 indexers. Each host has 128GB Ram, but as I see Splunk using only ~4GB. Is there any chance to optimize (speedup) memory usage and let splunk to use for example 100GB of RAM ?   I have tons of different indexers with different dashboards .... If yes, how to do it ?     Cheers Konrad
Hi How can I find events that not occurred daily? Here is the scenario  I have two field on my logfile <servername> <CLOSESESSION> need to know when CLOSESESSION is 0 each day by servername. every... See more...
Hi How can I find events that not occurred daily? Here is the scenario  I have two field on my logfile <servername> <CLOSESESSION> need to know when CLOSESESSION is 0 each day by servername. everyday I expect CLOSESESSION appear on my server logs, if one or more server has no CLOSESESSION it means something going wrong. need two search here, first extract all server names from file name that exist in path from metadata for faster result, then in second query check which one has not CLOSESESSION FYI: I don’t like to use lookup in csv file for first step, prefer do it with multi search and join table. something like this: 1- first search return list of all log files exist (per server) | metadata type=sources index=my_index | table source 2-second search filter lines contain CLOSESESSION index="my_index" | search CLOSESESSION | rex extracted server names of field "source" from STEP 1 | rex extract count of CLOSESESSION  join them and just show those hasn’t CLOSESESSION   here is the logs: servernames not exist in log, extract from log file name, i put it in log with different color for clear the main goal) 23:54:00.957 app server 1 module: CLOSESESSION 23:54:00.958 app server 3 module: CLOSESESSION 23:54:00.959 app server 4 module: CLOSESESSION   Expected output step 1: servernames server 1 server 2 server 3 server 4   Expected output step 2: Servername     cause Server2               NOCLOSESESSION
I am getting this error in deployment server 8.0.9 and how to make this right : ./splunk reload deploy-server -class (i gave my serverclass name here) An error occurred: Could not create Splunk set... See more...
I am getting this error in deployment server 8.0.9 and how to make this right : ./splunk reload deploy-server -class (i gave my serverclass name here) An error occurred: Could not create Splunk settings directory at '/root/.splunk'.  
Hello, I read my data with the inputlookup command and try to count the different occurrences of the field fields.SID as below:         | makeresults | eval time=relative_time(now(),"-24h") | e... See more...
Hello, I read my data with the inputlookup command and try to count the different occurrences of the field fields.SID as below:         | makeresults | eval time=relative_time(now(),"-24h") | eval time=ceil(time) | table time | map [ |inputlookup incidents where alert_time > $time$ ] | join incident_id [ |inputlookup incident_results ] | fields fields.SID | search fields.SID=* | mvexpand fields.SID           Unfortunately, whatever tricks I do I am always getting several SIDs packed into a single event, see the screenshot below.  How would I split it the way to have each fields.SID in separate row to be able to count it? Kind Regards, Kamil
Hi, We have status in one log type, where we would like to track if account is in state: bypassed Example: 2021-13-10 user1 bypassed 2021-13-10 user2 enabled 2021-13-09 user2 bypassed 2021-13... See more...
Hi, We have status in one log type, where we would like to track if account is in state: bypassed Example: 2021-13-10 user1 bypassed 2021-13-10 user2 enabled 2021-13-09 user2 bypassed 2021-13-08 user3 bypassed 2021-13-08 user3 active 2021-13-08 user3 bypassed 2021-13-07 user3 active how can we find last 2 status for user in period of time and than based on last bypass/active status we get only accounts that have still active bypass status?  
Hi All, I'm trying to create a search, to potentially be made into a monitoring rule later on. What I am trying to achieve is a way to compare if a user has logged into his machine from a wildly di... See more...
Hi All, I'm trying to create a search, to potentially be made into a monitoring rule later on. What I am trying to achieve is a way to compare if a user has logged into his machine from a wildly different IP address.  This will be using external IP addresses only. As an example I want to know if a user logged into the estate from an IP which wasn't the same or similar as the previous day.   User Today Yesterday User A 155.123.1.1 155.123.1.1 User B 155.124.1.2 155.125.20.2 User C 155.166.2.5 22.18.254.56   In the table able, I have 3 users, user A and B have logged into pretty similar IP's although user B has logged in from a different one today ( this often happens in our logs ).  What I am more wanting to see is User C, who has logged into from a completely subnet IP and is not similar to their IP from the previous day.  This is what I have so far:   index=foo (earliest=-1d@d latest=now()) | eval TempClientIP=split(ForwardedClientIpAddress,",") | eval ClientIP=mvindex(TempClientIP,0) | eval ClientIP1=mvindex(TempClientIP,1) | eval ClientIP2=mvindex(TempClientIP,2) | search NOT ClientIP=10.* | where LIKE("ClientIP","ClientIP") | eval when=if(_time<=relative_time(now(), "@d"), "Yesterday", "Today") | chart values(ClientIP) over user by when | where Yesterday!=Today     Some context regarding the search the ForwardedClientIpAddress field has 3 items inside, ClientIP + ClientIP1 are the same address, ClientIP2 is the end internal address. ClientIP can be an internal address, which is why there is a NOT to remove it from the searches.   Any help would be very much appreciated.    Thanks
Hi Guys, We have a requirement where we need to index emails  to be ingested into splunk. I know a couple of apps are out there but I could not get them working...also not sure how to setup/request ... See more...
Hi Guys, We have a requirement where we need to index emails  to be ingested into splunk. I know a couple of apps are out there but I could not get them working...also not sure how to setup/request a mail account for splunk specifically for this purpose like what all settings should be applied etc.  I am a novice as far as mail settings are concerned, so can someone take some time and help me out here and be as detailed as possible...We are using Splunk 8.0.0   Thanks, Neerav
Hi All, I have created a bar chart on my dashboard with the count of Exceptions. Now I want to create a drilldown to a separate dashboard whenever I click on any of the bars (Separate dashboards for... See more...
Hi All, I have created a bar chart on my dashboard with the count of Exceptions. Now I want to create a drilldown to a separate dashboard whenever I click on any of the bars (Separate dashboards for each bars). Can we achieve such drilldown from bar/column charts? I tried "Link it a dashboard", "Link to a custom URL" etc. but I takes me to only one dashboard whenever I click on any of the bars in the chart. Also thought of using "Manage tokens on this dashboard" option but that doesn't take me to a new dashboard as it only enables for in-page drilldown actions only. Please help suggest a way to get my desired output.    Thank you..!!
I used this eval statement with AND conditions but I'm only getting result as "Public" even when the condition satisfies for value "Private" i.e. I'm only getting default result. Any idea of what's w... See more...
I used this eval statement with AND conditions but I'm only getting result as "Public" even when the condition satisfies for value "Private" i.e. I'm only getting default result. Any idea of what's wrong with this statement? | eval perm=case(block_public_acls=true AND block_public_policy=true AND ignore_public_acls=true AND restrict_public_buckets=true,"Private",1=1,"Public")