All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi how can I find events that contain non english words? e.g i have log file that some lines contain germany or arabic words, how can i recognize these lines? thanks
Trying to implement an alert on detecting spikes in logged events in our Splunk deployment and not sure how to go about it... For example: Have 15 hosts with varying levels of sources within each.... See more...
Trying to implement an alert on detecting spikes in logged events in our Splunk deployment and not sure how to go about it... For example: Have 15 hosts with varying levels of sources within each... one of my sources in a host averages about 5-6k events per day over the past 30 days; then out of the blue, we're hit with 1.3 million events on one of the days. Assuming the alert would need to be tailored to each host (or source, not sure) and would need an average number of events over a "normal" week to compare to when there's a spike? Any help would be greatly appreciated.  
Hello Splunkers!!   When i am upgrading my web HF from 8.0.0 to 8.1.2 then i am getting below error. Please let me what is available workaround for the below issue.   The TCP output processor has... See more...
Hello Splunkers!!   When i am upgrading my web HF from 8.0.0 to 8.1.2 then i am getting below error. Please let me what is available workaround for the below issue.   The TCP output processor has paused the data flow. Forwarding to host_dest=<indexer_name> inside output group splunkcloud from host_src=<HF_NAME> has been blocked for blocked_seconds=1970. This can stall the data flow towards indexing and other network outputs. Review the receiving system's health in the Splunk Monitoring Console. It is probably not accepting data.
Hello, My team is interested in using Grand Central but we'd like to compare the templates generated by Grand Central to Trumpet to see if there are any significant differences before making the swi... See more...
Hello, My team is interested in using Grand Central but we'd like to compare the templates generated by Grand Central to Trumpet to see if there are any significant differences before making the switch. We haven't been able to get the app to work when importing it to our search head, so I'm hoping you can provide something generic. We are trying to capture the following sources if we use Grand Central: CloudTrail Config Notifications GuardDuty VPC Flow Logs If you need anymore information, please let me know. Best, Faleon
I use Ansible to install and configure Splunk Universal Forwarder on multiple servers.  However, it's difficult to maintain a link to the product, the links are not consistent and contain unpredictab... See more...
I use Ansible to install and configure Splunk Universal Forwarder on multiple servers.  However, it's difficult to maintain a link to the product, the links are not consistent and contain unpredictable data, such as the git commit for the version.  Is there a link that would allow me to generate a link dynamically only knowing non-changing properties of the download or predictable values, such as version, platform, package type, etc.  For example: "https://download.splunk.com/products/universalforwarder/releases/{{ splunk_version }}/linux/splunkforwarder-{{ splunk_version }}-{{ splunk_os }}-{{ splunk_arch }}.{{ splunk_pkg }}" or to get the latest version something simple like: https://download.splunk.com/products/universalforwarder/releases/latest  
I had encoutered an interesting question from my client/security SME 1. Which one is better. To have Splunk Security Essentials or to retain Enterprise Security + Content updates? 2. Where are the ... See more...
I had encoutered an interesting question from my client/security SME 1. Which one is better. To have Splunk Security Essentials or to retain Enterprise Security + Content updates? 2. Where are the detection rules kept in Splunk Security Essentials kept?   As far as I understand the Splunk ES content update is quite easy to understand and we can customise the savedsearches.conf (rules) to fit our environment. On other hand, Splunk security Essentials, we couldn't figure out where the rules exist and modify them. Any ideas how to get the detection rules of Splunk Essentials? Also what would be the future direction of these developments? wanted to stick to one of them if possible
Hi i have two field "servername" "code". i need to extract percent of code by servers. index="my-index" | table servername code expected output: servername code percent count server1           4... See more...
Hi i have two field "servername" "code". i need to extract percent of code by servers. index="my-index" | table servername code expected output: servername code percent count server1           404    50%        50                               500    40%       40                               401    10%       10 server2           404     55%       55                               500     30%       30                               401    15%       15 any idea? thanks
The following do not give the IP for the Splunk Enterprise Security (ES). Is there a better SPL to provide the list of all Splunk instances names, IPs. Specially the ES? Thanks a million in advance. ... See more...
The following do not give the IP for the Splunk Enterprise Security (ES). Is there a better SPL to provide the list of all Splunk instances names, IPs. Specially the ES? Thanks a million in advance.   | rest /services/server/sysinfo splunk_server=local | table splunk_server | rest /services/server/sysinfo splunk_server=local | table splunk_server | lookup dnslookup clienthost as splunk_server OUTPUT clienthost as ipAddress  
Hello all  I'm using Splunk Cloud Platform and i want to know how to acces to this URL. URL : /splunkd/__raw/services/data/lookup_edit/lookup_contents I saw it there https://lukemurphey.net/projec... See more...
Hello all  I'm using Splunk Cloud Platform and i want to know how to acces to this URL. URL : /splunkd/__raw/services/data/lookup_edit/lookup_contents I saw it there https://lukemurphey.net/projects/splunk-lookup-editor/wiki/REST_endpoints but i don't really understand how it works.  If someone can help me ! @LukeMurphey @Anonymous  Thank you all
Hello,   We received data from Alicloud and found there are alot of duplicate fields populate in Interesting fields like source , source_ . Is there any query i can use to check how many fields an... See more...
Hello,   We received data from Alicloud and found there are alot of duplicate fields populate in Interesting fields like source , source_ . Is there any query i can use to check how many fields and events are duplicate?    
Hello community, I am searching since few days a solution to display the earliest and latest value from a chart into a dashboard. here is my query; <search> <query>index=main Name=volume_* | char... See more...
Hello community, I am searching since few days a solution to display the earliest and latest value from a chart into a dashboard. here is my query; <search> <query>index=main Name=volume_* | chart sum("Used Capacity TB") AS "Used Capacity TB", sum("Total Capacity TB") AS "Total Capacity TB" by _time span=7d</query> <earliest>$time_token.earliest$</earliest> <latest>$time_token.latest$</latest> </search> I would like to extract the earliest and latest value and then substract the latest-earliest divided by the number of days.   exemple, values of  earliest is 50 latest is 52 the calculation will be  52-50 = 2 2/7d = 0.286   Thank you!  
We are trying to see if there is a way to modify this TA to pull in another AD attribute.  In our case, it is 'department', but it could really be anything.  Is there a file to modify where we coul... See more...
We are trying to see if there is a way to modify this TA to pull in another AD attribute.  In our case, it is 'department', but it could really be anything.  Is there a file to modify where we could then bring in a new attribute from the Azure AD?  Regards.
Hello. How can two files be compared for identity ? file1.csv: username id_user Jonh 123   file2.csv  username id_user Jonh 124   How to write a request correctly. To che... See more...
Hello. How can two files be compared for identity ? file1.csv: username id_user Jonh 123   file2.csv  username id_user Jonh 124   How to write a request correctly. To check if id_user is identical for one user in different files. Compare two files with the same ID or not if file1.csv.id_user != file2.csv.id_user  There was a message that the id is different
I'm trying to gather how many CPUs and Cores a server has but, it seems like on most VMs the CPUs and Cores reports as just 1 regardless of the actual number. Here is the search I was running: in... See more...
I'm trying to gather how many CPUs and Cores a server has but, it seems like on most VMs the CPUs and Cores reports as just 1 regardless of the actual number. Here is the search I was running: index=windows sourcetype=winhostmon source=processor | table host cpu* Number* | dedup host And here is an section of the output: host cpu_architecture cpu_cores cpu_count cpu_mhz NumberOfCores NumberOfProcessors server1 x64 1 1 2397 1 1 server2 x64 1 1 2397 1 1 server3 x64 1 1 2497 1 1 server4 x64 1 1 2497 1 1 server5 x64 1 1 2397 1 1 server6 x64 1 1 2397 1 1 server7 x64 1 1 2497 1 1 server8 x64 1 1 3193 1 1 server9 x64 1 1 2594 1 1 server10 x64 1 1 2397 1 1 server11 x64 1 1 2397 1 1 server12 x64 1 1 2397 1 1 server13 x64 1 1 2497 1 1 server14 x64 1 1 2597 1 1 server15 x64 1 1 2497 1 1 server16 x64 1 1 2397 1 1 server17 x64 1 1 2397 1 1 server18 x64 1 1 2497 1 1 server19 x64 1 1 2597 1 1 server20 x64 1 1 2497 1 1 server21 x64 1 1 2397 1 1 server22 x64 1 1 2397 1 1 server23 x64 1 1 2497 1 1 server24 x64 1 1 2597 1 1 server25 x64 1 1 2397 1 1 This is what I have in my inputs.conf [WinHostMon://Processor] interval = 300 disabled = 0 type = Processor What commands or data sources are used to gather this data? I want to view this data on the server itself and see if the server is reporting it to Splunk wrong(my assumption) or if there is a bug in winhostmon.   Thanks!
I can't make this script work for forwarder deployment. It takes up a lot of time to deploy to many servers.. I guess it has som obvious flaw that i can't see... My script: [Edit] $DEPLOYMENT_SER... See more...
I can't make this script work for forwarder deployment. It takes up a lot of time to deploy to many servers.. I guess it has som obvious flaw that i can't see... My script: [Edit] $DEPLOYMENT_SERVER="SPLUNK-05:8089" $RECEIVING_INDEXER="SPLUNK-05:9997" $MONITOR_PATH="C:\Temp\" $CERTFILE="c:\temp\cert.pfx" $CERTPASSWORD="pass" $LOGON_USERNAME="Admin" $LOGON_PASSWORD="pass" $SET_ADMIN_USER=1 $SPLUNKUSERNAME="Admin" $SPLUNKPASSWORD="pass" $AGREETOLICENSE="yes" msiexec.exe /i "\\server\splunkforwarder-8.1.2-545206cc9f70-x64-release.msi" DEPLOYMENT_SERVER=$DEPLOYMENT_SERVER RECEIVING_INDEXER=$RECEIVING_INDEXER MONITOR_PATH=$MONITOR_PATH CERTFILE=$CERTFILE CERTPASSWORD=$CERTPASSWORD SET_ADMIN_USER=$SET_ADMIN_USER SPLUNKUSERNAME=$SPLUNKUSERNAME SPLUNKPASSWORD=$SPLUNKPASSWORD AGREETOLICENSE=$AGREETOLICENSE /Quiet
How _time will be set for azure:eventhub using TA-MS-AAD  by default  ? I don't see any configuration for _time but the indextime and the _time are not identical     [azure:eventhub] SHOULD_LINEM... See more...
How _time will be set for azure:eventhub using TA-MS-AAD  by default  ? I don't see any configuration for _time but the indextime and the _time are not identical     [azure:eventhub] SHOULD_LINEMERGE = 0 category = Splunk App Add-on Builder pulldown_type = 1 #################### # Metrics ####################   Few results for reference  _time indextime count 2021-10-14 13:00:01.431 2021-10-14 13:10:27 1 2021-10-14 13:00:01.431 2021-10-14 13:10:35 1 2021-10-14 13:00:01.431 2021-10-14 13:10:37 1 2021-10-14 13:00:01.431 2021-10-14 13:10:43
Hello, I am looking to create a report of a search. I have a requirement of tracking user logon to window machines (Active directory). I am currently getting all the data, but I am having problems wi... See more...
Hello, I am looking to create a report of a search. I have a requirement of tracking user logon to window machines (Active directory). I am currently getting all the data, but I am having problems with false logons, or services using the credentials. for example, I will see people logged in at 1 am, but the logon id is 0x0, or there is an error code 000, so that most likely will be a service or something using the credentials of someone, and no one actually logging in. there are about 1500 records a day of these false logons.  I also have the requirement to track Monday - Friday from 6pm to 6am overnight, and I cant seem to get the time of recording properly in the search. Below is the search I am currently using, and help would be appreciated, thank you!    source= “wineventlog: security" EventCode=528 OR EventCode=540 OR EventCode=4624 OR (EventCode=4776 Error_Code=0x0) NOT Account_Name=“*$” NOT Logon _Account="*$" NOT User_Name="*$' | eval Account_Name=mvindex(Account Name, 1) | eval User=coalesce(Account_Name, Logon_Account, Logon_account, User_Name) | eval User=lower (User) | table _time, User, EventCode
We have two addon built by addon builder 1.0.1 and when we scan with python readiness app, these  addons are failing. We are running with splunk 8.1 and like to know if there is any way covert these ... See more...
We have two addon built by addon builder 1.0.1 and when we scan with python readiness app, these  addons are failing. We are running with splunk 8.1 and like to know if there is any way covert these addons to python3 without much rework.
hi what is the rex for extract all brackets contain this pattern [AB_123] [ZXY_987] 1-check all brackets if start with AB_ extract this AB_123 2-check all brackets if start with ZXY_ extract this... See more...
hi what is the rex for extract all brackets contain this pattern [AB_123] [ZXY_987] 1-check all brackets if start with AB_ extract this AB_123 2-check all brackets if start with ZXY_ extract this ZXY_987 3-put all this in single field that call "Codes" any idea? thanks
Hi I've got a csv file where the first line contains the field names and the rest are separate events but the first column is year and the second is month. There are no other time-based fields i.e. ... See more...
Hi I've got a csv file where the first line contains the field names and the rest are separate events but the first column is year and the second is month. There are no other time-based fields i.e. day, hour, etc so I am having difficulty in creating a _time field. Does anyone know how I can generate a _time field using only year and month? The day, hour, etc are irrelevant. Thanks in advance