All Topics

Top

All Topics

Installed Splunk Add-on for Unix and Linux 9.0.0 not getting memory data for ubuntu server? Checks performed 1) Getting data for logical disk space and cpu but not memory 2) sar utility is insta... See more...
Installed Splunk Add-on for Unix and Linux 9.0.0 not getting memory data for ubuntu server? Checks performed 1) Getting data for logical disk space and cpu but not memory 2) sar utility is installed  enabled hardware, CPU, and df metric stanzas added index details too.
So, I have one source (transactions) with userNumber and another source (users) with number. I want to join both of them. In each source, they have different field names. I want my table to have the ... See more...
So, I have one source (transactions) with userNumber and another source (users) with number. I want to join both of them. In each source, they have different field names. I want my table to have the employees name, which in in source users, which I get in my 2nd query in the join separately. Below is my SPL as of now: index=* sourcetype=transaction | stats dc(PARENT_ACCOUNT) as transactionMade by POSTDATE, USERNUMBER | join left=L right=R where L.USERNUMBER=R.NUMBER [search sourcetype=users | stats values(NAME) as Employee by NUMBER] | table USERNUMBER Employee PARENT_ACCOUNT POSTDATE transactionMade What is it that I am doing wrong?
Question:  We are using Commvault Metallic to backup our O365 cloud-based user data in the Microsoft GCC.  How can we send the Commvault transaction logs to our on-prem Splunk servers for event analy... See more...
Question:  We are using Commvault Metallic to backup our O365 cloud-based user data in the Microsoft GCC.  How can we send the Commvault transaction logs to our on-prem Splunk servers for event analysis and reporting?  
Hi everyone, this would be my first addition into community, have been using it for some time and it has been great. However now i have an issue i am not able to find answer or thread so thought to ... See more...
Hi everyone, this would be my first addition into community, have been using it for some time and it has been great. However now i have an issue i am not able to find answer or thread so thought to ask if someone is able to help. I have a search which gives me names of people, email addresses and other data. I wish to know if there is a way how to, when clicked on the value in field: Email it would open outlook and fill in the emails that were searched? Lets say I have 10 results,  and i would like for all 10 emails to be filled in the outlook email. i am able to do it through drilldown - click.value2 or row.fieldname..but those are going to fill in one specific email. I wish to have this capability but for group emails. Before I go and use sendemail, i was wondering if this can be done via mailto and possibly how? hope you all have good day !  
Hi , How to extract the fields from below json logs. Here we have fields like content.jobname and content.region .But i need to extract  content.payload details.how to extract the value.     "co... See more...
Hi , How to extract the fields from below json logs. Here we have fields like content.jobname and content.region .But i need to extract  content.payload details.how to extract the value.     "content" : { "jobName" : "PAY", "region" : "NZ", "payload" : [ { "Aresults" : [ { "count" : "6", "errorMessage" : null, "filename" : "9550044.csv" } ] }, { "Bresults" : [ { "count" : "6", "errorMessage" : null, "filename" : "9550044.csv" } ] } ] }      
Hi all I installed and configured the FortiWeb app for Splunk. I also set a desired index on the heavy forwarder (named fortiweb). There is a problem that the predefined dashboards in the app read t... See more...
Hi all I installed and configured the FortiWeb app for Splunk. I also set a desired index on the heavy forwarder (named fortiweb). There is a problem that the predefined dashboards in the app read the information from the main index. I can edit dashboards and add index=fortiweb to each query but it seems not optimal. how can I chnage the main index to fortiweb index? Thanks
Hi Team, I want to extract the below field value, here the challenge is the error code 403 sometimes it will change. "processing_stage": "Getting a response of 403 from CRM Lead"   Kindly help ... See more...
Hi Team, I want to extract the below field value, here the challenge is the error code 403 sometimes it will change. "processing_stage": "Getting a response of 403 from CRM Lead"   Kindly help me to extract the message using regex or any option available.
Hi, Could some one pls help me the lateral movement which  look for a user with remote NTLM (type 3) logins on an abnormal number of destinations.     Thanks
My apps running in docker containers currently use 8.2.9 splunk univeral forwarder that works fine. My image are based on linux alpine image.   I have for some time been trying to get 9.x.x UF work... See more...
My apps running in docker containers currently use 8.2.9 splunk univeral forwarder that works fine. My image are based on linux alpine image.   I have for some time been trying to get 9.x.x UF working instead but I cannot get it to work. When it boots, it prints the following error:     "/opt/splunkforwarder/bin/splunk" start --accept-license --answer-yes --no-prompt Warning: Attempting to revert the SPLUNK_HOME ownership Warning: Executing "chown -R splunk:splunk /opt/splunkforwarder" This appears to be your first time running this version of Splunk. Creating unit file... Error calling execve(): No such file or directory Error launching command: No such file or directory Failed to create the unit file. Please do it manually later. Splunk> The Notorious B.I.G. D.A.T.A. Checking prerequisites... Checking mgmt port [8089]: open Creating: /opt/splunkforwarder/var/lib/splunk Creating: /opt/splunkforwarder/var/run/splunk Creating: /opt/splunkforwarder/var/run/splunk/appserver/i18n Creating: /opt/splunkforwarder/var/run/splunk/appserver/modules/static/css Creating: /opt/splunkforwarder/var/run/splunk/upload Creating: /opt/splunkforwarder/var/run/splunk/search_telemetry Creating: /opt/splunkforwarder/var/run/splunk/search_log Creating: /opt/splunkforwarder/var/spool/splunk Creating: /opt/splunkforwarder/var/spool/dirmoncache Creating: /opt/splunkforwarder/var/lib/splunk/authDb Creating: /opt/splunkforwarder/var/lib/splunk/hashDb Checking conf files for problems... Done Checking default conf files for edits... Validating installed files against hashes from '/opt/splunkforwarder/splunkforwarder-9.1.2-b6b9c8185839-linux-2.6-x86_64-manifest' All installed files intact. Done All preliminary checks passed. Starting splunk server daemon (splunkd)... PYTHONHTTPSVERIFY is set to 0 in splunk-launch.conf disabling certificate validation for the httplib and urllib libraries shipped with the embedded Python interpreter; must be set to "1" for increased security         However it seems to start a background process but I dont see the logs in splunk. Using the status command kills the background process:   "/opt/splunkforwarder/bin/splunk" status Warning: Attempting to revert the SPLUNK_HOME ownership Warning: Executing "chown -R splunk:splunk /opt/splunkforwarder" splunkd 165 was not running. Stopping splunk helpers...     I have tried disabling boot start but that gives me a similar error::   "/opt/splunkforwarder/bin/splunk" disable boot-start Error calling execve(): No such file or directory Error launching command: No such file or directory execve: No such file or directory while running command /sbin/chkconfig     After researching this, I think it could be related to systemd perhaps? I dont think Alpine includes it but it uses OpenRC instead. However, I dont really have any use for this autostart feature anyway, is there a way to ignore/skip it somehow?
Hi @ITWhisperer need help, how many ways to show up in the dashboard where the eventids index=foo_win*  (host="PC*" EventID=1068) OR (host="PR**" EventID="1") OR (host="PR*" EventID="1") OR (ho... See more...
Hi @ITWhisperer need help, how many ways to show up in the dashboard where the eventids index=foo_win*  (host="PC*" EventID=1068) OR (host="PR**" EventID="1") OR (host="PR*" EventID="1") OR (host="PR*" EventID="1")....... where _time, server(host), eventid, severity (warning, critical, info) Desired to achieve like below snap.  
Hi All, I don't have many resource to build an ideal network environment to forward logs to Splunk. So, I'm seeking a way to simulating or source to obtain many commonly data sources into Splunk (Li... See more...
Hi All, I don't have many resource to build an ideal network environment to forward logs to Splunk. So, I'm seeking a way to simulating or source to obtain many commonly data sources into Splunk (Like some SIEM solutions have scripts to forward syslog through port 514). Any answer will be highly appreciated. Regard.  
add observation dashboard risks to splunk integration as a incident  im in AU: https://portal.XX.xdr.trendmicro.com/#/app/sase  please advise  thanks 
Hello, I am running a search that is returning IP addresses that are being sent to a waf (web access firewall).  The waf requires all IP addresses to be written in CIDR notation.  I am just returnin... See more...
Hello, I am running a search that is returning IP addresses that are being sent to a waf (web access firewall).  The waf requires all IP addresses to be written in CIDR notation.  I am just returning single IPs ,so I have to add a /32 to each address that I submit. I am using the stats command, looking at different parameters and them counting by IP to provide the list I am submitting.  It seems like it should be straight forward using concatenation, but I haven't been able to get to a solution. eval  cidr_address=remoteIP + "/32" and varieties  of this approach(casting to string etc)  haven't worked.  Appreciate any help anyone can provide.  
With syslog-ng we hit all kinds of limitations from the inability to support TCP, to the inability to write fast enough to disk, and therefore losing vast amounts of UDP data, struggling to have the ... See more...
With syslog-ng we hit all kinds of limitations from the inability to support TCP, to the inability to write fast enough to disk, and therefore losing vast amounts of UDP data, struggling to have the various F5 LBs to distribute the data evenly to the syslog servers behind the F5s... and all of this led to Kafka as a potential solution. Does anybody use Kafka effectively instead of syslog-ng? By the way, we did look at Splunk Connect for Syslog (SC4S) without much luck.
I have two very simple searches and I need to be able to get the difference. This is insanely hard for something that is so simple.  search index="first-app" sourcetype="first-app_application_log" ... See more...
I have two very simple searches and I need to be able to get the difference. This is insanely hard for something that is so simple.  search index="first-app" sourcetype="first-app_application_log" AND "eventType=IMPORTANT_CREATE_EVENT" | stats count ^ this result is 150 search index="second-app" sourcetype="second-app_application_log" AND "eventType=IMPORTANT_CANCEL_EVENT" | stats count ^ this result is 5 I'm trying to figure out how to simply do the 150 - 5 to get 145. I've tried `set diff` `eval` a bunch of different ways with no luck. I'm going nuts. Any help would be greatly appreciated!
How to pull data from Splunk using search and build component in SUIT - Splunk UI Tools (@splunk/visualization/Area )
Hi Experts,  I need to compare server lists from two different csv lookups and create a flag based on the comparison results,  I have two lookups abc.csv - contains list of servers being monito... See more...
Hi Experts,  I need to compare server lists from two different csv lookups and create a flag based on the comparison results,  I have two lookups abc.csv - contains list of servers being monitored in dashboard def.csv -contains list of servers from another source   I need to identify servers present in both abc.csv and def.csv not found in dashboard (i.e abc.csv) and not found in def.csv How to compare it and create a flag? Any guidance or example queries would be greatly appreciated. Thank You
Please let me know the correct data extraction?   index=* "Unknown message for StatusConsumer" topicId marshall | rex field=_raw "\"topicId\":\"(?<topicId>\d+)\"" | table topicId   Datas are not ... See more...
Please let me know the correct data extraction?   index=* "Unknown message for StatusConsumer" topicId marshall | rex field=_raw "\"topicId\":\"(?<topicId>\d+)\"" | table topicId   Datas are not getting parsed after giving table name on splunk query.
Hi community, When using datamodels, is it possible to remove/exclude the portion of the autoextractSearch: | search (index=* OR index=_*) 
Query: index=new "application status" AND Condition=Begin OR Condition=Done |rex field = _raw "DIDS \s+\[?<data>[^\]]+)" |dedup data |timechart span=1d count by application Result: _time appli... See more...
Query: index=new "application status" AND Condition=Begin OR Condition=Done |rex field = _raw "DIDS \s+\[?<data>[^\]]+)" |dedup data |timechart span=1d count by application Result: _time application1 application2 2022-01-06 10 20 2022-01-07 12 14 2022-01-08 18 30   I want to include Condition field as well in the table, how can i do it???