All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

  Is it possible to implement classification using Splunk MLTK assistance ? If yes, How to implement it ?    Regards Balaji TK
My objective is to make a search that compares the dest_ip field value of outbound traffic with the ip values in a lookup table of malicious IPs that results with any matches. The current search is ... See more...
My objective is to make a search that compares the dest_ip field value of outbound traffic with the ip values in a lookup table of malicious IPs that results with any matches. The current search is something simple like: index=NetworkTraffic dest_zone="Internet" NOT src_zone="Internet" to view the outbound traffic. The output includes a dest_ip field.  If I have a lookup table called maliciousIPs.csv, which contains a field called "ip", how do I compare that to the dest_ip field?  Ex: If the dest_ip field value of one of the NetworkTraffic logs is 1.2.3.4 and the IP address 1.2.3.4 exists within maliciousIPs.csv, then the search would result. 
I found a older discussion post that answered this question, but wanting to see if things changed. Does Splunk offer any sort of discount or exam voucher to active duty military members or veterans?... See more...
I found a older discussion post that answered this question, but wanting to see if things changed. Does Splunk offer any sort of discount or exam voucher to active duty military members or veterans? Thanks 
Hi Folks , I am new to splunk and trying to get dynamic source value from the response, here is my query:   index="itestData" AND source="/opt/ABC/DEF/GHI/KLM/LOG*" AND "error" Please note that... See more...
Hi Folks , I am new to splunk and trying to get dynamic source value from the response, here is my query:   index="itestData" AND source="/opt/ABC/DEF/GHI/KLM/LOG*" AND "error" Please note that * after LOG is a dynamic value (like LOG-A.log , LOG-B.log, LOG-C.log) and there are at least 70 servers like this, when i get any error i want to know from which log this error is coming (A or B or C and so on) . Let me know if there is any other way to get this (i do not want to individually put the name of sources as servers go up and down ) Thanks in advance.
Hi, I've been learning Splunk on my free time and at the part of my lesson that is teaching me how to add a splunk index via the CLI. I think I made a mistake with either the stanza or the key valu... See more...
Hi, I've been learning Splunk on my free time and at the part of my lesson that is teaching me how to add a splunk index via the CLI. I think I made a mistake with either the stanza or the key values can someone possibly help me out with this one?   Splunk> 4TW Checking prerequisites... Checking http port [8000]: open Checking mgmt port [8089]: open Checking appserver port [127.0.0.1:8065]: open Checking kvstore port [8191]: open Checking configuration... Done. Checking critical directories... Done Checking indexes... Problem parsing indexes.conf: Cannot load IndexConfig: stanza=security Required parameter=homePath not configured Validating databases (splunkd validatedb) failed with code '1'. If you cannot resolve the issue(s) above after consulting documentation, please file a case online at http://www.splunk.com/page/submit_issue $  
New to Splunk. Attempting to have Splunk monitor and index logs from NAS. Logs are being centrally stored on a NAS from multiple clients. Need Splunk to look at the Network drive and index the logs... See more...
New to Splunk. Attempting to have Splunk monitor and index logs from NAS. Logs are being centrally stored on a NAS from multiple clients. Need Splunk to look at the Network drive and index the logs in the shared folder. Using UNC path (\\192.168.xxx.xxx\sharefolder\filepath). Under the monitored files and directories in Splunk, able to monitor the number of log files in the selected directory and instantly show changes made if log files are added or removed. Splunk account given administrator access to rule out privilege issues. Logged into Splunk service account and confirmed able to access network location. If I take the same files and place them in a local folder, and attempt to index them using the same method, they are instantly indexed. The problem appears to be with Splunk not ingesting and indexing the logs when pointed to the network location. Any help would very much appreciated.
I want to represent interface wise (DFOINTERFACE) success and failure  success log below, where completed successfully is main keyword: 2022-12-06 14:43:21:064 EST| INFO |dfo_.allocation DFOINTE... See more...
I want to represent interface wise (DFOINTERFACE) success and failure  success log below, where completed successfully is main keyword: 2022-12-06 14:43:21:064 EST| INFO |dfo_.allocation DFOINTERFACE=dfo_.allocation START -- dfo_.allocation Execution accountNumber=%productValidationRequest/accountNumber/accountBase%%productValidationRequest/accountNumber/accountDest% completed successfully MFRESPONSETIME=96 millisec 176 microsec 997 nanosec MFPROGRAMEID=OMCRCAL1 Service Name : LowCodePlatform.RESTService.allocation:_post and error log below, where completed with Error is main keyword: 2022-12-06 13:52:38:233 EST| ERROR |dfo_.productValidation DFOINTERFACE=dfo_.productValidation START -- dfo_.productValidation Execution accountNumber=076732008 completed with Error 20120014 - CICS ECI Connection: Transformation error on reply: Invalid decimal digit: MFRESPONSETIME=411 millisec 753 microsec 627 nanosec MFPROGRAMEID=OECDFB21 Service Name : LowCodePlatform.RESTService.ProductValidation I want a report like, please see the attachment
Hi all, I'm a cloud security engineer. I recently started using Splunk. My organization is looking to use Splunk to enhance our cloud security capability. So far, we have ingested some samples of A... See more...
Hi all, I'm a cloud security engineer. I recently started using Splunk. My organization is looking to use Splunk to enhance our cloud security capability. So far, we have ingested some samples of Azure logs into the dev environment. I'm looking for ideas on what to do next and how other organizations are using Splunk to work with Azure and AWS logs.   Many thanks,   Daniel
Once installed the SA-Eventgen app and enabled the SA-Eventgen data input, it started ingest events for following sourcetype. but i don't see any configuration in eventgen.conf file. How is this happ... See more...
Once installed the SA-Eventgen app and enabled the SA-Eventgen data input, it started ingest events for following sourcetype. but i don't see any configuration in eventgen.conf file. How is this happening. Thanks bro:http:json bro:weird:json bro_conn bro_dhcp bro_ftp bro_notice bro_smtp bro_ssh bro_tunnel cisco:sourcefire eStreamer mcafee:ids oracle:alert:text oracle:audit:text oracle:connections oracle:database oracle:database:size oracle:dbFileIoPerf oracle:incident oracle:instance oracle:libraryCachePerf oracle:listener:text oracle:osPerf oracle:pool:connections oracle:query oracle:session oracle:sga oracle:sysPerf oracle:table oracle:tablespace oracle:tablespaceMetrics oracle:trace oracle:user snort sophos:appcontrol sophos:computerdata sophos:devicecontrol sophos:firewall
Hi Community,  Has anyone had a problem after installing the new Splunk 9.0.2.1? I had a problem after I finished downloading, and when I wanted to accept the license, there were error messages as ... See more...
Hi Community,  Has anyone had a problem after installing the new Splunk 9.0.2.1? I had a problem after I finished downloading, and when I wanted to accept the license, there were error messages as below; ──(root kali)-[/opt/splunk/bin] └─# ./splunk start --accept-license Splunk> 4TW Checking prerequisites... Checking http port [8000]: open Checking mgmt port [8089]: open Checking appserver port [127.0.0.1:8065]: open Checking kvstore port [8191]: open Checking configuration... Done. Checking critical directories... Done Checking indexes... Validated: _audit _configtracker _internal _introspection _metrics _metrics_rollup _telemetry _thefishbucket history main summary Done Checking filesystem compatibility... Done Checking conf files for problems... Done Checking default conf files for edits... Validating installed files against hashes from '/opt/splunk/splunk-9.0.1-82c987350fde-linux-2.6-x86_64-manifest' All installed files intact. Done All preliminary checks passed. Starting splunk server daemon (splunkd)... Job for Splunkd.service failed because the control process exited with error code. See "systemctl status Splunkd.service" and "journalctl -xeu Splunkd.service" for details. ┌──(root kali)-[/opt/splunk/bin] └─# systemctl status splunkd.service 4 ⨯ × splunk.service - LSB: Start splunk Loaded: loaded (/etc/init.d/splunk; generated) Active: failed (Result: exit-code) since Sun 2022-12-11 06:35:55 EST; 10min ago Docs: man:systemd-sysv-generator(8) Process: 64361 ExecStart=/etc/init.d/splunk start (code=exited, status=1/FAILURE) CPU: 7.741s Dec 11 06:35:55 kali splunk[64364]: Done Dec 11 06:35:55 kali splunk[64364]: All preliminary checks passed. Dec 11 06:35:55 kali splunk[64364]: Starting Splunk server daemon (splunkd)... Dec 11 06:35:55 kali systemctl[64427]: Job for Splunkd.service failed because the control process exited with error code. Dec 11 06:35:55 kali systemctl[64427]: See "systemctl status Splunkd.service" and "journalctl -xeu Splunkd.service" for details. Dec 11 06:35:55 kali splunk[64362]: Systemd manages the Splunk service. Use 'systemctl start Splunkd' to start the service. Root permission is require> Dec 11 06:35:55 kali systemd[1]: splunk.service: Control process exited, code=exited, status=1/FAILURE Dec 11 06:35:55 kali systemd[1]: splunk.service: Failed with result 'exit-code'. Dec 11 06:35:55 kali systemd[1]: Failed to start LSB: Start splunk. Dec 11 06:35:55 kali systemd[1]: splunk.service: Consumed 7.741s CPU time. lines 1-17/17 (END) anyone can help me to solve these problems. I appreciate it.   
Hi,   Can someone help me to provide the solution same as like in attached image.
Hello I am trying to set a static color to my single value (the value is string) visualization. I tried it with CSS, add ID to single markup and it didn't work. I also tried to set these option... See more...
Hello I am trying to set a static color to my single value (the value is string) visualization. I tried it with CSS, add ID to single markup and it didn't work. I also tried to set these options: <option name="colorMode">block</option> <option name="colorBy">value</option> <option name="useColors">0</option> <option name="rangeColors">["0x0000ff"]</option>   and it didnt work as well.
Dears    I need your help in extracting the domain and top level domain from dns queries where:   Query Field                  |         extracted field Account.fb.com         .         Fb.... See more...
Dears    I need your help in extracting the domain and top level domain from dns queries where:   Query Field                  |         extracted field Account.fb.com         .         Fb.com Aa.bb.cc.com              .         Cc.com Www.google.com      .        Google.com       Thanks in advance 
I have the following main search:     index=utm sys=SecureNet action=drop | eval protocol=case(proto==1, "ICMP", proto==6, "TCP", proto==17, "UDP", proto==132, "SCTP", 1=1,proto) | table _time sev... See more...
I have the following main search:     index=utm sys=SecureNet action=drop | eval protocol=case(proto==1, "ICMP", proto==6, "TCP", proto==17, "UDP", proto==132, "SCTP", 1=1,proto) | table _time severity srcip srcport srcmac dstip dstport dstmac protocol eval action fwrule tcpflags ttl initf outitf | sort -_time     On the existing eval, I need to modify the end that acts as the else. Right now, the  else specifies a name for numbers 1, 6, 17, and 132 in field "proto".  I need the else to use any other occurring number to lookup an associated name from a csv containing 2 fields: "number" and "name". I cannot for the life of me figure out what kind of subsearch to use or the syntax... I imagine it is something like:     | inputlookup protocol_number_list.csv | search number=proto | return name     but I can't figure out how to combine the two. Any help would be greatly appreciated, thanks!
Hi Experts, We have splunk enterprise 8.2.6 on sles12 sp4 in gcp. There are many corrupted buckets on indexer nodes.  Did any one of you experienced such bucket corruption due to OS or OS relat... See more...
Hi Experts, We have splunk enterprise 8.2.6 on sles12 sp4 in gcp. There are many corrupted buckets on indexer nodes.  Did any one of you experienced such bucket corruption due to OS or OS related patches or something else. Thank you.
Hi All What is the added advantage of Splunk MLTK is bringing when we already have commands like predict, cluster, Anomaly detection and association in  Splunk enterprise which uses the ML algori... See more...
Hi All What is the added advantage of Splunk MLTK is bringing when we already have commands like predict, cluster, Anomaly detection and association in  Splunk enterprise which uses the ML algorithms ? In what scenarios or use cases do we really need Splunk MLTK ?    Regards Balaji TK
Hi I have 3 servers that generate log file daily with size about 12GB (12*3=36GB) How can I gather these files on centralize log server.   FYI1: I can't use splunk forwarder in this scenario.... See more...
Hi I have 3 servers that generate log file daily with size about 12GB (12*3=36GB) How can I gather these files on centralize log server.   FYI1: I can't use splunk forwarder in this scenario. FYI2: rsyslog, filebeat, syslog-ngm, ... are available solution but I can't decide which one is more suitable for this issue. FYI3: raw data is important , and doesn't be missed. FYI4: like forwarder when ever servers or network down, after issue resolve it will continuously send data. (AFAIK rsyslog use tracker when server stopped and try to send remain file after service start again) Any idea? Thanks,
Hi  hope you are doing good. im working on a use case which will trigger if any user is trying to connect from non business country.  attaching the snap for the query. my query  want to opt... See more...
Hi  hope you are doing good. im working on a use case which will trigger if any user is trying to connect from non business country.  attaching the snap for the query. my query  want to optimize it more if one user is trying is log in from more than 2-3 country than it will trigger. can you please help me with the query    thanks  debjit   
I followed a tutorial on how to create an alert for a failed root login by typing "failed password for root" The alert is created but I want to see the alert be triggered. I'm working on a VM, what's... See more...
I followed a tutorial on how to create an alert for a failed root login by typing "failed password for root" The alert is created but I want to see the alert be triggered. I'm working on a VM, what's the best way to see the alert in action?
  We have been experiencing unusually high memory usage on some of our domain controllers. The culprit here is Splunk process splunk-MonitorNoHandle.exe. Here is the report of the memory usage of t... See more...
  We have been experiencing unusually high memory usage on some of our domain controllers. The culprit here is Splunk process splunk-MonitorNoHandle.exe. Here is the report of the memory usage of the domain controllers: DC1 splunk-MonitorNoHandle.exe   17724   Services 0   14,993,012 K DC2 splunk-MonitorNoHandle.exe   53268   Services 0   38,927,688 K DC3 splunk-MonitorNoHandle.exe   16164   Services 0   43,997,828 K