All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I would  to have a graph so I can see the trend  for a period and have a overlay with the running total for the day Colleague suggested this   index= ...... | timechart sum(values) span=5m by host... See more...
I would  to have a graph so I can see the trend  for a period and have a overlay with the running total for the day Colleague suggested this   index= ...... | timechart sum(values) span=5m by hosts limit=0 | addtotals    But, it doesn't give the running total for day it give the total for the measurement period
Hi everyone,   I have a json data payload as below:     { location: US all_results: { serial_a: { result: PASS, version: 123, data:[ data1, ... See more...
Hi everyone,   I have a json data payload as below:     { location: US all_results: { serial_a: { result: PASS, version: 123, data:[ data1, data2, data3 ] }, serial_b: { result: PASS, version: 456, data:[ data4, data5 ] }, serial_c: { result: FAIL, version: 789, data:[ data6, data7 ] } } }         and I would like to use splunk query and make a table as: serial_number  result version  data serial_a PASS 123 data1, data2, data3 serial_b PASS 456 data4, data5, serial_c fail 789 data6, data7   how to use splunk query to organize the result? I know I'm able to grab the data by: | spath path=all_results output=all_results | eval all_results=json_extract(all_results) The difficult part is at the serial_number. They have some common prefix serial, but it's dynamic. Therefore , when I try to grab the data inside serial_number, for example version, I'm not able to use query like: | spath  output=version path=all_result.serial*.version Could you give me some idea to do that? thank you!
Hello. I have some issues with field parsing for the CSV files using props configuration. I should be getting 11 fields for each of the events/rows, but parsing is giving me 17 fields. Here are the ... See more...
Hello. I have some issues with field parsing for the CSV files using props configuration. I should be getting 11 fields for each of the events/rows, but parsing is giving me 17 fields. Here are the 3 sample events (First row is header row) from that CSV file and the props.conf file is also provided below: Col1,Col2,Col3,Col4,Col5,Col6,Col7,Col8,Col9,Col10,Col11 APIDEV,4xs54,000916,DEV,Update,Integrate,String\,Set\,Number\,ID,Standard,2024-07-10T23:10:45.001Z,Process_TIME\,URI\,Session_Key\,Services,Hourly APITEST,4ys34,000916,TEST,Update,Integrate,String\,Set\,Number\,String,Typicall\,Response,2024-07-10T23:10:45.021Z,CPU_TIME\,URI\,Session_Key\,Type\,Request,Monthly APITEST,4ys34,000916,DEV,Insert,Integrate,Char\,Set\,System\,ID,On_Demand,2024-07-10T23:10:45.051Z,CPU_TIME\,URI\,Session_Key\,Services,Hourly   *Bold texts in each of the events should count one field    props.conf [mypropscon] SHOULD_LINEMERGE=False LINE_BREAKER=([\r\n]+) INDEXED_EXTRACTIONS=CSV KV_MODE=none disabled=false TIME_FORMAT=%Y-%m-%dT%H:%M:%S.%QZ HEARDER_FIELD_LINE_NUMBER=1   Any recommendation to resolve that issue will be highly appreciated. Thank you so much for your support as always.       
Hi, Thanks in advance! Just curious, has anybody configured github webhook to work with splunk HEC?
Hello guys,   I need to collect logs when the "admin of azure"  reset password or exclude one account. I have tried use Splunk Add-on for Microsoft Azure OR Splunk Add-on for Microsoft O365 but I ... See more...
Hello guys,   I need to collect logs when the "admin of azure"  reset password or exclude one account. I have tried use Splunk Add-on for Microsoft Azure OR Splunk Add-on for Microsoft O365 but I cant received this logs. I received the microsoft substrate management.  Is there any orther app to install to collect this ?     
     Hello everyone, I do not know why the classification is Threat, even though I chose endpoin
I have installed splunk otel collector on lnux(ubuntu). But when i run to start the service and check the status i see below logs. I removed all the reference to environment variable on agent_config.... See more...
I have installed splunk otel collector on lnux(ubuntu). But when i run to start the service and check the status i see below logs. I removed all the reference to environment variable on agent_config.yaml. keeping basic hostmetrics, oltp receivers and file exporter. urgent need to make it work, any help appreciated.       ● splunk-otel-collector.service - Splunk OpenTelemetry Collector Loaded: loaded (/lib/systemd/system/splunk-otel-collector.service; enabled; vendor preset: enabled) Active: failed (Result: exit-code) since Wed 2024-07-10 15:12:54 EDT; 13min ago Process: 246312 ExecStart=/usr/bin/otelcol $OTELCOL_OPTIONS (code=exited, status=1/FAILURE) Main PID: 246312 (code=exited, status=1/FAILURE) Jul 10 15:12:54 patil-ntd systemd[1]: splunk-otel-collector.service: Scheduled restart job, restart counter is at 5. Jul 10 15:12:54 patil-ntd systemd[1]: Stopped Splunk OpenTelemetry Collector. Jul 10 15:12:54 patil-ntd systemd[1]: splunk-otel-collector.service: Start request repeated too quickly. Jul 10 15:12:54 patil-ntd systemd[1]: splunk-otel-collector.service: Failed with result 'exit-code'. Jul 10 15:12:54 patil-ntd systemd[1]: Failed to start Splunk OpenTelemetry Collector.      
 
hi!  Working on adding a holiday table as a lookup to reference for alerts based on volume and want to alert on different thresholds if its a holiday. the referenced search is showing data for 7/1... See more...
hi!  Working on adding a holiday table as a lookup to reference for alerts based on volume and want to alert on different thresholds if its a holiday. the referenced search is showing data for 7/10 as nonHoliday, even though for a test, i have it listed as a holiday in the lookup file.  its a .csv, so no initial formatting seems to be passing thru the file, need to format the holidayDate column in mm/dd/yyyy       index=my_index | eval eventDate=strftime(_time, "%m/%d/%Y") | lookup holidayLookup.csv holidayDate as eventDate OUTPUT holidayDate | eval dateLookup = strftime(holidayDate, "%m/%d/%Y") | eval holidayCheck=if(eventDate == dateLookup, "holiday", "nonHoliday") | fields eventDate holidayCheck | where holidayCheck="nonHoliday"       screen shot shows its captured the event date as expected and is outputting a value for holidayCheck, but, based on the data file its referencing, it should show as Holiday.  data structure holidayDate holidayName 07/10/2024 Testing Day 07/04/2024 Independence Day  
As in object. Migration UF from 8.x to 9.x creates by itsself a Systemd Unit File! I do not want it, in first "restart" action, and not having to do manually a "disable boot-start" after restart. ... See more...
As in object. Migration UF from 8.x to 9.x creates by itsself a Systemd Unit File! I do not want it, in first "restart" action, and not having to do manually a "disable boot-start" after restart. First restart after deploying the new version (9.0.6), [DFS] Performing migration. [DFS] Finished migration. [Peer-apps] Performing migration. [Peer-apps] Finished migration. Creating unit file... Important: splunk will start under systemd as user: root The unit file has been created.   Loaded: loaded (/usr/lib/systemd/system/SplunkForwarder.service; enabled; vendor preset: disabled)   Is there a way in "restart" action to prevent the creation of Systemd Unit File? I'm under Centos-Stream. Thanks.
Hi all, I am monitoring a CSV file that has multiple lines and using a pipe as the delimiter:   I want to brake them to diferent events instead Splunk is treating it as one event with multiple line... See more...
Hi all, I am monitoring a CSV file that has multiple lines and using a pipe as the delimiter:   I want to brake them to diferent events instead Splunk is treating it as one event with multiple lines. I do have props.conf set on the IDXs but didnt change nothing,   #My Props.conf [my myfake-sourcetype] SHOULD_LINEMERGE=false LINE_BREAKER=([\r\n]+) NO_BINARY_CHECK=true CHARSET=UTF-8 INDEXED_EXTRACTIONS=PSV KV_MODE=none disabled=false category=Structured pulldown_type=true FIELD_DELIMITER=| FIELD_NAMES=eruid|description|   My inputs.conf [monitor:///my/fake/path/hhhh.csv*] disabled = 0 sourcetype = hhhh:csv index = main crcSalt = <SOURCE>   eruid|description| batman|uses technology| superman|flies through the air| spiderman|uses a web| ghostrider| rides a motorcycle eruid|description| batman|uses technology| superman|flies through the air| spiderman|uses a web| ghostrider| rides a motorcycle eruid|description| batman|uses technology| superman|flies through the air| spiderman|uses a web| ghostrider| rides a motorcycle eruid|description| batman|uses technology| superman|flies through the air| spiderman|uses a web| ghostrider| rides a motorcycle eruid|description| batman|uses technology| superman|flies through the air| spiderman|uses a web| ghostrider| rides a motorcycle eruid|description| batman|uses technology| superman|flies through the air| spiderman|uses a web| ghostrider| rides a motorcycle eruid|description| batman|uses technology| superman|flies through the air| spiderman|uses a web| ghostrider| rides a motorcycle eruid|description| batman|uses technology| superman|flies through the air| spiderman|uses a web| ghostrider| rides a motorcycle eruid|description| batman|uses technology| superman|flies through the air| spiderman|uses a web| ghostrider| rides a motorcycle eruid|description| batman|uses technology| superman|flies through the air| spiderman|uses a web| ghostrider| rides a motorcycle eruid|description| batman|uses technology| superman|flies through the air| spiderman|uses a web| ghostrider| rides a motorcycle eruid|description| batman|uses technology| superman|flies through the air| spiderman|uses a web| ghostrider| rides a motorcycle eruid|description| batman|uses technology| superman|flies through the air| spiderman|uses a web| ghostrider| rides a motorcycle eruid|description| batman|uses technology| superman|flies through the air| spiderman|uses a web| ghostrider| rides a motorcycle eruid|description| batman|uses technology| superman|flies through the air| spiderman|uses a web| ghostrider| rides a motorcycle     regards  
Hello all and hopefully @rechteklebe   I currently have PCAP analyzer to the point that I can copy over a *.pcap or *.pcapng" file to the monitor folder and it will run it through tshark and make th... See more...
Hello all and hopefully @rechteklebe   I currently have PCAP analyzer to the point that I can copy over a *.pcap or *.pcapng" file to the monitor folder and it will run it through tshark and make the output file that Splunk then ingests and powers the dashboards with. I found that Suricata on pfSense outputs the pcap files as log.pcap.<I think date data here>.  Example log.pcap.1720627634 These seem to not get converted and ingested.  I went to make a whitelist in the inputs.conf but running a btool check it comes back as an invalid key in stanza.  I remembered the webUI section to do this also does not offer a whilelist/blacklist stanza so I guess you have that disabled somehow.  I'm assuming one or some of you python scripts are filtering if the file is a *.pcap or *.pcapng. I'm finding it not an option to change file name format in the Suricata GUI in pfsense, or have rsync change the file names on copy or to have a bash script do it on the Linux host the files get moved to.  Are there a set of python scripts I can change this whitelisting in?  Or a way to enable whitelisting at the inputs.conf level?  Or could a transforms fix this?
There at one point was an add-in that was created for sending Splunk logs to MS Sentinel, but appears it was depreciated some time ago. I am in need of incorporating customer data that uses Splunk to... See more...
There at one point was an add-in that was created for sending Splunk logs to MS Sentinel, but appears it was depreciated some time ago. I am in need of incorporating customer data that uses Splunk to my SOC Sentinel environment. Are there any built in functions that can be utilized to forward to Sentinel? The forwarding option in Splunk appears to only work to other Splunk instances. All current add-ins appear to be focused on ingest from Sentinel to Splunk.  I have been researching a variety of options, but none seem to fill the void that I can find at this time, outside of creating and maintaining my own Splunk add-in.
  How can I match the IPs from csv file with the CIDR ranges in another csv? If no CIDR matches, I want to return "NoMatch" and if proper IP and CIDR match then return the CIDR  I tried the appro... See more...
  How can I match the IPs from csv file with the CIDR ranges in another csv? If no CIDR matches, I want to return "NoMatch" and if proper IP and CIDR match then return the CIDR  I tried the approach below, but I keep getting "No Match" for all entries, even though I have proper CIDR ranges: "| inputlookup IP_add.csv | rename "IP Address" as ip | appendcols [| inputlookup cidr.csv] | foreach cidr [ eval match=if(cidrmatch('<<FIELD>>', ip), cidr, "No Match")]" Note: I can't use join as I don't have IP field or ips in the cidr csv any help would be greatly appreciated. Thank you  
we have a table with large text values . those values need to be truncated to single line
Can someone explain to me why when I run my base search, it has exponentially more Events in the same time frame compared to the summary index search (based on the base search). My main concern is... See more...
Can someone explain to me why when I run my base search, it has exponentially more Events in the same time frame compared to the summary index search (based on the base search). My main concern is if I am having gaps in log events or not. The summary index report runs every two hours looking back two hours. 
Hi ,   query :  how to wrap the text(column values) in a table splunk dashboard studio. query2 : how to expand and collapse row size in table splunk dashboard studio.
Hey , Just heard about CVE-2024-5535 on splunkforwarder agent 9.0.9 for Openssl 1.0.2zj , Is this a real one ? Do we need upgrade the agent now.   Thanks in advance.
Bellow mentioned table is an example which having same index and sourcetype, but it have a different source.  I need to search a field from 1st file and the result should be a combination of fields ... See more...
Bellow mentioned table is an example which having same index and sourcetype, but it have a different source.  I need to search a field from 1st file and the result should be a combination of fields from file 1 and 2. File 1  T1_Fld 1  T1_Fld 2 Domain  T1_Fld 4  T1_Fld 5 AAA xxx google.com yy1 bbb AAB xxx Facebook.com yy2 bbb AAB xxx Gmail.com yy3 bbb AAD xxx Yahoo.com yy4 bbb AAE xxx xxx.com yy5 bbb   File 2 Domain IP google.com 1.1.1.1 Facebook.com 2.2.2.2 Gmail.com 3.3.3.3 Yahoo.com 4.4.4.4 xxx.com 5.5.5.5   consider i am running a search where  T1_Fld 1=AAB then the result table form should be like below.  Output  T1_Fld 1 Domain IP  T1_Fld 4 AAB Facebook.com 2.2.2.2 yy2 AAB Gmail.com 3.3.3.3 yy3    
We have configured inputs.conf with tcp to fetch the logs from streaming and send logs to Splunk server via TCP output. Logs are not being forwarded to Splunk server. Could someone please share the ... See more...
We have configured inputs.conf with tcp to fetch the logs from streaming and send logs to Splunk server via TCP output. Logs are not being forwarded to Splunk server. Could someone please share the proper set of inputs.conf and outputs.conf for reading the logs from TCP inputs ?   inputs.conf   [tcp://1.2.3.4:7514] connection_host=ip queueSize=10MB persistentQueueSize=50MB index=test_data sourcetype=testdata _TCP_ROUTING=ib_group   outputs.conf     [tcpout:ib_group] server=1.2.3.4:9997 useACK=false