All Topics

Top

All Topics

I haven't been able to find any documentation stating whether or not DB Connect is IPv6 compatible.  My customer has a requirement to migrate to IPv6 and I need to know if this app will still be able... See more...
I haven't been able to find any documentation stating whether or not DB Connect is IPv6 compatible.  My customer has a requirement to migrate to IPv6 and I need to know if this app will still be able pull data from an MS-SQL DB once the protocol changes.
Please help me on below things: Requirements: Once 3 events meets, immediately next event should published.if event is not published after 5 min ,need alert. Example : We have one customerno.for ... See more...
Please help me on below things: Requirements: Once 3 events meets, immediately next event should published.if event is not published after 5 min ,need alert. Example : We have one customerno.for the customer number ,I have to search whether 3 events meets logs available in the splunk log or not  Ex: index= 1 source type ="abc" "s1 event received" and "s2 event received" and "s3 event received"     When I search above query ,I will be getting like S1 received for 12345 customer S2 received for 12345 customer S3 received for 12345 customer   If for one customer,all 3 event are met,next i want to search "created" message available in the splunk for same customer (12345) Here "created" message index and source type is different If "created" message not available for 12345 customer no after 5 min once all 3 events meets,I need alert.pls help on this query.
Hi I use a splunk alert with a 24 hours slottime what is strange is that this alert show me an event older than 24 hours so I have 2 questiosn 1) How is it possible that an alert occurs with an e... See more...
Hi I use a splunk alert with a 24 hours slottime what is strange is that this alert show me an event older than 24 hours so I have 2 questiosn 1) How is it possible that an alert occurs with an event outside the slot time specified? 2) How to customize the alert for being sure that it shows only new events and not events already shown?  It means that I need the alert occurs just one time when an event is detected thanks
Hi, on the weekend we had an electric problem on our secondary datacenter and currently dont have energy. One indexer was on that datacenter, it was on cluster with another indexer.  There are any t... See more...
Hi, on the weekend we had an electric problem on our secondary datacenter and currently dont have energy. One indexer was on that datacenter, it was on cluster with another indexer.  There are any tasks that i must do in the meantime? the estimated recovery time for the datacenter is 3 to 4 days, maybe put the indexers on maintenance mode? i've read this https://docs.splunk.com/Documentation/Splunk/9.1.1/Indexer/Whathappenswhenapeergoesdown but only talks about what happens with bucket fixing Regards.
Hello all,  I have a lookup with a single column that lists source file names and paths.  I want to search an index and lookup the sources, then show the latest time of those sources.  I also want... See more...
Hello all,  I have a lookup with a single column that lists source file names and paths.  I want to search an index and lookup the sources, then show the latest time of those sources.  I also want to show if a file hasn't logged at all in a given timeframe. I set the lookup to use WILDCARD() in the lookup definition, but I am now struggling with the search. I basically want the search to lookup each source file, then search the index and tell me what the latest time of the log is, as well as show a "No Logs Found" if source doesn't exist. I was toying with this, but the wildcards aren't working, and I think it is because I am not using the definition.  But even so, I can't wrap my ahead around the search.     | inputlookup pvs_source_list | join type=left source [| search index=pvs | stats latest(_time) as TimeAx by source]     Thank you!
Hi, We are testing Splunk Add-on for Sysmon for Linux to ingest Sysmon data from Linux systems. Data ingestion and majority of the extractions are working fine, except the Data part.   <Data Name=... See more...
Hi, We are testing Splunk Add-on for Sysmon for Linux to ingest Sysmon data from Linux systems. Data ingestion and majority of the extractions are working fine, except the Data part.   <Data Name="FieldName">    It appears that Splunk is completely skips over this. We have Sysmon for Windows working as well and same attribute gets extracted just fine. Data format between Sysmon from Linux Vs Windows is identical, so are the transform stanza's in the TA's. Only difference I could see is that the field name in Windows is enclosed in single quotes where for Linux it is double quotes. Could this be causing the regex in TA to not work for Data ? Including some examples here.  Sample Data from Linux Sysmon   <Event><System><Provider Name="Linux-Sysmon" Guid="{ff032593-a8d3-4f13-b0d6-01fc615a0f97}"/><EventID>3</EventID><Version>5</Version><Level>4</Level><Task>3</Task><Opcode>0</Opcode><Keywords>0x8000000000000000</Keywords><TimeCreated SystemTime="2023-11-13T13:34:45.693615000Z"/><EventRecordID>140108</EventRecordID><Correlation/><Execution ProcessID="24493" ThreadID="24493"/><Channel>Linux-Sysmon/Operational</Channel><Computer>computername</Computer><Security UserId="0"/></System><EventData><Data Name="RuleName">-</Data><Data Name="UtcTime">2023-11-13 13:34:45.697</Data><Data Name="ProcessGuid">{ba131d2e-2a52-6550-285f-207366550000}</Data><Data Name="ProcessId">64284</Data><Data Name="Image">/opt/splunkforwarder/bin/splunkd</Data><Data Name="User">root</Data><Data Name="Protocol">tcp</Data><Data Name="Initiated">true</Data><Data Name="SourceIsIpv6">false</Data><Data Name="SourceIp">x.x.x.x</Data><Data Name="SourceHostname">-</Data><Data Name="SourcePort">60164</Data><Data Name="SourcePortName">-</Data><Data Name="DestinationIsIpv6">false</Data><Data Name="DestinationIp">x.x.x.x</Data><Data Name="DestinationHostname">-</Data><Data Name="DestinationPort">8089</Data><Data Name="DestinationPortName">-</Data></EventData></Event>   Sample data from Windows Sysmon   <Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'><System><Provider Name='Microsoft-Windows-Sysmon' Guid='{5770385f-c22a-43e0-bf4c-06f5698ffbd9}'/><EventID>3</EventID><Version>5</Version><Level>4</Level><Task>3</Task><Opcode>0</Opcode><Keywords>0x8000000000000000</Keywords><TimeCreated SystemTime='2023-11-13T13:26:31.064124600Z'/><EventRecordID>1571173614</EventRecordID><Correlation/><Execution ProcessID='2988' ThreadID='5720'/><Channel>Microsoft-Windows-Sysmon/Operational</Channel><Computer>computername</Computer><Security UserID='S-1-5-18'/></System><EventData><Data Name='RuleName'>-</Data><Data Name='UtcTime'>2023-11-13 13:26:13.591</Data><Data Name='ProcessGuid'>{f4558f15-1db6-654f-8400-000000007a00}</Data><Data Name='ProcessId'>4320</Data><Data Name='Image'>C:\..\..\image.exe</Data><Data Name='User'>NT AUTHORITY\SYSTEM</Data><Data Name='Protocol'>tcp</Data><Data Name='Initiated'>true</Data><Data Name='SourceIsIpv6'>false</Data><Data Name='SourceIp'>127.0.0.1</Data><Data Name='SourceHostname'>computername</Data><Data Name='SourcePort'>64049</Data><Data Name='SourcePortName'>-</Data><Data Name='DestinationIsIpv6'>false</Data><Data Name='DestinationIp'>127.0.0.1</Data><Data Name='DestinationHostname'>computername</Data><Data Name='DestinationPort'>4932</Data><Data Name='DestinationPortName'>-</Data></EventData></Event>   Transforms on both sides are also identical except the difference for single Vs double quotes.   Linux [sysmon-data] REGEX = <Data Name="(.*?)">(.*?)</Data> FORMAT = $1::$2 Windows [sysmon-data] REGEX = <Data Name='(.*?)'>(.*?)</Data> FORMAT = $1::$2    Any clues on what could be causing Splunk to not extract Data attribute for Linux? Transforms for other elements such as Computer, Keywords are working fine, it just skips this Data part completely. Thanks,
hi , We could be the  issue regarding the absence of Windows Security Command Line " EventCode=4688 ParentProcessName="C:\\Windows\\System32\\cmd.exe"  events. I have not blacklisted it in any of... See more...
hi , We could be the  issue regarding the absence of Windows Security Command Line " EventCode=4688 ParentProcessName="C:\\Windows\\System32\\cmd.exe"  events. I have not blacklisted it in any of the app. Thanks.
Hi,  How we can list out all the apps inputs.conf blacklisted stanzas in the DS ? Coz I'm seeing the command line events are getting blocked in my Environment.. Thanks
In splunkdev2 we are trying to implement SAML authentication for that we have configuration with .pem certificate but after applying portal is not working. As per the guidance of team they suggested... See more...
In splunkdev2 we are trying to implement SAML authentication for that we have configuration with .pem certificate but after applying portal is not working. As per the guidance of team they suggested to upgrade splundev server key store to trust the public signing authority Digital Certificate.  How can we do that?
Dears, kindly, support to make paloalto app to work the logs are coming to environment but the app is not working as you can see in pictures  thank you in advance   
Hi,  I was wondering if it was in any way possible to run scheduled browser tests on the splunk observability platform. I need tests to run from 2pm - 5pm however I can't seem to find a way to have ... See more...
Hi,  I was wondering if it was in any way possible to run scheduled browser tests on the splunk observability platform. I need tests to run from 2pm - 5pm however I can't seem to find a way to have them run during only those times. 
Hello All, I have a SPL which is scheduled to run each minute for a span of 1 hour. On each execution the search runs for 4 seconds with size of around 400KB. Thus, how does the scheduler and sear... See more...
Hello All, I have a SPL which is scheduled to run each minute for a span of 1 hour. On each execution the search runs for 4 seconds with size of around 400KB. Thus, how does the scheduler and search head work in such scenario at the backend? Does the scheduled SPL keeps scheduler and search head busy for entire 1 hour? Or they are free to run the other SPLs during that span of 1 hour? And can you share any negative implications on Splunk infrastructure due to the above scheduled search? Any information would be very helpful. Thank you Taruchit
Hi All, For the current version of Splunk Cloud, does it allow the integration with Google Authenticator for Multi-Factor Authentication?
  Hi All,   2023-10-25 10:56:46,709 WARN pool-1-thread-1 com.veeva.bpr.batchrecordprint.scheduledTasks - BOM Field Name: BOM_PPMDS_1, value is out of   The above WARN message replace to ERROR me... See more...
  Hi All,   2023-10-25 10:56:46,709 WARN pool-1-thread-1 com.veeva.bpr.batchrecordprint.scheduledTasks - BOM Field Name: BOM_PPMDS_1, value is out of   The above WARN message replace to ERROR message, please find the below ERROR message.   2023-11-06 15:30:48,941 ERROR pool-1-thread-1 com.veeva.brp.batchrecordprint.ScheduledTasks - Unknown error: {errorType=GENERAL,   How to write props .conf and transforms.conf configuration files. please help me.   Regards Vijay .K        
I have following data: 02:00:00 Item=A Result=success 02:00:05 Item=B Result=success 02:05:00 Item=A Result=fail 02:05:05 Item=B Result=success 02:10:00 Item=A Result=fail 02:10:05 Item=B ... See more...
I have following data: 02:00:00 Item=A Result=success 02:00:05 Item=B Result=success 02:05:00 Item=A Result=fail 02:05:05 Item=B Result=success 02:10:00 Item=A Result=fail 02:10:05 Item=B Result=success 02:15:00 Item=A Result=success 02:15:05 Item=B Result=fail 02:20:00 Item=A Result=success 02:20:05 Item=B Result=fail 02:25:00 Item=A Result=success 02:25:05 Item=B Result=success 02:30:00 Item=A Result=success 02:30:05 Item=B Result=success 02:35:00 Item=A Result=success 02:35:05 Item=B Result=success 02:40:00 Item=A Result=success 02:40:05 Item=B Result=fail 02:45:00 Item=A Result=success 02:45:05 Item=B Result=success 02:50:00 Item=A Result=success 02:50:05 Item=B Result=success 02:55:00 Item=A Result=success 02:55:05 Item=B Result=success My desired results: Item StartTime EndTime Duration A       02:05:00    02:15:00 00:10:00 B       02:15:05    02:25:05 00:10:00 B       02:40:05    02:45:05 00:05:00 I had tried transaction and streamstats but got wrong results. Can anybody here help me to solve this problem? Thank you.
Hi! This is a very basic question. First time working with Splunk Enterprise Platform. How do you actually go about switching on the feature to log network traffic coming into an internal network ... See more...
Hi! This is a very basic question. First time working with Splunk Enterprise Platform. How do you actually go about switching on the feature to log network traffic coming into an internal network with a specific IP range? I essentially want for Splunk Enterprise to act as a logger for all traffic that enters the internal network on a certain port, for example. How do I go about it? FYI - I do not want to use the Forwarder or upload log files function.
Hello, I am a beginner with Splunk. I am experimenting with a csv dataset containing the daily average temperature for different cities across the world. As a first step, I would like to see, for a g... See more...
Hello, I am a beginner with Splunk. I am experimenting with a csv dataset containing the daily average temperature for different cities across the world. As a first step, I would like to see, for a given city, the graph for the average temperature over time. However by default, the X axis on the timechart shows the timestamp of the source file, as opposed to the time field contained in each event. As a result, all events show the same date, which is probably the date the dataset was created. How do I use the "Date" field contained in each event, instead of the Timestamp of the dataset file? Thanks,
  Hello, I am forwarding data from an embedded system to an enterprise instance running on a Vm. The logs look like this: acces_monitoring (indexed on splunk, the first empty space means still on... See more...
  Hello, I am forwarding data from an embedded system to an enterprise instance running on a Vm. The logs look like this: acces_monitoring (indexed on splunk, the first empty space means still online):      Access_IP                Access_time                    Logoff_time 1 192.168.200.55 1699814895.000000 2 192.168.200.55 1699814004.000000 1699814060.000000 3 192.168.200.55 1699811754.000000 1699812677.000000 4 192.168.200.55 1699808364.000000 1699809475.000000 5 192.168.200.55 1699806635.000000 1699806681.000000 6 192.168.200.55 1699791222.000000 1699806628.000000 7 192.168.200.55 1699791125.000000 1699791127.000000 8 192.168.200.55 1699724540.000000 1699724541.000000 9 192.168.200.55 1699724390.000000 1699724474.000000   command_monitoring:       Access_IP              exec_time                     executed_command 1 192.168.200.55 1699813121.000000 cd ~ 2 192.168.200.55 1699813116.000000 cd /opt 3 192.168.200.55 1699813110.000000 prova3 4 192.168.200.55 1699811813.000000 cat sshd_config 5 192.168.200.55 1699811807.000000 cd /etc/ssh 6 192.168.200.55 1699811801.000000 cd etc 7 192.168.200.55 1699811793.000000 cd 8 192.168.200.55 1699811788.000000 ls 9 192.168.200.55 1699811783.000000 e che riconosce le sessioni diverse 10 192.168.200.55 1699811776.000000 spero funziona 11 192.168.200.55 1699809221.000000 cat command_log.log 12 192.168.200.55 1699809210.000000 ./custom_shell.sh 13 192.168.200.55 1699808594.000000 CD /MEDIA 14 192.168.200.55 1699808587.000000 cd /medi 15 192.168.200.55 1699808584.000000 omar when i try to join the two by running:   index=main source="/media/ssd1/ip_command_log/command_log.log" | eval exec_time=strptime(exec_time, "%a %b %d %H:%M:%S %Y") | rename ip_execut as Access_IP | table Access_IP, exec_time, executed_command | join type=left Access_IP [ search index=main source="/media/ssd1/splunk_wtmp_output.txt" | dedup Access_time | eval Access_time=strptime(Access_time, "%a %b %d %H:%M:%S %Y") | eval Logoff_time=if(Logoff_time="still logged in", now(), strptime(Logoff_time, "%a %b %d %H:%M:%S %Y")) | table Access_IP, Access_time, Logoff_time ] | eval session_active = if(exec_time >= Access_time AND exec_time <= coalesce(Logoff_time, now()), "true", "false") | where session_active="true" | table Access_IP, Access_time, Logoff_time, exec_time, executed_command   it does not join over every session but only the last one so the one started at 1699814895.000000 and it will not identify any of the commands ran on the embedded system in the correct session.What could be the catch?   Thanks in advance!
We are excited to introduce the enhanced Time Picker functionality, designed to empower developers and SREs with efficient observability workflows. With new capabilities that make investigating and w... See more...
We are excited to introduce the enhanced Time Picker functionality, designed to empower developers and SREs with efficient observability workflows. With new capabilities that make investigating and working across different product areas easier than ever, Time Picker is there to help you get to the insights faster across Logs, Application Performance Monitoring (APM), Real User Monitoring (RUM), Infrastructure Monitoring (IM), and Synthetics.  Seamless Transition between Product Areas Imagine starting your work in the Log Observer interface, pinpointing anomalies with precision. Now, when it's time to dive into APM, you can seamlessly carry your selected time range with you. Time Picker recent selection is persistent across all product areas to ensure that your carefully chosen timeframe is captured as you switch between Observability suite of products, eliminating the need for redundant configurations.  Intuitive UI and Enhanced Functionality Time Picker's improved user interface steps up the game. Not only does it resolve past issues/bugs, but it also introduces advanced features like time range pasting and type-ahead behavior. Consider this: you just joined your coworker in investigating an alert. They send you a time range when an outage happens. With enhanced Time Picker time range pasting feature, you can instantly and quickly apply your desired time frame without manually typing the value, minimizing clicks and accelerating your workflow.  Enhanced Timestamp Format for Convenience and Efficiency Time Picker also introduces a wide range of formats users can paste or type into the component, at the same time standardizing the output format across the product areas. In addition, Time Picker takes into consideration your preferred time zone or browser time zone, and if you paste time with offset or time zone designation, time picker automatically converts to the time zone you set. Now it is easier than ever to work with colleagues across different time zones. This not only elevates your experience but significantly reduces the chances of encountering bugs. Let’s look at our previous scenario, you are working on an alert, and your colleague that works in Eastern Time zone shared the time range when outage happened, their time range includes EST suffix, as you work in PST, and your observability suite is set to PST, when you paste the time range, it will convert the time range to your time zone and correctly display data. No need to manually adjust it! Like Gwen Stefani once said: What are you waiting for? Try out the enhanced Time Picker today!
I have 2 string which need to be searched in splunk both string having different index and different source type.one string is "published sourcing plan " and another string is "published transfer ord... See more...
I have 2 string which need to be searched in splunk both string having different index and different source type.one string is "published sourcing plan " and another string is "published transfer order" .I need to get "published transfer order" log from the splunk.if it's not available after 5 min of getting "published sourcing plan "log in the splunk.i need to count it or need to retrieve some details like salesorderid from "published sourcing order" log .how to prepare sea rch query in splunk.incase none of the log available in the splunk for "transfer order published",I need to capture the things