All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Good morning, I explain my casuistry, I have a Splunk tenant that belongs to a big company with sucusarles in three zones. Each branch should only see the data of its zone. The indexes are construct... See more...
Good morning, I explain my casuistry, I have a Splunk tenant that belongs to a big company with sucusarles in three zones. Each branch should only see the data of its zone. The indexes are constructed in the form, zone_technology, for example, eu_meraki. Knowing this, I have created a series of alerts, which are shared for all the areas, and search in all the indexes. How could I make that the warning email when the alert is triggered, only reaches the contacts of an area?   Thank you
I am using a single universal forwarder on my windows machine to send a log file to my Splunk host machine deployed on Ubuntu.  The problem is that there were 3 logs events initially in the file, ... See more...
I am using a single universal forwarder on my windows machine to send a log file to my Splunk host machine deployed on Ubuntu.  The problem is that there were 3 logs events initially in the file, and splunk read those events and displayed on the dashboard. But when I appended the same file and added 10 more events manually, the dashboard is giving out 16 log events when there are only 13 events in the log file. its is reading the first three logs twice. How to resolve this issue?  
i have all the below messages in the "response" field. {"errors": ["Message: Payment failed. Reason: Hi, we attempted to process the transaction but it seems there was an error. Please check your ... See more...
i have all the below messages in the "response" field. {"errors": ["Message: Payment failed. Reason: Hi, we attempted to process the transaction but it seems there was an error. Please check your information and try again. If the problem persists please contact your bank."]} {"errors": ["Unable to retrieve User Profile with sub '2415d' as it does not exist"]} {"errors": ["Unable to retrieve User Profile with sub 'dfadf' as it does not exist"]} {"errors": ["Unable to retrieve User Profile with sub 'fdsgad' as it does not exist"]} {"errors": ["Unallocated LRW seat not found with product id fdafdsaddsfa and start datetime utc 2024-01-06T05:30:00+00:00 and test location id dfafdfa"]} {"errors": ["Unallocated LRW seat not found with product id sfgdfa and start datetime utc 2024-01-06T05:30:00+00:00 and test location id dsfadfsa"]} I wanted to display the result with the count as  Message: Payment failed. Reason: Hi, we attempted to process the transaction but it seems there was an error. Please check your information and try again. If the problem persists please contact your bank. Unable to retrieve User Profile with sub '***' as it does not exist Unallocated LRW seat not found with product id *** and start datetime utc 2024-01-06T05:30:00+00:00 and test location id ***
Hi All, I have a multivalue field that contains nested key value pair with key named as "Key" and Value named as "Value". Example snippet : tags: [ [-] { [-] Key: Contact V... See more...
Hi All, I have a multivalue field that contains nested key value pair with key named as "Key" and Value named as "Value". Example snippet : tags: [ [-] { [-] Key: Contact Value: abc@gmail.com } { [-] Key: Name Value: abc } I want to extract only the Contact value from here i.e abc@gmail.com. I am trying with multivalue functions and spath. Still stuck here. Please help me. Regards, PNV
I am new to splunk queries and was trying to combine results from multiple queries without using subsearches due to its limitation of restricting subsearches to 50000 results but our dataset has more... See more...
I am new to splunk queries and was trying to combine results from multiple queries without using subsearches due to its limitation of restricting subsearches to 50000 results but our dataset has more than 50000 records to be considered. Below is the query I was trying (index="B"  logType="REQUEST") OR( index="B" logType="TRACES" message="searchString1*") OR (index="B" logType="TRACES" message="searchString2*") | stats latest(*) as * by Id All above queries have the id field in the result which match and correspond to some kind of a correlation id between these logs. I would like to have the end result show all the common fields which has same values, but also with message field having the consolidated message content from the individual queries made on the same index B. The message field alone can have different values between the queries and need to be consolidated on the result. Can someone help on how this can be done ? @splunk 
Some of my customers are using Splunk as their SIEM solution. I have a security platform that needs to integrate into their Splunk to send security events (probably syslog) into a certain index (mig... See more...
Some of my customers are using Splunk as their SIEM solution. I have a security platform that needs to integrate into their Splunk to send security events (probably syslog) into a certain index (might be an existing or brand new one). I already made a PoC using HEC and successfully managed to deliver my syslog events into an index in my test Splunk account (using Splunk Cloud Platform). The setup process that my customers will have to do for the integration using HEC is to create a new data input, create a token, and eventually deliver it to me (alongside their Splunk hostname). Now I'm wondering if this process can somehow be simplified using an app/add-on. Not sure exactly what is functionality using an add-on gives and if I can somehow leverage it in order to simplify the integration onboarding process between my security product and my customers. Is there anything else I should consider? Would love to know, I'm completely new to Splunk. Also, case it matters, most of my customers, are using Splunk Cloud Platform but in the future there might be customers that will have Splunk Enterprise, case it matters. Thanks  
I want to use the reset password action in Splunk Soar, but it doesn't work and gives this error message. handle_action exception occurred. Error string: ''LDAPInvalidDnError' object has no attribut... See more...
I want to use the reset password action in Splunk Soar, but it doesn't work and gives this error message. handle_action exception occurred. Error string: ''LDAPInvalidDnError' object has no attribute 'description''
Hi Support,   Can you please help me for field extraction  id reference number  and formid {"id":"0fb56c6a-39a6-402b-8f07-8b889a46e3e8","referenceNumber":"UOB-SG-20240101-452137857","formId":"sg-p... See more...
Hi Support,   Can you please help me for field extraction  id reference number  and formid {"id":"0fb56c6a-39a6-402b-8f07-8b889a46e3e8","referenceNumber":"UOB-SG-20240101-452137857","formId":"sg-pfs-save-savings-festival"}     Thanks, Hari
Name perc date xxx 90 28-Dec-23 yyy 91 28-Dec-23 zzz 92 28-Dec-23 xxx 96 29-Dec-23 yyy 97 29-Dec-23 zzz 98 29-Dec-23   i want to calculate the difference betwe... See more...
Name perc date xxx 90 28-Dec-23 yyy 91 28-Dec-23 zzz 92 28-Dec-23 xxx 96 29-Dec-23 yyy 97 29-Dec-23 zzz 98 29-Dec-23   i want to calculate the difference between perc column value based on date,   for example, xxx have 90 in perc column for 28 dec 2023 and 96 for 29 dec 2023.  96-90= 6 will be the output .can you please help me with solution for my query. additional query is i want to subtract the current date perc with yesterday date perc value. please assist me on this
Hi all One of my user lets say maxwell is getting locked frequently. i want to check logs for last 7 days. i am using the below query but i am not getting any output. i have 4 domain controllers(dc... See more...
Hi all One of my user lets say maxwell is getting locked frequently. i want to check logs for last 7 days. i am using the below query but i am not getting any output. i have 4 domain controllers(dctr01,dctr02,dctr03,dctr04). index=winevenlog sourcetype=wineventlog:security Account_Name=maxwell EventCode=4740 earliest=-h (host="dctr01*" OR host="dctr02*" OR host="dctr03*" OR host="dctr04*") | table _time Caller_Computer_Name Account_Name EventCode Source_Network_Address Workstation_Name
Hi all, I have a very specifc regex extraction (search time extraction) _raw data example: | union [| makeresults | eval _raw = "Dec-28-2023|12:05:46,836|10.150.6.118|148:|some branch|uswer_name... See more...
Hi all, I have a very specifc regex extraction (search time extraction) _raw data example: | union [| makeresults | eval _raw = "Dec-28-2023|12:05:46,836|10.150.6.118|148:|some branch|uswer_name|d168a8b9-5647-421b-97ba-f2aa3bceb69a|1:Creation page stack|Success|action_portfolio_forms_c_save.action|8970:PORTFOLIO ONBOARDING - FORMS CAPTURE||3065254228||||||| ~newType|||"] [| makeresults | eval _raw = "Dec-28-2023|12:05:46,836|10.150.6.118|148:|some branch|uswer_name|d168a8b9-5647-421b-97ba-f2aa3bceb69a|1:Creation page stack|Success|action_portfolio_forms_c_save.action|8970:PORTFOLIO ONBOARDING - FORMS CAPTURE||3065254228|||||||oldType~newType|||"] [| makeresults | eval _raw = "Dec-28-2023|12:05:46,836|10.150.6.118|148:|some branch|uswer_name|d168a8b9-5647-421b-97ba-f2aa3bceb69a|1:Creation page stack|Success|action_portfolio_forms_c_save.action|8970:PORTFOLIO ONBOARDING - FORMS CAPTURE||3065254228||||||||||"] I want to extract 2 fields from position 19 of the pipe and until 20th position that include (or may not) 2 fields that need to be extracted new: comes right after 19th | and before ~ old: comes after ~ and before 20th | There are 3 option that the data may appear: |<space>~newType| |oldType~newType| |<null><null>| The problem that I have is when no data is presented (3rd option) , then the props.conf doesn't parse it In the end I need to have 2 fields, based on the example above: old new <space> newType oldType newType <space> <space>   props.conf    [user_activity] REPORT-bb_extract = REPORT-bb_extract EXTRACT-oldAccountType = ^(?:[^|]*\|){19}(?<old>[^\~|\|]*) EXTRACT-newAccountType = (?:[^~]*\~){1}(?<new>[^|]*)   transforms.conf   [REPORT-bb_extract] KEEP_EMPTY_VALS = true DELIMS = "|" FIELDS = "DATE","TIME","ip","branch","appName","userName","actionID","actionType","actionStatus","actionName","action","srcPortfolioId","refID","currency","TotalAmount","secondPortfolioId","multiTransfer","field18","field19","id2","field21","new","old"             1. How can I extract the field that may or may not include some value 2. How can I fix the the second regex to start with ~ after 19th |  Thanks
If I use the command ./splunk add monitor /var/log, -> /splunk/etc/apps/search/local/inputs.conf file will be modified. However, if I use the command ./splunk add forward-server a.a.a.a:9997, -> /... See more...
If I use the command ./splunk add monitor /var/log, -> /splunk/etc/apps/search/local/inputs.conf file will be modified. However, if I use the command ./splunk add forward-server a.a.a.a:9997, -> /splunk/etc/system/local/outputs.conf is modified.   Why are both the same cli tasks, but one modifies the file under the search app and the other modifies the system file? Even considering the priority of the conf configuration file, both are GLOBAL CONTEXT, so I think they should both be placed under the System folder.   My question may be inappropriate or may have some shortcomings. I would really appreciate your advice.
Hi all, I am trying to put together a search and stats table for users in our environment who have uploaded data to a domain where there has been not been any other upload activity to that domain in... See more...
Hi all, I am trying to put together a search and stats table for users in our environment who have uploaded data to a domain where there has been not been any other upload activity to that domain in the last 7 days. Operation="FileUploadedToCloud" - I'm working with fields such as user and targetdomain. Any help is appreciated! Thanks!
Hi all, I've setup am SC4S just to forward nix:syslog events. In local/context/splunk_metadata.csv: nix_syslog,index,the_index nix_syslog,sourcetype,nix:syslog Cant find the events inSplunk and ... See more...
Hi all, I've setup am SC4S just to forward nix:syslog events. In local/context/splunk_metadata.csv: nix_syslog,index,the_index nix_syslog,sourcetype,nix:syslog Cant find the events inSplunk and splunkd.log is filling with: 12-29-2023 09:52:50.993 +0000 ERROR HttpInputDataHandler [2140 HttpDedicatedIoThread-0] - Failed processing http input, token name=the_token, channel=n/a, source_IP=172.18.0.1, reply=7, events_processed=1, http_input_body_size=1091, parsing_err="Incorrect index, index='main'" The HEC probes at sc4s boot are successful and inserted in the correct index. Any help would be really appreciated. Thank you Daniel
Hi Team, I have developed .NET sample MSMQ sender and receiver Console application. I have tried Instrumenting that application. I could load the profiler and was able to see the MQ Details and tr... See more...
Hi Team, I have developed .NET sample MSMQ sender and receiver Console application. I have tried Instrumenting that application. I could load the profiler and was able to see the MQ Details and transaction snapshots for sender application, but was unable to get MQ details for receiver application in AppDynamics controller. But we are expecting MSMQ Entry point for .NET consumer application. I have tried resolving the issue by adding POCO entry points which AppDynamics has been mentioned in the below link, but it didn’t help. Message Queue Entry Points (appdynamics.com) Please look into this issue and help us to resolve this. Thanks in advance.
hi, how can I change the scheduled index time of a data source?
Is federated search able to search frozen buckets in s3? Or only raw logs?
Hello, Line breaker in my props configuration for the json formatted file is not working, it's not breaking the json events. My props and sample json events are giving below. Any recommendation wil... See more...
Hello, Line breaker in my props configuration for the json formatted file is not working, it's not breaking the json events. My props and sample json events are giving below. Any recommendation will be highly appreciated, thank you! props [myprops] CHARSET=UTF-8 KV_MODE-json LINE_BREAKER=([\r\n]+)\"auditId\"\: SHOULD_LINEMERGE=true TIME_PREFIX="audittime": " TIME_FORMAT=%Y-%m-%dT%H:%M:%S TRUNCATE=9999 Sample Events { "items": [ { "auditId" : 15067, "secId": "mtt01", "audittime": "2016-07-31T12:24:37Z", "links": [ { "name":"conanicaldba", "href": "https://it.for.dev.com/opa-api" }, { "name":"describedbydba", "href": "https://it.for.dev.com/opa-api/meta-data" } ] }, { "auditId" : 16007, "secId": "mtt01", "audittime": "2016-07-31T12:23:47Z", "links": [ { "name":"conanicaldba", "href": "https://it.for.dev.com/opa-api" }, { "name":"describedbydba", "href": "https://it.for.dev.com/opa-api/meta-data" } ] }, { "auditId" : 15165, "secId": "mtt01", "audittime": "2016-07-31T12:22:51Z", "links": [ { "name":"conanicaldba", "href": "https://it.for.dev.com/opa-api" }, { "name":"describedbydba", "href": "https://it.for.dev.com/opa-api/meta-data" } ] } ] ​
Lookup 1  : Contains fields such as  AssetName  FQDN and IP Address Lookup 2 :  Contains fields such as Host Index and source type  Expected Output : Need to compare host value from lookup 2 with... See more...
Lookup 1  : Contains fields such as  AssetName  FQDN and IP Address Lookup 2 :  Contains fields such as Host Index and source type  Expected Output : Need to compare host value from lookup 2 with FQDN and IP address in Lookup 1 and output must be missing devices details
Able to see events in index=_internal but not in index=abc for a particular host  , what could be reason.