All Topics

Top

All Topics

index= name tag=name NOT "health-*" words="Authentication words" OR MESSAGE_TEXT="Authentication word" | stats count by host | table host,count
I am working to integrate Splunk with AWS to ingest CloudTrail logs. Looking at the documentation for the Splunk Add-on for AWS, under steps 3, 4, and 8 it says to create an IAM user, an access key, ... See more...
I am working to integrate Splunk with AWS to ingest CloudTrail logs. Looking at the documentation for the Splunk Add-on for AWS, under steps 3, 4, and 8 it says to create an IAM user, an access key, and then to input the key ID and secret ID into the Splunk Add-on: https://docs.splunk.com/Documentation/SplunkCloud/9.2.2406/Admin/AWSGDI#Step_3:_Create_a_Splunk_Access_user Can we instead leverage a cross-account IAM role with an external ID for this purpose? We try to limit IAM user creation in our environment and this also creates additional management overhead, such as needing to regularly rotate the IAM user access key credentials. Leveraging a cross-account IAM role that can be assumed by Splunk Cloud is a much simpler (and more secure) implementation. Thanks!
Hi I have a drop down based on domain ,entity so when i select domain , entity and date selected it fetch result of initlambda,init duplicate,init error...I want to have a extra submit button ,once i... See more...
Hi I have a drop down based on domain ,entity so when i select domain , entity and date selected it fetch result of initlambda,init duplicate,init error...I want to have a extra submit button ,once i hit submit then only run the result for initlambda,init duplicate,init error otherwise dont fetch anything   <row> <panel> <title> VIEW BY ENTITY</title> <input type="dropdown" token="tokEnvironment" searchWhenChanged="true"> <label>Domain</label> <choice value="Costing">Costing</choice> <change> <set token="inputToken">""</set> <set token="outputToken">""</set> <set token="inputToken2">""</set> <set token="outputToken2">""</set> <unset token="tokSystem"></unset> <unset token="form.tokSystem"></unset> </change> <default>Cost</default> <initialValue>Cost</initialValue> </input> <input type="dropdown" token="tokSystem" searchWhenChanged="false"> <label>Data Entity</label> <fieldForLabel>$tokEnvironment$</fieldForLabel> <fieldForValue>$tokEnvironment$</fieldForValue> <search> <!--<progress>--> <!-- match attribute for condition uses eval-like expression (see Splunk search language 'eval' command) --> <!-- logic: if resultCount is 0, then show a static html element, and hide the chart element --> <!-- <condition match="'job.resultCount'== 0">--> <!-- <set token="show_html">true</set>--> <!-- </condition>--> <!-- <condition>--> <!-- <unset token="show_html"/>--> <!-- </condition>--> <!-- </progress>--> <query>| makeresults | fields - _time | eval Costing="GetQuoteByCBD,bolHeader,bolLineItems,laborProcess,costSheetCalc,FOB" | fields $tokEnvironment$ | makemv $tokEnvironment$ delim="," | mvexpand $tokEnvironment$</query> </search> <change> <condition match="$label$==&quot;get&quot;"> <set token="inputToken">get</set> <set token="outputToken">get</set> <set token="inputToken2">b</set> <set token="outputToken2">b</set> <set token="inputToken3">c</set> <set token="outputToken3">c</set> <set token="inputToken4">d</set> <set token="outputToken4">d</set> <set token="inputToken5">e</set> <set token="outputToken5">e</set> <set token="inputToken4">d</set> <set token="outputToken4">d</set> <set token="inputToken3">3</set> <set token="outputToken3">3</set> <set token="apiToken">d</set> <set token="entityToken">get</set> </condition> <condition match="$label$==&quot;batch&quot;"> <set token="inputToken">batch</set> <set token="outputToken">batch</set> <set token="inputToken2">c</set> <set token="outputToken2">c</set> <set token="inputToken">b</set> <set token="outputToken4">b</set> <set token="inputToken3">d</set> <set token="outputToken3">d</set> <set token="apiToken">b</set> <set token="inputToken5">f</set> <set token="outputToken5">f</set> <set token="entityToken">batch</set> </condition> </condition> <condition match="$label$==&quot;Calc&quot;"> <set token="inputToken">Calc</set> <set token="outputToken">Calc</set> <set token="inputToken2">init</set> <set token="outputToken2">init</set> <set token="inputToken">Calc</set> <set token="outputToken4">Calc</set> <set token="inputToken3">d</set> <set token="outputToken3">d</set> <set token="apiToken">Calc</set> <set token="entityToken">Calc</set> </condition> </change> <default>get</default> </input> <input type="time" token="time_picker" searchWhenChanged="true"> <label>Time</label> <default> <earliest>-15m</earliest> <latest>now</latest> </default> </input> <html> <u1> </u1> </html> </panel> </row> <row> <panel> <title>Init Lambda</title> <table> <search> <query>index="" source IN ("/aws/lambda/aa-$outputToken$-$stageToken$-$outputToken2$") | spath msg | search msg="gemini:streaming:info:*" | stats count by msg</query> <earliest>$time_picker.earliest$</earliest> <latest>$time_picker.latest$</latest> <sampleRatio>1</sampleRatio> </search> <option name="dataOverlayMode">heatmap</option> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> </table> </panel> <panel> <title>Init Lambda - Duplicate</title> <table> <search> <query>index="" source IN ("/aws/lambda/aa-$outputToken$-$stageToken$-$outputToken2$") | spath msg | search msg="gemini:streaming:warning:*" | stats count by msg</query> <earliest>$time_picker.earliest$</earliest> <latest>$time_picker.latest$</latest> <sampleRatio>1</sampleRatio> </search> <option name="dataOverlayMode">heatmap</option> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> </table> </panel> <panel> <title>Init Lambda - Error</title> <table> <search> <query>index=""source IN ("/aws/lambda/aa-$outputToken$-$stageToken$-$outputToken2$") | spath msg | search msg="gemini:streaming:error:*" | stats count by msg</query> <earliest>$time_picker.earliest$</earliest> <latest>$time_picker.latest$</latest> <sampleRatio>1</sampleRatio> </search> <option name="dataOverlayMode">heatmap</option> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> </table> </panel> </row>
Hi, I've an eventhub that receives data from multiple application, with different number and values of columns.  The events are typically like so (as an example) Environment ProductName UtcDate   R... See more...
Hi, I've an eventhub that receives data from multiple application, with different number and values of columns.  The events are typically like so (as an example) Environment ProductName UtcDate   RequestId Clientid ClientIp #app1 Environment ProductName UtcDate Instance Region RequestId ClientIp DeviceId #app2 Environment ProductName UtcDate  DeviceId ClientIp #app3 PROD Product1 2024-04-04T20:21:20 abcd-12345-dev bcde-ed-1234 10.12.13.14 #app1 PROD Product2 2024-04-04T20:23:20 gwa us 126d-a23d-1234-def1 10.23.45.67 abcAJHSSz12. #ap TEST Product3 2024-04-04T20:25:20 Ghsdhg1245 12.34.57.78 #app3 Environment ProductName UtcDate Instance Region RequestId ClientIp DeviceId #app2 #app at end of line, is not part of log, just to annotate the different entrie How can splunk automagically select which "format" to use with REPORT/EXTRACT in transforms? On the HeavyForwarder  transforms.conf [header1] DELIMS="\t" FIELDS=Environment,ProductName,UtcDate,  RequestId,Clientid,ClientIp [header2] DELIMS="\t" FIELDS=Environment,ProductName,UtcDate,Instance,Region,RequestId,ClientIp,DeviceId [header3] DELIMS="\t" FIELDS=Environment,ProductName,UtcDate ,DeviceId ClientIp In props.conf [eventhub:sourcewithmixedsources] INDEXED_EXTRACTIONS = TSV CHECK_FOR_HEADER=true NO_BINARY_CHECK = 1 SHOULD_LINEMERGE = false pulldown_type = 1 REPORT-headers = header1, header3,header3  
Is there an integration available to push and pull to and from Palo Alto XSOAR. Looking for an integration to pull incidents and update the status.
Does Splunk support CrowdStrike OAuth API?
I have an enterprise deployment with multiple servers. All licensing is handled by a license manager. One of my indexers gives the warning "Your license is expired. Please login as an administrator t... See more...
I have an enterprise deployment with multiple servers. All licensing is handled by a license manager. One of my indexers gives the warning "Your license is expired. Please login as an administrator to update the license." When I do, licensing looks fine. It's pointed to the correct address for the license manager. Last successful contact was less than a minute ago. Under messages it says, "No licensing alerts". And under "Show all configuration details" It lists recent successful contacts and the license keys in use. That's about as far as I can go because 30 seconds in, my session will get kicked back to the login prompt with the message my session has expired.  So, I have one server out of a larger deployment that seems to think it doesn't have a license. But all indications are that it does. But it still behaves like it doesn't.   
I'm creating a splunk multisite cluster. the configuration is done as the documentation shows, so I did with the cluster node. all peers show up and tell me they are up and are happily replicating. b... See more...
I'm creating a splunk multisite cluster. the configuration is done as the documentation shows, so I did with the cluster node. all peers show up and tell me they are up and are happily replicating. but for whatever reason the search factor and replication factor is not met. the notification about the unhealthy system tells me it's the cluster node :      but - why is that? how can I check what is wrong with it? If I look up the cluster status it all seems fine (via cli)  
Where do I find my Splunk user account id? I must register through my school, Per Scholas, with Pearson VUE to take the Splunk certification exam.
I'm trying to configure the indexes.conf in such a way that its data retention is exactly 180 days and then does NOT get frozen, but gets deleted.    I've tried to set it with frozenTimePeriodInSec... See more...
I'm trying to configure the indexes.conf in such a way that its data retention is exactly 180 days and then does NOT get frozen, but gets deleted.    I've tried to set it with frozenTimePeriodInSecs = 15552000 but now I get the following error:    Validation errors are present in the bundle. Errors=peer=XXX, stanza=someidx Required parameter=thawedPath not configured;   so I HAVE TO put a thawed path in it even tho I don't want to freeze anything? how does that make sense?    Kind regards for a clarification!
Hello Everyone, My below splunk query works fine in normal splunk search and it returns expected results:   index="my_index" | stats count by kubernetes_cluster | table kubernetes_cluster | sort ... See more...
Hello Everyone, My below splunk query works fine in normal splunk search and it returns expected results:   index="my_index" | stats count by kubernetes_cluster | table kubernetes_cluster | sort kubernetes_cluster   However when the same query when I have it in dashboard's dropdown it is not returning that data. Search on Change is unchecked. the dropdown looks like this: source view: <input type="dropdown" token="regions" searchWhenChanged="false"> <label>region</label> <fieldForLabel>regions</fieldForLabel> <fieldForValue>regions</fieldForValue> <search> <query>index="my_index" | stats count by kubernetes_cluster | table kubernetes_cluster | sort kubernetes_cluster</query> <earliest>0</earliest> <latest></latest> </search> </input>    
Hi, I am trying to create a Transaction where my starting and ending 'event' are not always showing the correct overview.  I expect the yellow marked group events as result:       inde... See more...
Hi, I am trying to create a Transaction where my starting and ending 'event' are not always showing the correct overview.  I expect the yellow marked group events as result:       index=app sourcetype=prd_wcs host=EULMFCP1WVND121 "EquipmentStatusRequest\"=" D0022 | eval _raw = replace(_raw, "\\\\", "") | eval _raw = replace(_raw, "\"", "") | rex "Chute:DTT_S01.DA01.(?<Door>[^\,]+)" | rex "EquipmentName:DTT_S01.DA01.(?<EquipmentName>[^\,]+)" | rex "EquipmentType:(?<EquipmentType>[^\,]+)" | rex "Status:(?<EquipmentStatus>[^\,]+)" | rex "TypeOfMessage:(?<TypeOfMessage>[^\}]+)" | eval Code = EquipmentStatus+"-"+TypeOfMessage+"-"+EquipmentType | lookup Cortez_SS_Reasons.csv CODE as Code output STATE as ReasonCode | where ReasonCode = "Ready" OR ReasonCode = "Full" | transaction EquipmentName startswith=(ReasonCode="Full") endswith=(ReasonCode="Ready") | eval latestTS = _time + duration | eval counter=1 | accum counter as Row | table _time latestTS Row ReasonCode | eval latestTS=strftime(latestTS,"%Y-%m-%d %H:%M:%S.%3N")   The script above is showing the following overview as result and the marked line is not correct. I don't know how this is happened. Because, I expect that Transaction function will always take first events starting with "Ready" and ending with "Full"..  Thanks in advance.  
I have a JSON data like this.   "suite":[{"hostname":"localhost","failures":0,"package":"ABC","tests":0,"name":"ABC_test","id":0,"time":0,"errors":0,"testcase":[{"classname":"xyz","name":"foo1","ti... See more...
I have a JSON data like this.   "suite":[{"hostname":"localhost","failures":0,"package":"ABC","tests":0,"name":"ABC_test","id":0,"time":0,"errors":0,"testcase":[{"classname":"xyz","name":"foo1","time":0,"status":"Passed"},{"classname":"pqr","name":"foo2)","time":0,"status":"Passed"},....   I want to create a table with Suite testcase_name and Testcase_status as columns. I have a solution using mvexpand command. But when there is large data output gets truncated using mvexpand command.   ....| spath output=suite path=suite{}.name | spath output=Testcase path=suite{}.testcase{}.name| spath output=Error path=suite{}.testcase{}.error | spath output=Status path=suite{}.testcase{}.status|search (suite="*") | eval x=mvzip(Testcase,Status) | mvexpand x|eval y=split(x,",")|eval Testcase=mvindex(y,0) | search Testcase IN ("***") | eval suite=mvdedup(suite) |eval Status=mvindex(y,1) |table "Suite" "TestCase" Status   This is the query im using. But the results gets truncated. Is there any alternative for mvexpand so that i can edit the above query ?
Hi Team  Can you please let me know how can i use the below Field extraction formula directly using the rex command ?  Field extraction formula :  ^(?:[^,\n]*,){7}\s+"\w+_\w+_\w+_\w+_\w+":\s+"(?... See more...
Hi Team  Can you please let me know how can i use the below Field extraction formula directly using the rex command ?  Field extraction formula :  ^(?:[^,\n]*,){7}\s+"\w+_\w+_\w+_\w+_\w+":\s+"(?P<POH>[^"]+)      
Hi I have a Dashboard and i want to add a button , so when somebody solves that particular issue he/she can click on that button and it will change status to solved and it will be removed from dashbo... See more...
Hi I have a Dashboard and i want to add a button , so when somebody solves that particular issue he/she can click on that button and it will change status to solved and it will be removed from dashboard. for eg: I have a issue on a device and i solved that issue so then i can click on that button and it will make that issue status solved or will be removed from the dashboard.
I am finally successful in connecting Splunk with Power BI. But while adding a new source and getting data in Power BI, the data models I see are different from those I see in the Splunk datasets tab... See more...
I am finally successful in connecting Splunk with Power BI. But while adding a new source and getting data in Power BI, the data models I see are different from those I see in the Splunk datasets tab's interface and I also do not find the table view I created in Splunk.
Hi everyone, My name is Emmanuel Katto. I’m currently working on a project where I need to analyze large datasets in Splunk, and I've noticed that the search performance tends to degrade as the data... See more...
Hi everyone, My name is Emmanuel Katto. I’m currently working on a project where I need to analyze large datasets in Splunk, and I've noticed that the search performance tends to degrade as the dataset size increases. I'm looking for best practices or tips on how to optimize search performance in Splunk.   What are the recommended indexing strategies for managing large volumes of data efficiently? Are there particular search query optimizations I should consider to speed up the execution time, especially with complex queries? How can I effectively utilize data models to improve performance in my searches? I appreciate any insights or experiences you can share. Thank you in advance for your help! Best, Emmanuel Katto  
HI  Can someone please help me to extract the multiple fields from a single backslash separated field using rex command.  FIELD1 = ABCD/EFGH/IJ/KL/MN/OP/QRST How to create the multiple fields usin... See more...
HI  Can someone please help me to extract the multiple fields from a single backslash separated field using rex command.  FIELD1 = ABCD/EFGH/IJ/KL/MN/OP/QRST How to create the multiple fields using the field FIELD1 as below : Field_1 = ABCD  Field_2 = EFGH Field_3 = IJ Field_4 = KL Field_5 = MN Field_6 = OP Field_7 = QRST      
HI  Can someone please let me know how I can use the below expression (generated via Field Extraction) directly via Rex command:  Regular expression generated via Field extraction:   ^(?:[^,\... See more...
HI  Can someone please let me know how I can use the below expression (generated via Field Extraction) directly via Rex command:  Regular expression generated via Field extraction:   ^(?:[^,\n]*,){7}\s+"\w+_\w+_\w+_\w+_\w+":\s+"(?P<POH>[^"]+)   I am using the rex command as below but i am getting an error :    | rex field=Message mode=sed "(?:[^,\n]*,){7}\s+"\w+_\w+_\w+_\w+_\w+":\s+"(?P<POH1>[^"]+)"  
Greetings , Does anyone know if it's possible to create a script that writes splunk search quey based on the alerts results / table, for example: "Multiple Failure Attempts"  uses "Authentication" ... See more...
Greetings , Does anyone know if it's possible to create a script that writes splunk search quey based on the alerts results / table, for example: "Multiple Failure Attempts"  uses "Authentication" data model to display results and only shows specific fields as : username , total failure attempts, source ip, destination..etc. But I want to conduct more investigation and check raw logs to see more fields so I have to write a new search query with specifying fields and their values to get all information. (index=* sourcetype=xxx user=xxx dest=xxx srcip=xxx) then look for more fields under the displayed results. And I would like to automate this process. Any suggestions for Apps, Scripts, recommended programming language?