All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello, I am new-ish to Splunk and had a question regarding the use of a lookup table and wanting to include all values listed in a lookup table in search output even when there are no events related.... See more...
Hello, I am new-ish to Splunk and had a question regarding the use of a lookup table and wanting to include all values listed in a lookup table in search output even when there are no events related.  To summarize, I have a lookup file that correlates a server name with an environment name: host, EnvName server1, EnvA server2, EnvB serverN,EnvN... I am trying to show the number of events per day per server and my issue is that when there are no events for a server during the timeframe, the server is not listed in the output.   index=index_name sourcetype=sourcetype_name | lookup lookup_name host OUTPUT EnvName | chart count by .... With this search, only values with events will be returned.  Is there a way to include the values in the lookup that do not have any events during this timeframe?  Thanks.
My App is not showing up in Splunk Cloud. Yet I have uploaded the app to Splunkbase, and it passes all the Appinspect checks without any errors. This is the app in question: https://splunkbase.splu... See more...
My App is not showing up in Splunk Cloud. Yet I have uploaded the app to Splunkbase, and it passes all the Appinspect checks without any errors. This is the app in question: https://splunkbase.splunk.com/app/3980
This question was asked 4 years ago (https://community.splunk.com/t5/All-Apps-and-Add-ons/timeline-custom-visualization-Increase-the-width-of-labels-in/m-p/209544).  I'm still having the same issue. ... See more...
This question was asked 4 years ago (https://community.splunk.com/t5/All-Apps-and-Add-ons/timeline-custom-visualization-Increase-the-width-of-labels-in/m-p/209544).  I'm still having the same issue.   I tried adding a newline, which did not fix the problem.   Is there a solution today that lets me expand the size of the left panel?  Or make it two lines?   Thanks, Paul
Hi community, I am trying to get my head around the best way to import CloudTrail data from AWS to Splunk. What I don't understand right now is the difference between these two ways to get CloudTra... See more...
Hi community, I am trying to get my head around the best way to import CloudTrail data from AWS to Splunk. What I don't understand right now is the difference between these two ways to get CloudTrail data into Splunk when using the Add-On for AWS. First way: Cloud Trail Input Second way: SQS-based-S3 input with type CloudTrail Both inputs use the same mechanism: CloudTrail data is dumped to an S3 bucket, notifications are sent to SNS and forwarded to SQS, Splunk reads from SQS and fetches the data from S3. Can somebody give me a hint, how these two ways differ and which way would be recommended? Thanks!
Hello   I have this source path for example :   AAN831AA_ELS_Log_20200102064858+0000-[PS-MELSPP2]/fsmLog_010220-064856_4C5KKM00_AAN831AA.txt   and i want to extract the part before '-'     A... See more...
Hello   I have this source path for example :   AAN831AA_ELS_Log_20200102064858+0000-[PS-MELSPP2]/fsmLog_010220-064856_4C5KKM00_AAN831AA.txt   and i want to extract the part before '-'     AAN831AA_ELS_Log_20200102064858+0000    what should i do ?   thanks
I have recently created a version 8.0.6 Index cluster with S2 enabled. When necessary I would like to spin up a temp Splunk Server to read historic data from S2.  A use case could be if there is an ... See more...
I have recently created a version 8.0.6 Index cluster with S2 enabled. When necessary I would like to spin up a temp Splunk Server to read historic data from S2.  A use case could be if there is an incident and the investigators need like 2 years of data out of S2 to forensicate. We don't want to pull that amount of historic data back into our production cache as it will cause performance issues and shorten the local cache availability. Does anyone know how to attach a Splunk instance ad hoc to an existing s2 bucket? I presume the new instance will need all the existing indexes.conf info but I am not sure what else is required. Please advise. Thank you!
Hello all, Below is the sample data of one cycle. Similarly there are several cycles The data below is from sensor values. Each cycle starts with Spword:0 and ends with SPword 8. How do i group the... See more...
Hello all, Below is the sample data of one cycle. Similarly there are several cycles The data below is from sensor values. Each cycle starts with Spword:0 and ends with SPword 8. How do i group these as cycle1,2,3 based on the SPword field. The timedifference for all cycles is not same and also when i use transaction command am not able to name it as cycle 1,2. Please could you help "_time",FrontFormedHeight,Pot,LVDT,LoadCell,SPWord,LineNumber,cycl "2020-09-29T14:01:24.000+0000","2.703","67.64","0.198",0,0,1,CycleStart "2020-09-29T14:01:26.000+0000","2.703","67.63","0.198",0,0,2,CycleStart "2020-09-29T14:01:29.000+0000","2.703","67.64","0.198",0,0,3,CycleStart "2020-09-29T14:01:31.000+0000","2.703","67.63","0.195",0,0,4,CycleStart "2020-09-29T14:01:34.000+0000","2.703","67.64","0.198",0,0,5,CycleStart "2020-09-29T14:01:36.000+0000",0,"67.64","0.198",0,1,6, "2020-09-29T14:01:38.000+0000",0,"51.66","0.198",0,,7, "2020-09-29T14:01:39.000+0000",0,"24.61","1.565",0,2,8, "2020-09-29T14:01:41.000+0000",0,"24.59","1.59",0,2,9, "2020-09-29T14:01:43.000+0000",0,"18.3","7.551","1160.9241004669334",,10, "2020-09-29T14:01:44.000+0000","2.52","67.43","0.195",0,4,11, "2020-09-29T14:01:46.000+0000","2.52","15.53","1.413",0,5,12, "2020-09-29T14:01:48.000+0000","2.52","12.1","4.471","472.4265266884365",,13, "2020-09-29T14:01:51.000+0000","2.52","67.53","0.195",0,7,14, "2020-09-29T14:01:53.000+0000","2.52","67.59","0.195",0,,15, "2020-09-29T14:01:56.000+0000","2.52","67.64","0.198",0,8,16,cycleEnd  
I've installed the agent and can see it in the «Tier & Nodes» view. But on the «Getting Started Wizard» page in «Connect the Agent to the AppDynamics Controller» still shown «Waiting for Connection..... See more...
I've installed the agent and can see it in the «Tier & Nodes» view. But on the «Getting Started Wizard» page in «Connect the Agent to the AppDynamics Controller» still shown «Waiting for Connection...». The troubleshooting page does not help. Test requests to the application were sent to generate some work. I didn't find a way to attach logs here.
Hi, Facing a strange issue in splunk .First of all we are ingesting data into splunk from sql server as a view .The sql server view returns the correct value but the splunk sourcetype doesn't. Part... See more...
Hi, Facing a strange issue in splunk .First of all we are ingesting data into splunk from sql server as a view .The sql server view returns the correct value but the splunk sourcetype doesn't. Particular field  like reporting has 2 values (Yes or No ) where Yes will have count like 215 and No 44 .But the actual count required is Yes 246 and No 48 ,this is from warehouse view. When i run the search index=x sourcetype=y|search reporting="No"|stats count by z  the count  am getting is 44 and not 48.THe interesting field category has 215 and 44  .The field reporting in interesting field has around 86 % of values only .Is this the reason ?One more thing is the events tab has the no null values for these 2 values in reporting field and when  am searching using above SPL  am getting this issue. When fill null in reporting , am getting 48 but those null are not available in warehouse view. Can anyone help me to find why there is difference in count from warehouse view  to splunk sourcetype? Why the values are moved to "0" (Null).  
Hi Team, I've below extensible attributes in message tab, able to extract Assignor ID and it's Value but facing issue for BUILDING and It's Value. I guess because value is under special characters i... See more...
Hi Team, I've below extensible attributes in message tab, able to extract Assignor ID and it's Value but facing issue for BUILDING and It's Value. I guess because value is under special characters i.e. ["NOI_SEC60_GF"]]  network_view=default extensible_attributes=[[name="Assignor ID",value="51806153"],[name="BUILDING",value=["NOI_SEC60_GF"]] I tried few solution but it's not working. @MuS  Thanks,
Hi all, I have succesfully made a search to populate a CSV file thanks to @gcusello , this file lets me add Usernames and timestamps to monitor their last succesfull logon. Now after a certain time ... See more...
Hi all, I have succesfully made a search to populate a CSV file thanks to @gcusello , this file lets me add Usernames and timestamps to monitor their last succesfull logon. Now after a certain time i would like to delete some rows without overwriting the file.  If possible i want to check in the AD if  the user list is the same as  in the CSV file. If a certain user is not in the AD anymore, that usernames row needs to be deleted in the CSV file. Can anyone help me create a search to delete those rows if this is possible? Otherwise i will have to do it manually or via an other script.  This is an example of the CSV file that is generated.    Time | User --------------------------------------------- 1601341200 | User_Alpha 1601348400 | User_Beta 1601355600 | User_Charlie   Thank you very much, Sasquatchatmars
I need help to join two of my reports  1St report fetches host name with Event code 52 and in time picker this report runs for 24 hrs 2nd report fetches host name with Event code 52 and count of ho... See more...
I need help to join two of my reports  1St report fetches host name with Event code 52 and in time picker this report runs for 24 hrs 2nd report fetches host name with Event code 52 and count of host name which are been repeated in last 30 days I want to join both reports based on host name  2nd report running for 30 days should match host name running for last 24 hours and give the count also for how many days dat particular host is getti g repeated with Event code 52
Hi I want to create a report to display  time spent by user in a console Being beginner doesnt know how to query . Any suggestions ?   index="123" AND organizationId="0123000000000342" logRecordT... See more...
Hi I want to create a report to display  time spent by user in a console Being beginner doesnt know how to query . Any suggestions ?   index="123" AND organizationId="0123000000000342" logRecordType=ailtn ("appName":"Collections_Platform" AND "appType":"Console")
Hi Team, I am trying to get list of all jobs on click of button using its Public API in Splunk dashboard but i am getting CORS policy issue. The same when called using POSTMAN i am getting list of... See more...
Hi Team, I am trying to get list of all jobs on click of button using its Public API in Splunk dashboard but i am getting CORS policy issue. The same when called using POSTMAN i am getting list of jobs but not in splunk. Below are the methods which are already implemented :- ----------server.conf ----------- [httpServer] crossOriginSharingPolicy = * crossOriginSharingHeaders = * allowBasicAuth = true [general] listenOnIPv6 = yes ( 1  )---------------------JS CODE------------------- console.log('Check require !'); require([ "jquery", "splunkjs/mvc/searchmanager", "splunkjs/mvc/simplexml/ready!" ], function( $, SearchManager ) { var auth_token ="****"; $(".button1").on("click", function (){ console.log('1234567890'); var settings = { "crossDomain": true, "url": "https://********************/api/2.0/jobs/runs/list", "contentType": "application/x-www-form-urlencoded;charset=utf-8", "method": "GET", "headers": { "Authorization": "Bearer ****", "Access-Control-Allow-Origin": "*", "Access-Control-Allow-Methods" : "GET,POST,PUT,DELETE,OPTIONS", "Access-Control-Allow-Headers": "Content-Type, Access-Control-Allow-Headers, Authorization, X-Requested-With" } } $.ajax(settings).done(function (response) { console.log(response); }); }); }); ( 2  )---------------------ERROR------------------- Access to XMLHttpRequest at 'https://************/api/2.0/jobs/runs/list?_=2839475543271' from origin 'http://host.example.com:8000' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. ----------------------------------------------- Need an immediate help on above issue ? please help @thambisetty, @ITWhisperer ,  @niketn , @gcusello , @isoutamo ,  @woodcock 
I have got a field  Vulnerability ages, which are having now in days like 120days,110, days,30days,45days I need to fetch the count vulnerability ages>=30 and vulnerability ages<=120  How can I ac... See more...
I have got a field  Vulnerability ages, which are having now in days like 120days,110, days,30days,45days I need to fetch the count vulnerability ages>=30 and vulnerability ages<=120  How can I achieve this?   I have done the below query  Vulnerability Ages>=30 but  no result        
I have a query which can give DIfferent IPS that are hitting to the top URI's  source= some source 404 | stats count values(Real_IP) as Real_IP by URI | sort - count I want to know ou... See more...
I have a query which can give DIfferent IPS that are hitting to the top URI's  source= some source 404 | stats count values(Real_IP) as Real_IP by URI | sort - count I want to know out of these given IP's which IP occurred how many times in the same formate and in the same query  
I have several events that are structured like this: 2020-09-28T15:18:40Z duration=8.0 somevalue=42 otherfield=A 2020-09-28T15:18:45Z duration=2.0 somevalue=10 otherfield=B 2020-09-28T15:18:44Z du... See more...
I have several events that are structured like this: 2020-09-28T15:18:40Z duration=8.0 somevalue=42 otherfield=A 2020-09-28T15:18:45Z duration=2.0 somevalue=10 otherfield=B 2020-09-28T15:18:44Z duration=2.0 somevalue=10 otherfield=B Here "duration" is in seconds. I would like to transform those events to a  kind of timechart by spreading the "somevalue" on the "duration" starting from the "_time" of the event. The span would be 1second for example. Another condition, is to be able to aggregate on "otherfield". I'm expecting something like this: _time sum(somevalue) otherfield=A  sum(somevalue) otherfield=B 2020-09-28T15:18:40Z 42 0 2020-09-28T15:18:41Z 42 0 2020-09-28T15:18:42Z 42 0 2020-09-28T15:18:43Z 42 0 2020-09-28T15:18:44Z 42 10 2020-09-28T15:18:45Z 42 20 2020-09-28T15:18:46Z 42 10 2020-09-28T15:18:47Z 42 0 2020-09-28T15:18:48Z 42 0 2020-09-28T15:18:49Z 0 0   I tried to use the "concurrency" function but was not able to get the values spread on several intervals (only a value for the _time of the event).  
Hi This is my API AWS query: "search index=aws userIdentity.type=Root eventName=ConsoleLogin earliest=-10d  | rex field=_raw MFAUsed\D\D\s\D(?P<Mfa>\D?\S) | rex field=_raw principalId\D:\s\D(?P<pri... See more...
Hi This is my API AWS query: "search index=aws userIdentity.type=Root eventName=ConsoleLogin earliest=-10d  | rex field=_raw MFAUsed\D\D\s\D(?P<Mfa>\D?\S) | rex field=_raw principalId\D:\s\D(?P<principalId>\d*)" | stats count by principalId" its working and im getting results. Now I need your help with add the field that I parse (Mfa) and to add Mfa="No" to the query but its not showing resutls. I tried to do something like that: "search index="aws" (userIdentity.type="Root" eventName="ConsoleLogin" Mfa="No*" earliest=-10d | rex field=_raw MFAUsed\D\D\s\D(?P<Mfa>\D?\S) | rex field=_raw principalId\D:\s\D(?P<principalId>\d*)" What Im missing? Thanks!
Hi Team, We are currently extracting logs from Splunk via Splunk SDK based on index time. We have been seeing issues with some logs which are getting indexed on current time but belong to last month... See more...
Hi Team, We are currently extracting logs from Splunk via Splunk SDK based on index time. We have been seeing issues with some logs which are getting indexed on current time but belong to last month and would like to filter it from result.  Splunk SDK JobExportArgs object: JobExportArgs jobArgs = new JobExportArgs(); jobArgs.setIndexEarliest("2020-09-29T01:00:00"); jobArgs.setIndexLatest("2020-09-29T02:00:00"); Ask is to extract events based on index time but also where event time for the events is >=24 hrs (example). Does Splunk SDK support this feature? Thanks
Hi Everyone, Below is my data( raw logs) pod_name=node-fdzz message=2020-09-25 21:09:33.969 ERROR [node,00e,4deca,false]67 --- [r-84-548] c.r.Resolver  How do I extract Date and time 2020-09-25 21... See more...
Hi Everyone, Below is my data( raw logs) pod_name=node-fdzz message=2020-09-25 21:09:33.969 ERROR [node,00e,4deca,false]67 --- [r-84-548] c.r.Resolver  How do I extract Date and time 2020-09-25 21:09:33.969 from the above log data? Can anyone guide me on this.