All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, Following query displays user logon events for the last 10 days. We need user logon events for the last 12 months. How can this be achieved. index=main sourcetype=WinEventLog (EventCode=462... See more...
Hi, Following query displays user logon events for the last 10 days. We need user logon events for the last 12 months. How can this be achieved. index=main sourcetype=WinEventLog (EventCode=4624 OR EventCode=4634) user=pratapa.ln | eval day=strftime(_time,"%d/%m/%Y") | stats earliest(_time) AS earliest latest(_time) AS latest by user host day | eval earliest=strftime(earliest,"%d/%m/%Y %H.%M.%S"), latest=strftime(latest,"%d/%m/%Y %H.%M.%S")
Hi I use a dropdown list in every dashboards of my apps <input type="dropdown" token="tok_filtersite" searchWhenChanged="true"> <label>Site</label> <fieldForLabel>SITE</fieldFo... See more...
Hi I use a dropdown list in every dashboards of my apps <input type="dropdown" token="tok_filtersite" searchWhenChanged="true"> <label>Site</label> <fieldForLabel>SITE</fieldForLabel> <fieldForValue>SITE</fieldForValue> <search> <query>| inputlookup toto.csv | fields SITE | dedup SITE | table SITE | sort +SITE</query> <earliest>-30d@d</earliest> <latest>now</latest> </search> <default>*</default> <initialValue>*</initialValue> </input> </fieldset> Every time I open a dashboard from my nav menu, the dropdown list is on going updating So I waste time I need a solution in order to update this dropdown list just one time even if I open a new dashboard Is anybody can help me please??
My Splunk Ver : 8.0.2 I want enable summary indexing in alert, so I've tried change "action.summary_index" to true from advanced edit by according to below documents. https://docs.splunk.com/Docu... See more...
My Splunk Ver : 8.0.2 I want enable summary indexing in alert, so I've tried change "action.summary_index" to true from advanced edit by according to below documents. https://docs.splunk.com/Documentation/Splunk/8.0.2/Alert/Updatealerts#Enable_summary_indexing But when I click save button, it return to "false", and there isn't some error message. Is someone know about it?
Hello, I would like to add several tab characters in my html panel of xml dashboard to make my section more readable. It looks as follows: <h1>What are the components (panels) of this dashboard?... See more...
Hello, I would like to add several tab characters in my html panel of xml dashboard to make my section more readable. It looks as follows: <h1>What are the components (panels) of this dashboard?</h1> <ul> <li>Currently Running Startup for host: &#xA0;&#xA0;&#xA0;&#xA0; Running startup for the chosen host, refreshed each 2 min</li> <li>Last Startup for host: Runtimes of the startup phases for the last startup of the chosen host</li> <li>Average Startup Duration for host: Average runtimes of the startup phases for the host</li> <li>Currently Running Startup Logs for host: Progress of the startup logs, refreshed each 2 min</li> </ul> But repeating the &#xA0 four times does not work, I get the tab escape only one single time. How would I do this? Kind REgards, Kamil
Hello Experts, I have been asked by the executives to work on a plan to Socialise Splunk within the company . We just bough the License last year and there are not many people in the company who know... See more...
Hello Experts, I have been asked by the executives to work on a plan to Socialise Splunk within the company . We just bough the License last year and there are not many people in the company who know splunk. So i will have to ask from level 0 . I am looking for ideas on what are the best ways people would do in their companies.
I have json format data with a field called uploadDate .This has values like /Date(1584037059228)/ , /Date(1584033289090)/ etc . What stanza do I need to add at index time so that it will take the ... See more...
I have json format data with a field called uploadDate .This has values like /Date(1584037059228)/ , /Date(1584033289090)/ etc . What stanza do I need to add at index time so that it will take the uploadDate as the timestamp field and convert it to human readable format .The following strftime works when testing strftime(epoch/1000, "%Y-%m-%d %H:%M:%S") Sample event {"fileName":"TEST.yxmd","id":"0bb814","isChained":false,"metaInfo":{"author":"","copyright":"","description":"","name":"ATEST","noOutputFilesMessage":"","outputMessage":"","url":"","urlText":""},"packageType":1,"public":false,"runCount":1,"runDisabled":false,"subscriptionId":"5d395","uploadDate":"\/Date(1584037059228)\/","version":null,"workerTag":"","collections":[{"collectionId":"5e6a534","collectionName":"Test"}],"lastRunDate":"\/Date(1584037059000-0400)\/","publishedVersionId":"5e6a0031bb","publishedVersionNumber":4,"publishedVersionOwner":{"active":true,"email":"son.com","firstName":"a","id":"c398","lastName":"ngi","sId":null,"subscriptionId":"3c395"},"subscriptionName":"i"} Thanks in Advance
Hi all, how to get difference after using chart command. I did this command. | eval year=strftime(X,"%y") | eval month=strftime(X,"%m") | transaction event |chart count over year by month... See more...
Hi all, how to get difference after using chart command. I did this command. | eval year=strftime(X,"%y") | eval month=strftime(X,"%m") | transaction event |chart count over year by month it returns the table. year 01 02 04 05 07 08 09 10 11 12 OTHER 19 2 3 37 33 14 25 30 10 32 21 8 20 9 24 3 2 1 1 22 5 55 23 7 I want to count year-over-year. For example, January of 19year is 2, January of 20year is 9, so,Year-over-year increase is 9/2*100 = 450%.(it is example) But I think I can not use delta command. Is there any way to count year-over-year? Thank you for helping.
Hi, I am looking for some help on the below query. I have list of APIs which has different parameters in the URL. I have extracted the Values from the URL and stored it in a variable using replace... See more...
Hi, I am looking for some help on the below query. I have list of APIs which has different parameters in the URL. I have extracted the Values from the URL and stored it in a variable using replace command. Question: 1) How would I be able to combine them and store it in one Regex variable? 2) If I had it stored in one variable, will it be possible to display the count based on the selected api? Splunk Query: index=abcd appname=xyz | rex field=message "(GET|POST).(?[^\?\s]+)" | rex field=message "HTTP\/\S+.(?[^\ ]+)" | search RespCode=50* | eval api=replace(api, "(/api/abc/v2/user/Id/.*)","/api/abc/v2/user/Id/Unique_Value") | eval api=replace(api, "(/api/abc/v2/Name/.*)","/api/abc/v2/user/Name/Unique_Value") | eval api=replace(api, "(/api/abc/v2/user/.*)","/api/abc/v2/user/Unique_Value") | eval api=replace(api, "(/api/abc/v2/name/.*/info)","/api/abc/v2/name/unique_value/info") | eval api=replace(api, "(/api/abc/v2/info/.*/name)","/api/abc/v2/info/unique_value/name") | rex field=message "user.Id.(?[^\ ]+)" | stats count Can someone help if there is a scope within splunk queries to solve this? I am still trying to learn. Appreciate any assistance. Thank you.
Hi, I really appreciate the value splunk forum and the help from the community. Learning a lot. I have a question as I am trying to figure this out. I have data coming from different APIs an... See more...
Hi, I really appreciate the value splunk forum and the help from the community. Learning a lot. I have a question as I am trying to figure this out. I have data coming from different APIs and it has parameter on the header. I have used replace command in the header and stored each values under a variable like example below. example using replace: api/v1/testuser1 -> api/v1/unique_value api/v2/testinfo1 -> api/v2/unique_value Replace Query: | eval api=replace(api, "(api/v1/.)","api/v1/unique_value") | eval api=replace(api, "(api/v2/.)","api/v2/unique_value") when I run this as a search query, I am able to fetch the results. But when I use the above in a dashboard drilldown. it doesn't work. Can someone please help with this? Thanks.
I have the Okta Identity Cloud Add-on for Splunk installed on a heavy forwarder. The maximum log batch size is configured at 500,000, and every other limit setting (under add-on settings) is configur... See more...
I have the Okta Identity Cloud Add-on for Splunk installed on a heavy forwarder. The maximum log batch size is configured at 500,000, and every other limit setting (under add-on settings) is configured at the max. For inputs, it is configured to bring in log metrics, since I am interested in authentication API requests. I ran into some issues where logs would be about an hour or two behind in the afternoon of each day, since that is when the most amount of activity on our platform occurs. I ended up having to increase the typing queue and indexing queue on this heavy forwarder (in the server.conf file) in order to fix the queueing issues this box was running into. I still notice that in the afternoon it will fall behind 15 minutes to a half hour, and then by the time morning rolls around, it is caught up. I checked the system logs in the Okta admin portal, and I am not hitting any rate limits, or even warnings when this occurs. I am wondering if I have hit the limit of either the API, or of the add-on itself. The box that runs this heavy forwarder only has about 25% of the memory used, and 25% of the CPU in use.
Hi there! I created a hacky Splunk query for some YOY analysis I'm doing. I was wondering if there was a way to halt data from loading from the previous year up until today's date. For example,... See more...
Hi there! I created a hacky Splunk query for some YOY analysis I'm doing. I was wondering if there was a way to halt data from loading from the previous year up until today's date. For example, today is 3/12. I'd like data from previous year and this year to show up up until 3/12. The way my query (and time range selector) is now loads all data from previous year (I've attached image of what currently loads). The next day, the "end date" will update to 3/13 and I'd want my previous year data to only reach that date ceiling. Here's the query I'm working with: ((index=wsi_tax_summary sourcetype=stash capability=109* tax_year=2019 ein=* earliest=1578096000 latest=now()) OR (index=summary_dac_tax partnerId!=*Test* tax_year=2018 capability=*109* tax_year=2018 earliest=1546560000 latest=1556668800)) (intuit_offeringid=Intuit.platform.turbotaxipad.turbotaxmac OR intuit_offeringid=Intuit.platform.turbotaxwindows OR intuit_offeringid=Intuit.tax.ctg.ice.109ximportwidget) error_msg_host=SUCCESS partnerId!=*test* partnerId=* | eval Date=strftime(_time,"%m-%d") | chart dc(intuit_tid) by Date tax_year | rename "2018" as "TY18", "2019" as "TY19" | sort by Date | streamstats sum(TY18) as TY18 sum(TY19) as TY19
Hello, I'm trying to integrate the TA-symantec_atp with the EDR console. When I provide the EDR URL and the password (client_id:client_secret), I'm seeing the below error. [SSL: CERTIFICATE_VE... See more...
Hello, I'm trying to integrate the TA-symantec_atp with the EDR console. When I provide the EDR URL and the password (client_id:client_secret), I'm seeing the below error. [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:741)
Hi, I have JSON data format that send to Splunk as below: { "timestamp": "2020-03-12T18:18:48+00:00", "siteid": "CPM-1600-2-EC-158", "location": "LABRACK1", "powerunit": "1", "outletmetering": "1... See more...
Hi, I have JSON data format that send to Splunk as below: { "timestamp": "2020-03-12T18:18:48+00:00", "siteid": "CPM-1600-2-EC-158", "location": "LABRACK1", "powerunit": "1", "outletmetering": "1", "ats": "0", "branchcount": "4", "plugcount": "16", "powerfactor": "100", "powereff": "100", "powerdatacount": "1", "powerdata": [{ "timestamp": "2020-03-12T18:18:48+00:00", "plug1": [{ "plugname": "PaloAlto5220", "voltage": "125.00", "current": "6.00", "branch": "1" }], "plug2": [{ "plugname": "Cisco5220Meraki", "voltage": "125.00", "current": "6.00", "branch": "1" }], "plug3": [{ "plugname": "Outlet_A3", "voltage": "125.00", "current": "1.40", "branch": "2" }] }] } How do I extract JSON data format using the extract field in Splunk? I want to group the data like these. Can someone please point me to the right way of how to do it.
Is there a visualization of the _time and alerts issues, such as the one described at How to alert using _indextime for window instead of _time ? I need to visually show it to the managers here.
I have a difficulty in calculating statistics when different (multiple) values are present for a field in the same event. My query now extract the error field (for which i have written the regular ... See more...
I have a difficulty in calculating statistics when different (multiple) values are present for a field in the same event. My query now extract the error field (for which i have written the regular expression) and looks for the error and calculate count. How to overcome this.. Please help!
Hi All, I have enabled threat feed into my Splunk Enterprise Security app and the data was working fine until few days back when we disabled the acceleration of one of the datamodels. Since then, ... See more...
Hi All, I have enabled threat feed into my Splunk Enterprise Security app and the data was working fine until few days back when we disabled the acceleration of one of the datamodels. Since then, we are not seeing any threat activity for few of the datamodels in the setup. Does the threat intel work only on accelerated datamodels? or is it a different issue? Thanks in advance...!!!
I had a report that would run on the first Thursday of every month. It had been working for months. Recently, it startign running more than I expected based on how the cron was written, and the only ... See more...
I had a report that would run on the first Thursday of every month. It had been working for months. Recently, it startign running more than I expected based on how the cron was written, and the only thing I can attribute it to at the moment is that we upgraded to 7.3.4 from 6.1.1(I think). The cron is: 0 4 1-8 * 4 What it is doing now is running every day from the 1st to the 8th of every month, and running every Thursday at 4 am.
Anyone knows if Splunk for Cisco Identity Services is compatible with Splunk Enterprise Version: 8.0.1 or is there another app I could use for the same purpose? Thanks.
I have 2 searches. Search A produces a table output of "UserIP" Search B produces a table output of "FailedDestinationIP' and "FailedSourceIP". (search will have both values reported) I wan... See more...
I have 2 searches. Search A produces a table output of "UserIP" Search B produces a table output of "FailedDestinationIP' and "FailedSourceIP". (search will have both values reported) I want to see if "UserIP" matches either "FailedDestinationIP" or "FailedSourceIP" in search B. If it matches "Failed Destination" then I want output of "FailedSourceIP" or if it matches "FailedSourceIP" I want output of "FailedDestinationIP". It's ok if this is broken into to searches, rather than performing in one operation, as I really want 2 list of of either FailedsourceIP or Failed DestinationIP, where UserIP matches the opposite. I have tried some join commands and eval commands, but my results are not correct. Search A: source = abc | rex ".UserIP (?\S+)" | rex "(role(s):\s|role\s)(?\S+)." | rex "AuthGroup: (?\S+)" | search roles=$RoleName$ | table UserIP | dedup UserIP Search B: source = xyz | rex "unnel (?\S+) failed to (?\S+)" | table FailedDestinationIP FailedSourceIP Seems simple, but I can't get it right. Thanks in advance!
So we have a client system that has their own Splunk indexer. For certain reasons they do not want their splunk universal forwarders sending logs to two separate indexers, but want to continue to ... See more...
So we have a client system that has their own Splunk indexer. For certain reasons they do not want their splunk universal forwarders sending logs to two separate indexers, but want to continue to have all their logs sent to their indexer, and then forward select indexes from their indexer to ours. Most of the indexandforward items seem to require a heavy forwarder to work. We are trying not to interfere with their current setup as much as possible and adding the heavy forwarder seems like it would be exactly that. Any thoughts would be greatly appreciated.