All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi there are couple of workarounds for this. Here is one which use loadjob. https://community.splunk.com/t5/Other-Usage/Why-export-button-is-grayed-out-in-some-of-my-panels/m-p/647806/highlight/true... See more...
Hi there are couple of workarounds for this. Here is one which use loadjob. https://community.splunk.com/t5/Other-Usage/Why-export-button-is-grayed-out-in-some-of-my-panels/m-p/647806/highlight/true#M437 Another option is just use outputlookup + inputlookup. I don't believe that there will be any real fix for this as Splunk has announces end of support for classic dashboards and put all efforts to Dashboard studio. I'm expecting that the reason, why export is not working, is when you are using base search and then utilising it on another panel is that there is only this one SID which contains only base search not those in panels. For that reason there is no outputs which can exported on those other panels. Probably they have make decision that as you can just run that query/panel on another window and then do export it's not worth of cost to fix it? r. Ismo
Hi @PickleRick , Thank you for your reply. I sorted the date format at the time of uploading the CSV as %d/%m/%Y. By doing this, all of _time events are not the same and are actually the date fro... See more...
Hi @PickleRick , Thank you for your reply. I sorted the date format at the time of uploading the CSV as %d/%m/%Y. By doing this, all of _time events are not the same and are actually the date from the Posted Transactions Date field. I figured it out in the end by realizing that a Scatter chart is not suitable for date values on the X axis so I changed it to a line chart, and used this search query: index=main source="Transaction File.csv" | fields "Posted Transactions Date" _time "Debit Amount" "Spending Type" | timechart sum("Debit Amount")   Regarding your comment on quotes, I have noticed that comparing my searches to searches I have seen elsewhere.  However I seem to need to use double quotation marks on fields. For example: source="Transaction File.csv" 'Debit Amount'="*" (returns no events) source="Transaction File.csv" Debit Amount="*" (returns no events) However source="Transaction File.csv" "Debit Amount"="*" (returns all events)  
Thanks for the idea about the limits. I checked them and it doesn't look like the cause in this case, though we have run into that before where events with too much data didn't get indexed. Also, th... See more...
Thanks for the idea about the limits. I checked them and it doesn't look like the cause in this case, though we have run into that before where events with too much data didn't get indexed. Also, that looks like a very useful presentation.
I can't find any evidence that this is a simple data issue. We have many customers using queries like this, and it was only noticed a couple weeks ago. One guess is that the behavior changed when we ... See more...
I can't find any evidence that this is a simple data issue. We have many customers using queries like this, and it was only noticed a couple weeks ago. One guess is that the behavior changed when we upgraded from Splunk 8.x to 9.x a couple months ago. To further illustrate: |tstats count where index="my_index" eventOrigin="api" (accountId="8674756857") Result: 6618 |tstats count where index="my_index" eventOrigin="api" (accountId="8674756857" OR serviceType="unmanaged") Result: 0 |tstats count where index="my_index" eventOrigin="api" (accountId="8674756857" OR serviceType="unmanaged" OR noSuchField="noSuchValue*") Result: 6618 So adding a bogus OR term with an asterisk in the value returns the correct result, but without it the result is 0. I can't imagine this is correct behavior, and we have submitted a support request to Splunk.
For timechart to work you need to have reasonable _time field. I suppose that you ingested whole csv file at once and didn't properly parse the time from your data so your _time will be the same for ... See more...
For timechart to work you need to have reasonable _time field. I suppose that you ingested whole csv file at once and didn't properly parse the time from your data so your _time will be the same for all events, right? The way around it would be to overwrite the _time field with the contents of your Posted Transaction Date index=<yourindex> ... | eval _time=strptime('Posted Transaction Date',"%proper%time%format") | timechart values('Debit Amount') Key thing here is specifying proper time format for the strptime command. Also be cautious about using proper quotes.
That doesn't seem right. Those fields should not be empty so something must be overwriting it in search-time (or your indexes are damaged but let's assume they aren't). Try | tstats count where ind... See more...
That doesn't seem right. Those fields should not be empty so something must be overwriting it in search-time (or your indexes are damaged but let's assume they aren't). Try | tstats count where index=dfini by source sourcetype host That should show you what are the indexed fields. You have to search your search-time definitions to see what overwrites those values.
  Hi All, Httpevent collector logs in to splunk, not showing the host,source,sourcetype in splunk, please find the below screen shot, please help me.    
Hi All, I am trying to create a scatter dashboard or similar in Dashboard Studio to show debit transaction amounts over time. A query like this works well in Search, but translates poorly to the ... See more...
Hi All, I am trying to create a scatter dashboard or similar in Dashboard Studio to show debit transaction amounts over time. A query like this works well in Search, but translates poorly to the dashboard: source="Transaction File.csv" "Debit Amount"="*" | stats values("Debit Amount") BY "Posted Transactions Date" I am aware I likely need to convert the the date from string format to date format within my search, something to the effect of:  |eval Date = strptime("Posted Transactions Date","%d/%m/%y") But I am struggling to get the final result. I have also played around with using the _time field instead of Posted Transaction Date  field and with timecharts without success which I think is likely also a formatting issue.  Eg:  source=source="Transaction File.csv" | timechart values("Debit Amount")   As there are multiple debit amount values per day in some cases, I would ideally like a 2nd similar dashboard that sums these debits per day instead of showing them as individual values whilst also removing 1 outlier debit amount value of 7000. Struggling a bit with the required search(s) to get my desired dashboard results. Any help would be appreciated, thank you!        
Exchange is a relatively big solution so depending on what you want to ingest the answer can vary. If you want just the message tracking logs, you can easily ingest them using monitor input. I've ne... See more...
Exchange is a relatively big solution so depending on what you want to ingest the answer can vary. If you want just the message tracking logs, you can easily ingest them using monitor input. I've never dealt with SMTP or OWA logs so can't tell you how these work but I suppose they should also be relatively easily readable. The problem might be in parsing the data of course. QRadar is simply different so don't bring it for comparison.
You didn't say what have you tried so far. Maybe you have some small easily fixable mistake in your configs or maybe your approach is completely wrong. Show us what you've got.
its not extracting the whole data
I'm not sure if any already defined sourcetypes come with the add-on. The description says it provides transforms and some example props.conf but you have to see into it for yourself. Anyway, if kv i... See more...
I'm not sure if any already defined sourcetypes come with the add-on. The description says it provides transforms and some example props.conf but you have to see into it for yourself. Anyway, if kv is to do its job, it must know what sourcetype it's dealing with. With makeresults you haven't provided any. Anyway, this add-on seems strange. It says it extracts some fields in index time. Unless it's the time, it seems like an overkill.
Hi @rrovers , are you using a post process search? if yes, you can only open in search ypur panel. Ciao. Giuseppe
Hi @pavithra  adding your information to the below inputs.conf: [monitor://C:\Users\_svcAPACCommVault01\OneDrive - Lendlease\Desktop\csv\*.csv] disabled = 0 sourcetype = backup index = acn_lendleas... See more...
Hi @pavithra  adding your information to the below inputs.conf: [monitor://C:\Users\_svcAPACCommVault01\OneDrive - Lendlease\Desktop\csv\*.csv] disabled = 0 sourcetype = backup index = acn_lendlease_commvault host = your_host Ciao. Giuseppe
1. The thread is relatively old so you might not get many responses. It's usually better to post a new question (linking to old thread for reference) than to dig up a several-years-old thread. 2. Th... See more...
1. The thread is relatively old so you might not get many responses. It's usually better to post a new question (linking to old thread for reference) than to dig up a several-years-old thread. 2. The .xsl file is meant to be applied on the Vault side - installed to ENE, not used on Splunk's side. https://docs.splunk.com/Documentation/AddOns/released/CyberArk/Setup
Are you trying to set it up in Cloud or on-prem? (the section of Answers where you posted it suggests Cloud but it's better to be sure).
Dear All, I need your assistance in fetching Microsoft Exchange Server logs using the Splunk Universal Forwarder. I can provide the paths for the MSG Tracking, SMTP, and OWA log files. The goal is ... See more...
Dear All, I need your assistance in fetching Microsoft Exchange Server logs using the Splunk Universal Forwarder. I can provide the paths for the MSG Tracking, SMTP, and OWA log files. The goal is to configure the Universal Forwarder to collect these logs and forward them to a central Splunk server. Given that the Splunk documentation indicates that the MS Exchange App is end-of-life (EOL), is it necessary to use an add-on? The documentation suggests creating GPO policies and making other changes. However, in IBM QRadar, the process is simpler: you install the WinCollect agent, specify the paths for MSG Tracking, SMTP, and OWA logs, and the agent collects and forwards the logs to the QRadar Console. The Auto Discovery feature in QRadar then creates the log source automatically. Is there a simpler and more straightforward method to collect these logs using the Splunk Universal Forwarder? Thank you in advance for your assistance.
HF on its own does not have any "HA" or "clustering" feature. There are possibilities to manually craft some active-passive solutions but they require quite a lot of "external architecting" - doing a... See more...
HF on its own does not have any "HA" or "clustering" feature. There are possibilities to manually craft some active-passive solutions but they require quite a lot of "external architecting" - doing a lot of stuff semi-manually on the OS level. I think there was a .conf presentation about it a few years ago but I can't find it.
1. This is not a Professional Support service. People do have their lives and respond when they have some spare time. 2. For typical apache httpd logs there are two built-in sourcetypes - access_com... See more...
1. This is not a Professional Support service. People do have their lives and respond when they have some spare time. 2. For typical apache httpd logs there are two built-in sourcetypes - access_combined and apache_error.
Dashboard studio does not allow for custom JS and does not have any in-built capability to do it on its own. You could try to do something like this using SimpleXML dashboard and some custom JS scrip... See more...
Dashboard studio does not allow for custom JS and does not have any in-built capability to do it on its own. You could try to do something like this using SimpleXML dashboard and some custom JS scripting on your side.