All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

That is not normal, but not unheard of.  It depends on how busy the staff is.  Your Splunk account team should be able to find out why the vetting process is taking so long.
I suspect you are right, but you probably should post a separate question about that.
What is your question?  What have you tried so far and how did those efforts not meet expectations? Have you looked at the JSON functions in the Search Reference Manual?
I have found a search in the charge back application that might fit for seeing the SVC's by index.  Unfortunately that's how my company manages costs, by index.  The search is good, but I'm still hav... See more...
I have found a search in the charge back application that might fit for seeing the SVC's by index.  Unfortunately that's how my company manages costs, by index.  The search is good, but I'm still having issues getting just the SVC's and index as my return:  I did modify it from one day to 1 month, but I only want it to bring back for one month and thus have only one line of results.   Any help would be appreciated. index=summary source="splunk-ingestion" | `sim_filter_stack(myimplementation)` | dedup keepempty=t _time idx st | stats sum(ingestion_gb) as ingestion_gb by _time idx | eventstats sum(ingestion_gb) as total_gb by _time | eval pct=ingestion_gb/total_gb | bin _time span=1m | join _time [ search index=summary source="splunk-svc-consumer" svc_consumer="data services" svc_usage=* | fillnull value="" svc_consumer process_type search_provenances search_type search_app search_label search_user unified_sid search_modes labels search_head_names usage_source | eval unified_sid=if(unified_sid="",usage_source,unified_sid) | stats max(svc_usage) as utilized_svc by _time svc_consumer search_type search_app search_label search_user search_head_names unified_sid process_type | timechart span=1m sum(utilized_svc) as svc_usage ] | eval svc_usage=svc_usage*pct | timechart useother=false span=1m sum(svc_usage) by idx limit=200
Hi, What's the best way to only do a Lookup based on the results of the main search?  I want to only run this when 2 fields don't match.  Pseudo would be If field1!=field2 THEN | lookup accounts ... See more...
Hi, What's the best way to only do a Lookup based on the results of the main search?  I want to only run this when 2 fields don't match.  Pseudo would be If field1!=field2 THEN | lookup accounts department as field2 OUTPUT So like an if then statement most programming languages allow Thanks Lee
"where sha256a is from | eval sha256a=sha256({ "key1": "val1", "key2":"val2"})" What are you saying in the above statement? Do you want the events to be sha256 encoded? That's not what you put in yo... See more...
"where sha256a is from | eval sha256a=sha256({ "key1": "val1", "key2":"val2"})" What are you saying in the above statement? Do you want the events to be sha256 encoded? That's not what you put in you example so that part is a bit confusing. The first event in your combined json starts with sha256a and the second sha256b. Should the next be sha256c? Please post example events and an example of what you would like them transformed into.
@VatsalJagani  thanks for your message, i just checked the automatic lookups and i don't see one created for test.csv, Am i missing something? Do i need to check somewhere else? please help me
Hi,  quick summary of our deployment: - Splunk standalone 9.0.6 - PaloAlto Add-on and App freshly installed 8.1.0 - SC4S v3.4.4 sending logs to splunk - PA logs ingested in indexes and sourcetyp... See more...
Hi,  quick summary of our deployment: - Splunk standalone 9.0.6 - PaloAlto Add-on and App freshly installed 8.1.0 - SC4S v3.4.4 sending logs to splunk - PA logs ingested in indexes and sourcetypes according SC4S official doc https://splunk.github.io/splunk-connect-for-syslog/main/sources/vendor/PaloaltoNetworks/panos/ - I see events in all indexes and with all sourcetypes. Indexes: netfw, netproxy, netauth, netops Sourcetypes: pan:traffic , pan:threat , pan:userid, pan:system, pan:globalprotect, pan:config What else do I need to do to make the official PaloAlto App to work? I checked the documentation https://pan.dev/splunk/docs/installation/  and I enable the data acceleration, and still no data is shown in any dashboard. I don't know what else is missing, any suggestion? thanks a lot
Thanks @gcusello !
Hi @jwalrath1 , I don't think that this is a question for the Community: it requires a Splunk Professional Service Specialist. Anyway, if you can define an hostname and/or an IP to use for configur... See more...
Hi @jwalrath1 , I don't think that this is a question for the Community: it requires a Splunk Professional Service Specialist. Anyway, if you can define an hostname and/or an IP to use for configurations it should run, but I hint to ask to a Splunk PS. Ciao. Giuseppe
Thanks @richgalloway I have a second part to this question. Can I use the manager node to do a deployment to replicate configurations (dashboards and reports) saved on  site A to site B? Could this b... See more...
Thanks @richgalloway I have a second part to this question. Can I use the manager node to do a deployment to replicate configurations (dashboards and reports) saved on  site A to site B? Could this be done with the SHC deployer if I were to do a deployment on a weekly bases for example?
Hi @richgalloway  Thank you for the inputs I want to go with the modify my alert query by using look up file  Like I want to add the holidays dates in the Excel sheet and will upload to splunk Bu... See more...
Hi @richgalloway  Thank you for the inputs I want to go with the modify my alert query by using look up file  Like I want to add the holidays dates in the Excel sheet and will upload to splunk But I am not understanding how to frame a query with that now, below is my query Index=error-logs  status=401 |Stats count    Can you please help 
If I have a multisite architecture with site A and site B, can they live on different cloud environments and still have index replication? For example, if I have site A components on Azure but site B... See more...
If I have a multisite architecture with site A and site B, can they live on different cloud environments and still have index replication? For example, if I have site A components on Azure but site B is on AWS, can I still utilize index clustering across the two sites for replication?
My query returns many events, each event is in a form of a json i.e. { "key1": "val1", "key2":"val2"} I would like to convert all events to one event that contains all the original events using sha2... See more...
My query returns many events, each event is in a form of a json i.e. { "key1": "val1", "key2":"val2"} I would like to convert all events to one event that contains all the original events using sha256 of the original event as the key so the new json file will look like: { sha256a: { "key1": "val1", "key2":"val2"}, sha256b: { "key1": "val1a", "key2":"val2a"}, } where sha256a is from | eval sha256a=sha256({ "key1": "val1", "key2":"val2"})
Splunk doesn't have a built-in feature to do that because Bad Things happen even on holidays. You can either modify the alert SPL to not trigger on certain days or disable them on those days.  The H... See more...
Splunk doesn't have a built-in feature to do that because Bad Things happen even on holidays. You can either modify the alert SPL to not trigger on certain days or disable them on those days.  The Holidays app (https://splunkbase.splunk.com/app/4853) may help with that.
@Roy_9 - This query (index="_internal" log_level=ERROR "test.csv") has nothing to do with lookup. It seems you have Automatic lookup (props.conf) for automatic lookup, which is having similar permis... See more...
@Roy_9 - This query (index="_internal" log_level=ERROR "test.csv") has nothing to do with lookup. It seems you have Automatic lookup (props.conf) for automatic lookup, which is having similar permission issue, that's what causing it.  
Use cases do not need to be mapped from SSE to ES.  If there is an equivalent search in ES, then use that (after modifying it as necessary for your environment); otherwise, copy the SSE search into a... See more...
Use cases do not need to be mapped from SSE to ES.  If there is an equivalent search in ES, then use that (after modifying it as necessary for your environment); otherwise, copy the SSE search into a new Correlation Search and modify it as necessary.
Hi @VatsalJagani  I am getting the results, when i am running it inside the same app now but however when i am running the search index="_internal" log_level=ERROR "test.csv" i am still seeing the e... See more...
Hi @VatsalJagani  I am getting the results, when i am running it inside the same app now but however when i am running the search index="_internal" log_level=ERROR "test.csv" i am still seeing the error. The lookup table 'test.csv' requires a .csv or KV store lookup definition.   Thanks
I have an alert but I want to suppress it during holidays How can I do that????
Hello everyone, I am trying to enable some basic detections that found from the Splunk Security Essentials app. We do have ES however; we are still in the process to getting all of our data CIM comp... See more...
Hello everyone, I am trying to enable some basic detections that found from the Splunk Security Essentials app. We do have ES however; we are still in the process to getting all of our data CIM complaint. Do alerts from the Splunk Security Essentials app need to be map to to ES using the "add mapping " option? or do these basic alerts have an equivalent in the ES content management use cases tab?