All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@nopera  I recommend installing the add-on on both the indexers and the search heads. Indexers are responsible for index-time operations such as parsing, data transformation, and routing. Therefore... See more...
@nopera  I recommend installing the add-on on both the indexers and the search heads. Indexers are responsible for index-time operations such as parsing, data transformation, and routing. Therefore, any add-on containing props.conf or transforms.conf should be deployed to the indexers. Search Heads handle search-time functions, including dashboards, lookups, macros, and CIM mappings. While it's safe to install the add-on on the search heads for search-time functionality, doing so won’t interfere with index-time processes, provided those configurations are also present on the indexers. In general, it's best practice to install the add-on across all relevant tiers, indexers, search heads, and forwarders, and enable only the necessary components on each, depending on the role of the system. https://docs.splunk.com/Documentation/AddOns/released/Overview/Wheretoinstall 
@nopera  If you are using indexers (or a standalone Splunk Enterprise instance), follow these steps: Deploy the TA-Exchange-Mailbox add-on to the indexer at the following path: /opt/splunk/etc/a... See more...
@nopera  If you are using indexers (or a standalone Splunk Enterprise instance), follow these steps: Deploy the TA-Exchange-Mailbox add-on to the indexer at the following path: /opt/splunk/etc/apps/TA-Exchange-Mailbox Restart the Splunk service on the indexer to apply the changes. On the Universal Forwarder, verify that the inputs.conf is correctly configured with the appropriate sourcetype for message tracking logs.
  @kiran_panchavat    I dont use heavy forwarder, i installed universal forwarded to the exchange server, i placed the add-on "TA-Exchange-Mailbox" (server is in mailbox role) to the path "C:\Prog... See more...
  @kiran_panchavat    I dont use heavy forwarder, i installed universal forwarded to the exchange server, i placed the add-on "TA-Exchange-Mailbox" (server is in mailbox role) to the path "C:\Program Files\SplunkUniversalForwarder\etc\apps". Now i am getting the logs but message tracking logs arent parsed correctly.  What should I do now? Example logs below from test env.            
Did you get answer to this ? Can u help with resolution you obtained?
Thanks, "AND" is uppercase in both examples, but the issue persists. I followed your suggestion and checked the search job properties and the eventSearch changes to: index=my_index System="MySy... See more...
Thanks, "AND" is uppercase in both examples, but the issue persists. I followed your suggestion and checked the search job properties and the eventSearch changes to: index=my_index System="MySystem*" (Title=A OR Title=B OR Title=C OR Title=D OR Title=E OR (Title=F FROM=1) OR (Title=G FROM=2))   Still not working via REST, unfortunately.
Thank you Prewin that has worked
@Showkat_CT  Splunk SOAR Cloud is a managed SaaS offering, you need a Splunk subscription or trial with SOAR enabled. You'll receive a dedicated SOAR Cloud instance and login credentials from Splunk... See more...
@Showkat_CT  Splunk SOAR Cloud is a managed SaaS offering, you need a Splunk subscription or trial with SOAR enabled. You'll receive a dedicated SOAR Cloud instance and login credentials from Splunk. If you haven’t yet, reach out to your Splunk account team or open a support case to get your Cloud environment provisioned. https://help.splunk.com/en/splunk-soar/soar-cloud/administer-soar-cloud/introduction-to-splunk-soar-cloud/administer-splunk-soar-cloud?utm_source=chatgpt.com  You'll receive a URL (like: https://<your-org>.soar.splunkcloud.com) plus a default admin email and temporary password. https://help.splunk.com/en/splunk-soar/soar-cloud/administer-soar-cloud/introduction-to-splunk-soar-cloud/take-a-tour-of-splunk-soar-cloud-and-perform-product-onboarding-when-you-log-in-for-the-first-time?utm_source=chatgpt.com 
Hello All, Can anyone tell me how i can access the splunk SOAR for Cloud?
Thanks, tried to filter downstream without success, unfortunately. I am using URL encoding.
CPU bottleneck.
Hi @verbal_666 , I tried parallelPipelines=4 but I came back to 2 because indexing was better than 2 but I had issues in searches that were slower. Ciao. Giuseppe
@PrewinThomas  @tej57  I have tried implementing the same, but for each redirection , the dashboard loads only the default value mentioned on the redirecting dashboard :  My redirecting dashboard ... See more...
@PrewinThomas  @tej57  I have tried implementing the same, but for each redirection , the dashboard loads only the default value mentioned on the redirecting dashboard :  My redirecting dashboard link :  https://host:8000/en-US/app/app_name/dashboard_name?form.time.earliest=2025-03-01T00:00:00.000&amp;form.time.latest=now I have also tried it by removing the default value , { "type": "input.timerange", "options": { "token": "passed_time" }, "title": "Global Time Range" } on this scenario i observed ,  my token is not being passed. and panels are showing waiting for input .  i validated it by capturing the tokens on the "ADD TEXT " field. the token passes its value, but my panel remained the same showing waiting for input. I have also tried , with different default value for the input , still the same.   "inputs": {         "input_global_trp": {             "type": "input.timerange",             "options": {                 "token": "passed_time",                 "defaultValue": "$passed_time.latest$,$passed_time.earliest$"             },             "title": "Global Time Range"         }     },   I have also remove the whole input , and only captured the token at :    "defaults": {         "dataSources": {             "ds.search": {                 "options": {                     "queryParameters": {                         "latest": "$passed_time.latest$",                         "earliest": "$passed_time.earliest$"                     }                 }             }         } Just so puzzled, why my token values are not passed to the source , but fine on the text box.  kindly advice 
Hi @livehybrid, Thanks for providing the information. With using rsync, tested with All-in-One and it is working. But in a situation whereby there's a cluster involved for the indexer/search head ... See more...
Hi @livehybrid, Thanks for providing the information. With using rsync, tested with All-in-One and it is working. But in a situation whereby there's a cluster involved for the indexer/search head and it is in a separate network/location. Will the same method work?? or there's another method to follow?
Hi @isoutamo,  Thanks for providing some of the article. For All-in-One, tested with using the rsync. Everything went quite smooth.  But in a situation whereby there's a cluster involved for the i... See more...
Hi @isoutamo,  Thanks for providing some of the article. For All-in-One, tested with using the rsync. Everything went quite smooth.  But in a situation whereby there's a cluster involved for the indexer/search head and it is in a separate network/location. Will the same method work?? or there's another method to follow?
Trying different token name on both dashabord doesnt work either.
Hi, The package of Splunk Otel has been update. But during the update, the configuration file has rename to *.newrpm and create a new one, like a default configuration file. I have rename the saved... See more...
Hi, The package of Splunk Otel has been update. But during the update, the configuration file has rename to *.newrpm and create a new one, like a default configuration file. I have rename the saved file *.newrpm to *.yaml and restart with success the service. Thanks for your help Olivier
First, I suspect that you meant the input looks like key values AdditionalInfo DeviceID DeviceType OS user has removed device with id "alpha_numeric_field" in area "alpha_numeric_f... See more...
First, I suspect that you meant the input looks like key values AdditionalInfo DeviceID DeviceType OS user has removed device with id "alpha_numeric_field" in area "alpha_numeric_field" for user "alpha_numeric_field". alpha_numeric_field mobile_device Windows Second, I have a question about the origin of "key" and "values".  Could they come from a structure such as JSON?  Maybe there is a better opportunity than at the end of processing. Third, I suspect that you meant "the output would be" AdditionalInfo DeviceID DeviceType OS user has removed device with id "alpha_numeric_field" in area "alpha_numeric_field" for user "alpha_numeric_field". alpha_numeric_field mobile_device Windows   Finally, if your Splunk is 8.1 or later, you can use JSON functions and the multivalue mode of foreach to do the job: | eval idx = mvrange(0, mvcount(key)) | eval keyvalue = json_object() | foreach idx mode=multivalue [eval keyvalue = json_set(keyvalue, mvindex(key, <<ITEM>>), mvindex(values, <<ITEM>>))] | spath input=keyvalue | fields - idx key values keyvalue Here is an emulation for you to play with and compare with real data | makeresults format=csv data="key,values AdditionalInfo,user has removed device with id \"alpha_numeric_field\" in area \"alpha_numeric_field\" for user \"alpha_numeric_field\". DeviceID,alpha_numeric_field DeviceType,mobile_device OS,Windows" | stats list(*) as * ``` data emulation above ```
@seetide  is this what you are trying to achieve?  Ignore events where "NONE" appears only in allowed fields (e.g., ALLOWED1, ALLOWED2, ALLOWED3). Include events where "NONE" appears in any other ... See more...
@seetide  is this what you are trying to achieve?  Ignore events where "NONE" appears only in allowed fields (e.g., ALLOWED1, ALLOWED2, ALLOWED3). Include events where "NONE" appears in any other field, even if it also appears in allowed fields. It will be great if you can post with some examples. Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
@vishalduttauk  In a regular search, RecipientAddress is extracted at search time, so you can use it directly in eval. But in Ingest Actions, you're working with the raw event stream before field ex... See more...
@vishalduttauk  In a regular search, RecipientAddress is extracted at search time, so you can use it directly in eval. But in Ingest Actions, you're working with the raw event stream before field extractions happen. But you can use this as workaround to drop events that contain this email address. NOT match(_raw, "splunk\.test@test\.co\.uk")   Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
@tomapatan  Can you try with below, search_query = ''' search index=my_index System="MySystem*" (Title=A OR Title=B OR Title=C OR Title=D OR Title=E OR Title=F OR Title=G) | eval include=if((Title=... See more...
@tomapatan  Can you try with below, search_query = ''' search index=my_index System="MySystem*" (Title=A OR Title=B OR Title=C OR Title=D OR Title=E OR Title=F OR Title=G) | eval include=if((Title="F" AND FROM="1") OR (Title="G" AND FROM="2") OR match(Title, "^[ABCDE]$"), 1, 0) | where include=1 ''' Note: since you are using python, hope you are using url encoding. Without encoding, the API may misinterpret or strip them. Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!