Thanks for the response. Maciek Stopa on the Splunk Slack workspace provided this novel code that has worked well. This doesn't handled the metrics and sc4s events but I handled those in splunk_meta...
See more...
Thanks for the response. Maciek Stopa on the Splunk Slack workspace provided this novel code that has worked well. This doesn't handled the metrics and sc4s events but I handled those in splunk_metadata.csv. # cp example_postfiler.conf /opt/sc4s/local/config/app_parsers/
# systemctl restart sc4s
block parser app-dest-rewrite-index() {
channel {
rewrite {
r_set_splunk_dest_update_v2(
index("${.splunk.index}_unique-suffix")
);
};
};
};
application app-dest-rewrite-index[sc4s-postfilter] {
parser { app-dest-rewrite-index(); };
};
1. This is not a valid SPL. Please post your literal search in a code block or preformatted paragraph. 2. What do you mean "unable to work"? What results are you getting?
I am trying to get value of a field from a previous scheduled savedsearch in a new field using loadjob, however unable to get it to work. I am using something like: index=my_pers_index sourcetype=A...
See more...
I am trying to get value of a field from a previous scheduled savedsearch in a new field using loadjob, however unable to get it to work. I am using something like: index=my_pers_index sourcetype=ACCT | eval userid = [| loadjob savedsearch="myuserid:my_app:my_saved_search" | return actor] wherein, myuserid - owner id my_app - is the application name my_saved_search - name of the saved search that is present in savedsearches.conf & is scheduled actor is a field name in - my_saved_search
Hi, I need help as I'm new to Splunk and was assigned this role by a resigned administrator. I need to find out the expiration date of our Splunk Cloud license. I can't access the menu: Settings > S...
See more...
Hi, I need help as I'm new to Splunk and was assigned this role by a resigned administrator. I need to find out the expiration date of our Splunk Cloud license. I can't access the menu: Settings > System > Permissions (I've checked the assigned user's capabilities but there are no permissions for license_edit, license_read, license_tab, or license_view_warnings). Is it possible that there's a license manager somewhere on-premise since we're using Heavy Forwarder and Deployment Server? Please advise on what I can do if I can't contact the previous administrator. What information should I gather beforehand to be ready to open a Support Ticket? I apologize if I've misunderstood anything.
Hi Anup.Thatte
I gathered some Appdynamics documentation that could help you with the uninstallation. See following documentation for detailed uninstallation steps for each agent
Uninstalli...
See more...
Hi Anup.Thatte
I gathered some Appdynamics documentation that could help you with the uninstallation. See following documentation for detailed uninstallation steps for each agent
Uninstalling AppDynamics Machine Agent https://docs.appdynamics.com/appd/24.x/latest/en/infrastructure-visibility/machine-agent/administer-the-machine-agent/uninstall-the-machine-agent
Uninstalling AppDynamics Java Agent https://docs.appdynamics.com/appd/22.x/22.12/en/application-monitoring/install-app-server-agents/java-agent/administer-the-java-agent/uninstall-the-java-agent
Uninstalling AppDynamics ABAP App Agents and Datavard Transports/Now:SNP Crystal Bridge https://docs.appdynamics.com/sap/en/upgrade-or-uninstall-the-solution (see Uninstall the ABAP Agent and SNP CrystalBridge® Monitoring)
Hope this helps.
Regards,
Martina
Hi @pavi.p ,
I have answered similar question on this post. Kindly check my answer below. it might help you to resolve your issue. :) https://community.appdynamics.com/t5/NET-Agent-Installati...
See more...
Hi @pavi.p ,
I have answered similar question on this post. Kindly check my answer below. it might help you to resolve your issue. :) https://community.appdynamics.com/t5/NET-Agent-Installation/Couldn-t-get-MSMQ-entry-point-for-NET-Consumer-Application/m-p/54299/highlight/true#M1619 Hope this helps. Regards, Martina
Yes, copying data from one index to another, in the same or different indexer, will consume license (except for sourcetype=stash). That said, I would expect you to use 600GB of license so the extra ...
See more...
Yes, copying data from one index to another, in the same or different indexer, will consume license (except for sourcetype=stash). That said, I would expect you to use 600GB of license so the extra 200GB came from somewhere else, probably unexpected data sources. Use the Monitoring Console to show license usage over time to see when the increase started. Then split the results by source or sourcetype to see which is responsible for the increase.
Hello @Jonathan.Wang 、
To install demo profile controller, 50GB = 51200 MB Disk Space is required. Please make sure to allocate sufficient disk space. Or if you need to skip the disk space requirem...
See more...
Hello @Jonathan.Wang 、
To install demo profile controller, 50GB = 51200 MB Disk Space is required. Please make sure to allocate sufficient disk space. Or if you need to skip the disk space requirement, you can update the configuration file.
Steps:
Locate the Configuration File:
For the demo profile controller, the file is likely in the path: platform/platform-admin/archives/controller/<ver>/playbooks/controller-demo.groovy
Edit the Disk Space Property:
Update the property ”controller_data_min_disk_space_in_mb" to a lower disk space number. For example: controller_data_min_disk_space_in_mb = 2000 * 1024
Note: This manual configuration is intended for test purposes only and may not be suitable for production environments.
Additional Information:
For more details on Controller system requirements and settings, please refer to the AppDynamics Documentation.
Hope this helps.
Regards, Martina
Using the TERM() directive in search can dramatically improve speed as it will not have to do a data search, so using TERM(preprod) would avoid having to extract the fields and compare values from...
See more...
Using the TERM() directive in search can dramatically improve speed as it will not have to do a data search, so using TERM(preprod) would avoid having to extract the fields and compare values from the data - question are you using JSON INDEXED_EXTRACTIONS? The NOT search will be expensive, depending on what proportion of events will have mon-tx- in the data, you may find benefit in filtering that in a subsequent | where clause. You could also do (http_status=5* AND (TERM(500) OR TERM(501) OR ... )) i.e. include all the 5xx codes you want - if you know what they can be. But it may be that the rest of your search is where some of the performance problems are - can you share a bit more of the search?
Aside from the comments from @richgalloway in that you need a header, if you are just looking to ingest those rows and do something with that info, then using | inputlookup dummy.csv will input th...
See more...
Aside from the comments from @richgalloway in that you need a header, if you are just looking to ingest those rows and do something with that info, then using | inputlookup dummy.csv will input the rows into your pipeline. As you have 3+ columns in the data, first contains DayOfWeek,DD/MM/YYYY because the entire entry is quoted, you will have to extract the date if you need it, using the rex command. The second column will automatically have the field name defined by the header. Unless you have a header, it will lose the first line.
Ok, between your commentary and my re-write of the documentation I got this working. I will post my re write of the splunk instructions and the confs ASAP. Wish I could attach docs..
As @marnall says, you are using the token differently in each part of the search. How have you defined the multiselect prefix/suffix settings. You are using the syntax $opc_t|s$ correctly which will...
See more...
As @marnall says, you are using the token differently in each part of the search. How have you defined the multiselect prefix/suffix settings. You are using the syntax $opc_t|s$ correctly which will cause it to be quoted, so you don't need to surround that with extra quotes as in the other example. However, as you are able to define the token prefix/suffix and value prefix/suffix you generally just need to use $opc_t$. Let's assume your multiselect has this type of definition <prefix> IN (</prefix>
<suffix>)</suffix>
<valuePrefix>"</valuePrefix>
<valueSuffix>"</valueSuffix>
<delimiter>, </delimiter> so the token prefix is IN ( and then each value will be quoted valuesPrefix/Suffix and delimited with a comma and then the final token will be terminated with ) so your token would look like IN ("a","b","c","d") so you would then use it like this ... opc=$opc_t$ ...
OR
... ocp=$opc_t$ ... because you have not included the field name in the token value itself.
@elend You can pass tokens to a report by running the report using the savedsearch command and passing the values to the report - this assumes the report is set to have replaceable parameters, see ...
See more...
@elend You can pass tokens to a report by running the report using the savedsearch command and passing the values to the report - this assumes the report is set to have replaceable parameters, see the comments about replacement in this page. https://docs.splunk.com/Documentation/SplunkCloud/9.1.2312/SearchReference/savedsearch so you would run this in your dashboard to run the report | savedsearch report_name value=$token|s$ where $token$ is token you are passing to the report and it is assigned to the replaceable parameter $value$ in your report
This is great, thank you. I have struggled with eventstats in the past, and this is the first time I can remember seeing a use case for it that made sense to me.
Sweet I re-wrote the "if 0 events" to "if there were events but now there are none" This way I suppose the alert should be set to if there are "1 events"? If the initial search returns 0 events the...
See more...
Sweet I re-wrote the "if 0 events" to "if there were events but now there are none" This way I suppose the alert should be set to if there are "1 events"? If the initial search returns 0 events then there will be 1 generated event with the field "description". Thanx
I'm not 100% sure it will help but TA_nix is not meant for auditd logs. There are at least two different add-ons on splunkbase specifically for auditd logs.
@PickleRickThat was the issue. I was only pushing to the UF and not the indexers. Sometimes I forget that props.conf has parts that go to the indexer and parts go to the search heads.
Little late on this, but yes, I just tried this with a kvstore and it fails under 9.1.4. If I export a subset of my kvstore data to a CSV, it's fine. No mv fields in my kvstore data, either.
This can have multiple solutions depending on your data parameters (especially cardinality). Your main problem (performancewise) will be the second condition. Because how can you find something that...
See more...
This can have multiple solutions depending on your data parameters (especially cardinality). Your main problem (performancewise) will be the second condition. Because how can you find something that's not there? You have to list everything that is there and compare with what you get from the first conditon. With a small result set that's relatively quick but with a big one - not so much. Also: 1. Do you have any fields extracted from your events? 2. In the second type of events is there a space between ipaddress: and the actual address or not?