"I need help with this XML for a dashboard; essentially, I need to call a token that modifies data within a report, having already created the token with the name 'data.' How can I do this?" <for...
See more...
"I need help with this XML for a dashboard; essentially, I need to call a token that modifies data within a report, having already created the token with the name 'data.' How can I do this?" <form version="1.1"> <label>Lista IP da bloccare</label> <fieldset submitButton="true" autoRun="false"> <input type="time" token="data"> <label></label> <default> <earliest>rt-24h</earliest> <latest>rt</latest> </default> </input> </fieldset> <row> <panel> <table> <search ref="checkpoint1"></search> <option name="drilldown">none</option> </table> </panel> </row> </form>
Hey, Does anyone know what is the updated code for the v2 endpoint? I am trying to deploy apps with this config to Splunk Cloud but the vetting fails because new version of Splunk doesnt support old...
See more...
Hey, Does anyone know what is the updated code for the v2 endpoint? I am trying to deploy apps with this config to Splunk Cloud but the vetting fails because new version of Splunk doesnt support old endpoint I tried adapting the line of code for the button with the documentation, but it doesnt download anything and redirects to a 404 page.
1. Please post your data/code samples in a pre-formatted way (using either the "preformatted" style or the code sample control in the editor. It makes the sample easier to read. 2. It's not clear wh...
See more...
1. Please post your data/code samples in a pre-formatted way (using either the "preformatted" style or the code sample control in the editor. It makes the sample easier to read. 2. It's not clear what you want to get from this data. 3. Unless you have a very good reason and a strong use case, you should not parse data _into fields_ while indexing (in other words - create indexed fields). Most parsing in Splunk is done in search time. 4. Unless you have a very very good reason (even better one than the one for the indexed fields) you should not use SHOULD_LINEMERGE=true. It gives you a huge performance hit.
We have exactly the same problem here. Tested today on a Windows 2016/2019 - UFW Update from 9.1.1 to 9.1.3 But a new installation is out of the question for us, as you will lose all checkpoints and...
See more...
We have exactly the same problem here. Tested today on a Windows 2016/2019 - UFW Update from 9.1.1 to 9.1.3 But a new installation is out of the question for us, as you will lose all checkpoints and a reread of all is the result.
If I understand you correctly, you want the settings you defined on your DS to propagate to forwarders across your environment (or at least to some designated UF(s)). You did the first step correctl...
See more...
If I understand you correctly, you want the settings you defined on your DS to propagate to forwarders across your environment (or at least to some designated UF(s)). You did the first step correctly - you created an app in etc/deployment-apps (I hope the "deploymentapps" in your description is just a typo). But now you have to define a server class tying app(s) to deployment client(s) and deload deployment server. See the https://docs.splunk.com/Documentation/Splunk/latest/Updating/Aboutdeploymentserver document (read thoroughly the pages about creating server classes and deploying apps).
To most of those questions answers can be found on Splunk documentation pages. Some of them require a bit of experience with the system. Both knowing the docs as well as experience sums up to product...
See more...
To most of those questions answers can be found on Splunk documentation pages. Some of them require a bit of experience with the system. Both knowing the docs as well as experience sums up to product knowledge and abilities which are required to work with the product. Unless it's a very entry-level position where you should learn everything from scratch (but for such work you shouldn't get such questions in interview), you should know this before trying to administer Splunk environments. Otherwise you can do some very costly damage to your (potential) employer installation.
It does sound like a very peculiar use case. Maybe not even very well suited to searching Splunk directly. You definitely should try to engage Splunk Consultant to talk over your needs - maybe you ne...
See more...
It does sound like a very peculiar use case. Maybe not even very well suited to searching Splunk directly. You definitely should try to engage Splunk Consultant to talk over your needs - maybe you need some form of middleware or a completely different approach to data access.
Check the addon ASN Lookup Generator https://splunkbase.splunk.com/app/3531 First | asngen | table ip asn autonomous_system | outputlookup asn And then source="yourdata" | iplocation youri...
See more...
Check the addon ASN Lookup Generator https://splunkbase.splunk.com/app/3531 First | asngen | table ip asn autonomous_system | outputlookup asn And then source="yourdata" | iplocation youriptable | table youriptable, City, Country | lookup asn ip AS IPaddress
If you want to track changes per id, you can do something like | streamstats window=1 current=f values(*) as previous_* by id This will give you a value from previous event with a "previous_" prefi...
See more...
If you want to track changes per id, you can do something like | streamstats window=1 current=f values(*) as previous_* by id This will give you a value from previous event with a "previous_" prefix. Then you can try to use foreach to fiddle with that.
Hello, I'm looking of your insights to pinpoint changes in fields over time. Events structured with timestamp, ID, and various fields. Seeking advice on constructing a dynamic timeline to identify a...
See more...
Hello, I'm looking of your insights to pinpoint changes in fields over time. Events structured with timestamp, ID, and various fields. Seeking advice on constructing a dynamic timeline to identify altered values and corresponding fields. Example events below: 10:20:30 25/Jan/2024 id=1 a=1534 b=253 c=384 ...
10:20:56 25/Jan/2024 id=1 a=1534 b=253 c=385 ...
10:20:56 25/Jan/2024 id=2 a="something" b=253 c=385 ...
10:21:35 25/Jan/2024 id=2 a="something" b=253 c=385 ...
10:22:56 25/Jan/2024 id=2 a="xyz" b="-" c=385 ... Desired result format: 10:20:56 25/Jan/2024 id=1 changed field "c"
10:22:56 25/Jan/2024 id=2 changed field "a", changed field "b" My pseudo SPL to find changed events: ... | streamstats reset_on_change=true dc(*) AS * by id | foreach * [ ??? ] With hundreds of fields per event, seeking efficient method - considering a combination of streamstats, foreach, transaction or stats. Insights appreciated.
Hi If/when you have millions of customers which all have their own datasets which should see/use only that customer you have quite interesting challenge. I really propose that you as help from Splun...
See more...
Hi If/when you have millions of customers which all have their own datasets which should see/use only that customer you have quite interesting challenge. I really propose that you as help from Splunk Partner on your local area and they could ask more help from Splunk to figure out if there is any reasonable way to do this. r. Ismo
Hi this app is targeted to manufacturing industry where they have telemetry etc. data from machines. Probably/maybe you could generate needed data also from splunk, but I suppose that you will get ...
See more...
Hi this app is targeted to manufacturing industry where they have telemetry etc. data from machines. Probably/maybe you could generate needed data also from splunk, but I suppose that you will get that much easier with splunk's introspection data and use e.g. standard splunk command predict. You will get enough information even without MLTK just look current _introspection and use predict, trend line etc. r. Ismo
Hi Here is Splunk's InfoSec app, which you can look and thing what are those panels which are important to you. Based on that you can check which logs are needed to fulfil those. https://splunkbase...
See more...
Hi Here is Splunk's InfoSec app, which you can look and thing what are those panels which are important to you. Based on that you can check which logs are needed to fulfil those. https://splunkbase.splunk.com/app/4240 Very easy to setup and use, but still you will get lot of information what is happening. r. Ismo
Hi Have you configured the Add-On or just try to add new inputs? Usually you must 1st create connection to your Azure and correct organisation. Azure organization is your own AzureDevOps organisat...
See more...
Hi Have you configured the Add-On or just try to add new inputs? Usually you must 1st create connection to your Azure and correct organisation. Azure organization is your own AzureDevOps organisation what ever it's, not tenant eg. in Entra ID. Then you must create your own security token under that DevOps organization/project, which has enough access to get that information. Then it seems that those python scripts have not tested enough. As all python libraries are under separate lib directory (correctly) not in bin directory, they must be additional path in those scripts. Otherwise it cannot found needed libraries. I fixed it on my test env like import os
APP_LIB_FOLDER = os.path.join(os.path.dirname(__file__), "..", "lib")
sys.path.insert(0, APP_LIB_FOLDER) Just add those before it try to import e.g. solnlib. That fix needs to do all those files which are directly used from splunk side. r. Ismo
Hi Your best (and probably only) option is to connect your splunk account manager and ask PS support for this case. At least they could do an analyse about your environment and check is is what it r...
See more...
Hi Your best (and probably only) option is to connect your splunk account manager and ask PS support for this case. At least they could do an analyse about your environment and check is is what it required. They also could create a plan how this situation could fixed asap. r. Ismo
Hi if you are looking any workplace, where you are needing knowledge about those answers? I strongly propose, that you start your learning path to Splunk Certified System Admin to understand enough, ...
See more...
Hi if you are looking any workplace, where you are needing knowledge about those answers? I strongly propose, that you start your learning path to Splunk Certified System Admin to understand enough, what splunk is and how it will work. r. Ismo
Hi just like @richgalloway said. One comment about your "restart ds". It's not needed to restart it, just reload it's configuration for deployment part with command splunk reload deploy-server Or...
See more...
Hi just like @richgalloway said. One comment about your "restart ds". It's not needed to restart it, just reload it's configuration for deployment part with command splunk reload deploy-server Or even add more granularity there it you have lot of configurations and restart or even base reload take too long. r. Ismo
Hi I suppose that you have UF which get its configurations from DS. Then you have distributed SH + Indexer(s), but not any HFs. Is this correct assumption? If so you should deploy inputs.conf and ...
See more...
Hi I suppose that you have UF which get its configurations from DS. Then you have distributed SH + Indexer(s), but not any HFs. Is this correct assumption? If so you should deploy inputs.conf and outputs.conf to UF from DS, as you probably have done as you will get events into indexer(s). As those trasforms.conf and props.conf didn't work, I assume that. you haven't install those into indexer(s)? Based on these assumptions, you should create a new app which contains those transforms and props.confs and install it into indexer(s). Then do a restart and check if it's working. Anyhow you should do this kind on onboarding on separate instance, like your workstation. There just ensure that your configurations are working and then install those into production. r. Ismo