All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@ITWhisperer Thanks to you. I have an issue I need to use the same regex on two different fields butit throws an error when i run the below query  | inputlookup remediation.csv | stats count by kno... See more...
@ITWhisperer Thanks to you. I have an issue I need to use the same regex on two different fields butit throws an error when i run the below query  | inputlookup remediation.csv | stats count by knowbe4, solution | rex field=knowbe4 mode=sed "s/<\/?\w+.*?\/?>//g" rex field=solution mode=sed "s/<\/?\w+.*?\/?>//g"  
What are the various methods to integrate 3rd party SaaS applications with Splunk.
Hi @m92, with the table command you have many events with the same srcip and dfferent _time. Do you want different lines if you have different _time? if yes, you can add _time to the BY clause (i... See more...
Hi @m92, with the table command you have many events with the same srcip and dfferent _time. Do you want different lines if you have different _time? if yes, you can add _time to the BY clause (index="index1" Users =* IP=*) OR (index="index2" tag=1 ) | regex Users!="^AAA-[0-9]{5}\$" | eval IP=if(match(IP, "^::ffff:"), replace(IP, "^::ffff:(\d+\.\d+\.\d+\.\d+)$", "\1"), IP) | eval ip=coalesce(IP,srcip) | stats dc(index) AS index_count values(Users) AS Users BY ip _time | where index_count>1 | table Users, ip, _time even if, in this way you could have different _time in the two indexes so it will be difficoult to group by _time. Ciao. Giuseppe
@PickleRick    Thank you very much for your reply. If this is what you said, as shown in the picture below The summary range refers to the period for which data is preserved, and the backfill ran... See more...
@PickleRick    Thank you very much for your reply. If this is what you said, as shown in the picture below The summary range refers to the period for which data is preserved, and the backfill range refers to the period from which summarization launches. Also, can I understand that the fact that there is data that is physically outside the actual summary range is because the summary data is stored in bucket units and rolled in bucket units!?  
Hi PicleRick Thanks for your answer. 1. Is there such a thing as support service? I just posted this hoping to get some kind of help, wherever it comes from.  If there is online support, I'd appre... See more...
Hi PicleRick Thanks for your answer. 1. Is there such a thing as support service? I just posted this hoping to get some kind of help, wherever it comes from.  If there is online support, I'd appreciate if someone could tell me how to contact them. 2. Yes, the environment is old. Strangely enough the only problem I'm having is with the 9.2.1 forwarder. The old 6.x is OK. And the indexers (7.3.3) are a bit old, but they do work. Yes, I have also verified the configs are exactly the same. For that reason, I was wondering whether 9.2.1 required something new or different in the config. Thanks  
Hi @AtherAD , We're deploying UBA and we had an instalation with Red Hat 8.8, with some packets in 8.9 and doesn't run! Splunk Support confirmed that you must ne use that (with the present UBA rele... See more...
Hi @AtherAD , We're deploying UBA and we had an instalation with Red Hat 8.8, with some packets in 8.9 and doesn't run! Splunk Support confirmed that you must ne use that (with the present UBA release) all the packets will be the ones in the certified release, e.g. RedHat 8.8 without any update to greater releases. And you must block all the updates, otherwise the installation will stop to run. Ciao. Giuseppe
Depends on what do you want to "integrate". Do you want to collect events generated by your web app/web server? Do you want to collect metrics about your server? Do you want to embed reports from Spl... See more...
Depends on what do you want to "integrate". Do you want to collect events generated by your web app/web server? Do you want to collect metrics about your server? Do you want to embed reports from Splunk on your website? Do you want to be able to perform some action on your Splunk environment from your web app? Something else?
1. This is not Splunk Support service. This is a volunteer-driven community. 2. Your environment is really old. 3. We don't know what forwarder you're using (UF/HF), what configuration you have for... See more...
1. This is not Splunk Support service. This is a volunteer-driven community. 2. Your environment is really old. 3. We don't know what forwarder you're using (UF/HF), what configuration you have for your inputs and sourcetypes/sources/hosts on your components. So the only answer you can get at the moment is "something's wrong". But seriously - unless it's a forwarder which is built into some (presumably also obsolete) solution, you should definitely update this 6.4 UF to something less ancient. As a side note - how have you verified that configs on those forwarders are "the same"?
| rex field=_raw mode=sed "s/<\/?\w+.*?\/?>//g"
i have the same issue and no solution
OK. Right from the start there are some things that can be improved | table host | dedup host While in this particular case it might not make such difference it's worth remembering that the tab... See more...
OK. Right from the start there are some things that can be improved | table host | dedup host While in this particular case it might not make such difference it's worth remembering that the table command moves processing to the search-head layer so it's best avoided untill you really need to transform your data into table for presentation. I'd do | stats values(host) as host | mvexpand host instead. The annotate=t part is also not needed very much as you only want to set one field. I'm not sure what you're trying to do with this line: | eval multiplehost=mvjoin(host, ", ") I suppose you want it to work differently than it does. You can't "reach" to other result lines with the eval command. So you either need to combine your results into a multivalued field or maybe transpose your results and do a foreach. But this one will not work. Also, unless you have one of each host (not just one "dummy") in the appended part you won't detect the failed ones.
The rex command defaults to the _raw field.  Other fields must be explicitly referenced.  The following works in my sandbox. | makeresults | eval message="<UL> <LI> The first vulnerability occurs b... See more...
The rex command defaults to the _raw field.  Other fields must be explicitly referenced.  The following works in my sandbox. | makeresults | eval message="<UL> <LI> The first vulnerability occurs because Internet Explorer does not correctly determine an obr in a pop-up window.</LI> <LI> The t type that is returned from a Web server during XML data binding.</LI> </UL> <P> &quot;Location: URL:ms-its:C:WINDOWSHelpiexplore.::/itsrt.htm&quot; <P> :<P><A HREF='http://blogs.msdn.com/embres/archive/20/81.aspx' TARGET='_blank'>October Security Updates are (finally) available!</A><BR>" | rex field=message mode=sed "s/\<[^\>]+>//g"  
Would anyone know how to do it?
@richgalloway . This expression is not removing the tags from the raw data   | makeresults | eval message="<UL> <LI> The first vulnerability occurs because Internet Explorer does not correctly det... See more...
@richgalloway . This expression is not removing the tags from the raw data   | makeresults | eval message="<UL> <LI> The first vulnerability occurs because Internet Explorer does not correctly determine an obr in a pop-up window.</LI> <LI> The t type that is returned from a Web server during XML data binding.</LI> </UL> <P> &quot;Location: URL:ms-its:C:WINDOWSHelpiexplore.::/itsrt.htm&quot; <P> :<P><A HREF='http://blogs.msdn.com/embres/archive/20/81.aspx' TARGET='_blank'>October Security Updates are (finally) available!</A><BR>" | rex mode=sed "s/\<[^\>]+>//g"
Hi Team, I am trying to deploy the Splunk UBA node, but I get a bit confused because, in the Splunk UBA operating system requirements, I didn't find whether Red Hat 8.10 or 9.2 was supported or no... See more...
Hi Team, I am trying to deploy the Splunk UBA node, but I get a bit confused because, in the Splunk UBA operating system requirements, I didn't find whether Red Hat 8.10 or 9.2 was supported or not.  I only found the below information. How can I determine if Red Hat 8.10 or 9.2 are supported or not? Operating System: Red Hat Enterprise Linux (RHEL) 8.8 Kernel-Version Tested: 4.18.0-477.10.1.el8_8.x86_64, 4.18.0-372.9.1.el8.x86_64
it's not working as I expect it. I had already knew how to do the description. To simplify, I creating a script for whether it is up or down. If there are no failed alerts, then it is up. I am creati... See more...
it's not working as I expect it. I had already knew how to do the description. To simplify, I creating a script for whether it is up or down. If there are no failed alerts, then it is up. I am creating an event for up or down. If their down, I need to add the list of down host to the description. I can't use my stuff but this was enough to give a better understanding. index=myindex message=" failed*" | table host | dedup host | append [| makeresults annotate=true | eval host="Dummy" | table host] |eventstats count | eval status = if(count<2,"UP","DOWN") | eval severity = if(status="DOWN","Critical","Normal") | eval multiplehost=mvjoin(host, ", ") | eval msg=if(severity="Critical","Host Have Failed", "Host are Successful") | eval description=if(severity="Critical",multiplehost,"").msg I have tried different commands to join it and placed it in various places. I can't seem to get it to add them together into (host1,host2,host3) in a description. 
I need to capture everything except the html tags like </a> <a> </p> </b>. These tags may appear anywhere in the raw data. I was able to come up with regex that matches non capturing group (?:<\/?\w... See more...
I need to capture everything except the html tags like </a> <a> </p> </b>. These tags may appear anywhere in the raw data. I was able to come up with regex that matches non capturing group (?:<\/?\w>) but I am stuck with not able to capture the rest everything in raw data.   Sample:   Explorer is a web-browser developed by Microsoft which is included in Microsoft Windows Operating Systems.<P> Microsoft has released Cumulative Security Updates for Internet Explorer which addresses various vulnerabilities found in Internet Explorer 8 (IE 8), Internet Explorer 9 (IE 9), Internet Explorer 10 (IE 10) and Internet Explorer 11 (IE 11). <P> KB Articles associated with the Update:<P> 1) 4908777<BR> 2) 879586<BR> 3) 9088783<BR> 4) 789792<BR> 5) 0973782<BR> 6) 098781<BR> 7) 1234788<BR> 8) 8907799<BR><BR> Please Note - CVE-2020-9090 required extra steps to be manually applied for being fully patched. Please refer to the FAQ seciton for <A HREF='https://portal.mtyb.windows.com/en-PK/WINDOWS-guidance/advisory/CVE-2020-9090 ' TARGET='_blank'>CVE-2020-9090 .</A><P> QID Detection Logic (Authenticated):<BR> Additionally the QID checks if the required Registry Keys are enabled to fully patch <A HREF='https://portal.msrc.windows.com/en-US/guidance/advisory/CVE-2014-82789' TARGET='_blank'>CVE-2014-2897.</A> (See FAQ Section) <BR> The keys to be patched are: <BR> &quot;whkl\SOFTWARE\Microsoft\Internet Explorer\Main\FEATURE_ENABLE_PASTE_INFO_DISCLOSURE_FIX&quot; value &quot;iexplore.exe&quot; set to &quot;1&quot;.<BR>
Hi @Siddharthnegi , as I said, if you install the above app in your system, there's an example on how to implement the "Null Search Swapper" that's exacly the feature you need. In the example there... See more...
Hi @Siddharthnegi , as I said, if you install the above app in your system, there's an example on how to implement the "Null Search Swapper" that's exacly the feature you need. In the example there's the code to use in the dashboard, that you need only to customize for your searches and panels. Ciao. Giuseppe
Hello, Can someone please help me in extracting nested json fields without regex? I have tried below: 1. Updating KV_mode =json in the search head TA props.conf 2. Updating indexed_extractions=JS... See more...
Hello, Can someone please help me in extracting nested json fields without regex? I have tried below: 1. Updating KV_mode =json in the search head TA props.conf 2. Updating indexed_extractions=JSON in the search head TA props.conf 3. Updating the limits.conf with the spath stanza for the HF TA [spath] extraction_cutoff = 10000 4. Tried mvexpand command also.  Nothing worked. My raw logs looks like this: event": "{\"eventVersion\" "1.08\",\"userIdentity\":{\"type\" "AssumedRole\",\"principalId\" "AROAXYKJUXCU7M4FXD7ZZ:redlock\",\"arn\" "arn:aws:sts::533267265705:assumed-role/PrismaCloudRole-804603675133320192/redlock\",\"accountId\" "533267265705\",\"accessKeyId\" "ASIAXYKJUXCUSTP25SUE\",\"sessionContext\":{\"sessionIssuer\":{\"type\" "Role\",\"principalId\" "AROAXYKJUXCU7M4FXD7ZZ\",\"arn\" "arn:aws:iam::533267265705:role/PrismaCloudRole-804603675133320192\",\"accountId\" "533267265705\",\"userName\" "PrismaCloudRole-804603675133320192\"},\"webIdFederationData\":{},\"attributes\":{\"creationDate\" "2024-05-03T00:53:45Z\",\"mfaAuthenticated\" "false\"}}},\"eventTime\" "2024-05-03T04:09:07Z\",\"eventSource\" "autoscaling.amazonaws.com\",\"eventName\" "DescribeScalingPolicies\",\"awsRegion\" "us-west-2\",\"sourceIPAddress\" "13.52.105.217\",\"userAgent\" "Vert.x-WebClient/4.4.6\",\"requestParameters\":{\"maxResults\":10,\"serviceNamespace\" "cassandra\"},\"responseElements\":null,\"additionalEventData\":{\"service\" "application-autoscaling\"},\"requestID\" "ef12925d-0e9a-4913-8da5-1022cfd15964\",\"eventID\" "a1799eeb-1323-46b6-a964-efd9b2c30a8a\",\"readOnly\":true,\"eventType\" "AwsApiCall\",\"managementEvent\":true,\"recipientAccountId\" "533267265705\",\"eventCategory\" "Management\",\"tlsDetails\":{\"tlsVersion\" "TLSv1.3\",\"cipherSuite\" "TLS_AES_128_GCM_SHA256\",\"clientProvidedHostHeader\" "application-autoscaling.us-west-2.amazonaws.com\"}}"}
Hi Support Team I have two Splunk indexers and two forwarders. Both forwarders have a configuration with index = test in inputs.conf, but there is configuration in the indexers to decide which inde... See more...
Hi Support Team I have two Splunk indexers and two forwarders. Both forwarders have a configuration with index = test in inputs.conf, but there is configuration in the indexers to decide which index to put the data in based on the data itself (one of the values in the json object). Forwarder 1 has been running for a while with no problems (this runs version 6.4.1) Forwarder 2 is new (version 9.2.1), and requires exactly the same configuration as forwarder 1 which I have already done. The only difference is the host (host1 and host2). The data from Forwarder 2 is being sent to the indexers, but the index is not changed based on the config in the indexers. The data goes to the test index as specified in the forwarder config. Both indexers are running 7.3.3. What could I be missing to get the indexers to put the data from forwarder 2 in the correct index? Could this not be working due to the different versions of Splunk? Thanks