All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Dear all, I have a list of latitude and longitude pairs from my observed events and try to get the corresponding street/city from latitude and longitude. I found this website  https://nominatim.... See more...
Dear all, I have a list of latitude and longitude pairs from my observed events and try to get the corresponding street/city from latitude and longitude. I found this website  https://nominatim.org/release-docs/develop/api/Reverse/ and the URL it provides to map from latitude and longitude to street/building: https://nominatim.openstreetmap.org/reverse?format=xml&lat=<my_lat>&lon=<my_lon>&zoom=18 ex. https://nominatim.openstreetmap.org/reverse?format=xml&lat=39.982302&lon=116.484438&zoom=18 => output is: 360building, xx streat, xx road, xx village, xx district, Beijing, 100015, China However, here is my raw data, more than 1000 records so it's impossible to input to URL manually:  log id Lat Lon 1 39.982302 116.484438 2 30.608684 114.418229   Does anyone know is there a way to treat the Lat and Lon as input parameter in the URL link by code ? Thank you.
Hi all, I'm attempting to exclude specific undesired data from the security logs. Is there a way to minimize the number of items on the exclusion list, considering that we can only add up to 10 it... See more...
Hi all, I'm attempting to exclude specific undesired data from the security logs. Is there a way to minimize the number of items on the exclusion list, considering that we can only add up to 10 items to the blacklist due to limitations. Is there a possibility of consolidating the blacklist by, for example, appending a "|" (pipe) at the end of each blacklisted item, thereby reducing the total number of entries in the blacklist while still achieving the desired exclusion? blacklist1 = EventCode="4662" Message="Object Type:(?!\s*(groupPolicyContainer|computer|user))" blacklist2 = EventCode="4648|5145|4799|5447|4634|5156|4663|4656|5152|5157|4658|4673|4661|4690|4932|4933|5158|4957|5136|4674|4660|4670|5058|5061|4985|4965" blacklist3 = EventCode="4688" Message="(?:New Process Name:).+(?:SplunkUniversalForwarder\\bin\\splunk.exe)|.+(?:SplunkUniversalForwarder\\bin\\splunkd.exe)|.+(?:SplunkUniversalForwarder\\bin\\btool.exe)" blacklist4 = EventCode="4688" Message="(?:New Process Name:).+(?:SplunkUniversalForwarder\\bin\\splunk-winprintmon.exe)|.+(?:SplunkUniversalForwarder\\bin\\splunk-powershell.exe)|.+(?:SplunkUniversalForwarder\\bin\\splunk-regmon.exe)|.+(?:SplunkUniversalForwarder\\bin\\splunk-netmon.exe)|.+(?:SplunkUniversalForwarder\\bin\\splunk-admon.exe)|.+(?:SplunkUniversalForwarder\\bin\\splunk-MonitorNoHandle.exe)|.+(?:SplunkUniversalForwarder\\bin\\splunk-winevtlog.exe)|.+(?:SplunkUniversalForwarder\\bin\\splunk-perfmon.exe)|.+(?:SplunkUniversalForwarder\\bin\\splunk-wmi.exe)|.+(?:SplunkUniversalForwarder\\bin\\splunk\-winhostinfo\.exe)" blacklist5 = EventCode="4688" Message="(?:Process Command Line:).+(?:system32\\SearchFilterHost.exe)|.+(?:find/i)|.+(?:WINDOWS\\system32\\conhost.exe)" renderXml=true
Hi, If anyone come across this thread, could refer to Option 2 in this thread  "How to store results of searches in Dashboard?"  in order to achieve something similar. Build Report with loadjob  f... See more...
Hi, If anyone come across this thread, could refer to Option 2 in this thread  "How to store results of searches in Dashboard?"  in order to achieve something similar. Build Report with loadjob  from "Saved Scheduled Search(es)". loadjob "Loads events or results of a previously completed search job. The artifacts to load are identified either by the search job id <sid> or a scheduled search name and the time range of the current search. If a saved search name is provided and multiple artifacts are found within that range, the latest artifacts are loaded." Happy Splunking!
I am trying to configure SAML SSO for my Splunk Heavy Forwarders. When I upload and save SP Metadata file and remaining details. Its showing the below message on top, but configuration not getting sa... See more...
I am trying to configure SAML SSO for my Splunk Heavy Forwarders. When I upload and save SP Metadata file and remaining details. Its showing the below message on top, but configuration not getting saved in the backend and its not working.  "SAML has already been configured, Cannot add a new SAML configuration.saml" Your help is greatly appreciated.
I am also facing the same problem. But I need to configure it for my Splunk Heavy Forwarders. When I save SAML settings, its showing the below message, but there is configuration is exist for SAML.  ... See more...
I am also facing the same problem. But I need to configure it for my Splunk Heavy Forwarders. When I save SAML settings, its showing the below message, but there is configuration is exist for SAML.  "SAML has already been configured, Cannot add a new SAML configuration.saml" Your help is greatly appreciated.  
Hi @minhquannguyen7, it's easy:you have two solutions to this issue: buying an additional license, reduce some data flows. In other words, analyze your data identifying the larger data flows an... See more...
Hi @minhquannguyen7, it's easy:you have two solutions to this issue: buying an additional license, reduce some data flows. In other words, analyze your data identifying the larger data flows and see if you can reduce some of them disabling the related inputs or filtering events, discarding non so interesting events, for more infos see at https://docs.splunk.com/Documentation/SplunkCloud/9.0.2305/Forwarding/Routeandfilterdatad#Filter_event_data_and_send_to_queues). Ciao. Giuseppe
Hi All, just wondering if anyone has a search that shows which user deleted another user in Linux  ? Typically in the linux syslog messages, when we check for userdel messages ,  it only shows the... See more...
Hi All, just wondering if anyone has a search that shows which user deleted another user in Linux  ? Typically in the linux syslog messages, when we check for userdel messages ,  it only shows the name of the user account that was deleted.  There isn't any mention of which user performed this action.   Whereas in Windows events, we see both src and target user for deletion Event IDs.  How to get this info ? I know one can manually login to host and verify the ./bash_history but how do you accomplish this from Splunk itself ?
My splunk index have a trouble  Total License Usage 90% Near Daily Quota how to solve it pls ?  
| eval _time=strptime("2023-03-01","%F")+(random()%(strptime("2023-06-01","%F")-strptime("2023-03-01","%F")))
I agree, but in some cases it makes sense to give a single user the ability to upload logs to a dedicated index (eg. siem detection usecases). I had to give Admin rights to the server, and that was ... See more...
I agree, but in some cases it makes sense to give a single user the ability to upload logs to a dedicated index (eg. siem detection usecases). I had to give Admin rights to the server, and that was not the way I wanted to set the permissions. Kind Regards Patrick
Hi, I am working on a POC project. I need to add timestmap command to use predict algorithm. My dataset however has the below fields. MonthYear Year Quarter Month Subscription ID ... See more...
Hi, I am working on a POC project. I need to add timestmap command to use predict algorithm. My dataset however has the below fields. MonthYear Year Quarter Month Subscription ID Subscription name   I was looking for a solution to randomly add timestamps for a period of one year. I have data from March2023 till May2023.   March 2023   Qtr 1 March 020b3b0c-5b0a-41a1-8cd7-90cbd63e06 SUB-PRD-EU-EDLCONTACTS March 2023   Qtr 1 March 020b3b0c-5b0a-41a1-8cd7-90cbd63e06 SUB-PRD-EU-EDLCONTACTS  
This is a vague question with a multitude of possible answers, however, there are a couple of techniques ranging from the simplistic to more complex. For a simplistic approach, you could determine t... See more...
This is a vague question with a multitude of possible answers, however, there are a couple of techniques ranging from the simplistic to more complex. For a simplistic approach, you could determine the (historic) average of each of your metrics and compare your current values against that average. If you also determine the standard deviation of your metrics, your comparison can be based on number of standard deviations away from the mean that your current values are. You would then set a threshold for how far from the mean would be deemed an anomaly. More sophisticated ways of doing this is to use the Machine Learning ToolKit (MLTK) - this involves fitting your (historic) data to a statistical model, and then applying that model to your current data to find anomalies. The MLTK can fit your data to a number of different distribution models either specifically if you know the type of distribution your data is expected to follow, or let the MLTK find the most appropriate.
Hi, I am trying to look up data related to EventCode="4662", but it does not show in Splunk. Additionally I checked inputs.conf on the indexer and it was not present, I copied inputs.conf from defa... See more...
Hi, I am trying to look up data related to EventCode="4662", but it does not show in Splunk. Additionally I checked inputs.conf on the indexer and it was not present, I copied inputs.conf from default: [WinEventLog://Security] disabled = 0 start_from = oldest current_only = 0 evt_resolve_ad_obj = 1 checkpointInterval = 5 blacklist1 = EventCode="4662" Message="Object Type:\s+(?!groupPolicyContainer)" blacklist2 = EventCode="566" Message="Object Type:\s+(?!groupPolicyContainer)" index = wineventlog renderXml=false I have check within Windows Event Viewer on our Domain Controller that Event 4662 is present, but Splunk searches for EventCode=4662 produce no results. so what i want to see is the event code 4662 that in it's message contain Object Type: user Here i will provide the event viewer logs that i want splunk to show An operation was performed on an object. Subject :   Security ID:       CIMBNIAGA\YT91504X   Account Name:      YT91504X   Account Domain:    CIMBNIAGA   Logon ID:          0xC2D9E1AC Object:   Object Server: DS   Object Type:   user   Object Name:   CN=ADJOINADMIN,OU=Functional   ID,OU=Special_OU,DC=cimbniaga,DC=co,DC=id   Handle ID:     0x0 Operation:   Operation Type:  Object Access   Accesses:      READ_CONTROL   Access Mask:   0x20000   Properties:    READ_CONTROL {bf967aba-0de6-11d0-a285-00aa003049e2} Additional Information:   Parameter 1:    -   Parameter 2:   Please help me i really got stuck i already try to delete the blacklist filtering but it's still not give me  the log that i want just like in the top @kheo_splunk
was awesome, like it to talk to the experts and discussing about everything around ITSI!
do you see something in _internal regarding the execution of the script? What about other scripts, if you create a new action rule with a different script, does this work or are all scripts failing?
You can open a glass table by sharing the URL, of course you have to log in to see it.
HI @Dalton2, let me understand: do you want to have in the same table the values of your ix use cases that you already have, or do you want a solution to each of the six use cases? in the first cas... See more...
HI @Dalton2, let me understand: do you want to have in the same table the values of your ix use cases that you already have, or do you want a solution to each of the six use cases? in the first case, you should create six searches that have as output two columns: UseCase (containing the use cases you have) and value (containing the fould values); than store the results of these six searches in a summary index using the collect command; then you can search in the summary index and use the two stored columns to display results in a table. In the secod case, you want six different use cases that depend on many factors like kind of data, fields, etc... and it's too long for one answer and I hint to divide it in more questions. Ciao. Giuseppe
Thanks @kakawun this worked. Happy to give users this temporarily. Hopefully Splunk will rectify this bug soon. If you can mark your own comment as an answer I would recommend that! Thanks again.
Hi @aditsss, if you don't want the last row with some empty fields, you have to remove empty lines. You can do it knowing the name of the first column (that I don't know) and poning a rule (if the ... See more...
Hi @aditsss, if you don't want the last row with some empty fields, you have to remove empty lines. You can do it knowing the name of the first column (that I don't know) and poning a rule (if the column is called "column1": index="abc" sourcetype =600000304_gg_abs_ipc2 source="/amex/app/gfp-settlement-raw/logs/gfp-settlement-raw.log" "ReadFileImpl - ebnc event balanced successfully" | eval True=if(searchmatch("ebnc event balanced successfully"),"✔",""), EBNCStatus="ebnc event balanced successfully", Day=strftime(_time,"%Y-%m-%d") | dedup EBNCStatus Day | search column1=* | table EBNCStatus True Day there is an asterisk outside the quotes in the second eval. Ciao. Giuseppe
Hi @nithys, as I said, the easiest way is to create a lookup ontaining two columns (e.g. choice and value): value choice 0-a Airbag Scheduling 1-b Airbag Scheduling 2-b Airb... See more...
Hi @nithys, as I said, the easiest way is to create a lookup ontaining two columns (e.g. choice and value): value choice 0-a Airbag Scheduling 1-b Airbag Scheduling 2-b Airbag Scheduling 3- Airbag Scheduling 4- Airbag Scheduling 5- Airbag Scheduling 6-a Airbag Scheduling 0-d Material 1-e Material 2-e Material 3- Material 4-d Material 5-d Material 6-d Material 0-e Cost Summary 1-f Cost Summary 2-f Cost Summary 3-e Cost Summary 4- Cost Summary 5- Cost Summary 6- Cost Summary 0-f All 1-e All 2-b All 3-b All 4-md All 5-a All then you could run in the first lookup: | inputlookup mylookup | dedup choice | sort choice | table choice then in the second dropdown you could run: | inputlookup mylookup WHERE choice=$FirstToken$ | dedup value | sort value | table value so you'll have the values relative to the choosen first token Ciao. Giuseppe