All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

こんにちは。 自宅環境にSplunk Enterprise 7.3のトライアル版をインストールしました。 環境:Windows Server 2019 Essencial (192.168.0.x)    Active Directory インストールしたのは一台だけで、Forwarder等は使用していません。 SplunkWebにアクセスするときに以下のアドレスだと正しく接続できます... See more...
こんにちは。 自宅環境にSplunk Enterprise 7.3のトライアル版をインストールしました。 環境:Windows Server 2019 Essencial (192.168.0.x)    Active Directory インストールしたのは一台だけで、Forwarder等は使用していません。 SplunkWebにアクセスするときに以下のアドレスだと正しく接続できます。 https://localhost:8000 https://127.0.0.1:8000 しかし、本来そのサーバが持っているIPアドレスやホスト名ではアクセスできずタイムアウトしてしまいます。 https://192.168.0.x:8000 https://foo:8000 ほかのクライアントから上記のアドレスでアクセスしてもやはりSplunkの画面は表示されません。   これはなぜでしょうか。 おそらく、何かの設定が足りないためだとは思うのですが。 アドバイスをお願いします。
Hello ! I'm using Splunk_SA_CIM with ESS and I'm currently studying most of the ESCU correlation search for my own purposes. Problem : I discovered that most of my ESCU rules are creating a lot o... See more...
Hello ! I'm using Splunk_SA_CIM with ESS and I'm currently studying most of the ESCU correlation search for my own purposes. Problem : I discovered that most of my ESCU rules are creating a lot of notable events, which after investigation, were all false positives. All these rules are based on fields coming from Endpoint Data Model (for exemple, Processes.process_path), and because most of the process.path values are equal to "null", it triggers the search and create a notable event. I've already updated every app I use, and to gather Windows data, I'm using Splunk_TA_Windows add-on. Do you have any clue on how I can find where the problem is and solve it ?    
Hi @Strangertinz , a stupid question: have you configured the input in DB-Connect or did you only tested the connection? did you checked the ckeckpont based on the rising column? in other words, ar... See more...
Hi @Strangertinz , a stupid question: have you configured the input in DB-Connect or did you only tested the connection? did you checked the ckeckpont based on the rising column? in other words, are you sure that you have values where the rising column value has a greater value than the previous? are you sure that you're receiving logs on Splunk Cloud from the same server where DB-Connect is located? did you checked the index used in the inputs.conf? Ciao. Giuseppe
Hi @Siddharthnegi , in the test, be sure that the time period is the same at the scheduled time. Then, do you know in what app it's located? so you can search it, if you don't know the app, you co... See more...
Hi @Siddharthnegi , in the test, be sure that the time period is the same at the scheduled time. Then, do you know in what app it's located? so you can search it, if you don't know the app, you could search it on SSH in savedsearches.conf files. Ciao. Giuseppe
Hi @Esky73 , some add-ons aren't free and request to sign in an external site. The strange thing is that if you follow the Details there's another link to download this add-on and also the title is... See more...
Hi @Esky73 , some add-ons aren't free and request to sign in an external site. The strange thing is that if you follow the Details there's another link to download this add-on and also the title is a little different, but it's downloadable. Ciao. Giuseppe
Its a report , it is shared at global level, when i ran this search it is giving results.
Hi @Siddharthnegi , is this savedsearch an alert or a report? is this savedsearch shared at least at app level or private? are you sure that the savedsearch has results? please, make a test modif... See more...
Hi @Siddharthnegi , is this savedsearch an alert or a report? is this savedsearch shared at least at app level or private? are you sure that the savedsearch has results? please, make a test modifying the savedsearch assuting that there will be at least one result and see what happens. Ciao. Giuseppe 
Hey @sbel If you are using Splunk v9.3 then your app should be compatible with Python v3.9 by default. For temporary time you can try to use this if it can help you but for long term you have to m... See more...
Hey @sbel If you are using Splunk v9.3 then your app should be compatible with Python v3.9 by default. For temporary time you can try to use this if it can help you but for long term you have to make apps compatible to Python v3.9.0 $SPLUNK_HOME/etc/system/local/server.conf/[general]/python.version = python3.9
I have a saved search which is scheduled but it is not showing and not running at the scheduled time.
Hi @new2splunk21 , I see many different issues that maybe can be reconducted to the same one: are you sure that the indexers has the resources (storage) to receive all logs? because the message in ... See more...
Hi @new2splunk21 , I see many different issues that maybe can be reconducted to the same one: are you sure that the indexers has the resources (storage) to receive all logs? because the message in the last screenshot seems to indicate that there's an issue in the receiver and not in the Forwarder. Then, did you ever received logs from all the 5 forwarders? if not, maybe you used the same hostname in some forwarders. run a search on _internal to see if you have logs from all the forwarders: index=_internal Ciao. Giuseppe
Is this TA still being developed and supported? https://splunkbase.splunk.com/app/4950/ I followed the 'visit site' link on the splunkbase page and couldn't see the Enterprise version advertised?
The attempted code shows several misunderstandings, otherwise the regex can be fixed. Most importantly, you need to realize that table command does not perform evaluation.  It can only tabulate fie... See more...
The attempted code shows several misunderstandings, otherwise the regex can be fixed. Most importantly, you need to realize that table command does not perform evaluation.  It can only tabulate fields that already have value. Second, there are several obvious attempts to use asterisk (*) as wildcard in regex.  It is not.  In regex, * is a repetition token.  What you meant is perhaps .*.  So I made changes as such. Beside these, the first line in the sample also cannot match \d{21}\d2 because you used nonnumeric characters immediately after BP Tank: Bat from Surface = #07789*K00C0.  To make the following meaningful, I replaced those characters with numerals in the emulation.  What you should be using is perhaps something like   index=khisab_ustri sourcetype=sosnmega "*BP Tank: Bat from surface = *K00C0*" |rex max_match=0 "(?ms)(?<time_string>\d{12})BP Tank: Bat from Surface .*K00C0\d{21}(?<kmu_str>\d{2})*" |rex max_match=0 "(?<PC_sTime>\d{12})CSVSentinfo:L00Show your passport.*" |rex max_match=0 "(?<CP_sTime>\d{12})CSVSentinfo Data:z800.*" |rex max_match=0 "(?<MTB_sTime>\d{12})CSVSentinfoToCollege:.*" |rex max_match=0 "(?<MFB_sTime>\d{12})CSVSentinfoFromCollege:.*" |rex max_match=0 "(?<PR_sTime>\d{12})CSVSentinfo:G7006L.*" |rex max_match=0 "(?<JR_sTime>\d{12})CSVSentinfo:A0T0.*" |rex max_match=0 "(?<MR_sTime>\d{12})BP Tank: Bat to Surface .*L000passportAccepted.*" | eval PC_minus_timestring = (PC_sTime- time_string), CP_minus_PC = mvmap(CP_sTime, (CP_sTime- PC_sTime)), MTB_minus_CP = (MTB_sTime-CP_sTime), MFB_minus_MTB = (MFB_sTime-MTB_sTime), PR_minus_MFB = (PR_sTime- MFB_sTime), JR_minus_PR = (JR_sTime-PR_sTime), MR_minus_JR = (MR_sTime-JR_sTime) | table *_minus_*     The modified sample data will give CP_minus_PC JR_minus_PR MFB_minus_MTB MR_minus_JR PC_minus_timestring PR_minus_MFB 3 7 8 2 4 2 5 3 Some additional pointers You should not use dedup on _time.  If you need to do that, something is wrong with your event data.  Fix that first. rex command operates on _raw by default.  No need to specify. Some fields can have multiple matches.  I added max_match=0.  Read rex document about its options. Your sample data do not contain all fields you are trying to extract. Your sample SPL does not does not use kmu_str field that is extracted. Here is an emulation of modified sample data.  Play with it and compare with real data   | makeresults | eval _raw = "123456789102BP Tank: Bat from Surface = #07789*K00C012345678901234567890178 00003453534534534 123456789103UniverseToMachine\\0a<Ladbrdige>\\0a <SurfaceTake>GOP</Ocnce>\\0a <Final_Worl-ToDO>Firewallset</KuluopToset>\\0a</ 123456789105SetSurFacetoMost>7</DecideTomove>\\0a <TakeaKooch>&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;&#32;</SurfaceBggien>\\0a <Closethe Work>0</Csloethe Work>\\0a 123456789107CSVSentinfo:L00Show your passport 123456789108BP Tank: Bat from Surface = close ticket 123456789109CSVSentinfo:Guide iunit 123456789110CSVSentinfo Data:z800 123456789111CSVGErt Infro\"8900 123456789112CSGFajsh:984 123456789113CSVSentinfoToCollege: 123456789114CSVSentinfo Data:z800 123456789115CSVSentinfo Data:z800 123456789116Sem startedfrom Surface\\0a<Surafce have a data>\\0a <Surfacecame with Data>Ladbrdige</Ocnce>\\0a <Ladbrdige>Ocnce</Final_Worl>\\0a <KuluopToset>15284</DecideTomove>\\0a <SurafceCall>\\0a <wait>\\0a <wating>EventSent</SurafceCall>\\0a </wait>\\0a </sa>\\0a</Surafce have a data>\\0a\\0a 123456789117CSVSentinfoFromCollege: 123456789118CSVSentinfo:sadjhjhisd 123456789119CSVSentinfo:Loshy890 123456789120CSVSentinfo:G7006L 123456789121CSVSentinfo:8shhgbve 123456789122CSVSentinfo:A0T0 123456789123CSVSentinfo Data:accepted 123456789124BP Tank: Bat to Surface L000passportAccepted" ``` the above emulates index=khisab_ustri sourcetype=sosnmega "*BP Tank: Bat from surface = *K00C0*" ```    
Yes, different events. I am very initial stage of SPL hence trying to figure it out. TIA
Yes, they are multiple events.
Hi Splunk Community,  I am having issues with Splunk DB Connect 3.18.0 not sending data.  I was able to connect the db connect app to the database and query properly but no luck seeing the data... See more...
Hi Splunk Community,  I am having issues with Splunk DB Connect 3.18.0 not sending data.  I was able to connect the db connect app to the database and query properly but no luck seeing the data from splunk cloud. I am able to send other logs and data to Splunk cloud with no issues.  Thanks!
and under messages it s ays  
they're not showing up when i go to search and type index="host_audits"
Thanks for the clear info!
Universal Forwarder is a lighweight component you typically install on remote machines to - as the name suggests - forward the data to your "main part" of Splunk installation. But if you already have... See more...
Universal Forwarder is a lighweight component you typically install on remote machines to - as the name suggests - forward the data to your "main part" of Splunk installation. But if you already have full Splunk instance installed you don't need a UF (there are some border cases when such setup can be useful but makes the whole environment overly complicated). So if you're just starting with Splunk, it's enough to add local windows event log inputs on the Splunk server.
The forwarders are not listed where? Because forwarders may or may not be listed in several places depending on which functionalities you use. They can also not show up anywhere within the gui and st... See more...
The forwarders are not listed where? Because forwarders may or may not be listed in several places depending on which functionalities you use. They can also not show up anywhere within the gui and still be sending data and be functioning perfectly well. So what is the actual problem?