Hello @gcusello regarding your question: There's only one not clear thing: why are you speaking of a single intermediate Forwarder? No, I have 2 forwarders, but as you know, since UDP is a stream...
See more...
Hello @gcusello regarding your question: There's only one not clear thing: why are you speaking of a single intermediate Forwarder? No, I have 2 forwarders, but as you know, since UDP is a stream, one forwarder will handle all traffic.
Hi @yuanliu Yeah I have it set up in the same way you have shown - I do still get results but the first two field, which should provide details of where the subnets belong just come b...
See more...
Hi @yuanliu Yeah I have it set up in the same way you have shown - I do still get results but the first two field, which should provide details of where the subnets belong just come back as the "notfound" that I have added to the search when the subnets are not part of the lookup file (I am using a dummy subnet that is 100% present in the lookup file).
Hi @spisiakmi , ok, is the solution ok for you? let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karm...
See more...
Hi @spisiakmi , ok, is the solution ok for you? let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
This sounds like a network issue - the connection from your sharepoint server to the Splunk Cloud will be going through various network elements e.g. proxies, firewalls, etc. which may be changing th...
See more...
This sounds like a network issue - the connection from your sharepoint server to the Splunk Cloud will be going through various network elements e.g. proxies, firewalls, etc. which may be changing the address being used e.g. PAT and NAT such that the server the request ends up at either isn't the server you think it is or it is using a different port, hence the connection refused. Could this be the issue?
Hi gcusello, thank you for your reply. In fact there should be a saved search running on daily basis. Example: for every row in lookup table should run a query. index=myindex att1=F1 AND earliest=...
See more...
Hi gcusello, thank you for your reply. In fact there should be a saved search running on daily basis. Example: for every row in lookup table should run a query. index=myindex att1=F1 AND earliest=strptime("12.09.2024", "%d.%m.%Y") | stats count as cnt index=myindex att1=F2 AND earliest=strptime("23.04.2024", "%d.%m.%Y") | stats count as cnt index=myindex att1=F3 AND earliest=strptime("15.06.2024", "%d.%m.%Y") | stats count as cnt index=myindex att1=F4 AND earliest=strptime("16.03.2024", "%d.%m.%Y") | stats count as cnt result: att1 cnt att2 att3 F1 234 1100 12.09.2024 F2 4235 1100 23.04.2024 F3 3763 1100 15.06.2024 F4 42314 1100 16.03.2024
Variables in a macro are surrounded by dollar signs e.g. $var$. Tokens in a dashboard are also surrounded by dollar signs e.g. $token$. When a macro with variables is used in a dashboard, the dollar ...
See more...
Variables in a macro are surrounded by dollar signs e.g. $var$. Tokens in a dashboard are also surrounded by dollar signs e.g. $token$. When a macro with variables is used in a dashboard, the dollar signs have to be doubled-up e.g. $$var$$ otherwise the dashboard will assume they are tokens and probably the search will be waiting on user input to give the token ($var$) a value.
I do have Splunk Enterprise license and my Splunk version is 9.1.1. The problem I have is anyone can access this url htttps:...../en-US/config and it will show up even if the user is login or not, ...
See more...
I do have Splunk Enterprise license and my Splunk version is 9.1.1. The problem I have is anyone can access this url htttps:...../en-US/config and it will show up even if the user is login or not, like so
It is not clear what the dedup is doing, nor what the search XXX is for, but let's assume it is for the product you are interested in. Next, it isn't clear what the single would show. Is it how many ...
See more...
It is not clear what the dedup is doing, nor what the search XXX is for, but let's assume it is for the product you are interested in. Next, it isn't clear what the single would show. Is it how many users have used the product multiple times? | bin _time span=1mon
| stats count by _time user_id
| where count > 1
| timechart count span=1mon
Hi @rsAU The above reply should work fine for your situation. if still any issues, pls update us 1) your full search query (remove any confidential info) 2) maybe a screenshot is better
Hi @spisiakmi , let me understand: do you want to put in an input both the att1 and att3 tokens or do you want to pass all the att1 and att3 values of the lookup? in the first case, you have at f...
See more...
Hi @spisiakmi , let me understand: do you want to put in an input both the att1 and att3 tokens or do you want to pass all the att1 and att3 values of the lookup? in the first case, you have at first to create in a dashboard a dropdown using a search like the following: | inputlookup lookup.csv
| eval token=att1.",".att3
| dedup token
| sort token
| table token passing by value the token to te following search. Then run this search (in the same dashboard) index=myindex [ | makeresults | rex field=$token$ "^(?<att1>[^,]+),(?<att3>.*)" | eval earliest=strptime(att3, "%d.%m.%Y") | fields att1 att3 ]
| ... Ciao. Giuseppe
Hi @cdevoe57 , you have to create a new appLogo.png (160x40 pixels) and appLogo_2x.png (320x80 pixels) file, contaning the image to show (both logo and name) and save it in $SPLUNK_HOME/etc/apps/<yo...
See more...
Hi @cdevoe57 , you have to create a new appLogo.png (160x40 pixels) and appLogo_2x.png (320x80 pixels) file, contaning the image to show (both logo and name) and save it in $SPLUNK_HOME/etc/apps/<your_app>/static replacing the exisiting files. In this way you'll have the app icon you like, with both logo and name. Ciao. Giuseppe
Hi @rsAU , let me understand: you want to count the users that accessed the system more than one time, is this correct? You can use a simple search: <your_search>
| stats count by user_id
| wher...
See more...
Hi @rsAU , let me understand: you want to count the users that accessed the system more than one time, is this correct? You can use a simple search: <your_search>
| stats count by user_id
| where count>1 Ciao. Giuseppe
Hi @Iris_Pi , supponing that the _time of your events is the Timestamp field, you have two solutions: 1) using stats (supponing a span of 1 hour): <your_search>
| bin span=1h _time
| stats sum(rxb...
See more...
Hi @Iris_Pi , supponing that the _time of your events is the Timestamp field, you have two solutions: 1) using stats (supponing a span of 1 hour): <your_search>
| bin span=1h _time
| stats sum(rxbytes) AS rxbytes BY fwname interface 2) using timechart (supponing a span of 1 hour): <your_search>
| eval col=fwname.", "interface
| timechart span=1h sum(rxbytes) AS rxbytes BY col I prefer the first one. Ciao. Giuseppe
Trying to fix a corruption issue with a _metrics bucket, using the "./splunk rebuild <path> command. Doing this, i recieve the following WARN "Fsck - Rebuilding entire bucket is not supported for "m...
See more...
Trying to fix a corruption issue with a _metrics bucket, using the "./splunk rebuild <path> command. Doing this, i recieve the following WARN "Fsck - Rebuilding entire bucket is not supported for "metric" bucket that has a "stubbed-out" rawdata journal. Only bloomfilter will be build" How would i rebuild the metrics bucket to fix the error?
Hi @tungpx , let me understand: you have a Splunk instance accessible without login (also by API)? is it maybe a free Splunk instance? in this case the only solution is to buy a license. Could you...
See more...
Hi @tungpx , let me understand: you have a Splunk instance accessible without login (also by API)? is it maybe a free Splunk instance? in this case the only solution is to buy a license. Could you better describe your situation? Ciao. Giuseppe
Hi @Cleanhearty , I suppose that you already ingested the csv file in a lookup or in an index. If in a lookup you can define what you mean with "gender that performed the most fraudulent activities...
See more...
Hi @Cleanhearty , I suppose that you already ingested the csv file in a lookup or in an index. If in a lookup you can define what you mean with "gender that performed the most fraudulent activities and in what category", I suppose that you mean most fraudolent by amount, so you could try something like this: | inputlookup fraud_report.csv
| stats max(amount) AS amount BY gender category
| sort -amount
| head 10 in this way, you have the top 10 categories by gender that have the greatest amount. My hint is also to follow the Splunk Search Tutorial (https://docs.splunk.com/Documentation/SplunkCloud/latest/SearchTutorial/WelcometotheSearchTutorial) to learn how to run similar searches. Ciao. Giuseppe