All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello, I'm trying to investigate the configuration files in a new app I created, but every time I run ./splunk btool --app=my_new_app check I get this error  "Failed to run Splunk as SPLUNK_OS_USER. ... See more...
Hello, I'm trying to investigate the configuration files in a new app I created, but every time I run ./splunk btool --app=my_new_app check I get this error  "Failed to run Splunk as SPLUNK_OS_USER. This command can only be run by bootstart user." Please help!
Hello, I have some issues with field extraction using props.conf and transforms.conf files. Sample data (3 sample events), my props.conf and transforms.conf files are provided below. I am getting ev... See more...
Hello, I have some issues with field extraction using props.conf and transforms.conf files. Sample data (3 sample events), my props.conf and transforms.conf files are provided below. I am getting events but field extraction is not responding (or getting events without extracted fields). Any recommendations would be highly appreciated. Thank you!   3 Sample Raw Events tP1380158753BMFPG68006701522000000000000000000000000000000 Y1021210324010157DFTJ450757015#26I040 tP1380158753BMFPG68016702522000000000000000000000000000000 Y1023210324010156DFTJ450757015#25I040 tP1380158753BMFPG68026703522000000000000000000000000000000 Y1023210324010155DFTJ450757015#26I040 props.conf [abcd:tests] SHOULD_LINEMERGE = false CHARSET=UTF-8 LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK=true TIME_PREFIX=[^"]{62} TIME_FORMAT=%Y%m%d%H%M%S REPORT-abcdCod6 = abcdCod6   transforms.conf [abcdCod6] REGEX = (^)(?i)(?<a>.{1})(?i)(?<b>.{1})(?i)(?<e>.{10})(?i)(?<f>.{5})(?i)(?<g>(?:6))(?i)(?<h>.{8})(?i)(?<i>.{1})(?i)(?<j>.{1})(?i)(?<k>.{30})(?i)(?<l>.{1})(?i)(?<m>.{1})(?i)(?<n>.{2})(?i)(?<o>.{14})(?i)(?<p>.{4})(?i)(?<q>.{9})(?i)(?<r>.{1})(?i)(?<s>.{2})(?i)(?<t>.{1})(?i)(?<u>.{2})(?i)(?<v>.{1}) DEST_KEY = _raw
Hi Team,   We have a splunk dashboard panel which has a requirement that is. The dashboard panel has a title which needs a time range and that time range should be same as the time range which ... See more...
Hi Team,   We have a splunk dashboard panel which has a requirement that is. The dashboard panel has a title which needs a time range and that time range should be same as the time range which is used for the search time in the panel. Below are the snippets which can give an idea about the requirement. The date range which has highlighted should be same as the below search time which has been used in the same panel.   Need help on the above requirement.
Hello everyone! I'm trying to make props file which will trim all not cyrillic symbols from field "account" My log example is  18:10:24 Object="some object" Source="some source1323" Account="А... See more...
Hello everyone! I'm trying to make props file which will trim all not cyrillic symbols from field "account" My log example is  18:10:24 Object="some object" Source="some source1323" Account="Аккаунтvfweцw" i want to delete vfwew from field Account, but note that symbols can go in any order and with cyrillic symbols too, i need to catch them all and delete, only from one field SEDCMD-notcyr - Account="....  
Splunk developers should have access to native source control that includes versioning, effect analysis, and rollbacks to previously stable versions.
How can we halt duplicate notables from being created on the Enterprise security Incident Review page for the same event id? Do any parameters need to be changed? Ranging from earliest to latest: -7... See more...
How can we halt duplicate notables from being created on the Enterprise security Incident Review page for the same event id? Do any parameters need to be changed? Ranging from earliest to latest: -70M to -10M every 35 minutes on a cron plan All correlation inquiries experience it.
Is there a way to exclude source_ip and destination_ip combinations? There are two columns in my look up table. These are the IP addresses of the source and destination. Any combination of source IP... See more...
Is there a way to exclude source_ip and destination_ip combinations? There are two columns in my look up table. These are the IP addresses of the source and destination. Any combination of source IP and destination IP should be filtered out when searching firewall traffic logs. In the following query, source_ip is excluded from the lookup table. Could you please tell me how I could exclude a combination of source_ip and destination_ip?
HI, I am new to Splunk. If criteria is met, I notice my search results include my previous searches stored in Splunk's own audit trail. They have: host = [one of the Splunk nodes] source = audittr... See more...
HI, I am new to Splunk. If criteria is met, I notice my search results include my previous searches stored in Splunk's own audit trail. They have: host = [one of the Splunk nodes] source = audittrail sourcetype = audittrail Is there a way to exclude it? It generates a lot of noise when I am refining my searches.. Thanks.
Hi, I have the following query: | tstats count where index=dns earliest=-90d latest=now() groupby _time span=1d | fields _time count | rename _time as hour | eval hour=strftime(hour,"%Y-%m-%d %H... See more...
Hi, I have the following query: | tstats count where index=dns earliest=-90d latest=now() groupby _time span=1d | fields _time count | rename _time as hour | eval hour=strftime(hour,"%Y-%m-%d %H:%M:%S") | fields hour count | fields - _* | eventstats avg(count) as avg_count | eval k=(pow(avg_count,2))/(var(count)-avg_count) | eval outlier=if(count>(avg_count+k*pow(avg_count,2)),1,0) | eval predicted_outlier=if(outlier=1,"anomaly","normal") | eval actual_outlier=if(day>relative_time(now(),"-7d"), "anomaly", "normal") | eval true_positives=if(predicted_outlier="anomaly" AND actual_outlier="anomaly", 1, 0) | eval false_positives=if(predicted_outlier="anomaly" AND actual_outlier="normal", 1, 0) | eval false_negatives=if(predicted_outlier="normal" AND actual_outlier="anomaly", 1, 0) | eval true_negatives=if(predicted_outlier="normal" AND actual_outlier="normal", 1, 0) | stats sum(true_positives) as TP, sum(false_positives) as FP, sum(false_negatives) as FN, sum(true_negatives) as TN | eval accuracy=(TP+TN)/(TP+FP+FN+TN) | eval precision=TP/(TP+FP) | eval recall=TP/(TP+FN) However, the this statement is not working as the var command does not work without stats and I cannot use stats in a command by itself as I want to apply this k formula for each daily count. Can you please help? Many thanks,
Hello All, Is it possible to edit the default navigation menu for one app, to show a dashboard in a different app? Or, is the navigation menu only possible within an app?   thanks, eholz1
Hi. I want to try Splunk on windows server 2019, i have windows server and a client, what to do to make splunk read what is happening in the client pc? is there are step by step tutorial?  thanks
I'm using Classic dashboard for this. I'm trying to use some of the results of a search beneath a Line Chart to modify a custom URL, but need to use values other than click.value, click.value2. For... See more...
I'm using Classic dashboard for this. I'm trying to use some of the results of a search beneath a Line Chart to modify a custom URL, but need to use values other than click.value, click.value2. For example: I'd like when a user clicks on a line which is grouped by cluster (A field generated with rex), I'd like to take them to a separate page, with that cluster pre-filled in. Everything I've read says "Just do $row.cluster$", or "$result.cluster". I thought maybe because it's a value generated with rex, that it might not work, but even things like $row.host$, which I've seen recommended in many places, but it always comes through as just "$row.host$", instead of the value of the field. I even made a token that references $row.host$, and it translates the token to `%3Frow.host%3F` instead of filling in the actual value. Here's a slightly trimmed down example of the line chart that is not giving me the proper custom values on drilldown (click.value/click.value2 both work, for what it's worth, but *only* those two): <search> <query>index="*" sourcetype="infra_memory" | regex host=".*?bar.*?" | rex <insert convoluted 'cluster' regex here> | rex field=host "[^0-9]*(?&lt;stack_num&gt;[0-9]+)-" | eval memory_usage = ((total-available)/total*100) | bucket _time span=1hour | chart p99(memory_usage) by _time, cluster</query> </search> <drilldown> <set token="TEST">$row.host$</set> <link target="_blank">/test:$TEST$-$row.TEST$,$tkn.TEST$/rowstuff:$row.stack_num$,$row.host$,$row.cluster$-clickstuff:$click.value$,$click.value2$,$click.host$-result:$result.sourcetype$,$result.host$</link> </drilldown> (As you can see, I've tried a bunch of different things, in a desperate attempt to find anything that works) Is what I'm trying to do possible?
Hello, I am trying to figure out how to edit props.conf so that it splits my events properly. The events are added to a log file, which looks like this:   ******************************************... See more...
Hello, I am trying to figure out how to edit props.conf so that it splits my events properly. The events are added to a log file, which looks like this:   ****************************************************************************** Mon 01/02/2023 09:00 AM ****************************************************************************** The command completed successfully. 1 file(s) copied. \\share\folder\folder\folder\file 1 file(s) copied. 1 file(s) copied. ****************************************************************************** Tue 01/03/2023 09:00 AM ****************************************************************************** The command completed successfully. The system cannot find the file specified. \\share\folder\folder\folder\file 0 file(s) copied. The system cannot find the file specified. ****************************************************************************** Wed 01/04/2023 09:00 AM ****************************************************************************** The command completed successfully. 1 file(s) copied. \\share\folder\folder\folder\file 1 file(s) copied. 1 file(s) copied. ****************************************************************************** Thu 01/05/2023 09:00 AM ****************************************************************************** The command completed successfully. 1 file(s) copied. \\share\folder\folder\folder\file 1 file(s) copied. 1 file(s) copied. ****************************************************************************** I would like my events to look like this: ****************************************************************************** Mon 01/02/2023 09:00 AM ****************************************************************************** The command completed successfully. 1 file(s) copied. \\share\folder\folder\folder\file 1 file(s) copied. 1 file(s) copied. It seems like no matter what I try, I can't get splunk to separate it properly. The file updates daily and I have been testing my settings by uploading a copy of the text file directly and then adding then configuring splunk to monitor the file for continuous updates.   Typically the preview for the uploaded file looks somewhat acceptable like this: Mon 01/02/2023 09:00 AM ****************************************************************************** The command completed successfully. 1 file(s) copied. \\share\folder\folder\folder\file 1 file(s) copied. 1 file(s) copied.   This output would work, however I did notice that it is consistently cutting off the first line of text. The real problem comes in with the monitoring process. It tends to split the data in a way that seems almost random, and definitely isn't matching my regex settings. The date, the asterisks and the text get placed into separate events for reasons i dont understand.   My props.conf settings are displayed below: [log_file_test] BREAK_ONLY_BEFORE = \*{78}\s*[a-zA-z]{3}\s\d{2}\/\d{2}\/\d{2}\/\d{4} NO_BINARY_CHECK = 1 SHOULD_LINEMERGE=1 category=custom pulldown_type=1 disabled=false Any clues as to what I might be doing wrong or neglecting?  
Hi everyone, I have implemented Splunk for my architecture at work, to give you an insight, the architecture is pretty big, consisting of 3 sites. So, I have 3 heavy forwarders in total (1 for each s... See more...
Hi everyone, I have implemented Splunk for my architecture at work, to give you an insight, the architecture is pretty big, consisting of 3 sites. So, I have 3 heavy forwarders in total (1 for each site), and 2 indexers. The alert (forwarder ingestion latency) is showing on all the HFs and the 2 indexers. I have reviewed all the recommendations that were given by the community as well as Splunk documentation, with no luck. The issue is that now some logs are being lost because of this issue. I have increased the queue size on all servers up to 100MB and turned ACK off. Still, I am facing issues. Does anyone have any recommendation that might be useful?
Hello everyone. I installed Splunk Enterprise free edition on the Ubuntu machine. I want to install splunk forwarder on a vulnerable machine and deploy it to the Ubuntu machine. I want to attack the ... See more...
Hello everyone. I installed Splunk Enterprise free edition on the Ubuntu machine. I want to install splunk forwarder on a vulnerable machine and deploy it to the Ubuntu machine. I want to attack the vulnerable machine over Kali linux and examine the logs falling to Splunk. Can you help me which vulnerable machine I can install Splunk Forwarder. 
If there are events like these. And I want  to find Fieldnames which have "abc" Event 1  File : abcdefg URL : 1232323232.com NUM : 1234567899 Name : James Event 2 File : abcdefg URL ... See more...
If there are events like these. And I want  to find Fieldnames which have "abc" Event 1  File : abcdefg URL : 1232323232.com NUM : 1234567899 Name : James Event 2 File : abcdefg URL : 1232323232.abc NUM : 1234567899 Name : James   File has "abc" 2 times and URL has "abc" 1 time. I want result like this FieldName Count File 2 URL 1 How can I make this result by SPL?
Hello, In a cloud migration, can a Splunk Cloud Search Head be configured to search both its cloud data and legacy data on-prem indexers? Ex. There's an on-prem index called 'index01' that cont... See more...
Hello, In a cloud migration, can a Splunk Cloud Search Head be configured to search both its cloud data and legacy data on-prem indexers? Ex. There's an on-prem index called 'index01' that contains historical data. There is also same index created in Splunk Cloud with 90 days of data. After switching the UF's to point to Splunk Cloud, is there a way to run a search in Splunk Cloud that searches the recent 90 days of data in the cloud + the historical on-prem data? In this scenario, it seems like it would be Splunk Cloud - > On-Prem but this blog does not have that option. https://www.splunk.com/en_us/blog/platform/introducing-splunk-federated-search.html
Hi, I'm trying to write the spl query on  usecase like  alertname!="*pdm*"  triggerred by user in between like 2 hours how could we achieve using it eval expression.
We are having an issue with DBX when we try creating a new Output connection. New Output Table Schema Preview fails with "Table name can't be blank" even though its not.  Also we can use the SQL Expl... See more...
We are having an issue with DBX when we try creating a new Output connection. New Output Table Schema Preview fails with "Table name can't be blank" even though its not.  Also we can use the SQL Explorer to view the same table and that works fine.  Any help to get past this step would be appreciated. We are using Splunk enterprise version 8.2.6.1 and Splunk DB Connect version 3.12.1
Hi, I'm trying to work on the IP scanners scanning many IPs on a single port usecase on splunk  index=firewall sourcetype="firewall_cloud" dest_port="   " | stats count by src_ip,dest_port | wher... See more...
Hi, I'm trying to work on the IP scanners scanning many IPs on a single port usecase on splunk  index=firewall sourcetype="firewall_cloud" dest_port="   " | stats count by src_ip,dest_port | where count >3 I'm not sure which dest_port we need to use over here or we need to take the src_port  if needed pls edit the search  thanks..