Activity Feed
- Got Karma for Re: Case insensitive search in rex. Thursday
- Posted Re: Single Value Drilldown not working in Dashboard Studio on Splunk Enterprise. 02-25-2025 02:18 PM
- Posted Single Value Drilldown not working in Dashboard Studio on Splunk Enterprise. 02-25-2025 01:19 PM
- Got Karma for Re: Error in 'litsearch' command: Your Splunk license expired or you have exceeded your license limit too many times.. 12-09-2024 12:51 PM
- Posted Splunk DB connect indexing only 10k events per hour on Getting Data In. 08-28-2024 12:38 PM
- Got Karma for Re: Why are deployment clients not showing up in deployment server?. 07-29-2024 07:50 AM
- Got Karma for Re: Receiving the following error: "failed to start splunk.service: unit splunk.service not found". 07-18-2024 10:51 AM
- Got Karma for Re: How do I set a query to run overnight without it expiring before it completes?. 07-11-2024 03:42 PM
- Got Karma for Re: I am not recieving the logs of my linux machine. 06-04-2024 06:52 PM
- Got Karma for Re: I am not recieving the logs of my linux machine. 06-04-2024 05:11 PM
- Got Karma for Re: I am not recieving the logs of my linux machine. 06-04-2024 05:11 PM
- Got Karma for Re: I am not recieving the logs of my linux machine. 06-04-2024 05:11 PM
- Got Karma for Re: How can I convert column value to column?. 04-16-2024 06:21 AM
- Got Karma for Re: What does "notracking@example.com" mean in Splunk Add-on for Microsoft Cloud Services?. 04-03-2024 07:35 PM
- Got Karma for Re: JSON format - Duplicate value in field. 04-03-2024 02:05 AM
- Got Karma for Re: How can I identify real time searches?. 03-15-2024 12:52 AM
- Got Karma for Re: stats vs eventstats. 03-01-2024 09:58 PM
- Got Karma for Re: Datamodel Acceleration questions. 02-13-2024 08:27 AM
- Got Karma for Re: How to overwrite the host field value with dvc field value ?. 10-04-2023 06:52 AM
- Got Karma for Re: what actually dnslookup doing in my query? and what is it?. 09-26-2023 06:18 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
1 | |||
0 |
02-25-2025
02:18 PM
Yes this works. do you know how to link this table dynamically to reports? I am trying to convert this to dashboard studio and I do not see an option to link reports dynamically. I know you can set token and then set custom link but then problem here is values have spaces in it which it does not recognize by default. $rn$ in below example has spaces in it which URL doesnt recognize in dashboard studio. <table>
<title>test</title>
<search>
<query>index=_internal | stats count by savedsearch_name status
</query>
<earliest>$field1.earliest$</earliest>
<latest>$field1.latest$</latest>
</search>
<option name="drilldown">row</option>
<option name="link.inspectSearch.visible">0</option>
<option name="link.openSearch.visible">0</option>
<drilldown>
<set token="rn">$row.savedsearch_name$</set>
<link>
<![CDATA[/app/search/report?s=$rn$
]]>
</link>
</drilldown>
</table>
... View more
02-25-2025
01:19 PM
Hello I am trying to drilldown on a single value panel but its not working. looks simple but I am not sure what I am doing wrong. here is my source code {
"title": "Set Tokens on Click - Example",
"description": "",
"inputs": {
"input_global_trp": {
"options": {
"defaultValue": "-24h@h,now",
"token": "global_time"
},
"title": "Global Time Range",
"type": "input.timerange"
}
},
"defaults": {
"dataSources": {
"ds.search": {
"options": {
"queryParameters": {
"earliest": "$global_time.earliest$",
"latest": "$global_time.latest$"
}
}
}
}
},
"visualizations": {
"viz_column_chart": {
"containerOptions": {},
"context": {},
"dataSources": {
"primary": "ds_qBGlESX2"
},
"eventHandlers": [
{
"options": {
"tokens": [
{
"key": "row.method.value",
"token": "method"
}
]
},
"type": "drilldown.setToken"
}
],
"options": {},
"showLastUpdated": false,
"showProgressBar": false,
"title": "HTTP Request Method",
"type": "splunk.singlevalue"
},
"viz_pie_chart": {
"dataSources": {
"primary": "ds_c8AfQapt"
},
"title": "Response Codes for Method $method$",
"type": "splunk.pie"
}
},
"dataSources": {
"ds_c8AfQapt": {
"name": "Search_2",
"options": {
"query": "index=_internal method=$method$ | stats count by status"
},
"type": "ds.search"
},
"ds_qBGlESX2": {
"name": "Search_1",
"options": {
"enableSmartSources": true,
"query": "index=_internal method=GET | stats count by method"
},
"type": "ds.search"
}
},
"layout": {
"globalInputs": [
"input_global_trp"
],
"layoutDefinitions": {
"layout_1": {
"structure": [
{
"item": "viz_column_chart",
"position": {
"h": 400,
"w": 600,
"x": 0,
"y": 0
},
"type": "block"
},
{
"item": "viz_pie_chart",
"position": {
"h": 400,
"w": 600,
"x": 600,
"y": 0
},
"type": "block"
}
],
"type": "grid"
}
},
"tabs": {
"items": [
{
"label": "New tab",
"layoutId": "layout_1"
}
]
}
}
}
... View more
- Tags:
- Dashboard Studio
Labels
- Labels:
-
using Splunk Enterprise
08-28-2024
12:38 PM
Hello, Splunk db_connect is indexing only 10k events per hour at a time no matter what setting I configure in inputs. db connect version is 3.1.0 db connect db_inputs.conf is [ABC]
connection = ABC_PROD
disabled = 0
host = 1.1.1.1
index = test
index_time_mode = dbColumn
interval = 900
mode = rising
query = SELECT *\
FROM "mytable"\
WHERE "ID" > ?\
ORDER BY "ID" ASC
source = XYZ
sourcetype = XYZ:lis
input_timestamp_column_number = 28
query_timeout = 60
tail_rising_column_number = 1
max_rows = 10000000
fetch_size = 100000 when i run the query using dbxquery in splunk i do get more than 10k events. Also i tried max_rows = 0 which basically should ingest everything but its not working. how can I ingest unlimited rows.
... View more
- Tags:
- DB Connect
Labels
- Labels:
-
heavy forwarder
-
inputs.conf
-
Linux
-
props.conf
05-04-2023
08:01 AM
Hello Splunkers,
I have an event like this:
blocked,Adware,ABCD,test.exe,\\program_files\c\Drivers\,,,Generic PUA JB,,Endpoint Protection
I am extracting fields using comma separator delimiter, so my props.conf and transform.conf is:
transforms.conf
[cs_srctype]
CLEAN_KEYS = 0
DELIMS = ,
FIELDS = action,category,dest,file_name,file_path,severity,severity_id,signature,signature_id,vendor_product
props.conf
[cs_srctype]
KV_MODE = none
REPORT-cs_srctype = cs_srctype
Now the output that I am getting is :
file_path = \\program_files\c\Drivers\,
severity=
severity_id= Generic PUA GB
signature=
signature_id= Endpoint Protection
vendor_product=
All the fields before file_path are getting extracted properly and after file_path are incorrect because it's adding comma and thus not separating properly. how do I ignore the \, and extract the fields properly.
Thank you in advance
... View more
Labels
- Labels:
-
field extraction
06-09-2022
12:22 PM
1 Karma
hello, what are you trying to achieve? Could you please give us input and expected output in tabular format. Isn't it should be | eval hours=if(day="Mon",hours+2,hours)
... View more
06-08-2022
11:47 AM
The SEDCMD script applies only to the _raw field at index time. With the regular expression transform, you can apply changes to other fields https://docs.splunk.com/Documentation/Splunk/latest/Data/Anonymizedata?_gl=1*16ogb9v*_ga*MTA1MzM4MzA5NS4xNjQ2NDE5MTg2*_gid*MTQyMDA2ODcwNS4xNjU0NzA0NTg4&_ga=2.135579313.1420068705.1654704588-1053383095.1646419186#Caveats_for_anonymizing_data you can try using evaluation functions as well | makeresults
| eval _raw="Process Create:true
Utc Time: 2022-04-28 22:08:22.025
Process Guid: {XYZ-bd56-5903-0000-0010e9d95e00}
Process Id: 6228
Image: chrome.exe
Command Line: test"
| rex field=_raw max_match=0 "(?<field>[a-zA-Z ]+):(?<value>.+)"
| mvexpand field
| eval field1=replace(field,"\s","_") see if you can use calculated fields if its not a multivalue field.
... View more
06-08-2022
10:37 AM
Hello, you can achieve this using SEDCMD Scripts https://docs.splunk.com/Documentation/Splunk/8.2.6/Data/Anonymizedata#Example_of_substitution_using_a_sed.2FSEDCMD_script Transforms.conf must be used for the extracted field, and SEDCMD for _raw. See here for details. way at the bottom. https://answers.splunk.com/answers/739964/need-sedcmd-help.html
... View more
04-12-2022
12:21 PM
Look at the splunk documentation whats new and fixed issues. https://docs.splunk.com/Documentation/Splunk/8.2.4/ReleaseNotes/MeetSplunk#What.27s_New_in_8.2.4 https://docs.splunk.com/Documentation/Splunk/8.2.6/ReleaseNotes/MeetSplunk#What.27s_New_in_8.2.6 Should not be a major difference.
... View more
04-12-2022
09:43 AM
you have to try something like this to make it work with lookups https://community.splunk.com/t5/Splunk-Search/Using-CIDR-in-a-lookup-table/m-p/35787 like/accept if it works for you!
... View more
04-11-2022
09:52 PM
1 Karma
Put below in props.conf props.conf
[ssc_cloakware]
REPORT-extractions = field_extractions
EXTRACT-server = Server\s*:\s*(?<Server>[^\,]+) This is search time field extraction so make sure you write this regex in SH. OR simply go to search head: Settings » Fields » Field Extractions » Add new Destination App: <Choose appropriate app>
Name: Server
Apply to: Sourcetype: <sourcetype_name>
Extraction/Transform: Server\s*:\s*(?<Server>[^\,]+) Please upvote/accept to close this question if it works for you.
... View more
04-08-2022
07:58 AM
Hi for the search to work you would have to write this : <your base search>
| extract kvdelim=":" pairdelim=","
| rex "Server\s*:\s*(?<Server>[^\,]+)" props.conf and transforms.conf is best to put on the heavy forwarder if you have one or the indexing layer. The regex that I provided is for transforms only and it works well for all the events that you have given. https://regex101.com/r/0rdToo/1 use below configuration on HF or Indexers. props.conf
[ssc_cloakware]
REPORT-extractions = field_extractions
transforms.conf
[field_extractions]
REGEX = \s([^\:]+)\:\s+([^\,]+)
FORMAT = $1::$2 Restart the instance after editing the configuration. I am gonna test this configuration on my lab instance. meanwhile you do the same.
... View more
04-08-2022
06:38 AM
could you please give me below details: 1) please share more sample raw events 2) share the props and transforms that you have wrote 3) where did you write the props and transforms? 4) have you restarted splunk instance after updating props and transforms ?
... View more
04-07-2022
01:35 PM
can you share your query with few examples?
... View more
04-07-2022
01:32 PM
Well thats what "type=left" will do, it will give you results from the main search as well as the matching results from the subsearch. The above example is not matching your computerName is different, for subsearch it's PC44 and for main search it's 4GV that's why you see date,src and uri field blank in the result. Look for the one's where computerName is matching and there you should see all the fields.
... View more
04-07-2022
01:28 PM
best to use "\s*" instead of "\s" if you are not sure if there will be a space is future events or not.
... View more
04-07-2022
01:18 PM
could you please share more sample events as I do not see any error in your search. I have tried in this run anywhere search | makeresults
| eval _raw="store license for Store 123456
2022-04-07 19:17:44,360 ERROR path not found"
| rex field=_raw "Store\s123456\n\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}\,\d{3}\s(?P<errortext>.*)path"
| stats count by errortext From the regex I could see that you are searching for "Store 123456", please add that in the main search instead of in regex. index=* host="storelog*" "store license for Store 123456" Also is it a multiline event? that means is timestamp is on new line in raw logs or its just one line?
... View more
04-07-2022
12:48 PM
Not sure if this is what you are looking for below search will give you results from the main search as well as matching results from the subsearch: if you are just interested in matching results then change type=inner index="wineventlog" source="WinEventLog:Application"
| dedup ComputerName, ownerEmail, ownerFull, ownerName, ownerDept
| stats values(ownerEmail) as ownerEmail,values(ownerFull) as ownerFull, values(ownerName) ownerName, values(ownerDept) as ownerDept by ComputerName
| join type=left ComputerName
[ search index=traffic_log src="*" uri="*" site="friendly.org"
| eval date = date_month + "/" + date_mday + "/" + date_wday + "/" + date_year
| mvexpand date
| dedup src, uri
| lookup dnslookup clientip as src OUTPUT clienthost as ComputerName
| where like (ComputerName,"p%")
| dedup ComputerName
| stats values(src) as src, values(uri) as uri, values(date) as date by ComputerName]
... View more
04-07-2022
12:36 PM
is this what are you looking for ? Try this run anywhere search: | makeresults
| fields - _time
| eval Time="2022-04-07 07:00:11.028-EDT"
| eval time=strftime(strptime(Time, "%Y-%m-%d %H:%M:%S.%3Q-%Z"),"%Y-%m-%d %H:%M:%S.%3Q")
... View more
04-07-2022
07:57 AM
1 Karma
Hi is it a multiline event? if yes, could you please put an example of an entire raw event.
... View more
04-06-2022
11:19 PM
1 Karma
Yes you can but it will hit the license meter twice so you need to cautious about that. please see below example [tcpout:uf1]
server = xxx.xxx.xxx.xxx:9997
disabled = false
[tcpout-server://xxx.xxx.xxx.xxx:9997]
[tcpout:uf2]
server=yyy.yyy.yyy.yyy:9997
disabled = false
[tcpout-server://yyy.yyy.yyy.yyy:9997] . …
... View more
04-06-2022
10:33 PM
1 Karma
it should be the same way you generally forward data to indexing tier [tcpout]
defaultGroup = uf_tier
[tcpout:uf_tier] server=uf1:9997,uf2:9997,... so on. Refer: https://community.splunk.com/t5/Getting-Data-In/Sending-data-from-one-UF-to-other-UF/m-p/403838 https://docs.splunk.com/Documentation/SplunkCloud/latest/Forwarding/Configureforwarderswithoutputs.confd#Example
... View more
04-06-2022
10:21 PM
Hi its because you are using wrong time format: use this: | eval time = strptime(Time,"%d/%m/%Y %H:%M:%S") Accept/upvote if this helps!
... View more
04-06-2022
10:07 PM
I have updated the transform to accumulate the server field: The raw event that you gave , it should work now. https://regex101.com/r/dL6JPE/1
... View more
04-06-2022
09:42 PM
Server should be there as well, you can search for that field in all fields. Transforms.conf
[myplaintransform]
REGEX=\s([^\:]+)\:\s+([^\,]+)
FORMAT=$1::$2
props.conf
[sourcetype_name]
REPORT-a = myplaintransform Accept/Like if it works for you.
... View more