Activity Feed
- Posted Re: How to count events in a time frame based on a time elapsed field on Splunk Search. 06-08-2022 02:44 PM
- Karma Re: How to count events in a time frame based on a time elapsed field for yuanliu. 06-08-2022 02:44 PM
- Posted How to count events in a time frame based on a time elapsed field on Splunk Search. 06-08-2022 06:57 AM
- Posted Re: How to export the results of a Splunk search that contains transforming commands? on Splunk Search. 03-10-2022 06:00 AM
- Posted Re: How to export the results of a Splunk search that contains transforming commands? on Splunk Search. 03-10-2022 04:57 AM
- Posted How to export the results of a Splunk search that contains transforming commands? on Splunk Search. 03-09-2022 07:06 PM
- Tagged How to export the results of a Splunk search that contains transforming commands? on Splunk Search. 03-09-2022 07:06 PM
- Tagged How to export the results of a Splunk search that contains transforming commands? on Splunk Search. 03-09-2022 07:06 PM
- Tagged How to export the results of a Splunk search that contains transforming commands? on Splunk Search. 03-09-2022 07:06 PM
- Tagged How to export the results of a Splunk search that contains transforming commands? on Splunk Search. 03-09-2022 07:06 PM
- Posted Re: Using mvappend within a cidrmatch macro on Splunk Search. 06-30-2021 02:14 PM
- Posted Using mvappend within a cidrmatch macro on Splunk Search. 06-30-2021 02:03 PM
- Karma Re: Simplify REGEX for gcusello. 11-03-2020 09:56 AM
- Posted [WinError 10061] Export data using the Splunk SDKs not working? on Splunk Dev. 11-03-2020 08:52 AM
- Posted Can you call a workflow action in Splunk (or call a WF action on multiple events) on Knowledge Management. 10-27-2020 08:32 AM
- Posted Re: Why does this chart work, but this table doesn't? on Splunk Search. 10-14-2020 08:30 AM
- Posted Re: Why does this chart work, but this table doesn't? on Splunk Search. 10-09-2020 07:32 AM
- Posted Why does this chart work, but this table doesn't? on Splunk Search. 10-07-2020 03:32 PM
- Posted Re: Can the precision of numerical results be changed in the search? on Splunk Search. 10-07-2020 09:23 AM
- Posted Can the precision of numerical results be changed in the search? on Splunk Search. 10-06-2020 02:43 PM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 |
06-08-2022
02:44 PM
Thanks, I eventually came to something similar! I think this is the solution I am after, unless you can spot a hole in the logic. | eval seconds_taken = time_taken/1000
| eval responded = _time, requested = _time - seconds_taken
| where requested <= responded AND seconds_taken > 0
``` | where requested <= responded AND seconds_taken >= 0 ```
| timechart count span=1s
... View more
06-08-2022
06:57 AM
What is the is the best approach to creating a field that shows the number of incomplete requests in a given period of time? For the machine in question, events are logged when it completes the Request-Response Loop. I have a field `time_taken` which shows, in milliseconds, how long the Request-Response Loop has taken. I have already done the following, now how do I evaluate the total number of `open_requests` for each second? | eval responded = _time
| eval requested = _time - time_taken
| eval responded = strftime(responded ,"%Y/%m/%d %H:%M:%S")
| eval requested = strftime(requested ,"%Y/%m/%d %H:%M:%S")
| eval open_requests = ???
| table _time open_requests
| sort - _time
... View more
03-10-2022
06:00 AM
The instance of "Stream=True" that you are referring to is not related to any part of the request to Splunk's REST API. This is part of the python requests module/library that, if the response is available as a stream, reads the response as a stream. Further into the script this allows for the results to be written "live" as they are returned... instead of as one massive 2,924,899 line file. In essence, this is a band-aid to patch the issue I am actually asking in my question.
... View more
03-10-2022
04:57 AM
The trouble is not the timeout. When the Splunk search is complete the results are successfully moved into an output file. I am interested in learning how to return only the final results of ~300,000 rows from Splunk. The current URI path and parameters are leading to an output of more than 1,924,900 rows. I think this is some indication that Splunk is streaming "live" results and I only want the final results.
... View more
03-09-2022
07:06 PM
I am looking to export the results of a Splunk search that contains transforming commands. When I run the same search in the web GUI the live results "hang" on 50,000 stats, but once the search is complete it shows more than 300,000. (screenshots provided below) Using the Splunk API, I want to export all results in a .json format, and I only want to view the final results; I do not want to view the results as they are streamed In essence I want to avoid the API returning any row where: "preview":true What am I missing? While performing search Finished results Using python 3.9's requests, my script contains the following: headers={'Authorization': 'Splunk %s' % sessionKey}
parameters={'exec_mode': "oneshot", 'output_mode':output_type, 'adhoc_search_level':'fast', 'count':0}
with post(url=baseurl + '/services/search/jobs/export',params=parameters, data=({'search': search_query}), timeout=60, headers=headers, verify=False, stream=True) as response:
... View more
06-30-2021
02:14 PM
Never mind. Figuring it out just required taking a step back and reading the docs again. | eval subnet = mvappend(case(cidrmatch("$ip1$/24",src_ip), "$output_name$", cidrmatch("$ip2$",src_ip), "$output_name$"),subnet)
... View more
06-30-2021
02:03 PM
I already have the following macro `subnet(3)` defined as the following: | eval subnet = case(cidrmatch("$ip1$/24",src_ip), "$output_name$", cidrmatch("$ip2$",src_ip), "$output_name$") If I call the macro multiple in the same search the value of the field it creates (also called subnet) will be overwritten by the latest values. I would like to edit the macro so that calling it multiple times appends a new value to subnet. How could I use mvappend, or another command, to accomplish this?
... View more
Labels
- Labels:
-
eval
11-03-2020
08:52 AM
Apologies if these are very basic questions but I am new to the API and the SDK. I am running the script below following the guidelines provided in the documentation, but I am getting the following error. Can anyone point me in the correct direction? https://docs.splunk.com/Documentation/Splunk/8.1.0/Search/ExportdatausingSDKs ConnectionRefusedError: [WinError 10061] No connection could be made because the target machine actively refused it import splunklib.client as client
import splunklib.results as results
HOST =
PORT = 8089
USERNAME =
PASSWORD =
service = client.connect(
host=HOST,
port=PORT,
username=USERNAME,
password=PASSWORD)
rr = results.ResultsReader(service.jobs.export("search index=_internal earliest=-1h | head 5"))
for result in rr:
if isinstance(result, results.Message):
# Diagnostic messages might be returned in the results
data=(result.type, result.message)
string_format="%s:%s"
print(string_format % data)
elif isinstance(result, dict):
# Normal events are returned as dicts
print(result)
assert rr.is_preview == False
... View more
10-27-2020
08:32 AM
I created a workflow action to perform a reverse IP lookup using the link method GET. I would like to perform this action on multiple events. Is there a way to call this action on specific events within the search (not using the GUI)? Is there a way to call this action on multiple events (with or without the GUI?) So many thanks!
... View more
Labels
- Labels:
-
workflow action
10-14-2020
08:30 AM
I solved this with a workaround that may not be the most "splunkable" solution, but provides the answer results I am looking for. |eval splitfield=stocks+"_pct"
|stats sum(eval(flow*100))AS pct BY day_hour splitfield
|table day_hour pct splitfield
|eval {splitfield}=pct
|fields -splitfield,pct
|stats values(*) AS * BY day_hour
|fillnull
|addtotals
... View more
10-09-2020
07:32 AM
Thank you for your reply @renjith_nair but I am trying to create a table. The chart command I am using creates the intended format... but the table command does not. Do you know why this is happening?
... View more
10-07-2020
03:32 PM
I would like to apply a formula to each of the values in the field "stocks." I have been able to show this in a chart, but I need it as a table... what is going on here? The values in day_hour and stocks are strings. Flow is a numeric value. Pct should be a numeric value. | chart sum(eval(flow*100))AS pct BY day_hour stocks The charting command produces the following. This is how I want my table to look. day_hour stock_name_A stock_name_B stock_name_C 2020-01-01 00:00 2020-01-01 01:00 2020-01-01 02:00 Instead, my table looks like this: day_hour stocks pct 2020-01-01 00:00 stock_name_A 2020-01-01 00:00 stock_name_B 2020-01-01 00:00 stock_name_C 2020-01-01 01:00 stock_name_A 2020-01-01 01:00 stock_name_B 2020-01-01 01:00 stock_name_C 2020-01-01 02:00 stock_name_A 2020-01-01 02:00 stock_name_B 2020-01-01 02:00 stock_name_C
... View more
10-07-2020
09:23 AM
Thank you for trying to respond, but this doesn't answer the question. The closest answer appears to be the tostring function, but it also changes the datatype. … |stats sum(eval(sc_bytes/1073741824)) AS Gigabytes BY date
| eval Gigabytes = tostring(Gigabytes, "commas")
... View more
10-06-2020
02:43 PM
Let's say you have the following search: ... | stats sum(eval(sc_bytes/1073741824)) AS Gigabytes BY date The resulting values in the Gigabytes column may have many characters after the decimal point. In a results table or a dashboard one may format the values with commas or define precision in order to make the information easier to read at a glance. Is there a way to change how these values are displayed without changing the underlying information from the search? I know the following may be used to convert the values to a string, but is there a way to change the way these values are displayed without changing the number - perhaps you want to store it for later formulas? ... | stats sum(eval(sc_bytes/1073741824)) AS Gigabytes BY date | eval Gigabytes=printf("%.4f",Gigabytes)
... View more
09-30-2020
04:05 PM
@nick405060 This works for me, but my global variable is the sum of a very lengthy search. For as long as the search that creates the global variable is running, the search that uses starts, cancels itself, and then starts again, over and over and over. All searches eventually run as expected, but its not pretty to look at. Is there a way to hold the dependent search from starting until the global variable has been created?
... View more
09-28-2020
08:17 AM
Several months back I created a macro with the following regular expressions to "clean up" and concatenate several strings that I often use. Is there a website or tool that would help me to understand regex so that I may figure out how to simplify the search string? My goal is to speed up the search. I think eliminating the redundant rex commands would help but if there is an even better solution I want to know what it is. The macro currently contains the following: | eval source_clean=source
| rex field=source_clean mode=sed "s/\\\u_\S+//g"
| rex field=source_clean mode=sed "s/^[^\\\]*\\\//"
| rex field=source_clean mode=sed "s/^[^\\\]*\\\//"
| rex field=source_clean mode=sed "s/^[^\\\]*\\\//"
| lookup Source-Lookup.csv source AS source_clean OUTPUT web_domain
| eval pages = web_domain+cs_uri_stem I do not have access to the lookup table that would allow me to add slashes to the `source column` as a way to eliminate the need for lines 3-5.
... View more
09-02-2020
09:25 AM
Thank you for the reply but it doesn't really help. I am looking to create a Gantt style chart without incorporating additional plugins or apps
... View more
09-02-2020
08:14 AM
I am looking to visualize the start and end time of events by IP within a very narrow time frame. The attached image show what I imagine the visualization to look like. I guess this would use the horizontal bar chart? Can one create this type of visualization in Splunk without additional plugins?
... View more
- Tags:
- gantt
Labels
- Labels:
-
timechart
-
trellis layout
09-01-2020
12:45 PM
This is great, but how are you the poster with the correct answer to every one of my questions!? 😂
... View more
09-01-2020
12:43 PM
Hello @kkrishnan_splun and @niketn This is amazingly helpful and seems like a great way to add one of the best features of Tableau to Splunk's dashboard. I have three questions that I will need to answer before I suggest this to my organization: Would this JS file need updated every time a new dashboard or panel is created? Could one, in theory, just write the JS file for 10 panels... <panel id="tooltip_panel1">
<panel id="tooltip_panel2">
<panel id="tooltip_panel3">
... ...
//On mouseover() event set the show token for the Tooltip
$('#tooltip_panel1').on("mouseover",function(){
var tokens = mvc.Components.get("submitted");
tokens.set("tokToolTipShow1", "true");
});
//On mouseout() event unset the show token for the Tooltip to hide the same.
$('#tooltip_panel1').on("mouseout",function(){
var tokens = mvc.Components.get("submitted");
tokens.unset("tokToolTipShow1");
});
$('#tooltip_panel2').on("mouseover",function(){
var tokens = mvc.Components.get("submitted");
tokens.set("tokToolTipShow2", "true");
});
//On mouseout() event unset the show token for the Tooltip to hide the same.
$('#tooltip_panel2').on("mouseout",function(){
var tokens = mvc.Components.get("submitted");
tokens.unset("tokToolTipShow2");
... ...and the option to use the extra XML would then be available -- in any Dashboard -- for any panel, named tooltip_panelx ? If the naming convention was panelx (as in the example) and not tooltip_panelx (as in my last question), would this change have any affect on the usability or performance of other panels with the id="panel1" that are not embedding the additional XML for these tool tips? Thank you,
... View more
08-25-2020
08:58 AM
The following search works well enough, but I would like the color of the "bubbles" to be based on sc_status="200" or sc_status!="200" I still want to show a bubble for all of the cs_uri_stem values. In theory, if every cs_uri_stem has at least one event that is status 200 and at least one event that is something else, this could duplicate the number of rows in the output table. ...base search...
| stats avg(eval(time_taken)) AS avg_tt, avg(eval(sc_bytes)) AS avg_bytes,
count(eval(source)) AS NumTransactions, BY cs_uri_stem
| table cs_uri_stem, avg_tt, avg_bytes, NumTransactions
| rename avg_bytes AS "Average Bytes Returned" avg_tt AS "Average Time in Milliseconds" NumTransactions AS "# of Transactions" Can this be accomplished in the Dashboard's XML? Can this also be accomplished with an eval statement in the search itself?
... View more
Labels
- Labels:
-
CSS
-
panel
-
simple XML
-
table
08-25-2020
08:34 AM
I would like to create a new field, FlagSC, based on the value of sc_status. The new field should have a value of "OK" when the status is 200, or a value of "Other" for all other statuses. I intend to use this in a bubble chart with colors based on FlagSC In theory, if every cs_uri_stem has at least one event that is status 200 and at least one event that is something else, this could duplicate the number of rows in the output table. I have tried variations of the code below: ...base search...
| stats values(eval(if(sc_status==200,"OK","Other"))) AS FlagSC,
avg(eval(time_taken)) AS avg_tt,
avg(eval(sc_bytes)) AS avg_bytes,
count(eval(source)) AS NumTransactions,
BY cs_uri_stem
| table FlagSC, avg_tt, avg_bytes, NumTransactions
| rename avg_bytes AS "Average Bytes Returned" avg_tt AS "Average Time in Milliseconds" NumTransactions AS "# of Transactions" Ultimately, the goal is to have something that might resemble the following and does NOT include any rows where FlagSC is "OKOther" cs_uri_stem FlagSC avg_tt avg_bytes NumTransactions foo/ OK ... ... ... foo/ Other ... ... ... bar/ OK ... ... ... bar/ Other ... ... ...
... View more
08-19-2020
06:36 AM
The head command appears to work correctly, but the results do not match up. In the attached screenshot the values that have the greatest value in GB do not have the greatest value in Bytes.
... View more
08-18-2020
02:17 PM
I have four versions of a nearly identical search. The last one returns a completely different result. What is it about the interaction of the "sort" and "head" commands that changes the outcome? ...| stats sum(eval(sc_bytes/1073741824)) AS Gigabytes by cs_uri_stem | sort -sc_bytes
...| stats sum(eval(sc_bytes/1073741824)) AS Gigabytes by cs_uri_stem | sort -Gigabytes
...| stats sum(eval(sc_bytes/1073741824)) AS Gigabytes by cs_uri_stem | sort -Gigabytes | head 100
...| stats sum(eval(sc_bytes/1073741824)) as Gigabytes by cs_uri_stem | sort -sc_bytes | head 100
... View more
07-31-2020
12:56 PM
That produces a result but it doesn't help me very much. I need to perform this function on all the events in the dateset and use the percentage later in another function. Can you tell me how to find the percentage for all events and save the result?
... View more