All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I created a custom regex to filter on a numeric value called "window size" which varies from positive to negative, and I want to display hosts by IP. Trying to figure out the best command (chart,stat... See more...
I created a custom regex to filter on a numeric value called "window size" which varies from positive to negative, and I want to display hosts by IP. Trying to figure out the best command (chart,stats) etc. I really want it to have hosts all on a line graph and their unique window sizes...  I'm not sure if I have to use trellis to accomplish this, but I was hoping to make each line a host IP address and possibly have the x axis represent the window sizes available with the up/down spikes in window sizes being demonstrated. I already have my two fields, just cant figure out how to display the data correctly in a visualization. NOTE: Whenever I do "chart count" this kind of gets in my way because count takes up a value and I really don't know how to format it... I need hosts to "dip up and down" with values    Thanks in advance!
Sample event   { durationMs: 83 properties: { url: https://mywebsite/v1/organization/41547/buildings } correlationId: e581d476-fa5f-4023-a53e-53d6e06734ae }   ... See more...
Sample event   { durationMs: 83 properties: { url: https://mywebsite/v1/organization/41547/buildings } correlationId: e581d476-fa5f-4023-a53e-53d6e06734ae }   I want to replace the ids into https://mywebsite/v1/organization/{id}/buildings I tried {base search string} | eval endpoint=replace(properties.url, "\d+", "{id}") | stats by endpoint  This return no result, but if I try other coorelationId field on the root level, {base search string} | eval endpoint=replace(coorelationId, "\d+", "{id}") | stats by endpoint   This return what I expected endpoint                   |   (other fields) adb{id}f{id}-{id}fd{id}-{id}-a{id}b-{id}c{id}f{id}d | (other fields) aea{id}e{id}c-fcdc-{id}-a{id}-{id}a{id}bfe{id}ee{id} | (other fields) Why replace doesn't work on nested field?
I'm trying to setup splunk on our network.  We must use a proxy to access the internet.  I've set (I've tried with and without sslVersions): [sslConfig] sslRootCAPath = /etc/pki/tls/cert.pem ss... See more...
I'm trying to setup splunk on our network.  We must use a proxy to access the internet.  I've set (I've tried with and without sslVersions): [sslConfig] sslRootCAPath = /etc/pki/tls/cert.pem sslVersions = tls1.2 [applicationsManagement] sslVersions = tls1.2 [proxyConfig] http_proxy = http://PROXY:8080 https_proxy = http://PROXY:3128 no_proxy = 127.0.0.0/8,::1,localhost,10.0.0.0/8,192.168.0.0/16,.nwra.com splunkd reports: 11-16-2022 11:36:34.092 -0800 ERROR HttpClientRequest [50124 TcpChannelThread] - HTTP client error =error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol while accessing server=http:/ /PROXY:3128 for request=https://cdn.splunkbase.splunk.com/media/private/signed_42 40_20873_1668244830.tgz?response-content-disposition=attachment%3Bfilename%3D%22infosec-app-for-sp lunk_170.tgz%22&Expires=1668628893&Signature=Ks6QSvwm3FOjimXq42aW-xSdBeysPA1gYrQlQu0Urpf-R7XfnVyQn F8ChIlT4blEJ38jq-1Iy9vYopkI5MvZoccqJLsbv~fe8peAxgIDHABo0kGLacXoXgiYEE5MGxMmBlBcvA54dwr4xqdmo69zxl6 FhfGxHBfi6KUAZ6zgrv0RlZNz7uQR95cmTpjPbtwlDDbw8IeUE4~NEDnNhRwAqD3mKiSHhfGYEgDF5kQMEHgkm2csRMyJ7i4qR MscF~dUeqjvrN0P1W~NfL8vykYTHWMXqoeY1OVFliRXzfhqjwcCw8GtQgCcTWT7WOrHLfhZNJR-nJ9kf786SLqgNVQUXA__&Ke y-Pair-Id=K3GLBBC7R7U34X.   I can download that URL fine from the machine directly: https_proxy=http://PROXY:3128 curl 'https://cdn.splunkb ase.splunk.com/media/private/signed_4240_20873_1668244830.tgz?response-content-disposition=attachm ent%3Bfilename%3D%22infosec-app-for-splunk_170.tgz%22&Expires=1668627891&Signature=aA-kU~xxaEcPSU~ A3fY4tPEY2mzdfDNN-T4I~RF3bEFfqJB8u2-K7ia8IEMP~uqxqWQhGCKr2oBRC3qQqdsa2-vwz8yzvNgIPcwI5VFEjjFBs1yZu -0k91sOjFgbiCx3z2FetbSm2K05FOCCN2GCxrJacpjSCz9kPJdFrnsZRDgrdX9vHsC62Fn60OWt0IgRS3qoXKdHHWXct5-RFUc iKoOFWX8Hdp4ZGXe~xx3UGhqkonqV-ZE~Nt34beC~J5SGdvTS8mZcr7bZKL9M4fefGRtHiVzdK8ffuqCe5Fsthoyyl8OHr4MJy TptHLcwZKJhthqee80hyrlPYyGVgiEeyQ__&Key-Pair-Id=K3GLBBC7R7U34X' -o /tmp/out  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current                                 Dload  Upload   Total   Spent    Left  Speed 100   114  100   114    0     0    139      0 --:--:-- --:--:-- --:--:--   139  Both the splunk server and the proxy are running EL 8.7.
How can I run a report against all the servers with the machine agent installed but not linked to an application?
I'm working on doing some troubleshooting as to why the Splunk Add-On is not ingesting data into our Splunk Cloud environment (version 9.0), and I noticed that we have received a warning stating that... See more...
I'm working on doing some troubleshooting as to why the Splunk Add-On is not ingesting data into our Splunk Cloud environment (version 9.0), and I noticed that we have received a warning stating that the add-on is not compatible with Python 3.  We followed Splunk's documentation (Link: Splunk Documentation ) and we are still not having any success.  As a side note, we were unable to locate the permission below, however I'm not 100% sure that is the entire cause: ReportingWebService.Read.All - Read Message Trace data - Microsoft Reporting WebService Would the incompatibility with Python 3 be cause issues with ingesting data?
Hi all, I'm attempting to develop a regex that will pick up on a value contained in [ ] brackets (see below): Log value year number time:time:time 00 AAA0 Blah Blah Blah Blah Blah: [X] to [Y] (... See more...
Hi all, I'm attempting to develop a regex that will pick up on a value contained in [ ] brackets (see below): Log value year number time:time:time 00 AAA0 Blah Blah Blah Blah Blah: [X] to [Y] (4 possible variables X,Y,A,B)   I need to alert every time the * to [ bracketed value] changes. Trying to make a regex to pick out these bracketed values. Any help is appreciated!  
Hello Guys! Is my first post so sorry if the title is not as specific as it should be Look, we have an order tracking report here The first status is label created at 10:02   Later, a ... See more...
Hello Guys! Is my first post so sorry if the title is not as specific as it should be Look, we have an order tracking report here The first status is label created at 10:02   Later, a new status "arrived_at_facility" is added, and even tough that's the latest one. "Label_created" is superimposed    And this continues on and on, the tracking statuses are arriving as normal, but label_created is continued being moved as the latest one. So our tracking report always takes "label_created" as the latest status, instead of something else as "in_transit"  Any ideas of what could be wrong with our logs? Thanks in advance guys. Any additional info you can need, ask away  
Hello Splunkers, We have ran into several issues primarily with getting data into Splunk over HTTP Collectors. It appears that we need to update our cert with one that has a root ca that has been ap... See more...
Hello Splunkers, We have ran into several issues primarily with getting data into Splunk over HTTP Collectors. It appears that we need to update our cert with one that has a root ca that has been applied to our Splunk instance instead of a self-signed certificate. We are trying to determine what impact updating the cert across our entire environment could have.  After adding a cert to splunk web does not push down the the HTTP collectors. They were still using the self-signed certificate. So it appears adding a new certificate to the cluster is required. This will be my first time updating the certificate across the entire environment so feel free to provide any advice or doc pages that could assist. Documentation we are currently using: https://docs.splunk.com/Documentation/Splunk/9.0.2/Security/ConfigureandinstallcertificatesforLogObserver
Trying to get these UUID/GUIDs to extract from the message field. Hoping to create a rex to extract everything after 'fieldx: ' in the 8-4-4-4-12 character window separated by each , after that. Ive ... See more...
Trying to get these UUID/GUIDs to extract from the message field. Hoping to create a rex to extract everything after 'fieldx: ' in the 8-4-4-4-12 character window separated by each , after that. Ive tried the "extract new fields " but there are well over 120 of these things and splunk doesnt like selecting all of that and filtering keeps throwing errors. And would rather not have to do this one by one.  These are embedded in the message field as stated earlier. Id like to make a new field with the rex if possible and name it "fieldx" Any and all help is welcome.  "message: Filtered marking ids for DAC property 'fieldx': abc12345-b123-c456-d789-123abx789edc, de14fc5e-22av-87dd-65d9-7563a7pleqw3, "(<----there are about 120 more in a row of these) Thanks in advance    
Hi, is it possible to add a task in a phase of a workbook in a particular container via an api call? thanks for the help.  
I am currently trying to set up the Splunk_SA_CIM application but it displays "An error occurred fetching assets. Please try again." without any additional indications.    Splunk_SA_CIM was insta... See more...
I am currently trying to set up the Splunk_SA_CIM application but it displays "An error occurred fetching assets. Please try again." without any additional indications.    Splunk_SA_CIM was installed with our Enterprise Security and is currently  at version 4.18.0. I already checked documentation and my user has the accelerate_datamodel capability. I didn't find any ressource online that could indicate what is wrong... 
hi I am trying to get my dashboard better and move all of the different searches to a single/couple of base searches and then post processing. did what I saw in the forum and the documentation, did... See more...
hi I am trying to get my dashboard better and move all of the different searches to a single/couple of base searches and then post processing. did what I saw in the forum and the documentation, didnt give any results... original code (which gives results): <form> <label>Emulation run analysis</label> <fieldset submitButton="false" autoRun="true"> <input type="time" token="TimeRangePkr" searchWhenChanged="true"> <label>Time Range</label> <default> <earliest>-7d@h</earliest> <latest>now</latest> </default> </input> <input type="dropdown" token="steppingToken" searchWhenChanged="true"> <label>Stepping</label> <choice value="*">All</choice> <default>*</default> <fieldForLabel>stepping</fieldForLabel> <fieldForValue>stepping</fieldForValue> <search> <query>index=validation_bigcore aa_data_source="core_emu_run_info" |stats count by stepping</query> <earliest>-7d@h</earliest> <latest>now</latest> </search> </input> </fieldset> </form> code I am trying with base search: <form> <label>Emulation run analysis</label> <search id="baseSearch"> <query>index=validation_bigcore aa_data_source="core_emu_run_info"</query> <earliest>$TimeRangePkr.earliest$</earliest> <latest>$TimeRangePkr.latest$</latest> </search> <fieldset submitButton="false" autoRun="true"> <input type="time" token="TimeRangePkr" searchWhenChanged="true"> <label>Time Range</label> <default> <earliest>-7d@h</earliest> <latest>now</latest> </default> </input> <input type="dropdown" token="steppingToken" searchWhenChanged="true"> <label>Stepping</label> <choice value="*">All</choice> <default>*</default> <fieldForLabel>stepping</fieldForLabel> <fieldForValue>stepping</fieldForValue> <search base="baseSearch"> <query> |stats count by stepping</query> </search> </input> </fieldset> </form>   tried playing with the earliest/latest to be in base-search, in the post processing, in both, none gave results so probably not that any ideas what am I doing wrong? thanks, Noam
Hi All,   I have data as below, my requirement is to append/merge both the columns and then for each year split the column into multiple and place the details for one year field value adjacent to t... See more...
Hi All,   I have data as below, my requirement is to append/merge both the columns and then for each year split the column into multiple and place the details for one year field value adjacent to the previous one.   Merging part can be taken care, i need the solution for splitting columns part. Can someone please help how to achieve this.   SPL:   |rex field=_raw "(?<Date>\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2})" | rex field=_raw "\w+:\s(?<Year>(\d+))\sQ" | rex field=_raw "\d+\s(?<Quarter>(Q\d)):" |rex field=_raw "\s+(?<Count>(\d+M))" |table Year,Quarter,Count |strcat Quarter " " Count Task |fields - Quarter Count |strcat Year " " Task Ask |fields - Year Task   Below is the expected result. 2021           2022 Q4 2m         Q4 5m Q3 1m         Q3 7m Q2 2m         Q2 8m Q1 0m         Q1 5m
I am having issue with "Status" values as below and screenshot, please find below json and search query. Please advise!! Appreciate your help!   EventDate( 21/10/2022) EventDate( 20/1... See more...
I am having issue with "Status" values as below and screenshot, please find below json and search query. Please advise!! Appreciate your help!   EventDate( 21/10/2022) EventDate( 20/10/2022) Expected "Status" DOCUMENT_ERROR:2 DOCUMENT_REQUEST_RECEIVED:2   Actual "Status" DOCUMENT_REQUEST_RECEIVED:2 DOCUMENT_ERROR:2   |eval Status = mvzip('eventData{}.eventStatusCount{}.status', 'eventData{}.eventStatusCount{}.count', ":") | table "eventData{}.eventDate","eventData{}.ReceivedCount", "eventData{}.ProcessedCount","eventData{}.MismatchCount","Status" | rename eventData{}.eventDate as "EventDate",eventData{}.ReceivedCount as "Total Event Received Count", eventData{}.ProcessedCount as "Total Event Processed Count", eventData{}.MismatchCount as "Total Event Mismatch Count"   "eventData": [ { "eventDate": "2022-10-20", "eventKey": "event.request", "ProcessedCount": 0, "eventStatusCount": [], "ReceivedCount": 100, "MismatchCount": 100 }, { "eventDate": "2022-10-21", "eventKey": "event.request", "ProcessedCount": 2, "eventStatusCount": [ { "status": "DOCUMENT_ERROR", "count": 2 }, { "status": "DOCUMENT_REQUEST_RECEIVED", "count": 2 } ], "ReceivedCount": 1000, "MismatchCount": 998 } ]    
Good Day,               I am a brand new Splunk user who recently downloaded the free trial splunk license. I was using it in conjunction with a class from Udemy. Yesterday I did something to recei... See more...
Good Day,               I am a brand new Splunk user who recently downloaded the free trial splunk license. I was using it in conjunction with a class from Udemy. Yesterday I did something to receive a license warning. I have no idea what I did or how to get rid of the searches that are causing the warnings. Any help would be greatly appreciated.
Hi, Sometimes if we are doing base search, if not handled properly, you will see page loading, how do you handle it? Regards Suman P.
I have read all the posts about "merging fields" and none of the options work for me. I have events where the same value can come in fields with different names. For example, one has the Action in ... See more...
I have read all the posts about "merging fields" and none of the options work for me. I have events where the same value can come in fields with different names. For example, one has the Action in a field called "act" and another the field is "actResult". I tried to use: |eval Action = coalesce("act","actResult") |eval Action = mvappend("act","actResult") But both optiones is generating a field with "act" and "actResult" as value, removing all actual values. And also tried: |rename act as Action actResult as Action But it doesn't work Any ideas?  
Hi SMEs, Seeking advice on how i can create a rule/correlation search to detect some RHEL known vulnerabilities (CVEs)
In my new dashboard, I use the Kmeans algorithm twice.  The clustering is different in each case, is there a way to fix the random seed used within the algorithm?  I want to fix the random nature of ... See more...
In my new dashboard, I use the Kmeans algorithm twice.  The clustering is different in each case, is there a way to fix the random seed used within the algorithm?  I want to fix the random nature of the algorithm so that I get repeatable clustering.     Thank you