All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @akgmail , this seems to be a different question even if on a similar topèic. Anyway, to do calculations between dates, you have always to transforms then in epochtime (when they just aren't in ... See more...
Hi @akgmail , this seems to be a different question even if on a similar topèic. Anyway, to do calculations between dates, you have always to transforms then in epochtime (when they just aren't in thi s format) and then you have numbers that you can use for all your operations. If you don't like the format of the duration, you can create your own function to display a duration in the format you like making mathematic operations, so if you want to have a duration in hours, you have to divide the diff number (that are seconds) for 3600. | eval diff_in_hours=round(now()-_time)/3600,2) then you don't need to rename now() and _time. Ciao. Giuseppe
My application is a backend web service. All events in a request contain the same value for a "req_id" field. I have a use-case, where I want to look at all the events that occurred for requests, on... See more...
My application is a backend web service. All events in a request contain the same value for a "req_id" field. I have a use-case, where I want to look at all the events that occurred for requests, only when a particular log line is present. My first query would be this -   index="myapp" AND "some log line" | rex field=_raw "req_id=(?<req_id>[a-zA-Z0-9]*)"   And then my second query would be -   index="myapp" AND "$req_id" | transaction req_id   where the $req_id would be fed in by the first query. How do I join these two queries?  Thanks in advance!
@inventsekar We have onboarded the data coming from csv files in inputs.conf as below and the data is loaded every day as new csv is created with date stamp. [monitor:<path>/file_*.csv] disabled =... See more...
@inventsekar We have onboarded the data coming from csv files in inputs.conf as below and the data is loaded every day as new csv is created with date stamp. [monitor:<path>/file_*.csv] disabled = false sourcetype = <sourcetype> index=<index>   With this config, we are getting the data into splunk and each row in csv is loaded as separate event. Query:  index=<index> sourcetype=<sourcetype>.   All we need is to see the data similar to csv ie. we need to have a single line header and corresponding data for those columns (just how we see when we load the csv from Add inputs via Splunk GUI).    As of now, events are like as shown below and it repeats every day for new csv files   6/17/24 3:07:26.000 AM   col1,col2,col3,col4,col5,col6   host = <host> source =<source> sourcetype =<sourcetype>  6/17/24 3:07:26.000 AM   data1,data2,data3,data4,data5,data6 host = <host> source =<source> sourcetype =<sourcetype>    We need the output in below format when we run query: col1 col2 col3 col4 col5 col6 data1 data2 data3 data4 data5 data6   Regards, Sid
@gcusello  Thanks for your response this helps. I am getting diff in the string format example  00:01:12 --> This say 1 hour and 12 mins 30+03:46:11--> This say  30 days and 3 hours 46 mins  ... See more...
@gcusello  Thanks for your response this helps. I am getting diff in the string format example  00:01:12 --> This say 1 hour and 12 mins 30+03:46:11--> This say  30 days and 3 hours 46 mins  I want to convert this diff to number of hours and compare it with a threshold(is a numeric value like 24) when I am trying this it is not giving me correct value. I understand this is due to the fact that "diff" is in string format. Shall I first take the diff in epoch and find the diff and then convert it using strf function? Please assist me on the same. trying query | eval currentEventTime=strftime(_time,"%Y-%m-%d %H:%M:%S"), currentTimeintheServer=strftime(now(),"%Y-%m-%d %H:%M:%S"), test_now=now(), test_time=_time, diff_of_epochtime=(now()-_time), diff=strftime(diff_of_epochtime,"%Y-%m-%d %H:%M:%S"), difforg=tostring(round(diff), "duration")
my SAML Response to Splunk.   <?xml version="1.0" encoding="UTF-8" standalone="no"?><samlp:Response xmlns:samlp="urn:oasis:names:tc:SAML:2.0:protocol" Destination="http://RTNB336:8000/saml/acs" ID=... See more...
my SAML Response to Splunk.   <?xml version="1.0" encoding="UTF-8" standalone="no"?><samlp:Response xmlns:samlp="urn:oasis:names:tc:SAML:2.0:protocol" Destination="http://RTNB336:8000/saml/acs" ID="_4c16f9be1c813c774f2f9111fd5602f6" InResponseTo="RTNB336.21.0882C4AC-681F-4648-AD0F-FDD9F4BE114B" IssueInstant="2024-06-20T01:56:14.199Z" Version="2.0"><saml2:Issuer xmlns:saml2="urn:oasis:names:tc:SAML:2.0:assertion">http://hive.dreamidaas.com</saml2:Issuer><ds:Signature xmlns:ds="http://www.w3.org/2000/09/xmldsig#"><ds:SignedInfo><ds:CanonicalizationMethod Algorithm="http://www.w3.org/2001/10/xml-exc-c14n#"/><ds:SignatureMethod Algorithm="http://www.w3.org/2001/04/xmldsig-more#rsa-sha256"/><ds:Reference URI="#_4c16f9be1c813c774f2f9111fd5602f6"><ds:Transforms><ds:Transform Algorithm="http://www.w3.org/2000/09/xmldsig#enveloped-signature"/><ds:Transform Algorithm="http://www.w3.org/2001/10/xml-exc-c14n#"/></ds:Transforms><ds:DigestMethod Algorithm="http://www.w3.org/2001/04/xmlenc#sha256"/><ds:DigestValue>Wjlp0IBLeluYep7QMphL/ZBkVsDqxbrFcgSDFiFxQBo=</ds:DigestValue></ds:Reference></ds:SignedInfo><ds:SignatureValue>Y0Lp7OR2BWIie+F60hJUhNdOLKhWlXnjLyD0Y7Ut1lPIYfL9uoClcQA98Ge961M7FjrC/uDA8yxGYKvApU4VOYzy7kLM0wbxFKUVXAuPAl5of0WWrMV8QMSWfCq8/ensPzlzsqg84tf86UgMZ2PodD6WOM9SIIW+izBPOP3emuv2c+UrvR2eyp1s+ItWn0AUB+0R0l+iqd+sNE/Gb+l9THlJYm68yLr2DY0nT66dOLKS3Q3jnMox6xrzsSnwaF6+H+dSnvd5YeBIMyjTC1bF6GjQpdudTNz8162TvtJjvAcTUOwhUmLyY4ytTvL+lHKOsDh57wZenvB4gVYzoF6T+A==</ds:SignatureValue><ds:KeyInfo><ds:X509Data><ds:X509Certificate>MIIDtDCCApygAwIBAgIKJxHdhEoMRRD/JjANBgkqhkiG9w0BAQsFADBCMQswCQYDVQQGEwJLUjEW MBQGA1UECgwNRHJlYW1TZWN1cml0eTEMMAoGA1UECwwDU1NPMQ0wCwYDVQQDDARST09UMB4XDTIy MDIyMzIzNTY1NFoXDTMyMDIyMzIzNTY1NFowTzELMAkGA1UEBhMCS1IxFjAUBgNVBAoMDURyZWFt U2VjdXJpdHkxDDAKBgNVBAsMA1NTTzEaMBgGA1UEAwwRTUFHSUNfU1NPX0lEUF9TaWcwggEiMA0G CSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQCGEA5RIOlCH/xJX5qnAQRixJfuUhv2dBoGyCjO1qbJ GuJh6lCF7mwsJbS+PStFrFvXBrfFt8S2QU7hndK5aj3f83IJiiv6y+a26/4xNf19sp6AtafAmWr9 kkI5AH51/9l8ypzf67OAUfrJxFPW6ZKgWiGp5yjrensl1IKxwP0joxUQXISI+epu07XpdWF2SJQ7 rVRNPZUP6sA+lNQsFDznN7moWFcU+UyrTJHDkgj/2qw4QvucNBY7Hj/bC/6KX1d0XSKfvQCfI4gu Gd/4FL1ApnyTvZ/tnbcbl420NWbKgtn19Q4ZIqhj10ruTzVn1YOpwqBGP/NlKDVmKOCem7tvAgMB AAGjgZ4wgZswagYDVR0jBGMwYYAULffLTJtBlWrpR2I1Coc4OG3funyhRqREMEIxCzAJBgNVBAYT AktSMRYwFAYDVQQKDA1EcmVhbVNlY3VyaXR5MQwwCgYDVQQLDANTU08xDTALBgNVBAMMBFJPT1SC AQEwHQYDVR0OBBYEFPvKSaxuZMLnM8ZqaFFkw0xeDp8CMA4GA1UdDwEB/wQEAwIGwDANBgkqhkiG 9w0BAQsFAAOCAQEAflCL2e6ZHxGDtK6PSjrGrhrkJ4XYHvKGnEWWajhL0RqqDoQhEAOAzEyGMcpQ zWBF6etI+uDlnr7EfPCAojvwfcQCEGK4gCHskHHDkXNz5MAC2sSHqVEn/ChAO9nRnTRo4EZlFVgH SXIDJqeWZd2wJ86u9cqA6XTyB/KuVwnTD2U/1W87ERpKlXtDNnC5hB3cp1ONaW+0+Fnn4NdSgMQd SwteL/CtU+q/gcYt1izy1RGdcDRR11+nmfkZT6UYCyKj0ea0yc4SbRjGIEOgJExDJBL8eyc4X2D3 4k6B4rhPzx+vF1OB1esHB69T6Vlo+iUM+XtoLFUOhloNiDzXq+2Hgg==</ds:X509Certificate></ds:X509Data></ds:KeyInfo></ds:Signature><saml2p:Status xmlns:saml2p="urn:oasis:names:tc:SAML:2.0:protocol"><saml2p:StatusCode Value="urn:oasis:names:tc:SAML:2.0:status:Success"/></saml2p:Status><saml2:Assertion xmlns:saml2="urn:oasis:names:tc:SAML:2.0:assertion" ID="_93ae10442348482eb51b04051c58267a" IssueInstant="2024-06-20T01:56:14.199Z" Version="2.0"><saml2:Issuer>http://hive.dreamidaas.com</saml2:Issuer><saml2:Subject><saml2:NameID Format="urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress" NameQualifier="http://hive.dreamidaas.com" SPNameQualifier="RTNB336">rladnrud@devdreamsso.site</saml2:NameID><saml2:SubjectConfirmation Method="urn:oasis:names:tc:SAML:2.0:cm:bearer"><saml2:SubjectConfirmationData InResponseTo="RTNB336.21.0882C4AC-681F-4648-AD0F-FDD9F4BE114B" NotOnOrAfter="2024-06-20T02:01:14.199Z" Recipient="http://RTNB336:8000/saml/acs"/></saml2:SubjectConfirmation></saml2:Subject><saml2:Conditions NotBefore="2024-06-20T01:56:14.199Z" NotOnOrAfter="2024-06-20T02:01:14.199Z"><saml2:AudienceRestriction><saml2:Audience>RTNB336</saml2:Audience></saml2:AudienceRestriction></saml2:Conditions><saml2:AuthnStatement AuthnInstant="2024-06-20T01:55:52.000Z" SessionIndex="_8028c81d727dcc5a423afa58c645b8c5"><saml2:AuthnContext><saml2:AuthnContextClassRef>urn:oasis:names:tc:SAML:2.0:ac:classes:InternetProtocol</saml2:AuthnContextClassRef></saml2:AuthnContext></saml2:AuthnStatement></saml2:Assertion></samlp:Response>   There's no problem in my IDP. I don't know why Splunk can't verify signature properly
It says, "If you save your IdP certificate under $SPLUNK_HOME/etc/auth/idpCerts, please leave it blank." If you don't type idpCert.pem, it won't save it. I don't think that's the cause, but what more... See more...
It says, "If you save your IdP certificate under $SPLUNK_HOME/etc/auth/idpCerts, please leave it blank." If you don't type idpCert.pem, it won't save it. I don't think that's the cause, but what more information do I need to provide to get this error clearer? How should I solve this problem? I looked for the same error information and found that I need to register IdP certificate chains, but that's not a requirement, is it? Splunk-9.2.1 Windows11 Our company IDP
Hi @bowesmana  It's working now post changing "dashboard" to "form" and using "submittedTokenModel" in Splunkjs. Thank you for your response !
@tkopchak I cannot disable SSL in global settings because it's grayed out. do you have anything else I can try?
I would generally recommend setting the token to the submitted token model as well as the default, i.e. var submittedTokenModel = mvc.Components.getInstance('submitted'); and submittedTokenModel.s... See more...
I would generally recommend setting the token to the submitted token model as well as the default, i.e. var submittedTokenModel = mvc.Components.getInstance('submitted'); and submittedTokenModel.set('clickedButtonValue', value); I'm also not entirely sure how the <dashboard> or <form> structure of a dashboard changes how tokens are managed, because the token models effect how the tokens are used when clicking submit buttons in a dashboard, where the dashboard will always be a <form> dashboard. So, first change the dashboard to <form> and then try the changed JS - hopefully one will make the difference.  
What is the x-axis you need. You have 3 fields output in your search | table StatisticalId Value Unit and there is a lot of mvexpand logic going on... and that seems like you are going to multiple... See more...
What is the x-axis you need. You have 3 fields output in your search | table StatisticalId Value Unit and there is a lot of mvexpand logic going on... and that seems like you are going to multiple your data significantly as there's no correlation between each of the MV values you are expanding. That aside, the basic command to create the chart would be something like | chart max(Value) over Unit by StatisticalId which would put Unit on the x-axis. Swap Unit and StatisticalId to make StatisticalId the x-axis  
Thank you, @akouki_splunk! That's it. The on-prem Splunk instance uses SAML authentication so I get automatically assigned both "admin" and "user" roles from my group memberships. The "user" role was... See more...
Thank you, @akouki_splunk! That's it. The on-prem Splunk instance uses SAML authentication so I get automatically assigned both "admin" and "user" roles from my group memberships. The "user" role was in the "blacklisted_roles" list, which caused the error. Thank you for the quick response!  
The CSV is not structured as a lookup table. The structure should be that, given a value for CPU1 (e.g. "process_a"), what are the (first matching) values for CPU2 ("process_b") and CPU3 ("process_c"... See more...
The CSV is not structured as a lookup table. The structure should be that, given a value for CPU1 (e.g. "process_a"), what are the (first matching) values for CPU2 ("process_b") and CPU3 ("process_c"). What you seem to be looking for is given a value for some CPU (e.g. "process_a"), to what CPU category does it belong ("CPU1"). Are you able to restructure the test.csv to be more like: Process CPU Class process_a CPU1 process_b CPU2 process_c CPU3 process_d CPU1 process_e CPU2 process_f CPU3 process_g CPU1 process_h CPU2 process_i CPU3   IF you can't restructure that file, something like this would work:   | makeresults | eval CPU=mvappend("process_a","process_a","process_b","process_a","process_c","process_a","process_b","process_d","process_a","process_e","process_a","process_b","process_c","process_a","process_a","process_b","process_d","process_a","process_c","process_a","process_b","process_e","process_a") | mvexpand CPU ``` The above is to generate sample data and can be ignored in your SPL ``` ``` uncomment the line below and notice the change from CPU1 to CPU ``` ```index=custom | eval SEP=split(_raw,"|"), eval CPU=trim(mvindex(SEP,1))``` ``` These two lines create aliases to map in the CPU group for each class in turn ``` | eval myCPU1=CPU | eval myCPU2=CPU ``` These next lines assume that a process will only appear once in the test.csv file. ``` ``` If that is the case, then CPU2 and CPU3 will be non-null when CPU1 matches, ``` ``` otherwise that process does not belong to CPU1 (and ditto for the CPU2 case.) ``` | lookup community CPU1 as myCPU1 | eval myCPU1=if(NOT isnull(CPU2),CPU,NULL) | lookup community CPU2 as myCPU2 | eval myCPU2=if(NOT isnull(CPU1),CPU,NULL) ``` Now create your stats on the two CPU classes. ``` | bin _time span=1m | stats count(myCPU1) as CPU1_COUNT count(myCPU2) as CPU2_COUNT by _time    
Hi @ww9rivers , I'm @akouki_splunk , the developer of the Content Manager App. It seems you are having an issue with the blacklisted roles or users. Do you have access to the app configuration fil... See more...
Hi @ww9rivers , I'm @akouki_splunk , the developer of the Content Manager App. It seems you are having an issue with the blacklisted roles or users. Do you have access to the app configuration files? If so, please open the etc/apps/appcontentmanager/default/acms_settings.conf file and clear the blacklisted_roles and blacklisted_users attributes. The file content should look like this after the modification :   [settings] blacklisted_apps = alert_logevent,alert_webhook,appsbrowser,introspection_generator_addon,launcher,learned,legacy,logd_input,python_upgrade_readiness_app,sample_app,splunk_assist,splunk_gdi,splunk_httpinput,splunk_ingest_actions,splunk_instrumentation,splunk_internal_metrics,splunk_metrics_workspace,splunk_monitoring_console,splunk_secure_gateway,SplunkForwarder,SplunkLightForwarder,splunk-dashboard-studio blacklisted_conffiles = server,limits,app,passwords blacklisted_stanzas = blacklisted_roles = blacklisted_users = theme = light is_configured = 0 default_owner = nobody    
Did you want cids to contain that GUID? Try | rex field=log ".*customers\s(?<cids>.*)" Alternatively, if the GUID is always at the end, following a space, you can even drop the "customers" ... See more...
Did you want cids to contain that GUID? Try | rex field=log ".*customers\s(?<cids>.*)" Alternatively, if the GUID is always at the end, following a space, you can even drop the "customers" part: | rex field=log "(?<cids>\S+$)" Your example appears to be creating a capture group named "cids" that captures nothing (the first set of parentheses), and then a second non-capturing group that matches what you want (the second set of parentheses). This document might help explain in more detail: https://docs.splunk.com/Documentation/SCS/current/Search/AboutSplunkregularexpressions#Capture_groups_in_regular_expressions 
I want to exact a string 'GUID" from the log right after "customers". This regex expression works in https://regex101.com/ but not in Splunk.  My field name is log: 2023-06-19 15:28:01.726 ERROR [co... See more...
I want to exact a string 'GUID" from the log right after "customers". This regex expression works in https://regex101.com/ but not in Splunk.  My field name is log: 2023-06-19 15:28:01.726 ERROR [communication-service,6e72370er2368b08,6e723709fd368b08] [,,,] 1 --- [container-0-C-1] c.w.r.acc.commservice.sink.ReminderSink : Reminder Message processed, no linked customers aaf60d69-99a9-41f5-a081-032224284066   | rex field=log "(?<cids>).*customers\s(.*)"  
Before you do your eval statement, test that your extraction works. In your query, use a rex statement to see test this. ... | rex field=<your_field> "\"path\"\:\"auth\/(abc|xyz)\/login\/(?<Use... See more...
Before you do your eval statement, test that your extraction works. In your query, use a rex statement to see test this. ... | rex field=<your_field> "\"path\"\:\"auth\/(abc|xyz)\/login\/(?<User>[\w\_]+)" ... Then once you confirm you extracting your User field values, add the eval statement in the query. Once you confirm that works, you can then go back to your sourcetype, and modify your extract and eval lines.  --- If this reply helps you, Karma would be appreciated.
Hello, I am trying to change the email address of my Splunk community account. I went to My settings > Personal > Email and set the new email address. I got the verification email and verified the n... See more...
Hello, I am trying to change the email address of my Splunk community account. I went to My settings > Personal > Email and set the new email address. I got the verification email and verified the new email address. Now the new email address was displayed under My settings. However, when I logged out and then logged back in, the old email address is shown again. Is this a known issue?
I believe that your scenario could be accomplished with Ingest Actions: https://docs.splunk.com/Documentation/Splunk/9.2.1/Data/DataIngest This should support cloning data and applying different ... See more...
I believe that your scenario could be accomplished with Ingest Actions: https://docs.splunk.com/Documentation/Splunk/9.2.1/Data/DataIngest This should support cloning data and applying different filtering rules and routing to the two streams.
Thank you! Just like that it works and only in 1 line
| eval fruit=mvappend(fruit1,if(fruit2!="NULL",fruit2,null())) | stats count by fruit