All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

If you'll forgive the late reply... I ran into your problem this morning and found a workaround. (And wanted to answer in case someone else runs across this thread in the future, like I did.) E... See more...
If you'll forgive the late reply... I ran into your problem this morning and found a workaround. (And wanted to answer in case someone else runs across this thread in the future, like I did.) Either leave the "Earliest Offset" value blank, or default, and then hard-code the time you need into your search. For example, I needed to look back 1 month, so I added the following to my first line: earliest=-1mon That solved the issue for me.
Hi @Kashinath.Kumbharkar, Thanks for asking your question on the community. At this point, I think it would be best if you reached out to AppD Support or contact your AppD Rep to talk more about yo... See more...
Hi @Kashinath.Kumbharkar, Thanks for asking your question on the community. At this point, I think it would be best if you reached out to AppD Support or contact your AppD Rep to talk more about your specific goals and outcomes with this project. Your AppD Admin should be able to help with this. 
Charts show numeric values (y-axis) against a base (x-axis) - what you are asking for can't be done with charts.
Hi sainag, thanks for response. No we are not using scripted authentication. The pasted authentication.conf above it the complete config.  I am also not able to see the log     Unknown role 'ld... See more...
Hi sainag, thanks for response. No we are not using scripted authentication. The pasted authentication.conf above it the complete config.  I am also not able to see the log     Unknown role 'ldap_user"       What I figured out: I changed the default reply URL to  https://<instance>.westeurope.cloudapp.azure.com/saml/acs  instead of https://<instance>.westeurope.cloudapp.azure.com/en-GB/account/login  And now this error is gone: (that is maybe responsible for the evalutaion of the attributes)?  BUT now I get different Error:      10-14-2024 15:31:01.405 +0000 ERROR XmlParser [4858 webui] - func=xmlSecOpenSSLX509StoreVerify:file=x509vfy.c:line=342:obj=x509-store:subj=unknown:error=71:certificate verification failed:X509_verify_cert: subject=/CN=SSO-Certificate; issuer=/C N=SSO-Certificate; err=20; msg=unable to get local issuer certificate 10-14-2024 15:31:01.405 +0000 ERROR XmlParser [4858 webui] - func=xmlSecOpenSSLX509StoreVerify:file=x509vfy.c:line=381:obj=x509-store:subj=unknown:error=71:certificate verification failed:subject=/CN=SSO-Certificate; issuer=/CN=SSO-Certif icate; err=20; msg=unable to get local issuer certificate 10-14-2024 15:31:01.405 +0000 ERROR XmlParser [4858 webui] - func=xmlSecOpenSSLKeyDataX509VerifyAndExtractKey:file=x509.c:line=1505:obj=x509:subj=unknown:error=72:certificate is not found:details=NULL 10-14-2024 15:31:01.405 +0000 ERROR XmlParser [4858 webui] - func=xmlSecOpenSSLKeyDataX509XmlRead:file=x509.c:line=654:obj=x509:subj=xmlSecOpenSSLKeyDataX509VerifyAndExtractKey:error=1:xmlsec library function failed: 10-14-2024 15:31:01.405 +0000 ERROR XmlParser [4858 webui] - func=xmlSecKeyInfoNodeRead:file=keyinfo.c:line=114:obj=x509:subj=xmlSecKeyDataXmlRead:error=1:xmlsec library function failed:node=X509Data 10-14-2024 15:31:01.405 +0000 ERROR XmlParser [4858 webui] - func=xmlSecKeysMngrGetKey:file=keys.c:line=1227:obj=unknown:subj=xmlSecKeyInfoNodeRead:error=1:xmlsec library function failed:node=KeyInfo 10-14-2024 15:31:01.405 +0000 ERROR XmlParser [4858 webui] - func=xmlSecDSigCtxProcessKeyInfoNode:file=xmldsig.c:line=790:obj=unknown:subj=unknown:error=45:key is not found:details=NULL 10-14-2024 15:31:01.405 +0000 ERROR XmlParser [4858 webui] - func=xmlSecDSigCtxProcessSignatureNode:file=xmldsig.c:line=503:obj=unknown:subj=xmlSecDSigCtxProcessKeyInfoNode:error=1:xmlsec library function failed: 10-14-2024 15:31:01.405 +0000 ERROR XmlParser [4858 webui] - func=xmlSecDSigCtxVerify:file=xmldsig.c:line=341:obj=unknown:subj=xmlSecDSigCtxSignatureProcessNode:error=1:xmlsec library function failed: 10-14-2024 15:31:01.405 +0000 ERROR Saml [4858 webui] - Error: failed to verify signature with cert :/opt/splunk/etc/auth/idpCerts/idpCert.pem; 10-14-2024 15:31:01.405 +0000 ERROR Saml [4858 webui] - Unable to verify Saml document 10-14-2024 15:31:01.405 +0000 ERROR UiSAML [4858 webui] - Verification of SAML assertion using the IDP's certificate provided failed. Error: failed to verify signature with cert   are these errors somehow related? Any ides how to fix that ? How can I turn on debug for SAML ? 
Hello,    I would like to create chart with multiple fields in Y axis and time in x axis,  Y axis - FIELD_01 FIELD_02 FIELD_03 FIELD_04 FIELD_05 FIELD_06 (All field values are in strings and numb... See more...
Hello,    I would like to create chart with multiple fields in Y axis and time in x axis,  Y axis - FIELD_01 FIELD_02 FIELD_03 FIELD_04 FIELD_05 FIELD_06 (All field values are in strings and numbers as well) x axis - _time Lets say, If the FIELD_01 consists of values Stopped, Started, Stopped, Stopped In y axis it should change its values with some colours. FIELD_06     Field values FIELD_05     Field values FIELD_04     Field value FIELD_03     Field value FIELD_02     Field value FIELD_01     Field value Y axis/ x axis                                         _time Thanks in Advance!
Can you explain more?  Which file should be edit? Send path and file name 
My Splunk installation can't read files from windows host from a specific folder on the C:// drive. Logs are collected from another folder without problems. There are no errors in index _internal, st... See more...
My Splunk installation can't read files from windows host from a specific folder on the C:// drive. Logs are collected from another folder without problems. There are no errors in index _internal, stanza in inputs.conf looks standard, monitor on the folder and the path are specified correctly. The rights to the folder and files are system ones, as are other files that we can collect. What could be the problem?
Hello @BRFZ when was the last reboot on this search head ? looks like its hung up. I encourage to reach out to support if this not get resolved.       
Hello @msteffl . Are you using the scripted authentication? Do you also see any warnings like on the splunkd.log ?     " WARN AuthorizationManager [34567 TcpChannelThread] - Unknown role 'ldap_... See more...
Hello @msteffl . Are you using the scripted authentication? Do you also see any warnings like on the splunkd.log ?     " WARN AuthorizationManager [34567 TcpChannelThread] - Unknown role 'ldap_user"   If you also see the the "unknown role" error message, it might be AD group to Splunk Role mapping is failing on because it can't find a Splunk role definition for "ldap_user". Take a look at the "authorize.conf.   To troubleshoot this issue you will need to turn on debug for SAML on the SH and get the user to try and login again.  Once they have done that you can run the following to see if any roles are being retuned for the user: index=_internal sourcetype=splunkd samlp:response   Docs:  https://docs.splunk.com/Documentation/Splunk/9.2.0/Security/ConfigureSSOinSplunkWeb https://docs.splunk.com/Documentation/Splunk/9.3.1/Security/Mapgroupstoroles https://docs.splunk.com/Documentation/SplunkCloud/latest/Security/ConfigureauthextensionsforSAMLtokens#Configure_authentication_extensions_for_Microsoft_Azure_using_Splunk_Web   Hope this helps. 
  Hi, I encountered an issue where my indexer disconnected from the search head (SH), and similarly, the SH and indexer1 disconnected from the deployment server and license master. I keep receiving... See more...
  Hi, I encountered an issue where my indexer disconnected from the search head (SH), and similarly, the SH and indexer1 disconnected from the deployment server and license master. I keep receiving the following error message:    Error [00000010] Instance name "A.A.A.A:PORT" Search head's authentication credentials rejected by peer. Try re-adding the peer. Last Connect Time: 2024-10-14T16:23:23.000+02:00; Failed 5 out of 5 times. I've tried re-adding the peer but the issue persists. Does anyone have suggestions on how to resolve this? Thanks in advance!
Edit Splunk systemd service unit file and edit/add the line under [service]    AmbientCapabilities=CAP_DAC_READ_SEARCH CAP_NET_ADMIN CAP_NET_RAW​  
Well, if your Splunk needs 13 seconds to scan just 14 thousand events... that looks weird. But if you have big events (like 100K-big jsons), considering your wildcard at the beginning of the search ... See more...
Well, if your Splunk needs 13 seconds to scan just 14 thousand events... that looks weird. But if you have big events (like 100K-big jsons), considering your wildcard at the beginning of the search term, the initial search might indeed be slow. So that's the first and probably the most important optimization you can do - if you can drop the wildcard at the beginning of *1000383334*, it will save you a lot of time. Notice that Splunk had to scan over 14k events just to match two of them. That's because it can't use indexed terms, it has to scan every single raw event. Since you're extracting  Odernumber (and rely on it being non-empty by including it in the BY clause for stats) using [EXT] as an anchor for your regex the [EXT] part must obviously be in your event. So if it's only in part of the events, you can use it as additional search term (square brackets are major breakers so you can just add EXT to your search terms). The inputlookup and join @ITWhisperer already covered. Dedup should _not_ be using much resources. As you can see from your job inspect table, it gets just 10 results on input and returns 2. It's not a huge amount. The main problem here is the initial search. Also if your events are big, you can drop _raw early on so you don't drag it along with you along the pipeline (you only use a few fields in your stats anyway).
We have no idea what data you uploaded and how. I assume you used the webui and went through the "add data" dialog but we have no idea what sourcetype(s) you used, whether you had proper timestamp re... See more...
We have no idea what data you uploaded and how. I assume you used the webui and went through the "add data" dialog but we have no idea what sourcetype(s) you used, whether you had proper timestamp recognition and so on. We have also no knowledge about how you are searching for that data. So the only answer we can give you is "search for your data properly". But seriously - you're giving us the equivalent of "I bought a computer, I did something with it and now it doesn't do what I want".
This worked for me, i had a Data Durability / Data Searcheable alert after the upgrade to 9.3.0 on Master Cluster Thanks!
Hello @tchimento_splun @rjteh_splunk  looks like this bug is still happening in 9.3.0
I have created an index to store my data on Splunk.  The data contains 5 csv files uploaded one by one in the index. Now, if I try to show the data inside  the index, it shows the latest data (the ... See more...
I have created an index to store my data on Splunk.  The data contains 5 csv files uploaded one by one in the index. Now, if I try to show the data inside  the index, it shows the latest data (the csv file that was uploaded at the end ) We can show the data of other files by querying, including specific source names, but by default, we can not see the whole data; we can only see the data of the last table. To overcome this challenge we have used joins to join all the tables and show them through the query in one report. I wanted to find out if there is a better way to do this. I have to show this data in Power BI, and for that, I should have a complete report of the data.
| timechart sum(count) as total span=1h | timewrap 1w | where strftime(_time,"%a") = strftime(now(),"%a") | eval hour=strftime(_time,"%H") | fields - _time | table hour *
It sounds like it is working, just not with the results you expect? Search has an implied AND so perhaps you need an explicit OR? | search node="$form.tokenNode$" OR outcome="$form.tokenSwitch$"
HI  hour 0 for count1 is the total of all the counts for 00:00 to 00:59 for the current day (Monday) in the current week.  hour 0 for count2 is the total of all the counts for 00:00 to 00:59 for th... See more...
HI  hour 0 for count1 is the total of all the counts for 00:00 to 00:59 for the current day (Monday) in the current week.  hour 0 for count2 is the total of all the counts for 00:00 to 00:59 for the current day (Monday) in the previous week hour 0 for count3 is the total of all the counts for 00:00 to 00:59 for the current day (Monday) in the Current week -2  So, in X Axis we have 0-24 hours for the current day and in the Y axis, we have 3 lines  count1: Count of particular hour of the day in the current week    count2 : Count of particular hour of the day in the previous week    count3 : Count of particular hour of the day in the  current week  Plan is to compare : when current day is Monday the count of 8th hour of Monday with the last week Monday and with the last to last week Monday.  the count of 9th hour of Monday with the last week Monday and with the last to last week Monday.  the count of 10th hour of Monday with the last week Monday and with the last to last week Monday.  and so on till 24th Hour  We have fields like Current_day (example Monday , Tuesday etc) , Current_Week (example 41 or 40 etc) extracted in the query. 
So, just to be clear, count1 is the sum of the hourly counts for the current week, e.g. hour 0 for count1 is the total of all the counts for 00:00 to 00:59 for all the days (so far) in the current we... See more...
So, just to be clear, count1 is the sum of the hourly counts for the current week, e.g. hour 0 for count1 is the total of all the counts for 00:00 to 00:59 for all the days (so far) in the current week, hour 0 for count2 is the total of all the counts for 00:00 to 00:59 for all the days in the previous week, etc.?