All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

https://research.splunk.com/cloud/c783dd98-c703-4252-9e8a-f19d9f5c949e/ when i give this command Operation!="Disable Strong Authentication."  i am getting the MFA enabled users details. But whe... See more...
https://research.splunk.com/cloud/c783dd98-c703-4252-9e8a-f19d9f5c949e/ when i give this command Operation!="Disable Strong Authentication."  i am getting the MFA enabled users details. But when the below query is executed i am not getting any output. Can some one help me in sharing some docs `o365_management_activity` Operation="Disable Strong Authentication." | stats count earliest(_time) as firstTime latest(_time) as lastTime by UserType Operation UserId ResultStatus object | rename UserType AS user_type, Operation AS action, UserId AS src_user, object AS user, ResultStatus AS result | `security_content_ctime(firstTime)` | `security_content_ctime(lastTime)` | `o365_disable_mfa_filter` as per the  
hi I have created a tag for the field "counter" called "a" But when I run a search with tag=a or with tag::counter="a", there is no results what is the problem please?
Could I create my own certificate for SAML configuration if the IDP certificate if the IDP certificate setup isn't working as expected? if so, how can I do this?  
We are rolling out a customer service chatbot. Has anyone needed to collect the data such as input/output and logs between an chatbot and OpenAI to monitor it in Splunk? If so, what did you use to GD... See more...
We are rolling out a customer service chatbot. Has anyone needed to collect the data such as input/output and logs between an chatbot and OpenAI to monitor it in Splunk? If so, what did you use to GDI?   One other note; there is the possibility for customers to share images or video with the chatbot; wondering if anyone has tried to collect this type of data in Splunk?  
I have a search from which i produce a trellis of  the sum of various error codes from multiple machines  I would like to enhance the charts  with a short description of text. I  could  add the tex... See more...
I have a search from which i produce a trellis of  the sum of various error codes from multiple machines  I would like to enhance the charts  with a short description of text. I  could  add the text to the code value  and create a new  value name  and do the split on the new  "codetext". But, then I can't use the drill down  feature. Is there another way to add some text to the individual graphs
Hi Team, We have DB alerts for server sitpdb0033 are assigning to windows support team first , it needs to be assign to SQL team, How to change the assignment group from windows support team to SQL... See more...
Hi Team, We have DB alerts for server sitpdb0033 are assigning to windows support team first , it needs to be assign to SQL team, How to change the assignment group from windows support team to SQL team. The index=mssql there are 30+ host's are configured. We want only change the group for this server sitpdb0033 we have using this SPL query: index=mssql sourcetype="mssql:database" OR sourcetype="mssql:databases" state_desc!="ONLINE" | eval assignment_group = case(like(source,"%mssql_mfg%"),"Winows_Support - Operations",1=1, "Sql_Production Support") Can you please help on this requirement. Thank you Nandan
I have a list of comma separated names (lastname, firstname) that I need to reverse. So "Smith, Suzy" becomes "Suzy Smith".  What's the easiest way to do this? 
Log ingesting intermittently We could not find the path referenced . We have Univerasal forwarder is Windows server and Heavy forwarder is *nix server. How to get the diag files with debug enable of ... See more...
Log ingesting intermittently We could not find the path referenced . We have Univerasal forwarder is Windows server and Heavy forwarder is *nix server. How to get the diag files with debug enable of the UF and the HF? Can you please provide the detailed explanation with commands
I can run the below command in a search successfully -    | eval message=replace(Message, "^Installation Successful: Windows successfully installed the following update: ", "")    How can I conv... See more...
I can run the below command in a search successfully -    | eval message=replace(Message, "^Installation Successful: Windows successfully installed the following update: ", "")    How can I convert this to work in a data model?   Below is my base search sample result.  Message=Installation Successful: Windows successfully installed the following update: Security Intelligence Update for Microsoft Defender Antivirus - KB2267602 (Version 1.405.28.0) - Current Channel (Broad) In my data model I would like to use eval expression on the field message and take off - Installation Successful: Windows successfully installed the following update:  Desired results -  Message= Security Intelligence Update for Microsoft Defender Antivirus - KB2267602 (Version 1.405.28.0) - Current Channel (Broad)
Hello I have a working dashboard where I have various fields that can be defined (field1 and field2 in the example), and some events have a field that is an extracted JSON object. I have successfull... See more...
Hello I have a working dashboard where I have various fields that can be defined (field1 and field2 in the example), and some events have a field that is an extracted JSON object. I have successfully accessed various elements within the JSON object... but what I am trying to do is create ONE column called "Additional Details" where only certain elements, IF THEY EXIST, will populate in this column.  The search below technically works, but as you can probably see, it will just add a NULL value if the specified element from field3 does not exist. Is there a way to check for other values in the JSON object, and populate those values in that single column, only if they exist? i.e. If field3 has "Attribute Name", "Resource Name", and "ID", but many events have only one of these fields, is it possible to have the value from the field, only if it exists, populate in the "Additional Details" column?   index=test field1=* field2=* | spath input=field3 #(which is a json_object)# | fillnull value=NULL | eval type=if(isnotnull(element_from_field3), ElementName, NULL) | stats count values(type) as "Additional Details" by Other    
There is some configuration, to be able to expand the characters of the explanation of another query to be able to see the complete query.
Hello, Is there any way where we can know what are all applications are accessed by the user instead of just logon/log off activities from the winevent logs? Can someone help me with the search?   ... See more...
Hello, Is there any way where we can know what are all applications are accessed by the user instead of just logon/log off activities from the winevent logs? Can someone help me with the search?   Thanks
Good morning, I come to you because after looking for an answer to my problem, my last solution is to come and seek help on the splunk forum. Here is the context: I have hundreds of message... See more...
Good morning, I come to you because after looking for an answer to my problem, my last solution is to come and seek help on the splunk forum. Here is the context: I have hundreds of messages with identical node parameters, only the parameter values change. example: "jobs": dev "position": 3 "city": NY "name": Leo ....... “jobs”: HR "position": 4 “city”: CA "name": Mike ........ The goal is that these hundreds of messages are sometimes truncated because their responses are too large, I would like to find a solution to display them in full? I had thought about increasing the capacity in splink but this is not possible for my project and the truncated logs are -1% so a big change for few logs, not really good moves. My second solution, I thought of making a regex which finds the truncated message grouped into several pieces, is this possible?   I also try some regex to find my message like this, but it not working index="" | eval key="<value i want>" | table _raw If not, maybe you have another idea ?   Thank you for your help and time. Have a good evening
my splunk query results are getting truncated when creating a table Is there any workaround to avoid this ?? index=gbi_* (AppName=*) | table SQL
Hi, first question here ! I'm new on Splunk and I have a basic question on btool. With this command line :    /splunk btool outputs list --debug   the result is that the first element... See more...
Hi, first question here ! I'm new on Splunk and I have a basic question on btool. With this command line :    /splunk btool outputs list --debug   the result is that the first element in the (long) list is the one which is applied in case if there is no outputs.conf in a deployed app on the Heavy Forwarder ? Am I right ? Thanks Nico
Hello I have great difficulties to understand where to begin for using the CIM datamodel Is anybody can clearly summarize the different ways to apply a CIM datamodel in my own apps? Thanks in adva... See more...
Hello I have great difficulties to understand where to begin for using the CIM datamodel Is anybody can clearly summarize the different ways to apply a CIM datamodel in my own apps? Thanks in advance
Can someone help me understand the totalResultCount function? I have looked at the documentation and spent an hour or two fiddling with it, but I can't figure out what it is supposed to do.
need to install the splunk enterprise and wanted to make SH and indexer , universal forwarder  same system , please advise
Hi! We recently decided to move from Splunk on-prem to Cloud.  Is there any quick way for me to upload my savedsearches.conf file from the On-Prem to the Cloud instance?   I am looking for a way whe... See more...
Hi! We recently decided to move from Splunk on-prem to Cloud.  Is there any quick way for me to upload my savedsearches.conf file from the On-Prem to the Cloud instance?   I am looking for a way where I dont have to manually copy my saved searches.  Thanks!  
I have Solarwinds add-on installed on Linux HF. I am seesin this error: +0000 log_level=WARNING, pid=28286, tid=Thread-4, file=ext.py, func_name=time_str2str, code_line_no=321 | [stanza_name="Solarw... See more...
I have Solarwinds add-on installed on Linux HF. I am seesin this error: +0000 log_level=WARNING, pid=28286, tid=Thread-4, file=ext.py, func_name=time_str2str, code_line_no=321 | [stanza_name="SolarwindAlerts"] Unable to convert date_string "2024-02-15T13:44:46.6370000" from format "%Y-%m-%dT%H:%M:%S.%f" to "%Y-%m-%dT%H:%M:%S.%f", return the original date_string, cause=Traceback (most recent call last): File "/opt/splunk/etc/apps/Splunk_TA_SolarWinds/bin/splunk_ta_solarwinds/aob_py3/cloudconnectlib/core/ext.py", line 304, in time_str2str     dt = datetime.strptime(date_string, from_format)   File "/opt/splunk/lib/python3.7/_strptime.py", line 577, in _strptime_datetime     tt, fraction, gmtoff_fraction = _strptime(data_string, format)   File "/opt/splunk/lib/python3.7/_strptime.py", line 362, in _strptime     data_string[found.end():]) ValueError: unconverted data remains: 0   Can someone help. I have no data from solarwinds. I tried reinstalling the add on and reconfiguring it. It was working till 8.* version of HF now we have upgraded to 9.1.3. Its showing supported in splunkbase.