All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I have a table that contains Notable Source events with a drilldown that links to a dashboard of those events. In edit mode the text contained in the table is white. However, when I switch to view mo... See more...
I have a table that contains Notable Source events with a drilldown that links to a dashboard of those events. In edit mode the text contained in the table is white. However, when I switch to view mode the text changes to the default blue hyperlink color. I was wondering if there was a way to change the default color to be white before a user interacts with the table and then change to blue once the user clicks on a field in the table. I saw a post where someone had a similar question and I read the links provided in the solution; however, I'm new to XML so I still didn't really understand after reading the articles. Here is the post I referenced: https://community.splunk.com/t5/Dashboards-Visualizations/is-it-possible-to-apply-a-hex-color-to-change-the-font-of-the/m-p/536262. Any help would be greatly appreciated!
Hello @ITWhisperer, Thanks for the response the provided solution is working however if I edit this file and add more data will this search work same as it is working now? This file keeps getting up... See more...
Hello @ITWhisperer, Thanks for the response the provided solution is working however if I edit this file and add more data will this search work same as it is working now? This file keeps getting updated after some time, 
This new question should be a new posting.
Have you checked the splunk_db_connect*.log files for the error?  What is it?
Try it one line at a time and see where it fails
There is nothing to forgive. Most of us are not native English speakers so sometimes conveying your thoughts to words can be tricky So you have your data ingested "properly" and just want to rend... See more...
There is nothing to forgive. Most of us are not native English speakers so sometimes conveying your thoughts to words can be tricky So you have your data ingested "properly" and just want to render your events on output of your search as json structures? That's actually quite simple. <yoursearch> | tojson
Hello @ITWhisperer,   Thanks for your response. I was trying to add the search you have provided but failed to get desired value, can you please elaborate further how to use this solution you provi... See more...
Hello @ITWhisperer,   Thanks for your response. I was trying to add the search you have provided but failed to get desired value, can you please elaborate further how to use this solution you provided. 
Shouldn't your eval be something like this | eval Grade=case(GPA=1,"D", GPA>1 AND GPA<=1.3,"D+")]
Hi @PickleRick , forgive me, I fear I explained myself bad. Windows logs are coming directly from Domain Controllers. They are ingested using UF and they transitate through HF, so the final flow is:... See more...
Hi @PickleRick , forgive me, I fear I explained myself bad. Windows logs are coming directly from Domain Controllers. They are ingested using UF and they transitate through HF, so the final flow is: DCs with UF installed -> HF -> Splunk Cloud environment In addiction to this, the TA_windows is installed on both HF an Splunk Cloud. So, we don't want ingest data from third party forwarder; we want to know if, with this environment and the above addon installed, we are able to see logs on JSON format, when we perform searches on  SH, or we can see only Legacy and XML one because, with this environment and this addon, no other format are supported.
| makeresults | eval _raw="LIST F.PROTOCOL @ID PROTOCOL.ID PROCESS.DATE TIME.MSECS K.USER APPLICATION LEVEL.FUNCTION ID REMARK ... See more...
| makeresults | eval _raw="LIST F.PROTOCOL @ID PROTOCOL.ID PROCESS.DATE TIME.MSECS K.USER APPLICATION LEVEL.FUNCTION ID REMARK PAGE 1 11:34:02 23 NOV 2023 @ID............ 202309260081340532.2 @ID............ 202309260081340532.21 PROTOCOL.ID.... 202309260081340532.21 PROCESS.DATE... 20230926 TIME.MSECS..... 11:15:32:934 K.USER......... INPUTTER APPLICATION.... AC.INWARD.ENTRY LEVEL.FUNCTION. 1 ID............. REMARK......... ENQUIRY - AC.INTERFACE.REPORT @ID............ 202309260081340523.16 @ID............ 202309260081340523.16 PROTOCOL.ID.... 202309260081340523.16 PROCESS.DATE... 20230926 TIME.MSECS..... 11:15:23:649 K.USER......... INPUTTER APPLICATION.... AC.INWARD.ENTRY LEVEL.FUNCTION. 1 ID............. REMARK......... ENQUIRY - AC.INTERFACE.REPORT" ``` The lines above set up sample data in line with your example ``` | rex max_match=0 "(?ms)(?<event>^\@ID.*?REMARK.*?$)" | mvexpand event | rex max_match=0 field=event "(?m)(?<namevalue>.+\.+\s.*$)" | streamstats count as row | mvexpand namevalue | rex field=namevalue "(?<name>[^\s]+(?<!\.))\.*?\s(?<value>.*$)" | eval {name}=value | fields - name value namevalue event | stats values(*) as * by row | fields - row
[stream:ip] TRUNCATE = 0 did not help. Any other suggestions?
I suppose, judging by the section of Answers you posted it into, you want to ingest the json-formatted windows events supplied by third party "forwarder" (whatever it is - NXLog, Kiwi, winlogbeat...)... See more...
I suppose, judging by the section of Answers you posted it into, you want to ingest the json-formatted windows events supplied by third party "forwarder" (whatever it is - NXLog, Kiwi, winlogbeat...). You can ingest the events in any way you want but unless they are in one of the two formats supported by TA_windows, you're on your own with parsing and such. See for example, the https://community.splunk.com/t5/Getting-Data-In/Connect-winlogbeat-log-format-to-Splunk-TA-Windows/m-p/669783#M112304 thread for similar question.
Hi, I want to find the grade based on my Case condition but my query is not working as expected. | eval Grade=case(Cumulative=1,"D", Cumulative>1 AND Cumulative<=1.3,"D+")] Example: My Grade shoul... See more...
Hi, I want to find the grade based on my Case condition but my query is not working as expected. | eval Grade=case(Cumulative=1,"D", Cumulative>1 AND Cumulative<=1.3,"D+")] Example: My Grade should be based on the avg(GPA) If Avg(GPA) is 1 Grade at the bottom (Avg Grade)should be D , If it is between 1-1.3 then it should be D+      
Hello All, I am testing the data inputs for Splunk Addon for ServiceNow and there is a requirement to include only certain fields in the data. I tried to set the filtering using the "Included Param... See more...
Hello All, I am testing the data inputs for Splunk Addon for ServiceNow and there is a requirement to include only certain fields in the data. I tried to set the filtering using the "Included Parameters" option in the input and added the desired comma separated fields. However, I am not able to see those fields. What I see is only the two default id and time fields. I have included the following fields :  dv_active,dv_assignment_group,dv_assigned_to,dv_number,dv_u_resolution_category But in the output I see only below fields: Is there anything that I am doing wrong? Regards, Himani.
Hello community, Below is my sample log file I want to extract each individual piece of event(starting from @ID to REMARK) from the log file. I tried to achieve this by using following regex: (^@I... See more...
Hello community, Below is my sample log file I want to extract each individual piece of event(starting from @ID to REMARK) from the log file. I tried to achieve this by using following regex: (^@ID[\s\S]*?REMARK.*$) This regex is taking the whole log file as single event. Attaching the snapshot below.  Also tried to alter the props.conf by using the same regex: props.conf [t24] SHOULD_LINEMERGE=False LINE_BREAKER=(^@ID[\s\S]*?REMARK.*$) NO_BINARY_CHECK=true disabled=false INDEXED_EXTRACTIONS = csv   LIST F.PROTOCOL @ID PROTOCOL.ID PROCESS.DATE TIME.MSECS K.USER APPLICATION LEVEL.FUNCTION ID REMARK PAGE 1 11:34:02 23 NOV 2023 @ID............ 202309260081340532.21 @ID............ 202309260081340532.21 PROTOCOL.ID.... 202309260081340532.21 PROCESS.DATE... 20230926 TIME.MSECS..... 11:15:32:934 K.USER......... INPUTTER APPLICATION.... AC.INWARD.ENTRY LEVEL.FUNCTION. 1 ID............. REMARK......... ENQUIRY - AC.INTERFACE.REPORT @ID............ 202309260081340523.16 @ID............ 202309260081340523.16 PROTOCOL.ID.... 202309260081340523.16 PROCESS.DATE... 20230926 TIME.MSECS..... 11:15:23:649 K.USER......... INPUTTER APPLICATION.... AC.INWARD.ENTRY LEVEL.FUNCTION. 1 ID............. REMARK......... ENQUIRY - AC.INTERFACE.REPORT   Attaching the screenshot of the data which I'm getting on Splunk by using the regex mentioned above. Also attaching the snapshot of regex result which i have checked earlier online. I want my data to be shown in table form following is the example snapshot of how I want my data to be appear on Splunk.  
Use autoregress something like this | makeresults | eval a=mvrange(1,102) | mvexpand a | autoregress a p=1-100
While @bowesmana 's solution should work be aware that working with structured data using just regexes can lead to unforeseen problems.
Probably many people had various issues. That's the thing with computers - if you do something badly, you have problems But seriously - in order to set up monitoring properly you need to know wha... See more...
Probably many people had various issues. That's the thing with computers - if you do something badly, you have problems But seriously - in order to set up monitoring properly you need to know what to monitor, think how you want to monitor that and verify if your monitoring solution supports that. It's a completely different thing if you want to just monitor server performance parameters via SNMP than if you want to do functional checks with Selenium to verify if the whole setup is working properly.
You mean something like | streamstats <optionally current=f> window=10 list(myfield) <optionally BY another_field>
@inventsekar , They  recommended upgrading or updating the web.conf file in on-prem environment. How we can do this  as its not the cloud its an enterprise.