All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Did you add the CSS and also the table ID? Also yes, the mid/min/max is what you set in this statement | eval v=mvappend(v, tostring(case(perc<=10, 1, perc<=20, 2, perc<=100, 3))) So you set your ... See more...
Did you add the CSS and also the table ID? Also yes, the mid/min/max is what you set in this statement | eval v=mvappend(v, tostring(case(perc<=10, 1, perc<=20, 2, perc<=100, 3))) So you set your ranges with that setting, so add a 1 for min, 2 for mid and 3 for max and then in the format, decide the colours you want.
Check permissions. Both in Splunk (the meta entries) an on filesystem. Check if your script runs if you call it with splunk cmd <app_path>/bin/lenlookup.py  
You are really just repeating the same question all these days without showing your effort.  I have a fairly elaborate response in your other question How to filter events using text box values inclu... See more...
You are really just repeating the same question all these days without showing your effort.  I have a fairly elaborate response in your other question How to filter events using text box values including sample dashboards.  Please delete repeating posts and work on the post where volunteers have provided you with the most information.
Thanks @PickleRick  for your response. I am looking to get the count of browsers which are commonly used like chrome, firefox, safari, edge and opera
@bowesmana, thanks for the reply. It is working perfectly fine. I missed the html part for hiding the multivalue. Thank you so much !
One way you can do it is using the hidden multivalue technique where you store a number (or even the colour) as a second value to that number, which is hidden through CSS. The use a format expressio... See more...
One way you can do it is using the hidden multivalue technique where you store a number (or even the colour) as a second value to that number, which is hidden through CSS. The use a format expression to colour the cell See this example <panel> <title>Colour formatting for ranges of percentages</title> <html depends="$hidden$"> <style> #coloured_cell4 table tbody td div.multivalue-subcell[data-mv-index="1"]{ display: none; } </style> </html> <table id="coloured_cell4"> <search> <query>| makeresults count=10 | fields - _time | eval v=random() % 100 | eventstats sum(v) as totv | eval perc=round(v/totv*100) | eval v=v." (".perc."%)" | eval v=mvappend(v, tostring(case(perc&lt;=10, 1, perc&lt;=20, 2, perc&lt;=100, 3)))</query> <earliest>$earliest$</earliest> <latest>$latest$</latest> </search> <option name="refresh.display">progressbar</option> <format type="color" field="v"> <colorPalette type="expression">case(mvindex(value, 1) == "1", "#00FF00", mvindex(value, 1) == "2", "#FFBB00", mvindex(value, 1) == "3", "#FF0000")</colorPalette> </format> </table> </panel>
Hi All, I have few columns with is in the format "21 (31%)" , these are the value and percentage of the value. I want to use MinMidMax for the coloring based on the percentage. But i am not able to ... See more...
Hi All, I have few columns with is in the format "21 (31%)" , these are the value and percentage of the value. I want to use MinMidMax for the coloring based on the percentage. But i am not able to use it directly since it is a customized value. Any one knows any solution for coloring such columns?
Thanks  for your detailed reply. Created the python script, added the transforms.conf file, added the lookup definition, updated the meta file, and restarted Splunk and then ran: | makeresults | ... See more...
Thanks  for your detailed reply. Created the python script, added the transforms.conf file, added the lookup definition, updated the meta file, and restarted Splunk and then ran: | makeresults | eval data="இடும்பைக்கு" | lookup test_lenlookup data Still the error:  Error in 'lookup' command: Could not construct lookup 'test_lenlookup, data'. See search.log for more details. Could not find 'lenlookup.py'. It is required for lookup 'test_lenlookup'. The search job has failed due to an error. You may be able view the job in the - job inspector.   looks like the issue is solved 90%, but some basic issues here, but not sure of what. ok, let me revisit the whole issue tomorrow, thanks a lot for your help. 
Assuming this is the output of a search, then make the search do this with that data - this assumes raw is a field containing that data   | eval json=json_array_to_mv(raw) | fields - raw _time | mv... See more...
Assuming this is the output of a search, then make the search do this with that data - this assumes raw is a field containing that data   | eval json=json_array_to_mv(raw) | fields - raw _time | mvexpand json | spath input=json | fields - json  
You can do this in a number of ways. 1. Use a lookup definition based on a lookup file with the messages you want to match. Create a CSV lookup with the matches you are interested in and prefix/su... See more...
You can do this in a number of ways. 1. Use a lookup definition based on a lookup file with the messages you want to match. Create a CSV lookup with the matches you are interested in and prefix/suffix them with *, e.g. | makeresults format=csv data="match *error events found for key* *Invalid requestTimestamp* *Exception while calling some API ...java.util.concurrent.TimeoutException*" | outputlookup matches.csv Set up a look definition based on that CSV and in Advanced options, define match type as WILDCARD(match)  Then in your search do your search... | lookup matches match as message OUTPUT match | where isnotnull(match) | stats count by match So, you can see this actually working like this | makeresults format=csv data="message error events found for key a1 Invalid requestTimestamp abc error event found for key a2 Invalid requestTimestamp def correlationID - 1234 Exception while calling some API ...java.util.concurrent.TimeoutException correlationID - 2345 Exception while calling some API ...java.util.concurrent.TimeoutException" | lookup matches match as message OUTPUT match | where isnotnull(match) | stats count by match Note that this actually produces on a SINGLE match for the error events found because your second example was event, not events. There are other ways to do the same, but it depends on what your trying to do. Note that your lookup can contain additional fields you could output, e.g. a description, which you could OUTPUT instead to report on. Also note that wildcard is *, so put the wildcard where you want it and it will match anything between.
It would help to know what you've tried already so we don't waste time on that. Consider these props settings [mysourcetype] DATETIME_CONFIG = current SHOULD_LINEMERGE = false LINE_BREAKER = ([\r\n... See more...
It would help to know what you've tried already so we don't waste time on that. Consider these props settings [mysourcetype] DATETIME_CONFIG = current SHOULD_LINEMERGE = false LINE_BREAKER = ([\r\n]+)\{ TRANSFORMS-parse_mysourcetype = parse_mysourcetype with these transforms: [parse_mysourcetype] REGEX = "([^"]+)":"([^"]+) FORMAT = $1::$2
sure, thank you.
Thanks for the reply. I dont think punct will work for my requirement as I am creating an alert so its not one time thing  , but thanks for looking into it.
The fact that you have Rag_Status and count in your legend would indicate you have done a stats count, not a chart count. See the difference of the tabled output between using this | makeresults co... See more...
The fact that you have Rag_Status and count in your legend would indicate you have done a stats count, not a chart count. See the difference of the tabled output between using this | makeresults count=200 | eval KPI_Score=random() % 100, KPI_Threshold=80, SWC="SWC:".(random() % 5) | eval RAG_Status = case( KPI_Score >= KPI_Threshold, "Green", KPI_Score >= (KPI_Threshold - 5), "Amber", KPI_Score < (KPI_Threshold - 5), "Red" ) | chart count BY SWC RAG_Status | sort SWC and then using stats rather than chart in the second line. In the chart case, you should end up with columns SWC and then Amber, Green and Red, but if you use stats, you will get SWS, RAG_Status and count. In the first case you can stack the data perfectly OK  
@JandrevdM as your search is doing the same search twice just with a different user, you'd be better off just doing a single search and splitting by user, e.g. - similar to your existing search inde... See more...
@JandrevdM as your search is doing the same search twice just with a different user, you'd be better off just doing a single search and splitting by user, e.g. - similar to your existing search index=db_assets sourcetype=assets_ad_users ($user1$ OR $user2$) | dedup displayName sAMAccountName memberOf | makemv delim="," memberOf | mvexpand memberOf | rex field=memberOf "CN=(?<Group>[^,]+)" | where Group!="" | stats values(Group) as Groups by user which will give you a user column and then a multivalue field with the list of groups If you then want to automatically show the differences between the two users, you can following that with | transpose 0 header_field=user | eval UniqueU1=mvmap(User1, if(User1!=User2,User1,null())) | eval UniqueU2=mvmap(User2, if(User2!=User1,User2,null())) | eval Common=mvmap(User1, if(User1=User2,User1,null())) and it will give you a list of groups unique to user 1, user 2 and the common groups. However, your existing search could be more efficiently done with index=db_assets sourcetype=assets_ad_users ($user1$ OR $user2$) | fields displayName sAMAccountName memberOf | stats latest(*) as * by user | eval memberOf=split(memberOf,",") | rex field=memberOf max_match=0 "CN=(?<Group>.+)" | fields - memberOf If you really want a row by row breakdown of groups, you can do the base search and then just do this | chart count over Group by user | foreach * [ eval <<FIELD>>=if("<<FIELD>>"="Group", <<FIELD>>, if('<<FIELD>>'=1, "Member", "Missing")) ] which will tell you Membership status of each group per user
Thanks for the response, this will not work as I am not searching for any specific text I just shared the sample, it can be anything.
Hi smart folks. I have the output of a REST API call as seen below. I need to split each of the records as delimited by the {} as it's own event with each of the key:values defined for each record.  ... See more...
Hi smart folks. I have the output of a REST API call as seen below. I need to split each of the records as delimited by the {} as it's own event with each of the key:values defined for each record.  [ { "name": "ESSENTIAL", "status": "ENABLED", "compliance": "COMPLIANT", "consumptionCounter": 17, "daysOutOfCompliance": "-", "lastAuthorization": "Dec 11,2024 07:32:21 AM" }, { "name": "ADVANTAGE", "status": "ENABLED", "compliance": "EVALUATION", "consumptionCounter": 0, "daysOutOfCompliance": "-", "lastAuthorization": "Jul 09,2024 22:49:25 PM" }, { "name": "PREMIER", "status": "ENABLED", "compliance": "EVALUATION", "consumptionCounter": 0, "daysOutOfCompliance": "-", "lastAuthorization": "Aug 10,2024 21:10:44 PM" }, { "name": "DEVICEADMIN", "status": "ENABLED", "compliance": "COMPLIANT", "consumptionCounter": 2, "daysOutOfCompliance": "-", "lastAuthorization": "Dec 11,2024 07:32:21 AM" }, { "name": "VM", "status": "ENABLED", "compliance": "COMPLIANT", "consumptionCounter": 2, "daysOutOfCompliance": "-", "lastAuthorization": "Dec 11,2024 07:32:21 AM" } ] Thanks in advance for any help you all might offer to get me down the right track.
You're right in that location based analysis can often highlight interesting things in data. Postal codes are common in many countries. I have used Australian postcodes along with postcode population... See more...
You're right in that location based analysis can often highlight interesting things in data. Postal codes are common in many countries. I have used Australian postcodes along with postcode population density information, to do some covid related dashboards some years ago. It's also possible to do geocoding, e.g. using Google's API https://developers.google.com/maps/documentation/geocoding/overview (there are others), to convert addresses to lat/long and also to then get postcode information. I have used that in the past to do distance calculations using the haversine formula, between GPS coordinates so you can then include a distance element in your events where relevant, e.g. to answer the question "where's the nearest...?" What is the challenge you face - is it getting reliable postcode data from your event data. You can sometimes find good sources of postcode to gps coordinates, I found some Australian downloadable CSV files containing Suburb/Postcode/GPS coordinate data that I used as a lookup dataset which you can then use in your dashboard.  
@Aresndiz The data in Splunk is the data being sent by that machine. What tells you that the data in Splunk is not the same as the data on the server? Splunk wil not change the data coming from your ... See more...
@Aresndiz The data in Splunk is the data being sent by that machine. What tells you that the data in Splunk is not the same as the data on the server? Splunk wil not change the data coming from your server. I note that the table and the event list do not appear to have the same information, e.g. CPU instance 13 has a reading of 9.32 in your table, yet that number does not match any of the event data you show. Is this what you mean? CPU measurements are sometimes difficult to compare - in your example, you show data from a 16 core CPU with individual cores ranging from 7 to 60% and a total of 15%. What is the sampling rate of your readings being sent to Splunk, as that reading represents the average value since the previous reading. If you use a different sampling interval when looking at data on your server you may well see different values, so you need to be comparing like with like.
The use of makeresults is to show examples of how to use a technique, so what you need is the eval statement that sets the field 'color' based on the values of State_after. Add it after your stats co... See more...
The use of makeresults is to show examples of how to use a technique, so what you need is the eval statement that sets the field 'color' based on the values of State_after. Add it after your stats command | eval color=case(State_after="DOWN", "#FF0000", State_after="ACTIVE", "#00FF00", State_after="STANDBY", "#FFBF00")