All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello everyone,   I am still relatively new to Splunk. I would like to add an additionalTooltipField to my maps visualization, so that when you hover over a marker point, more data details about th... See more...
Hello everyone,   I am still relatively new to Splunk. I would like to add an additionalTooltipField to my maps visualization, so that when you hover over a marker point, more data details about the marker appear. I have formulated the following query: source="NeueIP.csv" host="IP" sourcetype="csv" | rename Breitengrad as latitude, L__ngengrad as longitude, Stadt as Stadt, Kurzbeschreibung as Beschreibung | eval CPU_Auslastung = replace(CPU_Auslastung, "%","") | eval CPU_Auslastung = tonumber(CPU_Auslastung) | eval CPU_Color = case( CPU_Auslastung > 80.0, "#de1d20", CPU_Auslastung > 50.0, "#54afda", true(), "#4ade1d" ) | table Stadt, latitude, longitude, Kurzbeschreibung, Langbeschreibung, CPU_Auslastung, CPU_Color | eval _time = now()     And I tried to adjust some things in the source code so that the additionalTooltipField appears. Last of all: "visualizations": {  "viz_map_1": {  "type": "splunk.map",  "options": {  "center": [  50.35,  17.36  ],  "zoom": 4,  "layers": [  {  "type": "marker",  "latitude": "> primary | seriesByName('latitude')",  "longitude": "> primary | seriesByName('longitude')",  "dataColors": ">primary | seriesByName(\"CPU_Auslastung\") | rangeValue(config)",  "additionalTooltipFields": ">primary | seriesByName(\"Stadt\")",  "markerOptions": {  "additionalTooltipFields": [  "Stadt",  "Kurzbeschreibung"  ] },  "hoverMarkerPanel": {  "enabled": true,  "fields": [  "Stadt",  "Kurzbeschreibung"  ]  }  }  ]  },   My sample data is as follows: Stadt, Breitengrad, Längengrad, Kurzbeschreibung, Langbeschreibung, CPU_Auslastung Berlin, 52.52, 13.405, BE, Hauptstadt Deutschlands, 45% London, 51.5074, -0.1278, LDN, Hauptstadt des Vereinigten Königreichs, 65% Paris, 48.8566, 2.3522, PAR, Hauptstadt Frankreichs, 78%     Is my plan possible?   Thanks for your help in advance!!  
I have the same problem.  The webhook work for a couple of days and the fails.    Did the cron job to restart the inputs work successfully as a workaround?         
thank you for your help can you help in how to create  my own lookup from the indexed IT   Thanks
Hello. Thank you for the suggestion. I would look into it. I did not know i can embed reports.
(eventtype =axs_event_txn_visa_req_parsedbody "++EXT-ID[C0] FLD[Authentication Program..] FRMT[TLV] LL[1] LEN[2] DATA[01]") | rex field=_raw "(?s)(.*?FLD\[Acquiring Institution.*?DATA\[(?<F19>[^\]]*)... See more...
(eventtype =axs_event_txn_visa_req_parsedbody "++EXT-ID[C0] FLD[Authentication Program..] FRMT[TLV] LL[1] LEN[2] DATA[01]") | rex field=_raw "(?s)(.*?FLD\[Acquiring Institution.*?DATA\[(?<F19>[^\]]*).*)" | rex field=_raw "(?s)(.*?FLD\[Authentication Program.*?DATA\[(?<FCO>[^\]]*).*)" | rex field=_raw "(?s)(.*?FLD\[62-2 Transaction Ident.*?DATA\[(?<F62_2>[^\]]*).*)" | stats values(F19) as F19, values(FCO) as FCO by F62_2 | where F19!=036 AND FCO=01 | append [search eventtype=axs_event_txn_visa_rsp_formatting | rex field=_raw "(?s)(.*?FLD\[62-2 Transaction Ident.*?DATA\[(?<F62_2>[^\]]*).*)"] | stats values(F19) as F19, values(FCO) as FCO values(txn_uid) as txn_uid, values(txn_timestamp) as txn_timestamp, by F62_2
what i really want is  This is  query 1  - output ------------------------------- (eventtype =axs_event_txn_visa_req_parsedbody "++EXT-ID[C0] FLD[Authentication Program..] FRMT[TLV] LL[1] LEN[2] D... See more...
what i really want is  This is  query 1  - output ------------------------------- (eventtype =axs_event_txn_visa_req_parsedbody "++EXT-ID[C0] FLD[Authentication Program..] FRMT[TLV] LL[1] LEN[2] DATA[01]") | rex field=_raw "(?s)(.*?FLD\[Acquiring Institution.*?DATA\[(?<F19>[^\]]*).*)" | rex field=_raw "(?s)(.*?FLD\[Authentication Program.*?DATA\[(?<FCO>[^\]]*).*)" | rex field=_raw "(?s)(.*?FLD\[62-2 Transaction Ident.*?DATA\[(?<F62_2>[^\]]*).*)" | stats values(F19) as F19, values(FCO) as FCO by F62_2 | where F19!=036 AND FCO=01   F62_2 F19 FCO 384011068172061 840 1 584011056069894 826 1   Query 2 eventtype=axs_event_txn_visa_rsp_formatting | rex field=_raw "(?s)(.*?FLD\[62-2 Transaction Ident.*?DATA\[(?<F62_2>[^\]]*).*)" | stats values(txn_uid) as txn_uid, values(txn_timestamp) as txn_timestamp, by F62_2 What I really want is the output of the for query 1 and pass as an input to query, common field between two queries is F62_2. if i run the query it would be different output, so basically two queries should be combined and when it run it should take from F62_2 from query 1 and produce values(txn_uid) as txn_uid, values(txn_timestamp) as txn_timestamp
ok. Thanks. Would you please share your thoughts on how to merge the two queries
Hi Team,   In role we are providing the user as read only access, and set up the capabilities, Inheritress , resources, and restrictions.  But that user able to delete the query and delete the repo... See more...
Hi Team,   In role we are providing the user as read only access, and set up the capabilities, Inheritress , resources, and restrictions.  But that user able to delete the query and delete the report also, how do we hide delete option in the report?   Please guide the process.        
 Hi,  Does anyone have experience in monitoring Azure Integration Services with AppDynamics? Suggestions on a setup would be appreciated. The services will be calling an on-premise .NET application ... See more...
 Hi,  Does anyone have experience in monitoring Azure Integration Services with AppDynamics? Suggestions on a setup would be appreciated. The services will be calling an on-premise .NET application the ability to drilldown downstream is not a must but would be really nice to have. br Kjell Lönnqvist
2. Ensure 'nimalert' is stored under /opt/nimbus/bin/nimalarm, if not, you can change the path in TA-nimbusalerting/bin/nimbus.sh.  The add-on is written by some private person (there is a gmail ad... See more...
2. Ensure 'nimalert' is stored under /opt/nimbus/bin/nimalarm, if not, you can change the path in TA-nimbusalerting/bin/nimbus.sh.  The add-on is written by some private person (there is a gmail address on the Contact tab in Splunkbase) and is not very popular so it's hard to tell without deeper debugging what's going on. I'd start by looking into _internal for any logs containing "nimbus".
While @bowesmana 's solution is correct, it might not be the fastest one If your data haven't already rolled over past retention date, you can see if the licensing report is enough for you (but a... See more...
While @bowesmana 's solution is correct, it might not be the fastest one If your data haven't already rolled over past retention date, you can see if the licensing report is enough for you (but as far as I remember it's either by host or by index). Unfortunately, if you want to measure the size of raw data (which is what you're asking about), you need to read all the raw data back from the time period you need to analyze. Which is gonna be painfully slow if your environment is of any decent size.
Ugh. Docker But seriously, first things first. Check with normal openssl whether you can properly connect to the server. If not, then problems are on the server's side. If yes, then on the client... See more...
Ugh. Docker But seriously, first things first. Check with normal openssl whether you can properly connect to the server. If not, then problems are on the server's side. If yes, then on the client's side. openssl s_client -connect splunk.your.org.domain:8089 -CAfile path_to/your_rootCA.pem  
How did you proceed further? We're also looking at integrating backbase with Splunk for logging and monitoring purposes.
Hi @anooshac , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Poi... See more...
Hi @anooshac , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Unfortunately, Splunk cannot automatically extract MSG_DATA correctly because the XML document contains double quote.  If MSG_DATA is always the last field in the event, you can use   | eval MSG_DA... See more...
Unfortunately, Splunk cannot automatically extract MSG_DATA correctly because the XML document contains double quote.  If MSG_DATA is always the last field in the event, you can use   | eval MSG_DATA = replace(_raw, ".+,\s*MSG_DATA=\"|\"$", "") | spath input=MSG_DATA path=Message.additionalInfo.fileDetails.fileDetail.title | table Message.additionalInfo.fileDetails.fileDetail.title   Your sample data (which includes an invalid fragment that I remove) results in Message.additionalInfo.fileDetails.fileDetail.title FABDC REDS Letter 11-222-333   Normally, I advise against treating structured data as text.  But if you cannot be certain that MSG_DATA is the last field and cannot be certain of the exact terms that follows MSG_DATA, rex as @richgalloway suggested would be more stable.
@jayeshrajvir wrote: append and appendcol simply appending the query its like a glue. Please correct me if i am wrong   Not quite right - append adds events to the event pipeline, appendcols a... See more...
@jayeshrajvir wrote: append and appendcol simply appending the query its like a glue. Please correct me if i am wrong   Not quite right - append adds events to the event pipeline, appendcols adds fields to existing event i.e. append is vertical "glue" whereas appendcols is horizontal "glue" For completeness, appendpipe is also vertical "glue" but it uses the existing events pipeline as its base data rather than a new search
Hi All, I have a dashboard which has 3 radio buttons both,TypeA and TypeB. Also i have a table. The requirement is that, if i select both or TypeA in radio buttons, columnA and columnB in the table ... See more...
Hi All, I have a dashboard which has 3 radio buttons both,TypeA and TypeB. Also i have a table. The requirement is that, if i select both or TypeA in radio buttons, columnA and columnB in the table should be highlighted. If i select the TypeB, only columnA should be highlighted. How can i achieve this? I have tried using color palette expression like below. But no luck. Anyone have solution for this? <format type="color" field="columnA"> <colorPalette type="list">["#00FFFF"]</colorPalette> </format> <format type="color" field="columnB"> <colorPalette type="expression">if(match(Type,"TypeB")," ", "#00FFFF")</colorPalette> </format>
Thank you @gcusello , I'll try using lookup.
As you noted that "someLog" is just a text identifier to connect the two sets.  I deduce that "consistencies" and "inconsistencies" are also mere text identifiers, not associated with a specific fiel... See more...
As you noted that "someLog" is just a text identifier to connect the two sets.  I deduce that "consistencies" and "inconsistencies" are also mere text identifiers, not associated with a specific field. If this is correct, your problem can be clarified as: Find values of someField that only occurs in events that contains identifier term "inconsistencies" and that do not contain identifier term "consistencies".  This way, it is easy to translate into SPL: sourcetype="my_source" someLog (consistencies OR inconsistencies) | eval consistent_or_not = if(searchmatch("consistencies"), "consistent", "inconsistent") | stats values(someField) as someField by consistent_or_not | stats values(consistent_or_not) as consistent_or_not by someField | where mvcount(consistent_or_not) < 2 AND consistent_or_not == "inconsistent" Hope this helps.
No. I assume that there is simply no ?auto_extract_timestamp=true functionality for versions before 8.0. After that - it works OK.