All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

As @bowesmana says, map is generally not suitable for what you are trying to do.  Instead of illustrating an imagined SPL snippet for volunteers to read your mind, it is better to ask yourself, and i... See more...
As @bowesmana says, map is generally not suitable for what you are trying to do.  Instead of illustrating an imagined SPL snippet for volunteers to read your mind, it is better to ask yourself, and illustrate: What is a meaningful dataset to illustrate my problem? Action: Illustrate said dataset using text. (Screenshot does not apply.  Anonymize as needed.) What is the information I am trying to obtain?  Action: Illustrate your desired output based on the dataset. What is the logic between my sample dataset and desired output?  Use plain language, not SPL.  Make your intention clear in logical terms.  Use common mathematical/logical symbols if you like, but not SPL if you have any doubt about your code. If you illustrate some SPL that does not give you desired output, also illustrate actual results from the sample dataset.  Then, explain why the result differs from desired output unless the reason is painfully obvious. Before I try to read your mind, let me point out one critical point you need to clarify - I will use your "first search" to exemplify.  Do you try to search for events with terms "<ProcessName>" and "Exception occurred" only in source=user2, then all events from source=user1? Because that's what your first search does.  Your second search has the same logic, therefore IF that map command works, events in source=user1 will always match.  Is this really your intention? I have a high suspicion that you want to search for events with terms "<ProcessName>" and "Exception occurred" in either source=user1 or source=user2.  Is this correct?  I will assume so in the following. This being said, based on the screenshot snippet you shared, you don't need to use regex or even spath to extract jobId because Splunk has clearly done that for you.  The field name is Properties.jobId.  All you need to do is to match this field. In other words, given these 8 simplified events:   source _raw 1 user1 {"Level": "Error", "MessageTemplate": "Exception occurred - something something", "Properties": { "jobId": "8ef3e2f8-35c4-4f0a-8553-cffd718640b", "message": "<ProcessNotName2> Exception occurred - Exception Source: System.Activities stuff, stuff" } } 2 user1 {"Level": "Error", "MessageTemplate": "Exception occurred - something more", "Properties": { "jobId": "8ef3e2f8-2903-4f0a-8553-cffd718640b", "message": "<ProcessName> Exception occurred - Exception Source: System.Activities stuff, stuff" } } 3 user1 {"Level": "Info", "MessageTemplate": "Exception did not occurr - something else", "Properties": { "jobId": "8ef3e2f8-1234-4f0a-8572-cffd718640b", "message": "Exception won't happen - blah" } } 4 user1 {"Level": "Info", "MessageTemplate": "Not exception - something else", "Properties": { "jobId": "8ef3e2f8-5678-4f0a-8553-cffd718640b", "message": "Nothing to see here - don't worry" } } 5 user2 {"Level": "Error", "MessageTemplate": "Exception occurred - something more", "Properties": { "jobId": "8ef3e2f8-35c4-4f0a-8553-cffd718640b", "message": "Exception occurred - Exception Source: System.Activities stuff, stuff" } } 6 user2 {"Level": "Error", "MessageTemplate": "Exception occurred - something something", "Properties": { "jobId": "8ef3e2f8-2903-4f0a-8553-cffd718640b", "message": "Exception occurred - Exception Source: System.Activities stuff, stuff" } } 7 user2 {"Level": "Info", "MessageTemplate": "Exception did not occurr - something else", "Properties": { "jobId": "8ef3e2f8-2903-4f0a-8572-cffd718640b", "message": "Exception won't happen - blah" } } 8 user2 {"Level": "Info", "MessageTemplate": "Not exception - something else", "Properties": { "jobId": "8ef3e2f8-2903-4f0a-8553-cffd718640b", "message": "Nothing to see here - don't worry" } } you want to select 2, 6, and 8. This is the search to use:   index="<indexname>" (source = "user1" OR source = "user2") [ search index="<indexname>" (source = "user1" OR source = "user2" ) "<ProcessName>" "Exception occurred" | stats values(Properties.jobId) AS Properties.jobId ]   This is the data emulation to generate the mock dataset posted above.  Play with it and compare with real data   | makeresults | eval data = mvappend( "{\"Level\": \"Error\", \"MessageTemplate\": \"Exception occurred - something something\", \"Properties\": { \"jobId\": \"8ef3e2f8-35c4-4f0a-8553-cffd718640b\", \"message\": \"<ProcessNotName2> Exception occurred - Exception Source: System.Activities stuff, stuff\" } }", "{\"Level\": \"Error\", \"MessageTemplate\": \"Exception occurred - something more\", \"Properties\": { \"jobId\": \"8ef3e2f8-2903-4f0a-8553-cffd718640b\", \"message\": \"<ProcessName> Exception occurred - Exception Source: System.Activities stuff, stuff\" } }", "{\"Level\": \"Info\", \"MessageTemplate\": \"Exception did not occurr - something else\", \"Properties\": { \"jobId\": \"8ef3e2f8-1234-4f0a-8572-cffd718640b\", \"message\": \"Exception won't happen - blah\" } }", "{\"Level\": \"Info\", \"MessageTemplate\": \"Not exception - something else\", \"Properties\": { \"jobId\": \"8ef3e2f8-5678-4f0a-8553-cffd718640b\", \"message\": \"Nothing to see here - don't worry\" } }" ) | mvexpand data | rename data AS _raw | spath | eval source = "user1" | append [| makeresults | eval data = mvappend( "{\"Level\": \"Error\", \"MessageTemplate\": \"Exception occurred - something more\", \"Properties\": { \"jobId\": \"8ef3e2f8-35c4-4f0a-8553-cffd718640b\", \"message\": \"Exception occurred - Exception Source: System.Activities stuff, stuff\" } }", "{\"Level\": \"Error\", \"MessageTemplate\": \"Exception occurred - something something\", \"Properties\": { \"jobId\": \"8ef3e2f8-2903-4f0a-8553-cffd718640b\", \"message\": \"Exception occurred - Exception Source: System.Activities stuff, stuff\" } }", "{\"Level\": \"Info\", \"MessageTemplate\": \"Exception did not occurr - something else\", \"Properties\": { \"jobId\": \"8ef3e2f8-2903-4f0a-8572-cffd718640b\", \"message\": \"Exception won't happen - blah\" } }", "{\"Level\": \"Info\", \"MessageTemplate\": \"Not exception - something else\", \"Properties\": { \"jobId\": \"8ef3e2f8-2903-4f0a-8553-cffd718640b\", \"message\": \"Nothing to see here - don't worry\" } }" ) | mvexpand data | rename data AS _raw | spath | eval source = "user2"] ``` the above emulates index="<indexname>" (source = "user1" OR source = "user2") ```   Using this emulation in both main search and subsearch, here is a full emulation:   | makeresults | eval data = mvappend( "{\"Level\": \"Error\", \"MessageTemplate\": \"Exception occurred - something something\", \"Properties\": { \"jobId\": \"8ef3e2f8-35c4-4f0a-8553-cffd718640b\", \"message\": \"<ProcessNotName2> Exception occurred - Exception Source: System.Activities stuff, stuff\" } }", "{\"Level\": \"Error\", \"MessageTemplate\": \"Exception occurred - something more\", \"Properties\": { \"jobId\": \"8ef3e2f8-2903-4f0a-8553-cffd718640b\", \"message\": \"<ProcessName> Exception occurred - Exception Source: System.Activities stuff, stuff\" } }", "{\"Level\": \"Info\", \"MessageTemplate\": \"Exception did not occurr - something else\", \"Properties\": { \"jobId\": \"8ef3e2f8-1234-4f0a-8572-cffd718640b\", \"message\": \"Exception won't happen - blah\" } }", "{\"Level\": \"Info\", \"MessageTemplate\": \"Not exception - something else\", \"Properties\": { \"jobId\": \"8ef3e2f8-5678-4f0a-8553-cffd718640b\", \"message\": \"Nothing to see here - don't worry\" } }" ) | mvexpand data | rename data AS _raw | spath | eval source = "user1" | append [| makeresults | eval data = mvappend( "{\"Level\": \"Error\", \"MessageTemplate\": \"Exception occurred - something more\", \"Properties\": { \"jobId\": \"8ef3e2f8-35c4-4f0a-8553-cffd718640b\", \"message\": \"Exception occurred - Exception Source: System.Activities stuff, stuff\" } }", "{\"Level\": \"Error\", \"MessageTemplate\": \"Exception occurred - something something\", \"Properties\": { \"jobId\": \"8ef3e2f8-2903-4f0a-8553-cffd718640b\", \"message\": \"Exception occurred - Exception Source: System.Activities stuff, stuff\" } }", "{\"Level\": \"Info\", \"MessageTemplate\": \"Exception did not occurr - something else\", \"Properties\": { \"jobId\": \"8ef3e2f8-2903-4f0a-8572-cffd718640b\", \"message\": \"Exception won't happen - blah\" } }", "{\"Level\": \"Info\", \"MessageTemplate\": \"Not exception - something else\", \"Properties\": { \"jobId\": \"8ef3e2f8-2903-4f0a-8553-cffd718640b\", \"message\": \"Nothing to see here - don't worry\" } }" ) | mvexpand data | rename data AS _raw | spath | eval source = "user2"] ``` the above emulates index="<indexname>" (source = "user1" OR source = "user2") ``` | search [makeresults | eval data = mvappend( "{\"Level\": \"Error\", \"MessageTemplate\": \"Exception occurred - something something\", \"Properties\": { \"jobId\": \"8ef3e2f8-35c4-4f0a-8553-cffd718640b\", \"message\": \"<ProcessNotName2> Exception occurred - Exception Source: System.Activities stuff, stuff\" } }", "{\"Level\": \"Error\", \"MessageTemplate\": \"Exception occurred - something more\", \"Properties\": { \"jobId\": \"8ef3e2f8-2903-4f0a-8553-cffd718640b\", \"message\": \"<ProcessName> Exception occurred - Exception Source: System.Activities stuff, stuff\" } }", "{\"Level\": \"Info\", \"MessageTemplate\": \"Exception did not occurr - something else\", \"Properties\": { \"jobId\": \"8ef3e2f8-1234-4f0a-8572-cffd718640b\", \"message\": \"Exception won't happen - blah\" } }", "{\"Level\": \"Info\", \"MessageTemplate\": \"Not exception - something else\", \"Properties\": { \"jobId\": \"8ef3e2f8-5678-4f0a-8553-cffd718640b\", \"message\": \"Nothing to see here - don't worry\" } }" ) | mvexpand data | rename data AS _raw | spath | eval index = "<indexname>", source = "user1" | append [| makeresults | eval data = mvappend( "{\"Level\": \"Error\", \"MessageTemplate\": \"Exception occurred - something more\", \"Properties\": { \"jobId\": \"8ef3e2f8-35c4-4f0a-8553-cffd718640b\", \"message\": \"Exception occurred - Exception Source: System.Activities stuff, stuff\" } }", "{\"Level\": \"Error\", \"MessageTemplate\": \"Exception occurred - something something\", \"Properties\": { \"jobId\": \"8ef3e2f8-2903-4f0a-8553-cffd718640b\", \"message\": \"Exception occurred - Exception Source: System.Activities stuff, stuff\" } }", "{\"Level\": \"Info\", \"MessageTemplate\": \"Exception did not occurr - something else\", \"Properties\": { \"jobId\": \"8ef3e2f8-2903-4f0a-8572-cffd718640b\", \"message\": \"Exception won't happen - blah\" } }", "{\"Level\": \"Info\", \"MessageTemplate\": \"Not exception - something else\", \"Properties\": { \"jobId\": \"8ef3e2f8-2903-4f0a-8553-cffd718640b\", \"message\": \"Nothing to see here - don't worry\" } }" ) | mvexpand data | rename data AS _raw | spath | eval source = "user2"] | search "<ProcessName>" "Exception occurred" ``` the above emulates index="<indexname>" (source = "user1" OR source = "user2") "ProcessName" "Exception occurred" ``` | stats values(Properties.jobId) as Properties.jobId ]   The output is these three events: source _raw user1 {"Level": "Error", "MessageTemplate": "Exception occurred - something more", "Properties": { "jobId": "8ef3e2f8-2903-4f0a-8553-cffd718640b", "message": "<ProcessName> Exception occurred - Exception Source: System.Activities stuff, stuff" } } user2 {"Level": "Error", "MessageTemplate": "Exception occurred - something something", "Properties": { "jobId": "8ef3e2f8-2903-4f0a-8553-cffd718640b", "message": "Exception occurred - Exception Source: System.Activities stuff, stuff" } } user2 {"Level": "Info", "MessageTemplate": "Not exception - something else", "Properties": { "jobId": "8ef3e2f8-2903-4f0a-8553-cffd718640b", "message": "Nothing to see here - don't worry" } }  
Thanks, as per my first post I'd like to join 2 searches together on NB and retain all columns . I am able to retain all columns but some rows are filled are some are not (but NB is definitely matchi... See more...
Thanks, as per my first post I'd like to join 2 searches together on NB and retain all columns . I am able to retain all columns but some rows are filled are some are not (but NB is definitely matching in both searches) index=sky sourcetype=sky_trade_murex_timestamp | rex field=_raw "trade_id=\"(?<trade_id>\d+)\"" | rex field=_raw "mx_status=\"(?<mx_status>[^\"]+)\"" | rex field=_raw "sky_id=\"(?<sky_id>\d+)\"" | rex field=_raw "event_id=\"(?<event_id>\d+)\"" | rex field=_raw "operation=\"(?<operation>[^\"]+)\"" | rex field=_raw "action=\"(?<action>[^\"]+)\"" | rex field=_raw "tradebooking_sgp=\"(?<tradebooking_sgp>[^\"]+)\"" | rex field=_raw "portfolio_name=\"(?<portfolio_name>[^\"]+)\"" | rex field=_raw "portfolio_entity=\"(?<portfolio_entity>[^\"]+)\"" | rex field=_raw "trade_type=\"(?<trade_type>[^\"]+)\"" | rename trade_id as NB | table sky_id, NB, event_id, mx_status, operation, action, tradebooking_sgp, portfolio_name, portfolio_entity, trade_type index=sky sourcetype=mx_to_sky | rex field=_raw "(?<NB>\d+);(?<TRN_STATUS>[^;]+);(?<NOMINAL>[^;]+);(?<CURRENCY>[^;]+);(?<TRN_FMLY>[^;]+);(?<TRN_GRP>[^;]+);(?<TRN_TYPE>[^;]*);(?<BPFOLIO>[^;]*);(?<SPFOLIO>[^;]*)" | eval NB = tostring(trim(NB)) | table TRN_STATUS, NB, NOMINAL, CURRENCY, TRN_FMLY, TRN_GRP, TRN_TYPE, BPFOLIO, SPFOLIO
There is no such option because reports are sent unconditionally. If you wish to send email only when there are results then consider changing the report to an alert.
Hi @Jado95, Is your question specific to Splunk Add-on for Cisco ASA or Cisco ASA itself? The message format is defined by Cisco ASA, and the add-on implementation should agree with Cisco ASA docume... See more...
Hi @Jado95, Is your question specific to Splunk Add-on for Cisco ASA or Cisco ASA itself? The message format is defined by Cisco ASA, and the add-on implementation should agree with Cisco ASA documentation at https://www.cisco.com/c/en/us/td/docs/security/asa/syslog/b_syslog/syslog-messages-302003-to-342008.html: 302013 ... If inbound is specified, the original control connection was initiated from the outside. For example, for FTP, all data transfer channels are inbound if the original control channel is inbound. If outbound is specified, the original control connection was initiated from the inside. ... 302015 ... If inbound is specified, then the original control connection is initiated from the outside. For example, for UDP, all data transfer channels are inbound if the original control channel is inbound. If outbound is specified, then the original control connection is initiated from the inside. The corresponding teardown events, 302014 and 302106, do not specify a direction, so without prior knowledge, the field extraction can't know which address is the initiator. If needed, you can correlate the events by the session_id field. This example is slow and ugly; it's only meant to demonstrate the correlation: | eventstats values(direction) as direction by session_id | eval src_ip_tmp=src_ip, dest_ip_tmp=dest_ip, src_ip=if(lower(vendor_action)=="teardown" && lower(direction)=="outbound", dest_ip_tmp, src_ip_tmp), dest_ip=if(lower(vendor_action)=="teardown" && lower(direction)=="outbound", src_ip_tmp, dest_ip_tmp) | fields - src_ip_tmp dest_ip_tmp  
Unfortunately we as community users cannot do anything for this. Time by time it could take even day or two to get this email.
I registered for the 14-day Free Trial of Splunk Cloud Platform. I registered my email address and verified it. I expected to receive an email entitled "Welcome to Splunk Cloud Platform" with c... See more...
I registered for the 14-day Free Trial of Splunk Cloud Platform. I registered my email address and verified it. I expected to receive an email entitled "Welcome to Splunk Cloud Platform" with corresponding links from which to access and use a trial version of Splunk Cloud. That email never arrived after several hours. No evidence of it exists either in my "Spam" or "Trash" folders of my inbox. Please look into this and advise. Thanks! -Rolland
Hello, I have a report scheduled every week and the results are exported to pdf's. Is there an option to NOT email if no results are found because sometimes these PDF's have nothing in them.   Tha... See more...
Hello, I have a report scheduled every week and the results are exported to pdf's. Is there an option to NOT email if no results are found because sometimes these PDF's have nothing in them.   Thanks
I have the Splunk Add-on for Google Cloud Platform set up on an IDM server.  I am currently on version 4.4 and have inputs set up already from months ago however, I am trying to send more data to spl... See more...
I have the Splunk Add-on for Google Cloud Platform set up on an IDM server.  I am currently on version 4.4 and have inputs set up already from months ago however, I am trying to send more data to splunk but for some reason the inputs page does not load anymore.  The connection seems to be fine as I am still receiving expected data from my previous inputs but now when i try to add another input I get the following error: Failed to Load Inputs Page This is normal on Splunk search heads as they do not require an Input page. Check your installation or return to the configuration page. Details AxiosError: Request failed with status code 500
I would use list() instead of values() to prevent removal of duplicates and wrap the product in exact() to prevent rounding errors: | makeresults format=csv data="value_a 0.44 0.25 0.67 0.44" | sta... See more...
I would use list() instead of values() to prevent removal of duplicates and wrap the product in exact() to prevent rounding errors: | makeresults format=csv data="value_a 0.44 0.25 0.67 0.44" | stats list(value_a) as value_a | eval "product(value_a)"=1 | foreach value_a mode=multivalue [ eval "product(value_a)"=exact('product(value_a)' * <<ITEM>>) ] | table "product(value_a)"  => product(value_a) 0.032428
Hi @madhav_dholakia, In Simple XML, you can generate categorical choropleth maps with color-coded city boundaries: In Dashboard Studio, however, choropleth maps are limited to numerical distrib... See more...
Hi @madhav_dholakia, In Simple XML, you can generate categorical choropleth maps with color-coded city boundaries: In Dashboard Studio, however, choropleth maps are limited to numerical distributions, e.g. OpenIssues by city, and the geospatial lookup geometry isn't always interpreted correctly: In both examples, I've used mapping data published by the Office for National Statistics at https://geoportal.statistics.gov.uk. Search for BDY_TCITY DEC_2015 to download the corresponding KML file. As a compromise, you can use a marker map to display color-coded markers at city centers by latitude and longitude: Here's the source: { "visualizations": { "viz_KxsdmDQb": { "type": "splunk.map", "options": { "center": [ 52.560559999999924, -1.4702799999984109 ], "zoom": 6, "layers": [ { "type": "marker", "dataColors": "> primary | seriesByName('Status') | matchValue(colorMatchConfig)", "choroplethOpacity": 0.75, "additionalTooltipFields": [ "Status", "StoreID", "City", "OpenIssues" ], "latitude": "> primary | seriesByName('lat')", "longitude": "> primary | seriesByName('lon')" } ] }, "context": { "colorMatchConfig": [ { "match": "Dormant/Green", "value": "#118832" }, { "match": "Warning/Amber", "value": "#cba700" }, { "match": "Critical/Red", "value": "#d41f1f" } ] }, "dataSources": { "primary": "ds_CtvaIPJ3" } } }, "dataSources": { "ds_CtvaIPJ3": { "type": "ds.search", "options": { "query": "| makeresults format=csv data=\"StoreID,City,OpenIssues,Status,lat,lon\r\nStore 1,London,3,Critical/Red,51.507222,-0.1275\r\nStore 2,York,2,Warning/Amber,53.96,-1.08\r\nStore 3,Bristol,0,Dormant/Green,51.453611,-2.5975\r\nStore 4,Liverpool,1,Warning/Amber,53.407222,-2.991667\" \r\n| table StoreID City OpenIssues Status lat lon", "queryParameters": { "earliest": "-24h@h", "latest": "now" } }, "name": "Choropleth map search" } }, "defaults": { "dataSources": { "ds.search": { "options": { "queryParameters": {} } } } }, "inputs": {}, "layout": { "type": "absolute", "options": { "width": 918, "height": 500, "display": "auto" }, "structure": [ { "item": "viz_KxsdmDQb", "type": "block", "position": { "x": 0, "y": 0, "w": 918, "h": 500 } } ], "globalInputs": [] }, "description": "", "title": "eaw_store_status_ds" } As a static workaround, the Choropleth SVG visualization allows you to upload a custom image, e.g. a stylized map of England and Wales, and define custom SVG boundaries and categorical colors. The Dashboard Studio documentation includes a basic tutorial at https://docs.splunk.com/Documentation/Splunk/latest/DashStudio/mapsChorSVG.
N/A
So UF recognize the change of time when it writes this message into log, but its scheduler didn’t understand it correctly to run next round at correct time. Definitely time to create splunk support case.
N.A
Thanks I found this solution insightful and helpful for a similar scenario I am working on.
Hi All, is there any steps to follow to ingest transactional data from TIBCO database to Splunk without any add-on's 
How quickly that sw update node’s time to correct after hibernation? Basically after that UF’s cron schedule should work as expected. If not then I propose that you should create a support case to Spl... See more...
How quickly that sw update node’s time to correct after hibernation? Basically after that UF’s cron schedule should work as expected. If not then I propose that you should create a support case to Splunk.
@isoutamo, No, I have not tried with spath, Could you please guide me with that.   I tried with the below, its showing events, but not getting the transaction level information index="<indexname>... See more...
@isoutamo, No, I have not tried with spath, Could you please guide me with that.   I tried with the below, its showing events, but not getting the transaction level information index="<indexname>" source = "user1" OR source = "user2" "<ProcessName>" "Exception occurred" | spath | table _time JobId TransactionId _raw | search JobId=* | append [ search index="<indexname>" source = "user1" OR source = "user2" | spath | search JobId=* | table _time JobId TransactionId _raw ] | stats dc(TransactionId) as UniqueTransactionCount values(TransactionId) as UniqueTransactions by JobId
This from the documentation Best practices for creating chain searches Use these best practices to make sure that chain searches work as expected. Use a transforming base search A base search sho... See more...
This from the documentation Best practices for creating chain searches Use these best practices to make sure that chain searches work as expected. Use a transforming base search A base search should be a transforming search that returns results formatted as a statistics table. For example, searches using the following commands are transforming searches: stats, chart, timechart, and geostats, among others. For more information on transforming commands, see About transforming commands in the Search Manual. https://docs.splunk.com/Documentation/SplunkCloud/latest/DashStudio/dsChain#Best_practices_for_creating_chain_searches  
N.A  
Have you try a spath command as you have json data in use?