All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

OK. There is a free track of entry-level trainings on the Splunk educational platform - the links from the new STEP page are a bit strange so just search for "Splunk Search Beginner Learning Path". I... See more...
OK. There is a free track of entry-level trainings on the Splunk educational platform - the links from the new STEP page are a bit strange so just search for "Splunk Search Beginner Learning Path". It's a relatively quick way to learn the basics. If you have any experience with unix shell scripting, it should be quite intuitive - you just pass your set of results through sequence of subsequent commands connected with pipe characters. As @bowesmana showed - you can see what your search does right from the start and add more and more commands to see how they affect the results. About the asset numbers - well,  I assume you don't have this information in the events themselves, right? So it's a bit more advanced topic because you must get it stored somewhere inside Splunk (most probably as a lookup) and correlate that info with your events to use it for further filtering. There is probably more than one way to go about it but the proper approach would depend on particular use case. It's not possible to give you a general answer without knowing the details.
Hi @indeed_2000 , about the indexes, as I said, you could use one index or divide your data in few manageable indexes based e.g on the type of data flows. To have performant searches on large amoun... See more...
Hi @indeed_2000 , about the indexes, as I said, you could use one index or divide your data in few manageable indexes based e.g on the type of data flows. To have performant searches on large amount of events, using only some predefined fields (not search on raw logs), you could use a custom (or from CIM) accelerated datamodel, i?m using one of them for one of my customers that needs to fastly search on 8 fields of billions of events. Then I configured a drilldown on the indexes to display the raw events filtered with the results of the datamodel search. Ciao. Giuseppe
Hi @dcfrench3 , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Hi @RSS_STT , as I said, you can use my search and after use some conditions to filter results. Ciao. Giuseppe
Hi @bowesmana , to avoid to use a wrong name, I usually use the same name for the lookup and its definition, so even if I use the csv name, I use the definition. Ciao. Giuseppe
Hi @Veerendra, the requirement isn't so easy to solve because Splunk isn't a transactional Database. Yo have three solutions: 1) use the Splunk Lookup Editor to manually modify the value whitout a... See more...
Hi @Veerendra, the requirement isn't so easy to solve because Splunk isn't a transactional Database. Yo have three solutions: 1) use the Splunk Lookup Editor to manually modify the value whitout any control (easy) . 2) create a java script that updates the lookup and a dashboard that uses the JS, (complicated also to describe). 3) create some panels in the dashboard to update the lookup  I describe the third one: in few words, you should: visualize the lookup values in a dashbaord panel with in dashboard drilldown active, click on a value, in another panel display all the values with the updated row, adding at the end of the search the outputlookup command that override the entire lookup with the results of the second panel. It's difficoult to give more infos. Ciao. Giuseppe
it's possible to write partial data in to lookup file on base on some condition. Like... if dv_u_parent_class = ci_appld than outputlookup append=true abc.csv |where dv_u_parent_class != ci_appld ... See more...
it's possible to write partial data in to lookup file on base on some condition. Like... if dv_u_parent_class = ci_appld than outputlookup append=true abc.csv |where dv_u_parent_class != ci_appld and run time query should show rest two events.   child child_Name dv_u_parent_class fqdn_name direction name parent 55555 xyz PROD ci_appld xyz.srv.com R toY xyz 111111 55555 abc PROD ci_appld xyz.srv.com R to Y xyz 222222 55555 zzzz-FSE2 ci_netcom xyz.srv.com Y to R xyz 333333 55555 abc.srv.com ci_esx_app xyz.srv.com Y to R xyz 444444
Hi Team, I have the table in my dashboard as below: Age Approval Name 61 Approve Sujata 29 Approve Linus 33 Approve Karina 56 Approve Rama Requirement is to update the Appr... See more...
Hi Team, I have the table in my dashboard as below: Age Approval Name 61 Approve Sujata 29 Approve Linus 33 Approve Karina 56 Approve Rama Requirement is to update the Approve to Approved once user click on a particular row and the output should like like below: Age Approval Name 61 Approved Sujata 29 Approve Linus 33 Approve Karina 56 Approve Rama
Hi everyone I created a look up table:   Department,Vendor,Type,url_domain,user,src_ip,Whitelisted BigData,Material,Google Remote Desktop,Alpha.com,Alice,172.16.28.12,TRUE   Then I created a l... See more...
Hi everyone I created a look up table:   Department,Vendor,Type,url_domain,user,src_ip,Whitelisted BigData,Material,Google Remote Desktop,Alpha.com,Alice,172.16.28.12,TRUE   Then I created a look up definition with this match type:   WILDCARD(url_domain), WILDCARD(user), WILDCARD(src_ip)   Then I tested it on following search but it didn't work.   index=fortigate src_ip=172.16.28.12 url_domain=Alpha.com | lookup Whitelist url_domain user src_ip | where isnull(Whitelisted) | table _time, severity, user, url_domain, src_ip, dest_ip, dest_domain, transport, dest_port, vendor_action, app, vendor_eventtype, subtype, devname   and shows all results including traffic from 172.16.28.12 by Alice to the mentioned url  Anyone has any idea what is the issue?
Hi @anooshac, You were probably really close to the solution for this - but it looks like using tokens to set the colour on tables only works before the table has loaded. The only workaround out I f... See more...
Hi @anooshac, You were probably really close to the solution for this - but it looks like using tokens to set the colour on tables only works before the table has loaded. The only workaround out I found was to re-run the search each time you select a new radio button. Here's a working version for you to build from. The radio buttons set the type: Both and Type A => Highlight Columns A and B Type B => Highlight Column B The table has a format for ColumnA and ColumnB that looks for the right Type in the token: <format type="color" field="ColumnB"> <colorPalette type="expression">if(match($field_to_highlight|s$,".*TypeB.*"),"#FF0000", "")</colorPalette> </format> To make the dashboard redraw with the right columns highlighted, I added this to the query: ``` $field_to_highlight$ ``` That won't do anything to the search results, but will force the search to be run again. Here is a working dashboard: <form version="1.1" theme="dark"> <label>Answers for anooshac</label> <fieldset submitButton="false"> <input type="radio" token="field_to_highlight" searchWhenChanged="true"> <label>Option</label> <choice value="TypeA__TypeB">Both</choice> <choice value="TypeA_TypeB">Type A</choice> <choice value="TypeB">Type B</choice> <default>TypeATypeB</default> </input> </fieldset> <row> <panel> <table> <title>Content</title> <search> <query>| gentimes start=-20 | eval type=if(random()%2==0,"Type A","Type B") | appendcols[|windbag] | fields starttime, endtime, sample, lang, type | rename starttime as "ColumnA", endtime as "ColumnB" | head 20 ``` $field_to_highlight$```</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="drilldown">none</option> <format type="color" field="ColumnA"> <colorPalette type="expression">if(match($field_to_highlight|s$,".*TypeA.*"),"#FF0000", "")</colorPalette> </format> <format type="color" field="ColumnB"> <colorPalette type="expression">if(match($field_to_highlight|s$,".*TypeB.*"),"#FF0000", "")</colorPalette> </format> </table> </panel> </row> </form>   Hopefully that's close to what you were after.
Now you can use the eval command's bitwise operators.  https://docs.splunk.com/Documentation/SplunkCloud/9.1.2308/SearchReference/BitFunctions Your query will look like:  * | head 1 | eval x=2 | e... See more...
Now you can use the eval command's bitwise operators.  https://docs.splunk.com/Documentation/SplunkCloud/9.1.2308/SearchReference/BitFunctions Your query will look like:  * | head 1 | eval x=2 | eval y=2 | eval z = bit_and(x, y) | table x y z  
Hi @vishalrohila,   You can create dynamic dropdowns by creating searches that populate based on the value of tokens. For example, you can create a dropdown to let users pick either A or W, and th... See more...
Hi @vishalrohila,   You can create dynamic dropdowns by creating searches that populate based on the value of tokens. For example, you can create a dropdown to let users pick either A or W, and that sets a token, $A_or_W$ Then, in the second dropdown you can populate the list based on a lookup, search, or faking the results, and filter out based on what should be shown for A or W. Faking the search to make the options: | makeresults | eval parent="a;w" | makemv parent delim=";" | mvexpand parent | eval options=case(parent="a", "a;b;c", parent="w", "x;y;z") | makemv options delim=";" | mvexpand options | search parent="$A_or_W$" | table options   With that configured, you get a dropdown for A or W selection, which then populates the second dropdown with ABC or XYZ. Your example was fairly generic, so your actual use case might be a better fit for a lookup, with Parent / Label / Value fields. Either way, they list can be filtered by the first lookup. Here's the Dashboard code for you to try out: {"visualizations":{}, "dataSources":{"ds_gKyzOqv5":{"type":"ds.search","options":{"query":"| makeresults\n| eval parent=\"a;w\"\n| makemv parent delim=\";\"\n| mvexpand parent\n| eval options=case(parent=\"a\", \"a;b;c\",parent=\"w\", \"x;y;z\")\n| makemv options delim=\";\"\n| mvexpand options\n| search parent=\"$A_or_W$\"\n| table options"},"name":"dropdown"}}, "defaults":{"dataSources":{"ds.search":{"options":{"queryParameters":{"latest":"$global_time.latest$","earliest":"$global_time.earliest$"}}}}}, "inputs":{"input_global_trp":{"type":"input.timerange","options":{"token":"global_time","defaultValue":"-24h@h,now"},"title":"Global Time Range"},"input_gkLCoAVl":{"options":{"items":[],"token":"dd_7gHCFird","selectFirstSearchResult":true},"title":"Dropdown Input Title","type":"input.dropdown","dataSources":{"primary":"ds_gKyzOqv5"}},"input_VpWiOave":{"options":{"items":[{"label":"All","value":"*"},{"label":"A","value":"a"},{"label":"W","value":"w"}],"token":"A_or_W","defaultValue":"*"},"title":"Choose A or W","type":"input.dropdown"}}, "layout":{"type":"grid","options":{"width":1440,"height":960}, "structure":[],"globalInputs":["input_global_trp","input_VpWiOave","input_gkLCoAVl"]}, "description":"", "title":"test"}   Hopefully that helps.
Hi @MikeWilliams ..  Please check the process here..  https://docs.splunk.com/Documentation/Splunk/9.1.2/Indexer/Addclusterpeer Let us know if any issues, queries.. thanks. 
Hello Everyone There is one index cluster, one search header, one management node, and three peers. The configuration is RF=3 and SF=3.  There is a non-clustered indexer, and many universal forward... See more...
Hello Everyone There is one index cluster, one search header, one management node, and three peers. The configuration is RF=3 and SF=3.  There is a non-clustered indexer, and many universal forwarder have sent data to this non-clustered indexer. I want this non-clustered indexer to join this cluster and have this cluster take over the incoming data from this indexer. If this indexer fails, other peers in the cluster can store its data. What should I do?
Hello Mzorzi, Did you managed to get the integration successfully? have a similar requirement of this.
Hello everyone In the Investigation view, in the Workbench section, I want to add a different artifact type than the ones that appear (asset, identity, file, url), I would like an artifact type: Pro... See more...
Hello everyone In the Investigation view, in the Workbench section, I want to add a different artifact type than the ones that appear (asset, identity, file, url), I would like an artifact type: Process, and another type: Index. Where to add custom artifact types to use in the workbench?
Hi @senthild  More details needed from your side..  from AWS Cloud to Splunk Cloud or Splunk Enterprise? any recent changes to the HEC inputs?  get details from the user that which timeframe or l... See more...
Hi @senthild  More details needed from your side..  from AWS Cloud to Splunk Cloud or Splunk Enterprise? any recent changes to the HEC inputs?  get details from the user that which timeframe or logs are missing exactly.. pls check these logs yourself..   (may times the developers simply "think" something is missing) maybe, pls check these troubleshooting steps..  https://docs.splunk.com/Documentation/SplunkCloud/9.1.2308/Data/TroubleshootHTTPEventCollector  
Hi @splunkN00b  Pls copy paste the props.conf .. something got messaged up there..  Pls provide us some more details.. may we know what logs are these..  did you upload the logs manually or do you... See more...
Hi @splunkN00b  Pls copy paste the props.conf .. something got messaged up there..  Pls provide us some more details.. may we know what logs are these..  did you upload the logs manually or do you read these logs thru UF
Hi @Dallastek1  I assume this is about Enterprise Security Content management and i am not much sure of this task.  Chjekced the documentations and found this..  https://docs.splunk.com/Documentat... See more...
Hi @Dallastek1  I assume this is about Enterprise Security Content management and i am not much sure of this task.  Chjekced the documentations and found this..  https://docs.splunk.com/Documentation/ES/7.3.0/Admin/Export hope this maybe helpful.. thanks. 
Hi @Qyusuff  The UF version details  and ubuntu version please was it working previously and recently you receive this error?  any recent changes, upgrades, etc? the UF was installed manually or t... See more...
Hi @Qyusuff  The UF version details  and ubuntu version please was it working previously and recently you receive this error?  any recent changes, upgrades, etc? the UF was installed manually or thru tools like chef/puppet/salt? may i know if you had a look at the splunk log files at that UF..