All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Thank you for your response. I added $ sign in the return field ($actor), however I am still getting the below error.   Error in 'EvalCommand': Failed to parse the provided arguments. Usage: eval ... See more...
Thank you for your response. I added $ sign in the return field ($actor), however I am still getting the below error.   Error in 'EvalCommand': Failed to parse the provided arguments. Usage: eval dest_key = expression.
Hi all I am trying to fetch incident details from servicenow, but its showing duplicate values      index=acn_lendlease_certificate_tier3_idx tower=Entrust_Certificate | join type=left source_... See more...
Hi all I am trying to fetch incident details from servicenow, but its showing duplicate values      index=acn_lendlease_certificate_tier3_idx tower=Entrust_Certificate | join type=left source_host max=0 [search index=acn_ac_snow_ticket_idx code_message=create uid="*Saml : Days to expire*" OR uid="*Self_Signed : Days to expire*" OR uid="*CA : Days to expire*" OR uid="*Entrust : Days to expire*" | rex field=_raw "\"(?<INC>INC\d+)," | rex field=uid "(?i)^(?P<source_host>.+?)__" | table INC uid log_description source_host | dedup INC uid log_description source_host | rename INC as "Ticket_Number"] | fillnull value="NA" Ticket_Number | stats latest(tower) as Tower, latest(source_host) as source_host , latest(metric_value) as "Days To Expire", latest(alert_value) as alert_value, latest(add_info) as "Additional Info" by instance,Ticket_Number | eval alert_value=case(alert_value==100,"Active",alert_value==300,"About to Expire", alert_value==500,"Expired") | search Tower="*" alert_value="*" alert_value="About to Expire" | sort "Days To Expire" | dedup instance | rename instance as "Serial Number / Server ID", Tower as "Certificate Type" , source_host as Certificate , alert_value as "Certificate Status"
Hi @trevor7 , in interesting fields there are only fields present at least in the 20% of the displayed events, to see the others go in "Other fields". you can force Splunk to display them, adding t... See more...
Hi @trevor7 , in interesting fields there are only fields present at least in the 20% of the displayed events, to see the others go in "Other fields". you can force Splunk to display them, adding to your search (only for test) the condition S42DSN_0010=*, select this field as Selected and then remove the above additional condition. In this way you should see this field in your list. About the not displayed using the table command is another issue that I cannot check without accessing your data, do you see them in interesting fields? The only check I hint is on the field name, that's case sensitive. Then selecting the above field, you should see the value of this field for each event so you can be sure that there's a value for this field; maybe you have few rows with this field and you are seeing only the ones without it. Ciao. Giuseppe
Nope. See the tables here for comparison between simpleXML dashboards and DS ones. https://docs.splunk.com/Documentation/Splunk/latest/DashStudio/IntroFrame
Hi @Tajuddin , in addition to the questions from @PickleRick I need to know: the two conditions you listed are related with OR or AND? then what's the name of the field (in the main search) to comp... See more...
Hi @Tajuddin , in addition to the questions from @PickleRick I need to know: the two conditions you listed are related with OR or AND? then what's the name of the field (in the main search) to compare with confroom_ipaddress of the checkin_rooms.csv lookup? i use IP_Address but you can change. anyway, eventually change my main search:   index=fow_checkin [| inputlookup checkin_rooms.csv | rename confroom_ipaddress AS IP_Address | fields IP_Address ] ("IpAddress(from request body)" OR NOT "display button:panel-*") | ...   Ciao. Giuseppe
Do you mean we can't use JS to customize dashboards created in Splunk dashboard studio?
While there is quite a lot of flexibility in simpleXML based dashboards (you can insert custom css, custom js and so on) but dashboard studio while nice and pretty doesn't allow for as much customiza... See more...
While there is quite a lot of flexibility in simpleXML based dashboards (you can insert custom css, custom js and so on) but dashboard studio while nice and pretty doesn't allow for as much customization.
Hi @JoshuaJJ , if in the /var/log/syslog/192.168.1.1 folder you have only those three files add * at the end of the monitor stanza, [monitor:///var/log/syslog/192.168.1.1/*] disabled = false recur... See more...
Hi @JoshuaJJ , if in the /var/log/syslog/192.168.1.1 folder you have only those three files add * at the end of the monitor stanza, [monitor:///var/log/syslog/192.168.1.1/*] disabled = false recursive = true index = insight otherwise use the whitelist option I hinted. [monitor:///var/log/syslog/192.168.1.1/*/*/] disabled = false host_segment = 4 index = insght whitelist=secure|cron|message Ciao. Giuseppe
As to whether to use the latest version... well, that's a bit more complicated. As a rule of thumb - yes. Latest versions should contain fixes, vulnerability patches and possibly new functionalities... See more...
As to whether to use the latest version... well, that's a bit more complicated. As a rule of thumb - yes. Latest versions should contain fixes, vulnerability patches and possibly new functionalities. But in specific cases you might have unusual needs regarding compatibility or particular bugs being (not) present in some versions. So again - it's not as straightforward as it would seem. In a newly installed environment I'd probably go for the latest version (possibly with exception of the x.y.0 versions and maybe x.y.1 as well) but later... I would definitely _not_ rush to upgrade whole environment every time version x.y.z+1 comes out.
Hi aind, your suggestions sounded good and I created a field alias for host to original_host with the source veeam as follows Unfortunatelly this didn't help. Do you have any other ideas or d... See more...
Hi aind, your suggestions sounded good and I created a field alias for host to original_host with the source veeam as follows Unfortunatelly this didn't help. Do you have any other ideas or did I something wrong?
Hi @lostcauz3 , good for you, see next time! yes, I always use the latest version of Splunk. Only one additional information, as also @PickleRick said, check your volume requirements, because 5 GB... See more...
Hi @lostcauz3 , good for you, see next time! yes, I always use the latest version of Splunk. Only one additional information, as also @PickleRick said, check your volume requirements, because 5 GB/day and 1000 users are strange numbers: 5 GB/day is a very small intallation, that usually doesn't require a distributed arcitecture, but 1000 users are a number of a very large and complex infrastructure. Closing: see and follow the Splunk Architect Certification path (if you have time), or engage a Certified Splunk Partner for your design and implementation. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
I'm building a dashboard in Splunk dashboard studio which consists of a lot of tiles containing images. On clicking on any of the tiles, you get routed to a new dashboard in new tab. I want to add a ... See more...
I'm building a dashboard in Splunk dashboard studio which consists of a lot of tiles containing images. On clicking on any of the tiles, you get routed to a new dashboard in new tab. I want to add a hover function to both blue & orange tile so that as soon as the user hover on any of the tiles, he should be able to see a brief description about the dashboard (which gets opened in new tab).  Is there a way to do that?  
OK. While one of the obvious culprits is probably the exclusion (NOT "mon-tx-"), it would be very useful to see the job report because depending on your data you might be having also problems elsewhe... See more...
OK. While one of the obvious culprits is probably the exclusion (NOT "mon-tx-"), it would be very useful to see the job report because depending on your data you might be having also problems elsewhere. For example - the http_status=5* condition while seemingly harmless can be very very "heavy" if you have many different fields containing strings beginning with 5. So this might be the case for either some form of acceleration (summary indexing?) or one of the border cases where it's actually a good idea to use indexed field. BTW, this is not a JSON.
This is something typically a Splunk Partner does for you - depending on your needs an architect from the partner's side should design a proper environment for your particular situation. As a side n... See more...
This is something typically a Splunk Partner does for you - depending on your needs an architect from the partner's side should design a proper environment for your particular situation. As a side note - 5GB/day (unless you have a very unusual use case which requires a lot of processing) is a relatively small installation and often can be done using a single server (but that is not what I would recommend without seeing the whole picture so don't quote me on that ;-)). Also as @gcusello mentioned - that 1000 users value is either completely off if we're talking about Splunk users or is completely irrelevant if we're talking about users in your overall environment. Long story short - contact your local friendly Splunk Partner for assistance.
OK. If you do [| loadjob savedsearch="myuserid:my_app:my_saved_search" | return actor] Splunk will run the subsearch - load the saved search and return a string containing actor=something Which ... See more...
OK. If you do [| loadjob savedsearch="myuserid:my_app:my_saved_search" | return actor] Splunk will run the subsearch - load the saved search and return a string containing actor=something Which means your main search will effectively be index=my_pers_index sourcetype=ACCT | eval userid = actor=something This is not a valid SPL. Eval - as your error says - needs an asignment of field=value. You need to return just the value from your subsearch. And for that there is a special syntax. index=my_pers_index sourcetype=ACCT | eval userid = [| loadjob savedsearch="myuserid:my_app:my_saved_search" | return $actor]  
Apologies I am new to SPL. My requirement is to get values of a previously run saved search in a new field in current search. And I have only changed the names in my original search, it is what I w... See more...
Apologies I am new to SPL. My requirement is to get values of a previously run saved search in a new field in current search. And I have only changed the names in my original search, it is what I was trying to use:   index=my_pers_index sourcetype=ACCT | eval userid = [| loadjob savedsearch="myuserid:my_app:my_saved_search" | return actor] I was getting Error in 'EvalCommand': Failed to parse the provided arguments. Usage: eval dest_key = expression. Which possibly means, the block of commands in [] is not returning a value as expected by eval. Any help on how I can get all the field values into a field in my current search would be appreciate.
Thanks for the response. Maciek Stopa on the Splunk Slack workspace provided this novel code that has worked well. This doesn't handled the metrics and sc4s events but I handled those in splunk_meta... See more...
Thanks for the response. Maciek Stopa on the Splunk Slack workspace provided this novel code that has worked well. This doesn't handled the metrics and sc4s events but I handled those in splunk_metadata.csv. # cp example_postfiler.conf /opt/sc4s/local/config/app_parsers/ # systemctl restart sc4s block parser app-dest-rewrite-index() { channel { rewrite { r_set_splunk_dest_update_v2( index("${.splunk.index}_unique-suffix") ); }; }; }; application app-dest-rewrite-index[sc4s-postfilter] { parser { app-dest-rewrite-index(); }; };    
I am running  Red Hat Enterprise Linux release 8.10 (Ootpa)  
1. This is not a valid SPL. Please post your literal search in a code block or preformatted paragraph. 2. What do you mean "unable to work"? What results are you getting?
I am trying to get value of a field from a previous scheduled savedsearch in a new field using loadjob, however unable to get it to work. I am using something like: index=my_pers_index sourcetype=A... See more...
I am trying to get value of a field from a previous scheduled savedsearch in a new field using loadjob, however unable to get it to work. I am using something like: index=my_pers_index sourcetype=ACCT | eval userid = [| loadjob savedsearch="myuserid:my_app:my_saved_search" | return actor] wherein, myuserid - owner id my_app - is the application name my_saved_search - name of the saved search that is present in savedsearches.conf & is scheduled actor is a field name in - my_saved_search