Monitoring Splunk

Global users having slowness in

robertlynch2020
Motivator

Hi

I have Splunk installed in Paris and we are noticing a large amount of slowness in Splunk when users are logged in via WAN.
We expect some slowness as we are a longer distance away, however it is nearly unusable.

I have a lot of logic in my dashboards, so perhaps it is this that it is causing it. However is there any thing else i can do.

For example if a use logs in from NY they would nearly have to use CITRIX to get a good connection. As SPlunk is web based and most of the heavy lifting is done on the server side i am surprised.
Below is an exampe of some of the searches that one of my dashbards could be running. A could have this run 10 times in parallel in one dashboard, so on the LAN it is fast. But when we get to long distance it becomes very difficult to use.

Would i need to install part of SPLUNK on servers locally to speed this up, if so what parts and how do i do this?

 <panel depends="$MXTIMING_ON_OFF$,$MXTIMING_ON_ALL_PANELLS$">
      <title>Filters: Command=$MXTIMING_Command_token$ Context=$MXTIMING_Context_token$ PATH=$source_path_search_token$ User=$MXTIMING_UserName_token$ NPID=$MXTIMING_NPID_token$ TYPE=$MXTIMING_TYPE_TOKEN$ Sercives=$NICKNAME_TOKEN$ Tags=$TAG_TOKEN$</title>

      <single>
        <title>TAG's Over $MAX_TIME$ and Threshold : Sercives=$NICKNAME_TOKEN$</title>
        <search>
          <query>| tstats summariesonly=$summariesonly_token$ avg(MXTIMING.Elapsed) AS average FROM datamodel=MXTIMING_V2 WHERE 
           host=$host_token$ 
AND MXTIMING.Elapsed &gt;  $MAX_TIME$ 
AND MXTIMING.source_path IN ($source_path_search_token$) 
AND MXTIMING.UserName2 IN ($MXTIMING_UserName_token$)
AND MXTIMING.NPID IN ($MXTIMING_NPID_token$) 
AND MXTIMING.MXTIMING_TYPE_DM IN ($MXTIMING_TYPE_TOKEN$)
AND MXTIMING.Context+Command IN ($MXTIMING_Context_token$) 
AND MXTIMING.Context+Command IN ($MXTIMING_Command_token$)
AND MXTIMING.Time = *
GROUPBY MXTIMING.Context+Command MXTIMING.NPID MXTIMING.Time | rename MXTIMING.Context+Command as Context+Command  |rename MXTIMING.NPID as NPID 

| join NPID [| tstats summariesonly=$summariesonly_token$ count(SERVICE.NPID) AS count2 FROM datamodel=SERVICE WHERE ( host=$host_token$ earliest=@w1)  
 AND SERVICE.NICKNAME IN ($NICKNAME_TOKEN$)
GROUPBY SERVICE.NICKNAME SERVICE.NPID  | rename SERVICE.NPID AS NPID | rename SERVICE.NICKNAME AS NICKNAME ] | lookup MXTIMING_lookup_test Context_Command AS "Context+Command" OUTPUT Tags CC_Description Threshold Alert  |  search |where average > Threshold OR isnull('Threshold') | fillnull Tags | eval Tags=if(Tags=0,"NO_TAG",Tags) | eval Tags=split(Tags,",")|stats count(average) as count by Tags | sort Tags | append [| inputlookup stars.csv | table Column1 Column2 | rename Column1 as Tags | rename Column2 as count] | sort Tags</query>
          <earliest>$time_token.earliest$</earliest>
          <latest>$time_token.latest$</latest>
        </search>

Cheers
Robert Lynch

0 Karma
1 Solution

robertlynch2020
Motivator

To be honest this issues is not happening anymore, after an upgrade to 7.0.3. Perhaps something changed

View solution in original post

0 Karma

robertlynch2020
Motivator

To be honest this issues is not happening anymore, after an upgrade to 7.0.3. Perhaps something changed

0 Karma

woodcock
Esteemed Legend

Splunk searches are only as fast as the slowest indexer. If your Search Head has been recently peered to even just a single Indexer that is poorly performing, it is going to slow down all searches for everyone. The Search Head will get fast results from all the other Indexers but will wait and wait and wait for that one Indexer to send his results back. As far as users being far away from the Search Head, this should not make any difference at all.

0 Karma

robertlynch2020
Motivator

At the moment i have one indexer and one search head.
To be honest this issues is not happening anymore, after an upgrade to 7.0.3. Perhaps something changed

0 Karma
Get Updates on the Splunk Community!

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...