Dashboards & Visualizations

Why is our custom dashboard reporting search errors, but only in the dashboard view in our search head cluster?

cumbers
Explorer

We are experiencing some weird issues with a custom developed dashboard application, and after a couple of days trying to debug the issue, I feel it is time I reached out to the community. If anyone can help, even if it's extra debug steps, that would be great!

Background
I developed a dashboard application with 2 views using a standalone search head to test and then deployed the custom app to the search head cluster in the standard way.

Issue
For certain users (as yet we cannot determine what the commonality is) the dashboard application does not work. We get a red triangle with the following text:

Search process did not exit cleanly, exit_code=-1, description="exited with code -1". Please look in search.log for this peer in the Job Inspector for more info.

Clicking the spyglass and using the integrated search within the app initially shows the errors (and allows us to look at the Job Inspector), however, if we retry the search using the same search parameters and time - we get results as expected.

Within the failed search, the search log has some differences from the successful search. We see WARN messages from the AuthorizationManager about an unknown role - '', something that does not appear in the successful search. I have confirmed that our authorization.conf and authorize.conf are the same on all members of the search head cluster, and that there are no etc/system/local versions of these files either.

Lastly, the permissions on this application are pretty wide, as defined in the default.meta for the app:

[]
access = read : [ * ], write : [ admin, power ]
export = system

Environment
- Splunk 6.4.1 on Windows Server 2012,
- we have a working Search Head Cluster running, that connects to 2 clustered indexers
- Using LDAP for authentication and deploying authorize.conf and authentication.conf as part of the Search Head Cluster bundle.

I am really at a loss for what to do or look at next, any help is very much appreciated

Update: We created two users with the same roles - one works, the other does not

0 Karma

maciep
Champion

So you have one app with two dashboards? If so, do both dashboards have the same problem? Anything consistent? Is it always the same users that don't work? Is it the same data on the same peer? Does the problem persist on each sh in the cluster?

If were to create a dashboard from Splunk web in your app and just copy the source over, does the issue surface for it as well?

No real ideas here, just some troubleshooting steps to narrow down some pattern.

0 Karma

cumbers
Explorer

Hi Maciep,

Thanks for the response, it's good to have some external input to help solve this issue!

Yes we have one app, with two dashboards. Both dashboards exhibit the same behaviour, and the same errors. If a user has a problem with one dashboard, they have a problem with both. The problem is consistent on all search head members.

If I copy the app dashboard XML and create a new Dashboard in the Search & Reporting App - everything is fine - all users can view the data without a problem

What is weird is as the following behaviour, which has me thinking it's some issue with permissions (somewhere):

  1. User sees error within App on dashboard
  2. User clicks on the spyglass, this opens a search view within the app. From here the user can view the job inspector - the errors are real (but odd)
  3. User simply refreshes the search, no error is received and the results are returned

I am wondering if there is a bug that needs to be looked at, but I can't figure out what it might be!

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...