Dashboards & Visualizations

Why is there no data being displayed in Trusted Advisor Dashboards?

shwetas
Explorer

HI Team,

We have installed the AWS Trusted Advisor in Splunk to show optimization capability using Splunk.

Configured the input no error in Splunkd.log but wondering why dashboard is not displaying any result? Did anyone else also face the same problem?

Labels (1)
0 Karma

ptursun
Loves-to-Learn Lots

As 'platformoperati' stated in below comment, there must be something in the code. Gitchecks did not return any result. 

I replaced the gitchecks.py to  platformoperati's code. I ran it in the server, and it generated result. 

Now, my issue is that how to display the result in the dashboard. 

0 Karma

livehybrid
Builder

@ptursun I'm very sorry, I didnt get a notification of this.

I'm picking this up again as very keen to go to the bottom of it. 

@maxr7866 If you're still up for a screenshare then that would be great. Ping me an email to splunk at livehybrid . com and we'll get something setup.

Thanks

Will

0 Karma

maxr7866
Observer

Hi Will,

I figured this out by modifying the getchecks.py file 

0 Karma

livehybrid
Builder

@shwetas Did you get to the bottom of this? Please feel free to reach out if you still have any issues.
Thanks
Will

0 Karma

ptursun
Loves-to-Learn Lots

Are you available to answer this issue? 

0 Karma

maxr7866
Observer

@livehybrid,

Hi Will,

I have the same issue as the OP. The dashboards are not populating.  I have installed the latest version of the app and can confirm that I receive trusted advisor data. When running the | getchecks command, i get the error below. Any idea? 

External search command 'getchecks' returned error code 1. .

splunkd log shows the following: 

<Expiration>2020-06-26T07:55:11Z</Expiration> </Credentials> </AssumeRoleResult> <ResponseMetadata> <RequestId>3618e0ca-f465-4988-9ca7-ff03d7e74294</RequestId>

0 Karma

livehybrid
Builder

Hi @maxr7866 

Sorry about the delay getting back to you on this.

If you Inspect the search job and click the splunk.log link, does it give any indication of errors?
Does your user have the list_storage_passwords capability?

If you're available for a quick screen share to walk through this then please let me know.

Thanks

Will

0 Karma

maxr7866
Observer

Hi @livehybrid ,

Sorry for the late reply. Please let know if you have time today to jump on a quick call. 

0 Karma

brettcave
Builder

Are you running in a distributed environment (e.g. collecting TA results on a HF or something that isn't the search head), or does the IAM identity being used have the right permissions?

The "getchecks" command makes an AWS API call - support:describe_trusted_advisor_checks (in us-east-1) - and then populates a lookup. The lookup needs to be populated on the search layer (not the collection layer). If you are collecting data on a splunk instance that isn't the search head, and the search head isn't authorized to invoke the AWS action, then the lookup won't populate. To resolve this, you can look at authorizing the search layer to make this call.

You can check if getchecks is actually working by running a search of | getchecks - if you get results, then auth is set up properly, but it might be as easy as ensuring the search to populate lookups is enabled and scheduled:

1 other thing to check is that the search that runs getcheck and populates the lookup is actually scheduled: on your search layer, go to settings --> searches, reports and alerts. Find the search called " Trusted Advisor Checks Lookup Populator " in the Trusted Advisor app. Click the Edit --> Schedule. In our deployment, the "Schedule Report" option was disabled, ensure that it is enabled. The default schedule is to run every week on Monday at 00:00 with a time range of the last minute.

0 Karma

platformoperati
New Member

I ran into this same issue, it was because the custom search command "getchecks" doesn't return a result, so the lookup csv trusted_advisor_checks.csv remains empty. I didn't get it to work from our Splunk Cloud instance, but I queried the AWS API myself and imported the generated lookup file using the Lookup Editor app. The code below is extracted from the custom search command, and prints the id, name and category in a CSV fashioned way. I left the field "description" out because it contains comma's, so you'll need to edit some searches in the dashboard of the AWS Trusted Advisor Aggregator app. Hope this helps!

Cheers,

Christiaan

import boto3
from botocore.exceptions import EndpointConnectionError
from botocore.exceptions import ClientError


def get_checks(results):
    events = []
    row = {}
    for check in results:
        row['id'] = check['id']
        row['name'] = check['name']
        row['category'] = check['category']
        row['description'] = check['description']
        events.append(row)
        print(row['id'] + "," + row['name'] +","+row['category'])
        row = {}


if __name__ == "__main__":
    session_token=None
    region = 'us-east-1'
    try:
        client = boto3.client(
            'support',
            region_name=region
        )
        checks = client.describe_trusted_advisor_checks(language='en')['checks']
        output=get_checks(checks)
        splunk_results = output

    except EndpointConnectionError as e:
        message = '{}'.format(e)
        print(message)
    except ClientError as e:
        message = '{}'.format(e)
        print(message)
0 Karma

ptursun
Loves-to-Learn Lots

I tried your code to be able to generate the result in server. But having trouble displaying in the dashboard. 

Is there anything needs to be done? 

I tried to reach out to William, have no luck so far.

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...