I have a simple table in a dashboard built like this:
<table>
<title>Test</title>
<searchString>source="wineventlog:security" EventCode="528" Logon_Type="10" startdaysago=14 | table _time User_Name Source_Network_Address Workstation_Name | rename _time as Time User_Name as "User Name" Source_Network_Address as "Source IP Address" Workstation_Name as "RDP Server" | convert ctime(Time)</searchString>
<option name="count">25</option>
<option name="displayRowNumbers">true</option>
<option name="showPager">true</option>
</table>
Now, the table renders without any problems, however, when I click on any element in the table I get the following error:
Encountered an unexpected error while parsing intentions. PARSER: Applying intentions failed Unable to drilldown because of post-reporting 'convert' command
Any idea why this is happening? Even if I remove the convert() option and cut down the search, I still get the same error.
The drilldown is expecting a timechart
or chart
or stats
command to generate the data. Without it, it is unable to determine the original data and to drill back to. This is probably a bug or enhancement request to Splunk to get the UI to accept table
as a reporting command.
You can solve this either by creating a custom drilldown search (which requires going to Advanced XML), or you can rewrite your table
command using stats
:
... | stats count by _time User_Name Source_Network_Address Workstation_Name
| fields - count
| ...
This will eliminate duplicate rows. If you have them and want to keep them, you could do:
... | stats count by _time User_Name Source_Network_Address Workstation_Name _serial
| fields - count _serial
| ...
The drilldown is expecting a timechart
or chart
or stats
command to generate the data. Without it, it is unable to determine the original data and to drill back to. This is probably a bug or enhancement request to Splunk to get the UI to accept table
as a reporting command.
You can solve this either by creating a custom drilldown search (which requires going to Advanced XML), or you can rewrite your table
command using stats
:
... | stats count by _time User_Name Source_Network_Address Workstation_Name
| fields - count
| ...
This will eliminate duplicate rows. If you have them and want to keep them, you could do:
... | stats count by _time User_Name Source_Network_Address Workstation_Name _serial
| fields - count _serial
| ...