Splunk Search

Why curl command for CSV output of top 20 error messages using regex returns error " -bash: syntax error near unexpected token `(' "?

shantu
Explorer

I'm trying to use the REST API to export an aggregation of the top 20 error messages in my log4j formatted logs. I want to do this through a search with Regex. Here is my curl command:

curl --get -s -u admin:pwd -k https:localhost:8088/servicesNS/admin/search/search/jobs/export -d output_mode=csv --data-urlencode search="search index=* sourcetype="log4j" | rex field=_raw ".*ERROR\\s+(?.*)\\n | top limit=20 ErrorMessage" -o aggregatedErrors.csv

This returns the error: -bash: syntax error near unexpected token `('

The search itself works fine in the Splunk search app, but curl seems to have an issue with the search string. Any idea why? Do I need to escape characters in the regex to use with curl? The reason I'm not just picking out a pre-saved field extraction is because the field extraction shows up fine in the extractor but gives me the entire stack trace when aggregating the errors. Therefore I end up with 100+ unique values instead of 10 or 12. The regex search piped into "top limit=20..." works best.

Please help!

0 Karma
1 Solution

Ayn
Legend

The reason why your command doesn't work is indeed because you're not escaping your query string and therefore it gets interpreted by the shell. The solution is simple: enclose your search string within single quotation marks.

curl --get -s -u admin:pwd -k https:localhost:8088/servicesNS/admin/search/search/jobs/export -d output_mode=csv --data-urlencode search='search index=* sourcetype="log4j" | rex field=_raw ".*ERROR\\s+(?.*)\\n | top limit=20 ErrorMessage' -o aggregatedErrors.csv

View solution in original post

Ayn
Legend

The reason why your command doesn't work is indeed because you're not escaping your query string and therefore it gets interpreted by the shell. The solution is simple: enclose your search string within single quotation marks.

curl --get -s -u admin:pwd -k https:localhost:8088/servicesNS/admin/search/search/jobs/export -d output_mode=csv --data-urlencode search='search index=* sourcetype="log4j" | rex field=_raw ".*ERROR\\s+(?.*)\\n | top limit=20 ErrorMessage' -o aggregatedErrors.csv

shantu
Explorer

Awesome, that worked perfectly! Thanks!

0 Karma
Get Updates on the Splunk Community!

Automatic Discovery Part 1: What is Automatic Discovery in Splunk Observability Cloud ...

If you’ve ever deployed a new database cluster, spun up a caching layer, or added a load balancer, you know it ...

Real-Time Fraud Detection: How Splunk Dashboards Protect Financial Institutions

Financial fraud isn't slowing down. If anything, it's getting more sophisticated. Account takeovers, credit ...

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

 Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...