I try to export data from Splunk. It is important that this data is not changed/manipulated by the export/Splunk itself.
" and \ are a problem here.
Example:
| makeresults | eval SkriptBlockText="\"s0m3\\Code\""
| table _time SkriptBlockText
The value ist "s0m3\Code"
. When exporting, the characters are escaped \"s0m3\\Code\"
. This affects the result of the subsequent analysis, from a security point of view.
My exportscript.ps1:
`
$search = '| makeresults | eval SkriptBlockText="\"s0m3\Code\"" | table _time SkriptBlockText'
$url = 'https://YOUR_URL:8089/services/search/jobs/export'
$credential = get-credential
$outfile = '.\output-'+$(get-date -f yyyy-MM-dd-hh-mm)+'.json'
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls
$Body = @{
search = $search
output_mode = 'json'
exec_mode = 'oneshot'
count = '0'
}
Invoke-RestMethod -Uri $url -Credential $credential -Method Post -OutFile $outfile -Body $Body
`
Perhaps the method is not the right one, does anyone have an idea or a solution to the problem?
It may be easier to run the splunk search from the search head inside of your powershell script. Did you know that you can run a Splunk search from your powershell script? Try this from cmd.exe
and then migrate it inside of your script:
C:\Program*Files\Splunk\bin\Splunk.exe "Your Splunk search here"
Check out JSON Tools
app:
https://github.com/doksu/TA-jsontools/wiki
Multi-value field preservation when exporting to csv
Fields with multiple values can be easily preserved when exporting to csv. For example, we can convert a single field to JSON:
... | mkjson outputfield=src src | outputlookup mylookup
This can then be reconstituted with spath:
... | lookup mylookup ... OUTPUT src | spath input=src
This is useful to migrate KV Store Collection records from one host to another (or to a SHC), but be mindful of the fact spath removes any preceding underscore from field names. For example, to export Enterprise Security's Notable Event lookup to a csv containing JSON, the following could be used:
| inputlookup es_notable_events
| mkjson includehidden=true outputfield=json
| table json
| outputlookup migration_es_notable_events
If the flat-file csv lookup is then migrated, it can be used to re-populate the contents of the KV Store without loss of fidelity (multi-value fields, etc.) but the fields beginning with an underscore must be renamed:
| inputlookup migration_es_notable_events
| spath input=json
| rename key as _key
| rename time as _time
| outputlookup es_notable_events
That's how I assemble json at searchtime....
| foreach *
[ eval jsonmv_ = mvappend(jsonmv_,"\"<<MATCHSTR>>\":\"" + <<FIELD>> + "\"")]
| eval json_raw = "{" + mvjoin(jsonmv_,",") + "}"
| fields - jsonmv_
That not the problem. It's a problem that the characters "
and \\
are escaped when exporting the data.
Sorry if I have expressed myself a little unclear or complicated here. , My English is not so good...
I need a method to export the data without the backslashes being added. Only the data, raw and unchanged.
The addon is more practical for other things but in this case it does exactly what I don't want it to do, it escapes (adds backslashes) the data already at search time.
No, it is not how you assemble the JSON. You are doing it manually and incorrectly. Doing it my way will create correct, coherent, and properly-escaped/encapsulated JSON which will be easy to export/import. Just try it.
properly-escaped/encapsulated JSON
Thank you for your willingness to help me but..I need the data without "proberly-escaped/encapsulated". If the field value is "S0m3\Code"
, i need exacly this in my export.
I tested the addon, it does exactly what I don't need.
In the following, it is intended to analyze "Powershell ScriptBlockLogging" for obfuscations and any added character will falsify the results.