Splunk Search

rename field with numeric date eg 2024-06-10 to today

CMEOGNAD
Engager

Hi Community,

actual i have a cron job, thats get every day values for today and tomorrow.
How to extract for "today" or "tomorrow" the value?

This SPL doesn´t work, and don´t  rename my field to get a fix fieldname...

| eval today=strftime(_time,"%Y-%m-%d")
| rename "result."+'today' AS "result_today"
| stats list(result_today)



Here my RAW...
CMEOGNAD_0-1718013750277.png

 

Labels (2)
0 Karma

CMEOGNAD
Engager

The solution from yuanliu works, but not for the full json file from https://forecast.solar/
The best way was to use regex field extractor, but...

...next step to get timecharts from this format wont work by regex

{
    "result": {
        "watts": {
            "2019-06-22 05:15:00": 17,
            "2019-06-22 05:30:00": 22,
            "2019-06-22 05:45:00": 27,
            ...
            "2019-06-29 20:15:00": 14,
            "2019-06-29 20:30:00": 11,
            "2019-06-29 20:45:00": 7
        },
        "watt_hours": {
            "2019-06-22 05:15:00": 0,
            "2019-06-22 05:30:00": 6,
            "2019-06-22 05:45:00": 12,
            ...
            "2019-06-29 20:15:00": 2545,
            "2019-06-29 20:30:00": 2548,
            "2019-06-29 20:45:00": 2550
        },
        "watt_hours_day": {
            "2019-06-22": 2626,
            "2019-06-23": 2918,
            "2019-06-24": 2526,
            "2019-06-25": 2866,
            "2019-06-26": 2892,
            "2019-06-27": 1900,
            "2019-06-28": 2199,
            "2019-06-29": 2550
        }
    },
    "message": {
        "type": "success",
        "code": 0,
        "text": ""
    }
}

 

0 Karma

yuanliu
SplunkTrust
SplunkTrust

As we always say in this forum, illustration of raw input (in text format) is critical for the question to be answerable.  Thank you for finally getting to data.  My previous answer was based on KendallW's emulation.  This latest illustration is not only different from that emulation, but also different from your initial screenshot.  One fundamental difference is that this data includes multiple days potentially in the future.  It seems that the input is from a prediction of sorts.

This said, I also realized that JSON keys themselves can be utilized to simply solution if you are using Splunk 8.1 or later.  Again, regex is NOT the correct tool for structured data.

Here is the code you can try:

 

| eval today = strftime(now(), "%F"), tomorrow = strftime(relative_time(now(), "+1d"), "%F")
| eval today = json_extract(_raw, "result.watt_hours_day." . today)
| eval tomorrow = json_extract(_raw, "result.watt_hours_day." . tomorrow)

 

Here is an emulation for you to play with and compare with real data.  Because your illustrated data is way in the past, I randomly pick 2019-06-26 as search time and establishes a "fake_now" field instead of using now() function. (As a result, "tomorrow" corresponds to 2019-06-27.)

 

| makeresults
| eval _raw="{
    \"result\": {
        \"watts\": {
            \"2019-06-22 05:15:00\": 17,
            \"2019-06-22 05:30:00\": 22,
            \"2019-06-22 05:45:00\": 27,
            \"2019-06-29 20:15:00\": 14,
            \"2019-06-29 20:30:00\": 11,
            \"2019-06-29 20:45:00\": 7
        },
        \"watt_hours\": {
            \"2019-06-22 05:15:00\": 0,
            \"2019-06-22 05:30:00\": 6,
            \"2019-06-22 05:45:00\": 12,
            \"2019-06-29 20:15:00\": 2545,
            \"2019-06-29 20:30:00\": 2548,
            \"2019-06-29 20:45:00\": 2550
        },
        \"watt_hours_day\": {
            \"2019-06-22\": 2626,
            \"2019-06-23\": 2918,
            \"2019-06-24\": 2526,
            \"2019-06-25\": 2866,
            \"2019-06-26\": 2892,
            \"2019-06-27\": 1900,
            \"2019-06-28\": 2199,
            \"2019-06-29\": 2550
        }
    },
    \"message\": {
        \"type\": \"success\",
        \"code\": 0,
        \"text\": \"\"
    }
}"
| spath
| eval fake_now = strptime("2019-06-26 18:15:06", "%F %T")
| eval today = strftime(fake_now, "%F"), tomorrow = strftime(relative_time(fake_now, "+1d"), "%F")
| eval today = json_extract(_raw, "result.watt_hours_day." . today)
| eval tomorrow = json_extract(_raw, "result.watt_hours_day." . tomorrow)
| fields result.watt_hours_day.2019-06-26 result.watt_hours_day.2019-06-27 today tomorrow

 

Output is

todaytomorrowresult.watt_hours_day.2019-06-26result.watt_hours_day.2019-06-27_raw
2892190028921900{ "result": { "watts": { "2019-06-22 05:15:00": 17, "2019-06-22 05:30:00": 22, "2019-06-22 05:45:00": 27, "2019-06-29 20:15:00": 14, "2019-06-29 20:30:00": 11, "2019-06-29 20:45:00": 7 }, "watt_hours": { "2019-06-22 05:15:00": 0, "2019-06-22 05:30:00": 6, "2019-06-22 05:45:00": 12, "2019-06-29 20:15:00": 2545, "2019-06-29 20:30:00": 2548, "2019-06-29 20:45:00": 2550 }, "watt_hours_day": { "2019-06-22": 2626, "2019-06-23": 2918, "2019-06-24": 2526, "2019-06-25": 2866, "2019-06-26": 2892, "2019-06-27": 1900, "2019-06-28": 2199, "2019-06-29": 2550 } }, "message": { "type": "success", "code": 0, "text": "" } }
Tags (1)
0 Karma

bowesmana
SplunkTrust
SplunkTrust

You won't really be able to rename the fields unless you transpose data, which is probably not the right approach in your use case.

Here are a couple of other examples to give you ways to manipulate data

This one gets the result object and sorts the dates to make sure they are in date order

| eval dates=mvsort(json_array_to_mv(json_extract(_raw, "result")))
| eval result_today=replace(mvindex(dates, 0), "[^:]*:\s*(\d+)\}", "\1")
| eval result_tomorrow=replace(mvindex(dates, 1), "[^:]*:\s*(\d+)\}", "\1")

This one extracts the fields and then uses the wildcarding technique with foreach to make the field assignments.

| spath
| foreach result{}.* [ eval result_today=if("<<MATCHSTR>>"=strftime(_time, "%F"), '<<FIELD>>', result_today),
                            result_tomorrow=if("<<MATCHSTR>>"=strftime(_time, "%F"), result_tomorrow, '<<FIELD>>')]

 

0 Karma

yuanliu
SplunkTrust
SplunkTrust

Iterator command foreach is your friend. (I do not recommend treating structured data as text.)  One strategy could be iterate on field name:

 

| foreach result.*
    [eval today = mvappend(today, if(strftime(now(), "%F") == "<<MATCHSTR>>", '<<FIELD>>', null())),
    tomorrow = mvappend(tomorrow, if(strftime(now(), "%F") != "<<MATCHSTR>>", '<<FIELD>>', null()))]

 

BTW, illustrate raw JSON in text, do not use screenshot and do not use Splunk's contracted format.  Using @KendallW's makeresults emulation, you get

_timeresult.2024-06-10result.2024-06-11todaytomorrow
2024-06-10 23:59:391338171513381715
Tags (1)
0 Karma

KendallW
Contributor

Hi @CMEOGNAD 

Assuming that today's date will always be the first element of result and tomorrow the second, you can do this:

```adding sample data```
| makeresults 
| eval _raw="{
     \"result\" : [
         {\"2024-06-10\" : 1338},
         {\"2024-06-11\" : 1715}
      ]
}"

```using spath to extract values```
| spath output=today path=result{0}.
| rex field=today "\{\"(?<todayDate>[^\"]+)\"\s\:\s(?<todayResult>\d+)"
| spath output=tomorrow path=result{1}.
| rex field=today "\{\"(?<tomorrowDate>[^\"]+)\"\s\:\s(?<tomorrowResult>\d+)"

This gets you the todayResult and tomorrowResult values extracted with regex.

Ideally, you could extract the values directly with spath, but it seems it's not possible to use a variable for the path in spath, e.g. 

| eval today=tostring(strftime(_time,"%Y-%m-%d"))
| spath output=today path=result{0}.
| spath input=today output=today path='today'

 hope this helps.

0 Karma
Get Updates on the Splunk Community!

Monitoring Postgres with OpenTelemetry

Behind every business-critical application, you’ll find databases. These behind-the-scenes stores power ...

Mastering Synthetic Browser Testing: Pro Tips to Keep Your Web App Running Smoothly

To start, if you're new to synthetic monitoring, I recommend exploring this synthetic monitoring overview. In ...

Splunk Edge Processor | Popular Use Cases to Get Started with Edge Processor

Splunk Edge Processor offers more efficient, flexible data transformation – helping you reduce noise, control ...