All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

| eval start_time=if(processing_stage="Obtained data",invocation_timestamp,null()) | eval end_time=if(processing_stage="Successfully obtained genesys response",invocation_timestamp,null()) | stats va... See more...
| eval start_time=if(processing_stage="Obtained data",invocation_timestamp,null()) | eval end_time=if(processing_stage="Successfully obtained genesys response",invocation_timestamp,null()) | stats values(start_time) as start_time values(end_time) as end_time by correlation_id | eval difference=strptime(end_time,"%FT%TZ")-strptime(start_time,"%FT%TZ")
Hello @gabbydm , You can use the following parameters to change the sparkline color from the editor. sparklineColors sparklineAreaColors Note that you can define the cellType to SparklineCell for ... See more...
Hello @gabbydm , You can use the following parameters to change the sparkline color from the editor. sparklineColors sparklineAreaColors Note that you can define the cellType to SparklineCell for such visualization. Reference document - https://docs.splunk.com/Documentation/SplunkCloud/9.0.2305/DashStudio/objOptRef#columnFormat_.28object_type.29   Thanks, Tejas.   --- If the above solution helps you, an upvote is appreciated.
Hi @ITWhisperer  "invocation_timestamp": "2023-11-01T11:33:41Z" "processing_stage": "Obtained data">>> Start time "processing_stage": "Successfully obtained incontact response" >>> End time
From these events, where exactly do the timestamps come from?
Hi @ITWhisperer  Correlation ID Event start time Event end time Difference  0cd56112-6346-4ea3-8a2f-2b59b9eb68ba 11-01-2023 17:03:41:321 11-01-2023 17:04:04:300 22.979 {"mess... See more...
Hi @ITWhisperer  Correlation ID Event start time Event end time Difference  0cd56112-6346-4ea3-8a2f-2b59b9eb68ba 11-01-2023 17:03:41:321 11-01-2023 17:04:04:300 22.979 {"message_type": "INFO", "processing_stage": "Obtained data", "message": "Successfully received data from API/SQS", "correlation_id": "0cd56112-6346-4ea3-8a2f-2b59b9eb68ba", "error": "", "invoked_component": "prd-start-step-function-from-lambda-v1", "request_payload": "", "response_details": "{'executionArn': 'arn:aws:states:eu-central-1:981503094308:execution:contact-centre-dialer-service:8a1acb14-b170-4f95-99bc-7a89ff814207', 'startDate': datetime.datetime(2023, 11, 1, 11, 33, 41, 354000, tzinfo=tzlocal()), 'ResponseMetadata': {'RequestId': '60427a29-6dd4-4cdf-b5c0-fc6cb45b08b2', 'HTTPStatusCode': 200, 'HTTPHeaders': {'x-amzn-requestid': '60427a29-6dd4-4cdf-b5c0-fc6cb45b08b2', 'date': 'Wed, 01 Nov 2023 11:33:41 GMT', 'content-type': 'application/x-amz-json-1.0', 'content-length': '165', 'connection': 'keep-alive'}, 'RetryAttempts': 0}}", "invocation_timestamp": "2023-11-01T11:33:41Z", "response_timestamp": "2023-11-01T11:33:41Z", "custom_attributes": {"entity-internal-id": "", "root-entity-id": "", "student_id": "64690945", "lead-id": "37079165", "country": "Nepal"}} {"message_type": "INFO", "processing_stage": "Successfully obtained genesys response", "message": "Successfully obtained genesys response", "correlation_id": "0cd56112-6346-4ea3-8a2f-2b59b9eb68ba", "error": "", "invoker_agent": "arn:aws:sqs:eu-central-1:981503094308:prd-ccm-genesys-ingestor-queue-v1", "invoked_component": "prd-ccm-genesys-ingestor-v1", "request_payload": "", "response_details": "", "invocation_timestamp": "2023-11-01T11:34:04Z", "response_timestamp": "2023-11-01T11:34:04Z", "original_source_app": "YMKT", "target_idp_application": "", "retry_attempt": "1", "custom_attributes": {"entity-internal-id": "", "root-entity-id": "", "campaign-id": "4e749ade-ac9c-45e0-94fe-9ae21e1398d8", "campaign-name": "", "marketing-area": "IDP_NPL", "lead-id": "37079165", "record_count": "", "country": "Nepal"}}
| eval stime=strftime(strptime(stime,"%FT%TZ"),"%F %T") | eval etime=strftime(strptime(etime,"%FT%TZ"),"%F %T") | eval orgstime=strftime(strptime(orgstime,"%FT%TZ"),"%F %T") | eval orgetime=strftime(... See more...
| eval stime=strftime(strptime(stime,"%FT%TZ"),"%F %T") | eval etime=strftime(strptime(etime,"%FT%TZ"),"%F %T") | eval orgstime=strftime(strptime(orgstime,"%FT%TZ"),"%F %T") | eval orgetime=strftime(strptime(orgetime,"%FT%TZ"),"%F %T")
@phanTom  you are a genius!  thank you very much
I had to look through the search job logs where I noticed there were some errors regarding a lookup that didn't exist in that SH but was being used by the SH running the DM acceleration. I added said... See more...
I had to look through the search job logs where I noticed there were some errors regarding a lookup that didn't exist in that SH but was being used by the SH running the DM acceleration. I added said lookup and fields to all SHs where I was sharing DMA summaries and the error went away. I'd start by reviewing search job logs and then going over your affected DM(s) to see if there are any lookups being used to populate any fields.
This is the final stats results I got it now. The query you have shared is used to modify specific time. But I like to modify the timestamp on all the below mentioned column.   
Hi @Mafokognel, Thanks for your answer. I know this, bat my question is:  after LDAP integration, I see groups containing users, but I don't see Groups without users. Do you think that's normal o... See more...
Hi @Mafokognel, Thanks for your answer. I know this, bat my question is:  after LDAP integration, I see groups containing users, but I don't see Groups without users. Do you think that's normal or there could be an issue? Ciao. Giuseppe
| makeresults | eval time="2023-11-01T15:54:00Z" | eval reformatted=strftime(strptime(time,"%FT%TZ"),"%F %T")
I am trying to remove T and Z from the output timestamp results. Can you please help me with the query to remove  and space in the place of T and Z. 2023-11-01T15:54:00Z
@PickleRick  It work great thanks, i have another key value that call T[001] means “Type” on each line.  in last line need to add it, to show in result, so try 1-to add to last stats but it return... See more...
@PickleRick  It work great thanks, i have another key value that call T[001] means “Type” on each line.  in last line need to add it, to show in result, so try 1-to add to last stats but it returns nothing for T (because remove in first stats) 2-try to add after “by” in first stats not work, 3-use evenstream but it will count all lines that contain module, while it should return 1  <your_search> | stats list(module) as modules by transactionID | eval modules=mvjoin(modules," ") | stats count by modules Any idea? 
Hello,  For my knowledge, You have to create role, after assign to the role their permission. thereafter you can map the group and authenticate again. Then Go to user and check username assign to th... See more...
Hello,  For my knowledge, You have to create role, after assign to the role their permission. thereafter you can map the group and authenticate again. Then Go to user and check username assign to the group. Thanks
You could try using a token to define the series colours. You could set the token in the done handler of the search used for the pie chart such that the right colours are used based on the values pre... See more...
You could try using a token to define the series colours. You could set the token in the done handler of the search used for the pie chart such that the right colours are used based on the values present in the results.
Same here. Two environments upgraded from 9.05 to 9.1.0.1, 9.1.0.2, 9.1.1 and had the same issue on both ? Anyone found a solution yet ?
The typical approach to such case is to use streamstats to find last occurrence of different state than it is at given moment (using reset_on_change=t or reset_before/reset_after). That's probably y... See more...
The typical approach to such case is to use streamstats to find last occurrence of different state than it is at given moment (using reset_on_change=t or reset_before/reset_after). That's probably your only reasonable approach since you need to "carry over" information from some events into other ones and this (along with the autoregress) is the command to do so.
Hi @Roy_9 , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Can you provide details on how you did this please? I'm having the same issue, but I'm unsure of what your solution was.
@isoutamo @inventsekar  We'll be upgrading soon but until then I'm stuck.