All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

You can do inline extraction with rex, e.g. | rex "lda\((?<to>[^\)]*)\)" which will extract a new field called to from the portion between the brackets  You can also set this up as a field extract... See more...
You can do inline extraction with rex, e.g. | rex "lda\((?<to>[^\)]*)\)" which will extract a new field called to from the portion between the brackets  You can also set this up as a field extraction - see Fields->Field Extractions and create a new field extraction there using the regex above and then, if lda(xxx) exists in your data, you will get a field called to  
I am also facing same issue, have you got any solution for this?
Hi @atr , check also the other hardware reuirement, to avoid next issues. let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splun... See more...
Hi @atr , check also the other hardware reuirement, to avoid next issues. let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Hi marnall, thanks for the answer. So there is no chance to put cold datas on a NFS network storage without implementing smartstore ?  
Hi @marka3721 , You are right! sorry I confused f5 with fortinet! Anyway, take the transformation you find in the add-on transforms.conf and try it out. the transformations to search and verify in... See more...
Hi @marka3721 , You are right! sorry I confused f5 with fortinet! Anyway, take the transformation you find in the add-on transforms.conf and try it out. the transformations to search and verify in transforms.conf are: f5_bigip-icontrol-locallb, f5_bigip-icontrol-globallb, f5_bigip-icontrol-networking, f5_bigip-icontrol-management, f5_bigip-icontrol-system-systeminfo, f5_bigip-icontrol-system-statistics, f5_bigip-icontrol-system-disk, f5_bigip-icontrol-management-device, f5_bigip-icontrol-networking-interfaces, f5_bigip-icontrol-networking-adminip, f5_bigip-icontrol-locallb-pool, f5_bigip-icontrol-management-usermanagement. check if those regexes match your data or you need to modify them to adapt to your logs. If you have to modify them, remember to copy the thansforms.conf file in the local folder before modifying it. Ciao. Giuseppe
Your code looks fine
Hi @DoubleAka , your message seems to be in json, so if you delete part of the message (for example the first part) you lose the formatting and you can no longer use field extraction tools such as I... See more...
Hi @DoubleAka , your message seems to be in json, so if you delete part of the message (for example the first part) you lose the formatting and you can no longer use field extraction tools such as INDEXED_EXTRACTIONS or spath, furthermore you save very little by deleting just one word. In any case, the SED_CMD command uses a substitution regex and the one you used is wrong because quotes must be escaped and you missed the global parameter: SEDCMD-strip_event = s/^\"event\":\{\s*//g Ciao. Giuseppe  
Your requirement isn't really clear.  Not to point to the obvious difference between last (set in first panel) and $latest$ (used in second panel), but are you sure you can even add an additional fie... See more...
Your requirement isn't really clear.  Not to point to the obvious difference between last (set in first panel) and $latest$ (used in second panel), but are you sure you can even add an additional field in the first panel and still maintain your original timechart? (Hint: It will ruin it all; at least it will distort the chart.) Another important question is: What is that $latest$ expected  supposed to be?  It seems that you want it to be the interactive token because you set it according to _time which varies by row.  I already mentioned that setting a new field after timechart will ruin your chart.  But in addition, Dashboard Studio has its own regiment to manage tokens.  You cannot set a variable in one search and call that variable with $$ and expect it to be a passable token.  This is the document about setting interactive token with search result: Setting tokens from search results or search job metadata. Then, to add 1 week to the click value, run that result in another search. (Just like you would do in Simple XML.)  Lastly, use result from that search to drive the second panel.  Here is an example: { "visualizations": { "viz_7yE1ZwsT": { "type": "splunk.line", "dataSources": { "primary": "ds_DmIKSSCN" }, "title": "First panel", "eventHandlers": [ { "type": "drilldown.setToken", "options": { "tokens": [ { "token": "latest_tok", "key": "row._time.value" } ] } } ], "options": { "legendDisplay": "top" } }, "viz_OIqDnl0b": { "type": "splunk.line", "options": { "legendDisplay": "bottom" }, "dataSources": { "primary": "ds_79fdaiuf" }, "showProgressBar": false, "showLastUpdated": false } }, "dataSources": { "ds_DmIKSSCN": { "type": "ds.search", "options": { "query": "| tstats count where index=_internal by _time span=1d sourcetype\n| timechart span=1d sum(count) by sourcetype\n| eval _last = relative_time(_time, \"+1w\")" }, "name": "first panel" }, "ds_79fdaiuf": { "type": "ds.search", "options": { "query": "index=_introspection latest=$make token:result.week_after$\n| timechart span=1d count by sourcetype" }, "name": "dependent panel" }, "ds_EHm1QhZI": { "type": "ds.search", "options": { "query": "| makeresults\n| eval week_after = relative_time($latest_tok$, \"+1w\")", "enableSmartSources": true }, "name": "make token" } }, "defaults": { "dataSources": { "ds.search": { "options": { "queryParameters": { "latest": "$global_time.latest$", "earliest": "$global_time.earliest$" } } } } }, "inputs": { "input_global_trp": { "type": "input.timerange", "options": { "token": "global_time", "defaultValue": "-3w@w,now" }, "title": "Global Time Range" } }, "layout": { "type": "grid", "options": { "width": 1440, "height": 960 }, "structure": [ { "item": "viz_7yE1ZwsT", "type": "block", "position": { "x": 0, "y": 0, "w": 1440, "h": 400 } }, { "item": "viz_OIqDnl0b", "type": "block", "position": { "x": 0, "y": 400, "w": 1440, "h": 400 } } ], "globalInputs": [ "input_global_trp" ] }, "description": "https://community.splunk.com/t5/Splunk-Search/Dashboard-Studio-earliest-latest-tokens/m-p/691740", "title": "Pass time token" } In this dashboard, when you click a point on July 13 in the first panel, the second panel will end on July 20.  Is this something you are looking at?
as mention we drop one of the "s/" and also the "g" at the end: SEDCMD-CLean_powershell_800 = s/\n\s+Context Information\:.*([\r\n]+.*){0,500}// SEDCMD-CLean_powershell_4103 = s/\s+Context\:.*([\r\... See more...
as mention we drop one of the "s/" and also the "g" at the end: SEDCMD-CLean_powershell_800 = s/\n\s+Context Information\:.*([\r\n]+.*){0,500}// SEDCMD-CLean_powershell_4103 = s/\s+Context\:.*([\r\n]+.*){0,500}//
@PickleRick  When I am using this time preference then there is no difference showing. So its good to setup this setting ? Is there anything else you want me to  suggest for fix ?
Hello team, Am working with dovecot logs-- it's a mail logs. I managed to integrate it with Splunk through syslog. it gives me the logs in this format (Attached screenshot) Now, I want to... See more...
Hello team, Am working with dovecot logs-- it's a mail logs. I managed to integrate it with Splunk through syslog. it gives me the logs in this format (Attached screenshot) Now, I want to create a new field to have value of to/receiver From the screenshot the value of to/receiver is in lda(value) NOTE: on the below screenshot I dont have to/receiver values i just have from/sender and subject   Help me please !
No, I am not using that attribute in props.conf.
At first glance it looks relatively ok. Are you using indexed extractions?
@PickleRick  Props.conf setting  KV_MODE = xml NO_BINARY_CHECK = true CHARSET = UTF-8 LINE_BREAKER = <\/eqtext:EquipmentEvent>() NO_BINARY_CHECK = true SHOULD_LINEMERGE = false MAX_TIMESTAM... See more...
@PickleRick  Props.conf setting  KV_MODE = xml NO_BINARY_CHECK = true CHARSET = UTF-8 LINE_BREAKER = <\/eqtext:EquipmentEvent>() NO_BINARY_CHECK = true SHOULD_LINEMERGE = false MAX_TIMESTAMP_LOOKAHEAD = 650 TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%3QZ TIME_PREFIX = ((?<!ReceiverFmInstanceName>))<eqtext:EventTime> User time preference setting  
And your current settings are...?  
  Hello Splunkers!! Please help me to fix this time zone issue. Thanks in advance!!
Hi @gcusello  I tried to clone it but there is no option to clone dashboard studio to classic dashboard, dashboard studio can be cloned only to dashboard studio.
Hello Anees Ur.Rahman, Thanks for posting question to the community. You could create either a service unit file or an init script which is depent on your Linux OS. Currently, we don't provide ... See more...
Hello Anees Ur.Rahman, Thanks for posting question to the community. You could create either a service unit file or an init script which is depent on your Linux OS. Currently, we don't provide a direct way to run the EUM as a service in Linux by default. If you need help to create either a service unit file or an init script, please reach out to your Appdynamics account manager and engage someone from the professional services team who is responsible for implementing customers' environment to help you with your requirements. Best regards, Xiangning
The upper one (|eval ~) work! But when I refresh the page, the start_time and bubble_size work wrong. For Example, This is origin data, But when I refresh the page, It show like this. The... See more...
The upper one (|eval ~) work! But when I refresh the page, the start_time and bubble_size work wrong. For Example, This is origin data, But when I refresh the page, It show like this. The code is this. | eval start_time = starttime_data/1000 | eval duration = floor(duration_data/ 1000) | eval start_time_bucket = 5 * floor(start_time/5) | stats count by start_time_bucket, duration | eval bubble_size = count | table start_time_bucket, duration, bubble_size | rename start_time_bucket as "Start time" duration as "Duration"   Is this just server problem? or my Code problem?
Hi @Anonymous , I recreated the issue you described and managed to have both the Sender and Receiver Application processes shown on my AppDynamics dashboard. I didn’t add the Message Queue entry ... See more...
Hi @Anonymous , I recreated the issue you described and managed to have both the Sender and Receiver Application processes shown on my AppDynamics dashboard. I didn’t add the Message Queue entry points mentioned Above. But, I did these steps below to resolve the issue of Receiver info not being shown: Setting MSMQ Backends Monitoring for .NET: To enable downstream correlation for MSMQ, you must configure the agent. Documentation: MSMQ Backends for .NET. What to do: Go to “Tiers and Nodes” on Controller UI: Choose your Receiver Application Node. At the top right, select “Action” > “Configure App Server Agent”. Configure the App Server Agent: In the "App Server Agent Configuration" modal window, click on "Apps" from the tree on the right. Select “.NET Agent Configuration” on the right panel, then click on “+”. Define the MSMQ Correlation Field: Name:  msmq-correlation-field Description: Your description here Type: String Value: Label By following these steps, I was able to see the MQ details and transaction snapshots for both the sender and receiver applications in the AppDynamics controller. Hope this helps. Regards, Martina