All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Sorry for the late reply.  I tried your command and it's receiving an error.  
tried this and not getting any results for the sub query    | appendcols [ search index="dynatrace"  [| makeresults | eval earliest=relative_time(now(),"$tr_14AGuxUA.earliest$"), latest=relative_t... See more...
tried this and not getting any results for the sub query    | appendcols [ search index="dynatrace"  [| makeresults | eval earliest=relative_time(now(),"$tr_14AGuxUA.earliest$"), latest=relative_time(now(),"$tr_14AGuxUA.latest$") | table earliest latest] | spath output=user_actions path="userActions{}" | stats count by user_actions
Are you sure? The stanza is different yet... or we must detail all monitored logs? Thanks!
Try something like this search index="dynatrace" [| makeresults | eval earliest=relative_time(now(),"$tr_14AGuxUA.earliest$"), latest=relative_time(now(),"$tr_14AGuxUA.latest$") | table earliest la... See more...
Try something like this search index="dynatrace" [| makeresults | eval earliest=relative_time(now(),"$tr_14AGuxUA.earliest$"), latest=relative_time(now(),"$tr_14AGuxUA.latest$") | table earliest latest]
@nw - Yes, I have seen the issue. "Program Files" has space in the path.  That's the issue. I would switched to Linux or Mac instead. Somehow Windows doesn't work well with it. Alternatively for Win... See more...
@nw - Yes, I have seen the issue. "Program Files" has space in the path.  That's the issue. I would switched to Linux or Mac instead. Somehow Windows doesn't work well with it. Alternatively for Windows, I could say that you can install Splunk on Windows in any folder, and install it in a folder that doesn't have space in the path.   I hope this helps!!!
@Awanish1212 - Try below index="production" host="abc.com-*" source="Log-*" | eval ID=substr(host,9,7) | eval Sub_ID = mvindex(split(source,"-"),2) | stats dc(RP_Remote_User) as events by ID, Sub_ID... See more...
@Awanish1212 - Try below index="production" host="abc.com-*" source="Log-*" | eval ID=substr(host,9,7) | eval Sub_ID = mvindex(split(source,"-"),2) | stats dc(RP_Remote_User) as events by ID, Sub_ID | stats list(Sub_ID) as Sub_ID, list(events) as events by ID   I hope this helps!!! Kindly upvote if it does!!!
@arifsaha - There is a scenario where you have having LDAP setup on Splunk which means you have a Splunk password the same as the AD password so you don't want to expose the AD password anywhere, you... See more...
@arifsaha - There is a scenario where you have having LDAP setup on Splunk which means you have a Splunk password the same as the AD password so you don't want to expose the AD password anywhere, you would rather share the token. This is just one example, but the basic idea is you are giving access but not the password. Plus you can time-bound the token.   I hope this helps!!!
If i use this in the sub query - earliest=$tr_14AGuxUA.earliest$ latest=$tr_14AGuxUA.latest$, then getting this error    Invalid value "2023-10-16T14:00:00.000Z" for time term 'earliest'
Try something like this (note that the rollover colour changes are disabled by this): <dashboard version="1.1" theme="light"> <label>Trellis</label> <row> <panel depends="$alwayshide$"> ... See more...
Try something like this (note that the rollover colour changes are disabled by this): <dashboard version="1.1" theme="light"> <label>Trellis</label> <row> <panel depends="$alwayshide$"> <html> <style> #trellis div.facets-container div.viz-panel:nth-child(1) g.highcharts-series path { fill: red !important; } #trellis div.facets-container div.viz-panel:nth-child(2) g.highcharts-series path { fill: green !important; } #trellis div.facets-container div.viz-panel:nth-child(3) g.highcharts-series path { fill: blue !important; } #trellis div.facets-container div.viz-panel:nth-child(4) g.highcharts-series path { fill: yellow !important; } </style> </html> </panel> <panel> <chart id="trellis"> <search> <query>| makeresults count=100 | eval _time=relative_time(_time,"@h")-(random()%(5*60*60)) | eval Category="Category ".mvindex(split("ABCD",""),random()%4) | eval Value=random()%100 | timechart span=1h avg(Value) as AvgValue_Secs by Category</query> <earliest>-5h@h</earliest> <latest>@h</latest> </search> <option name="charting.axisTitleX.visibility">collapsed</option> <option name="charting.axisTitleY.visibility">collapsed</option> <option name="charting.axisTitleY2.visibility">collapsed</option> <option name="charting.chart">column</option> <option name="charting.drilldown">none</option> <option name="charting.legend.placement">none</option> <option name="refresh.display">progressbar</option> <option name="trellis.enabled">1</option> </chart> </panel> </row> </dashboard>
Hi, To get your logs visible in Log Observer, you would configure your K8S OTel deployment to send your logs to a Splunk platform instance (Enterprise or Cloud) via HEC (http event collector). Then,... See more...
Hi, To get your logs visible in Log Observer, you would configure your K8S OTel deployment to send your logs to a Splunk platform instance (Enterprise or Cloud) via HEC (http event collector). Then, in Splunk Observability Cloud, you would configure the Log Observer Connect integration to read logs from your Splunk (Cloud or Enterprise) and display them in Log Observer. The important part to understand about this approach is that the logs are not ingested or stored in Splunk Observability Cloud--they are just displayed there after reading them from your Splunk platform instance (Enterprise or Cloud). Once they're visible to Log Observer, it can do good things like correlate the logs to metrics and traces.
I have a query to retrieve user experience metrics from Dynatrace index. Wanted to compare the response times for 2 different time frames. My query is having sub query as well. In the dashboard, i am... See more...
I have a query to retrieve user experience metrics from Dynatrace index. Wanted to compare the response times for 2 different time frames. My query is having sub query as well. In the dashboard, i am having 2 time range pickers. Main query is picking the time range from time range picker1 and in the sub query using the token from time range picker2.  <<main search>> | appendcols [ search index="dynatrace"  $tr_14AGuxUA.earliest$ - $tr_14AGuxUA.latest$ | spath |output=user_actions path="userActions{}"| stats count by user_actions this is not retrieving any data from the sub query. how to fix this? If i am passing the hard coded values - earliest=10/23/2023:10:00:00 latest=10/23/2023:11:00:00, then its working fine. 
I can't access the support portal, with URL https://www.splunk.com/404?ErrorCode=23&ErrorDescription=Invalid+contact   Does anyone have the same issue?  
I have opened the port 8088 in Windows Defender but the result is the same  Is anybody have an idea?
A couple of things: - What user are you running this command as, and what user is Splunk installed as? - Are you in a bash?  If you don't quote your credentials correctly then they won't get expand... See more...
A couple of things: - What user are you running this command as, and what user is Splunk installed as? - Are you in a bash?  If you don't quote your credentials correctly then they won't get expanded: Solved: Getting error "Could not look up HOME variable. Au... - Splunk Community  
I don't think you can monitor the same "base path" twice. An ugly hack to walk around that is to use (hard/soft) links
Hi @rphillips_splk  bar can be an environment variable ? thanks  
You should be able to, although it isn't called out in the docs for serverclass.conf directly. There are a couple of other configuration parameters you can set to get a bit of logic in the matching,... See more...
You should be able to, although it isn't called out in the docs for serverclass.conf directly. There are a couple of other configuration parameters you can set to get a bit of logic in the matching, too, if that is helpful: whitelist.where_field whitelist.where_equals blacklist.where_field blacklist.where_equals   If you think the docs are unclear and should include a multiple wildcard example, then I suggest submitting feedback via the form at the bottom of every Splunk docs page.  That team has always been responsive for improving the documentation.  
@meshorer I have led you to the water, now you need to learn how to drink it  I don't have the knowledge, nor the time,  straight away to work out how to do it the way you need to but you have the... See more...
@meshorer I have led you to the water, now you need to learn how to drink it  I don't have the knowledge, nor the time,  straight away to work out how to do it the way you need to but you have the logs, now you just need to do the engineering piece and make it work in your SIEM.  There is also REST capability that you could have a script or some other REST capability in your SIEM to grab the data from REST. Maybe consider changing the logs to JSON: Before you begin Configure Splunk SOAR (On-premises) with JSON log format by issuing the following command from the Splunk SOAR console: $phenv set_preference --logging-format json Happy SOARing! 
Hi Giuseppe I already have a lookup table created. My question is if it is possible to import that into the Splunk Lookup file editor and not create a new one from there.
The message says it all - your curl sent SYN packets but never got any reply. Which means that even if your port is open, it's probably filtered by your local firewall (since you're connecting to lo... See more...
The message says it all - your curl sent SYN packets but never got any reply. Which means that even if your port is open, it's probably filtered by your local firewall (since you're connecting to loopback device it can't be anything on external network). Check your iptables/firewalld config and open that port so that you can connect. Whether the port is open by Splunk is another question and you'll see as soon as you "poke a hole" in your firewall.