All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I am trying to use the credentials of my friend to log into Splunk Enterprise, and I am unable to do that.  Also, I am using ODBC to connect Splunk with Power BI, and when I do that locally, I am ab... See more...
I am trying to use the credentials of my friend to log into Splunk Enterprise, and I am unable to do that.  Also, I am using ODBC to connect Splunk with Power BI, and when I do that locally, I am able to do that, but when I am trying to do that remotely, I am unable to do that. I am having issues with server URL and port number. Any help would be appreciated to solve these queries. TIA.
Essentially you need to extract from the url field the part that you want. For example, is it always the first two parts, or fewer, or only applied to particular urls? Please describe your requiremen... See more...
Essentially you need to extract from the url field the part that you want. For example, is it always the first two parts, or fewer, or only applied to particular urls? Please describe your requirement in more detail.
Please can you share the source of (the relevant parts of) your dashboard so we can see what settings you have used.
Hi splunkers !   I got a question about memory.    In my splunk monitoring console, I get approx 90% of memory used by splunk processes. The amount of memory is 48 Gb In my VCenter, I can see th... See more...
Hi splunkers !   I got a question about memory.    In my splunk monitoring console, I get approx 90% of memory used by splunk processes. The amount of memory is 48 Gb In my VCenter, I can see that only half of the assigned memory is used (approx 24 Gb over 48Gb available).   Who is telling me the truth : Splunk monitoring or Vcenter. And overall, is there somthing to configure in Splunk to fit the entire available memory.   Splunk 9.2.2 / redhat 7.8 Thank you .   Olivier.
Hey hgarnica, i have the same issue, like i was not able to run the search from powerbi, what type of modifications or permissions i need to provide and how will be the sample url for connecting t... See more...
Hey hgarnica, i have the same issue, like i was not able to run the search from powerbi, what type of modifications or permissions i need to provide and how will be the sample url for connecting the splunk as i was using it with https://hostname:8089 --> do we need to give any specific app names like that. Thanks in-advance for awaiting for your response.
i have created a stacked bar based on a data source (query) and everything works with the exception of: i have to select each data value to display when the query runs through Data Configuration - Y... See more...
i have created a stacked bar based on a data source (query) and everything works with the exception of: i have to select each data value to display when the query runs through Data Configuration - Y meaning all of my desired values show up there but they are not "selected" by default so the chart is blank until i select them?
Has there been any futhure information regarding this error? I am still unable to install the app in Slunk.
My query is    index=stuff | search "kubernetes.labels.app"="some_stuff" "log.msg"="Response" "log.level"=30 "log.response.statusCode"=200 | spath "log.request.path"| rename "log.request.path" as u... See more...
My query is    index=stuff | search "kubernetes.labels.app"="some_stuff" "log.msg"="Response" "log.level"=30 "log.response.statusCode"=200 | spath "log.request.path"| rename "log.request.path" as url | convert timeformat="%Y/%m/%d" ctime(_time) as date | stats min("log.context.duration") as RT_fastest max("log.context.duration") as RT_slowest p95("log.context.duration") as RT_p95 p99("log.context.duration") as RT_p99 avg("log.context.duration") as RT_avg count(url) as Total_Req by url   And i am getting the attached screenshot response. I want to club all the similar api's like all the /getFile/* as one API and get the average time
It did not work It is still giving all the events other than the expected one.
Hi  I have events that having multiple countries... I want to count the country field and with different time range. It is need to sort by highest country to lowest. EX   Country         Last 24h  ... See more...
Hi  I have events that having multiple countries... I want to count the country field and with different time range. It is need to sort by highest country to lowest. EX   Country         Last 24h     Last 30 days     Last 90 days            US                       10                   50                            100            Aus                       8                     35                              80 I need query kindly assist me.
I have ingested data form influx DB to Splunk Enterprise using influxDB add from splunk db connect. Performing InfluxQL search in SQL explorer of created influx connection. I am getting empty values... See more...
I have ingested data form influx DB to Splunk Enterprise using influxDB add from splunk db connect. Performing InfluxQL search in SQL explorer of created influx connection. I am getting empty values for value column. Query: from(bucket: "buckerName") |> range(start: -6h) |> filter(fn: (r) => r._measurement == "NameOfMeasurement") |>filter(fn: (r) => r._field == "value") |> yield(name: "count")     Splunk DBX Add-on for InfluxDB JDBC 
Up
初歩的な質問で失礼いたします。弊社ではPoCとして、Splunk Enterprise Trial Licenseをご提供いただき、まずはpalo altoのログを取り込んで、(メール等で)アラートを発報させたいと思っていますが、どのようにすればいいかわかりません。(手動で過去のログを取り込むことはできましたが、過去のログに対してアラートは出せないですよね。日付を現在にすれば出るのでしょうか。ま... See more...
初歩的な質問で失礼いたします。弊社ではPoCとして、Splunk Enterprise Trial Licenseをご提供いただき、まずはpalo altoのログを取り込んで、(メール等で)アラートを発報させたいと思っていますが、どのようにすればいいかわかりません。(手動で過去のログを取り込むことはできましたが、過去のログに対してアラートは出せないですよね。日付を現在にすれば出るのでしょうか。また出し方もわかっておりませんが。。) 環境はFJCloudに仮想サーバを1台立てて、そこでSplunkを動かしていますが、他のサーバにForwarderを入れたりなどはしていないです。 どなたかご存知の方、教えていただければ幸甚です。よろしくお願いいたします。
Hi @Poojitha  You Can Add Multiple Tokens in the Same Configuration Page ! Please refer the Image that i am Attaching,  is this what you are looking for ??
Hello @somesoni2 what a great idea to name it same way and using upper/lower case to make them different between eventtype & EventType... 
Hi @gcusello ,   Thanks for the feedback.. Wanted to understand , if we are changing OS on first physical node after restoring the backup. So this node will not be running on Red hat but other 3 s... See more...
Hi @gcusello ,   Thanks for the feedback.. Wanted to understand , if we are changing OS on first physical node after restoring the backup. So this node will not be running on Red hat but other 3 still running on centos. Will this node be part of clustering? Can servers with different OS part of same cluster? Thanks
Try something like this | streamstats count by ReasonCode EquipmentName reset_on_change=t global=f | where count=1
I migrated to v9.1.5 and have the TA-XLS app installed and working from a v7.3.6.  Commanding an 'outputxls' will generate a 'cannot concat str to bytes' error for the following line of the outputxl... See more...
I migrated to v9.1.5 and have the TA-XLS app installed and working from a v7.3.6.  Commanding an 'outputxls' will generate a 'cannot concat str to bytes' error for the following line of the outputxls.py file in the app:  try: csv_to_xls(os.environ['SPLUNK_HOME'] + "/etc/apps/app_name/appserver/static/fileXLS/" + output)Tried encoding by appending  .encode(encode('utf-8') to the string -> not working Tried importing the SIX and FUTURIZE/MODERNIZE libraries and ran the code to "upgrade" the script: it just added the and changed a line --> not working  from __future__ import absolute_import   Tried to define each variable, and some other --> not working  splunk_home = os.environ['SPLUNK_HOME'] static_path = '/etc/apps/app_name/appserver/static/fileXLS/' output_bytes = output csv_to_xls((splunk_home + static_path.encode(encoding='utf-8') + output))   I sort of rely on this app to work, any kind of help is needed! Thanks!            
@mrilvan there is only a Splunk app for that at the moment and nothing on the SOAR Side. However if the API is available there is nothing stopping you building a custom app in the platform as I am su... See more...
@mrilvan there is only a Splunk app for that at the moment and nothing on the SOAR Side. However if the API is available there is nothing stopping you building a custom app in the platform as I am sure XSOAR is just another REST API.  
You appear to be missing part of the answer - hot and warm buckets are normally stored on expensive fast storage, whereas (in order to reduce costs) cold buckets are stored on cheaper slower storage.... See more...
You appear to be missing part of the answer - hot and warm buckets are normally stored on expensive fast storage, whereas (in order to reduce costs) cold buckets are stored on cheaper slower storage. Using these distinctions, Splunk gives organisations the flexibility to manage the cost of their storage infrastructure.