All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello Splunk experts,  I'm working over a dashboard where I would like to have statistical table with line chart in the last column as it's created in the picture below. Could you please propose a s... See more...
Hello Splunk experts,  I'm working over a dashboard where I would like to have statistical table with line chart in the last column as it's created in the picture below. Could you please propose a solution. Thank you in advance
Hi, I would like to ask how to ingest BitWarden event logs into Splunk Cloud. I could not find any apps for this purpose in Splunk Cloud. Thanks.
how can i modify the transforms.conf file so that when i ingest the data it throws away all the events that have the status FAILED after the first ip address
Hello, I can not able to access Support -> Support portal page in splunk web page. Its show error -  Sorry, we couldn’t find that page The page you're looking for may have been moved or deleted.
Hi, Anyone tried having this kind of integration where an ON-prem Splunk will be integrated to a cloud waf? Cloud WAF has an integration package purposely built for Splunk (in .spl format). Any... See more...
Hi, Anyone tried having this kind of integration where an ON-prem Splunk will be integrated to a cloud waf? Cloud WAF has an integration package purposely built for Splunk (in .spl format). Any insights?   TIA, A
Hi  I need regular expression to extract field "timed out " by using below log .... "Description":"Job-2069950 Error in [InfrastructureServices/Dispatcher/Interface/MQ_InterfaceDispatcher.process... See more...
Hi  I need regular expression to extract field "timed out " by using below log .... "Description":"Job-2069950 Error in [InfrastructureServices/Dispatcher/Interface/MQ_InterfaceDispatcher.process/JMS Queue Requestor]\nActivity timed out\n\tat com.tibco.pe.core.    please help to write regular expression by using rex command  ...
Hi Splunkers, I have results in the following way. Country        Count Japan                50 USA.                   70 China.                60 India.                  50 Tokyo.     ... See more...
Hi Splunkers, I have results in the following way. Country        Count Japan                50 USA.                   70 China.                60 India.                  50 Tokyo.                5 Goal: I want to add the count of Tokyo to Japan (55) and take off Tokyo from the results. Could anyone suggest me with a logic. TIA
Does anyone know the formatting for doing a post using the splunkjs Endpoints function to an inputs.conf? I can get it like this: let splunk_service = splunkjs.mvc.createService({owner: 'nobody'}... See more...
Does anyone know the formatting for doing a post using the splunkjs Endpoints function to an inputs.conf? I can get it like this: let splunk_service = splunkjs.mvc.createService({owner: 'nobody'}) let endpoint = new splunkjs.Service.Endpoint(splunk_service, "data/inputs/mymod/test") endpoint.get() But a post isn’t working:  endpoint.post({'myfield': '123abc'}) Sorry for the formatting. I’m on mobile and see no way to add code blocks.
Hi   I added a url to the website monitoring app but the url needs a certificate to be added.  I did not see an option to add the certificate to the app and it is not giving me any response code... See more...
Hi   I added a url to the website monitoring app but the url needs a certificate to be added.  I did not see an option to add the certificate to the app and it is not giving me any response code when I try to hit the endpoint. Let me know how can I add the certificate to the app. 
Hey Splunk People,   I'm running a search against a CSV file: |inputlookup "GSOCdata_230717.csv" | fields source_address, destination_address, protocol_id, destination_port, psrsvd_gc | stats s... See more...
Hey Splunk People,   I'm running a search against a CSV file: |inputlookup "GSOCdata_230717.csv" | fields source_address, destination_address, protocol_id, destination_port, psrsvd_gc | stats sum(psrsvd_gc) as count by source_address, destination_address, protocol_id, destination_port   This builds a table w/ the specified data types contained in the CSV file. Can I filter my data to a smaller output table? I'd like to exclude certain IP addresses from the output of this command. I've tried using a CIDR notation of my address space, but it just chokes.. I've tried .. piping to "eval source_address=172.16.50.0/24" but it doesn't seem to like it..   Do you have a suggestion to do this? I worked around this by just building another CSV file with the data filtered to where I want to go, but it seems like this should be solvable in a more elegant way.   Thanks,   Paul Diggins
Hi i have a table where i obtained the values after sorting PCT_FREE in ascending order now i want to plot a timechart of avg of used_space only for first two entity "B" AND "D" however when i plot t... See more...
Hi i have a table where i obtained the values after sorting PCT_FREE in ascending order now i want to plot a timechart of avg of used_space only for first two entity "B" AND "D" however when i plot timechart it takes all the entity  any suggestions on how can i exclude then rest of the entity and plot the timechart  time PCT_FREE USED_SPACE Entity 17/07/2023 16:15 10.4 245 B 16/07/2023 16:15 10.5 233 B 15/07/2023 16:15 10.3 235 B 14/07/2023 16:15 10.6 232 B 17/07/2023 16:15 11 245 D 16/07/2023 16:15 11 233 D 15/07/2023 16:15 11 235 D 14/07/2023 16:15 11 232 D 17/07/2023 16:15 12 245 A 16/07/2023 16:15 12 233 A 15/07/2023 16:15 12 235 A 14/07/2023 16:15 12 232 A 17/07/2023 16:15 14 245 C 16/07/2023 16:15 14 233 C 15/07/2023 16:15 14 235 C 14/07/2023 16:15 14 232 C
Hello, I would like to create a role which allows to add roles to users on a limited perimeter. The goal is to delegate a part of the user/role mapping to superpowerusers according to their perimet... See more...
Hello, I would like to create a role which allows to add roles to users on a limited perimeter. The goal is to delegate a part of the user/role mapping to superpowerusers according to their perimeter. For example: user1 can add as role only role1 (no any other role) I have already tried with the following authorize.conf configurations [role_superpoweruser] edit_user = enabled edit_roles_grantable=enabled grantableRoles = role1; => allows to add ALL roles (including others than role1) [role_superpoweruser] edit_user = enabled grantableRoles = role1; => the user can add role1 but it removes ALL other roles Any help would be greatly appreciated
Hello Everyone, I am trying to integrate Google logs to splunk. I downloaded the Splunk add on+ 2.5.1 version. I am facing few problems: 1. In the google app for splunk while setting up the account... See more...
Hello Everyone, I am trying to integrate Google logs to splunk. I downloaded the Splunk add on+ 2.5.1 version. I am facing few problems: 1. In the google app for splunk while setting up the account, there is a field called "Service account certificate". What do I enter in there? Should we copy the entire certificate and paste in there?  2. I get the errors 2023-07-18 13:36:14,769 ERROR pid=4110446 tid=MainThread file=gws_gmail_logs.py:stream_events:159 | Exception raised while ingesting data for gmail: 404 Not found: Dataset splunk-387815:gmail_logs_dataset was not found in location EU Location: EU Job ID: c504d67b-c171-49c9-9085-70831fad4353 . Traceback: Traceback (most recent call last): File "/opt/splunk/etc/apps/Splunk_TA_Google_Workspace/bin/gws_gmail_logs.py", line 126, in stream_events results = query_job.result(page_size=BIGQUERY_RESULT_PAGE_SIZE) File "/opt/splunk/etc/apps/Splunk_TA_Google_Workspace/lib/google/cloud/bigquery/job/query.py", line 1499, in result do_get_result() File "/opt/splunk/etc/apps/Splunk_TA_Google_Workspace/lib/google/api_core/retry.py", line 354, in retry_wrapped_func on_error=on_error, File "/opt/splunk/etc/apps/Splunk_TA_Google_Workspace/lib/google/api_core/retry.py", line 191, in retry_target return target() File  I am guessing it is related to authentication and authorization. It would be great to get inputs/suggestions from the community out there and users who have already done this integration. Best, PR
hello!  I am getting an error trying to add a "hello world" event into my splunk cloud trial environment.  I'm looking for help to see what the issue could be.  I am getting what looks like an SSL er... See more...
hello!  I am getting an error trying to add a "hello world" event into my splunk cloud trial environment.  I'm looking for help to see what the issue could be.  I am getting what looks like an SSL error.  I added certs to my ubuntu VM but that didn't seem to help.  Maybe it's a network fw error of some kind?  Is the IP source of my env somehow blacklisted on your end?  Not certain but think it could be the case, maybe you can help me? In my virtual network, I can connect fine from my VM... here's a simple connection: sudo telnet prd-p-ki5a7.splunkcloud.com 8088 Trying 44.206.98.245... Connected to prd-p-ki5a7.splunkcloud.com. When I try this curl command … curl -k https://prd-p-ki5a7.splunkcloud.com:8088/services/collector/event -H "Authorization: Splunk <token> -d '{"event": "hello world"}' …I get this: curl: (35) OpenSSL SSL_connect: Connection reset by peer in connection to prd-p-ki5a7.splunkcloud.com:8088 ...however if I try it on a different network, it works fine.  Any way you can help me troubleshoot this one?
Hello, I have this search for a chart that counts values weekly and divides then by day of the week. Is there any option that I can do to show this graph a little more compact grouping the results ... See more...
Hello, I have this search for a chart that counts values weekly and divides then by day of the week. Is there any option that I can do to show this graph a little more compact grouping the results daily in showing just the week like the screenshots examples?                
We are trying to figure UF and Splunk Indexer to use SSL certs to create a secure comms between the two and for some reason it does not create the secure connection  any assitance you can provide... See more...
We are trying to figure UF and Splunk Indexer to use SSL certs to create a secure comms between the two and for some reason it does not create the secure connection  any assitance you can provide would be great Below is the Indexer inputs.conf stanza[default] host = xxxspl01x [splunktcp-ssl://9998] compressed = true disabled = 0 [SSL] serverCert = $SPLUNK_HOME/etc/auth/certs/splunkweb/xxxpl01x.pem sslPassword = requireClientCert = true sslVersions = tls1.2 cipherSuite = ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256 :ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDH-ECDSA-AES256-GCM-SHA384 :ECDH-ECDSA-AES128-GCM-SHA256:ECDH-ECDSA-AES128-SHA256:AES256-GCM-SHA384:AES128-GCM-SHA256:AES128-SHA256 sslCommonNameToCheck = xxxx01x.xxx.yyyy.com sslAltNameToCheck = xxxx01x  Below is the UF outputs.com  # DoS Universal Forwarder outputs - new [tcpout] defaultGroup = xxxIndexer-group [tcpout:dosIndexer-group] server = z.z.z:9998 disabled = 0 clientCert = $SPLUNK_HOME\etc\auth\DOS\xxx01x.pem sslPassword = removed  useClientSSLCompression = true sslVerifyServerCert = true sslVerifyServerName = true sslCommonNameToCheck = yyy01y.xxx.yyy.com sslAltNameToCheck = yyy01y cipherSuite = ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256 :ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDH-ECDSA-AES256-GCM-SHA384 :ECDH-ECDSA-AES128-GCM-SHA256:ECDH-ECDSA-AES128-SHA256:AES256-GCM-SHA384:AES128-GCM-SHA256:AES128-SHA256 [tcpout-server://x.x.x.xx:9998]then run the below  index=_internal source=*metrics.log* group=tcpin_connections | dedup hostname | table _time hostname version sourceIp destPort ssl results of  _time = system time hostname = host name version = 9.0.5 sourcelp = ip address det port =  9997 ssl = false 
I have the below events getting generated which has list of file counts on diffrent directories with date. creating a table format output with headers "Directory" "date" and "Filecount". Need assitan... See more...
I have the below events getting generated which has list of file counts on diffrent directories with date. creating a table format output with headers "Directory" "date" and "Filecount". Need assitance in rex to orginzate this date in table format so that I can setup a dashboard for the same     "Directory","Date","FileCount" "E:\test\IEX\app1\Incoming","7/18/2023","12" "E:\test\IEX\Processed\Success","7/14/2023","11922" "E:\test\IEX\Processed\Success","7/15/2023","319" "E:\test\IEX\Processed\Success","7/16/2023","449" "E:\test\IEX\Processed\Success","7/17/2023","14264" "E:\test\IEX\Processed\Success","7/18/2023","414" "E:\test\IEX\Error","7/13/2023","170" "E:\test\IEX\Error","7/14/2023","176" "E:\test\IEX\Error","7/15/2023","1" "E:\test\IEX\Error","7/17/2023","146" "E:\test\IEX\Error","7/18/2023","3" "E:\test\IEX\Error","7/10/2023","244" "E:\test\IEX\Error","7/11/2023","194" "E:\test\IEX\Error","7/12/2023","189"
Hello everyone, Please we are trying to monitor a database but everytime we run the querry we get the error an error occured while fetching data the agent did not respond after 70 seconds Has anyo... See more...
Hello everyone, Please we are trying to monitor a database but everytime we run the querry we get the error an error occured while fetching data the agent did not respond after 70 seconds Has anyone had this issue before and how did you solve it
Splunk Python readiness app Not being push from deployer to the SH cluster . The deployer server is running as MC, LM, Indexer master node, deployment server.  Tried to push the APP with merge_t... See more...
Splunk Python readiness app Not being push from deployer to the SH cluster . The deployer server is running as MC, LM, Indexer master node, deployment server.  Tried to push the APP with merge_to_default push mode on deployer. Any suggestion for the troubleshoot this issue. 
Hi and just reaching out as stumped. Very grateful for assistance. This query returns the following in the statistics tab: index="ds" (tags_rule="Jason" OR tags_rule="Bill" OR tags_rule=”Smithy”... See more...
Hi and just reaching out as stumped. Very grateful for assistance. This query returns the following in the statistics tab: index="ds" (tags_rule="Jason" OR tags_rule="Bill" OR tags_rule=”Smithy”) | timechart span=1d dc(Device_Name) as Number_of_Devices by tags_rule The next step i'd like to do is then count up all the values in the columns and group them by the respective month. So it would look like the below. Just not having any luck figuring out the right query. Thanks in advance!