All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello all, Basically, I can't use Splunk Cloud Trial. It constantly throws "An internal error was detected when creating the stack.".  
Hi Team, We installed these apps on our License Master as part of IT Essentials Work app SA-ITSI-Licensechecker SA-UserAccess https://docs.splunk.com/Documentation/ITEWork/4.10.2/Install/Install#... See more...
Hi Team, We installed these apps on our License Master as part of IT Essentials Work app SA-ITSI-Licensechecker SA-UserAccess https://docs.splunk.com/Documentation/ITEWork/4.10.2/Install/Install#Install_IT_Essentials_Work_in_a_distributed_environment Then we saw this License "IT Service Intelligence Internals *DO NOT COPY*" appeared Can this license be used for production data ingestion? Can you confirm that this license is included on the IT Essentials Work app itself?  
See title, looking for an old version of splunk to test something. I know it will not be supported I am just curious. Are any 6.x UFs still available for download? I could not find anything before 7... See more...
See title, looking for an old version of splunk to test something. I know it will not be supported I am just curious. Are any 6.x UFs still available for download? I could not find anything before 7.x on the site.
Hello Everyone, I am looking to find the splunk product published date from internal logs, anyone know if this information is already being logged somewhere? I know README.txt file contains the pro... See more...
Hello Everyone, I am looking to find the splunk product published date from internal logs, anyone know if this information is already being logged somewhere? I know README.txt file contains the product release month & year, but looking to get this info from internal logs. Please guide me if anyone noticed this information logged somewhere? Thanks for your help in advance! Regards, BK
Hi, In our dashboard every panel and also every input contains a field "id".  We want to hide or show them using tokens. How can we hide/show them refering to  the id? Is this possible? No css pleas... See more...
Hi, In our dashboard every panel and also every input contains a field "id".  We want to hide or show them using tokens. How can we hide/show them refering to  the id? Is this possible? No css please  
    Our app has a functionality where users can create alerts for specific events. Unfortunately the users do not have the rights to create saved searches (we are on a multi-tenant platform, so we ... See more...
    Our app has a functionality where users can create alerts for specific events. Unfortunately the users do not have the rights to create saved searches (we are on a multi-tenant platform, so we cannot change user rights).  The code for this is:   var service = mvc.createService(); var mySavedSearches = service.savedSearches(); mySavedSearches.init(admin_service, {app:"APP", sharing:"app"}); // Create a saved search/report as an alert. // service.savedSearches().create(alertOptions, function (err, alert) { mySavedSearches.create(alertOptions, function (err, alert) { console.log("ALERT"); // Error checking. if (err && err.status === 409) { console.error("ERROR: A saved alert with the name '" + alertOptions.name + "' already exists"); error(alertOptions.name); return; } else if (err) { console.error("There was an error creating the alert:", err); return; } // Confirmation message. console.log("Created alert: " + alert.name); });   When logged in as an admin user, the saved searches are created. However, when logged in as a normal user, the following error appears:   User 'user' with roles { db_connect_user, user } cannot write: /nobody/APP/savedsearches/test_saved_search { read : [ admin, user ], write : [ admin ] }, export: app, removable: no, modtime: 1559130962.504602000   Would it be possible to create these saved searches as admin, by for instance creating a service with the admin user? How could I do this? I have tried:   var service = mvc.createService({ owner: "admin" })   but this did not work.
Hi,  I would like to request you to assist me either in changing the username or deleting this account.   Regards,  
Database agent doesn't start and give the below error message. port already opened to both Controller and Event service.  main] 03 Nov 2021 14:31:07,366 INFO Agent - Agent Install Directory [/appdyn... See more...
Database agent doesn't start and give the below error message. port already opened to both Controller and Event service.  main] 03 Nov 2021 14:31:07,366 INFO Agent - Agent Install Directory [/appdynamic/db-agent-21.9.0.2521] [main] 03 Nov 2021 14:31:07,366 INFO Agent - Using Agent Version [Database Agent v21.9.0.0 GA compatible with 4.5.2.0 Build Date 2021-09-22] [main] 03 Nov 2021 14:31:07,367 INFO Agent - JVM Runtime: java.home=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.181-7.b13.el7.x86_64/jre java.vm.vendor=Oracle Corporation java.vm.name=OpenJDK 64-Bit Server VM java.version=1.8.0_181 java.specification.version=1.8 java.runtime.version=1.8.0_181-b13 java.io.tmpdir=/tmp user.language=en user.country=US user.variant= Default locale=en_US [main] 03 Nov 2021 14:31:07,367 INFO Agent - OS Runtime: os.name=Linux os.arch=amd64 os.version=3.10.0-957.el7.x86_64 user.name=aurahman user.home=/home/aurahman user.dir=/appdynamic/db-agent-21.9.0.2521 [main] 03 Nov 2021 14:31:07,367 INFO Agent - JVM Args : -XX:+HeapDumpOnOutOfMemoryError | -XX:OnOutOfMemoryError=kill -9 %p | -Xms1024m | -Xmx11264m | [main] 03 Nov 2021 14:31:07,368 INFO Agent - JVM Runtime Name: 5493@apdyn-pr-da1.moh.gov.sa [main] 03 Nov 2021 14:31:07,368 INFO Agent - JVM PID: 5493 [main] 03 Nov 2021 14:31:07,368 INFO Agent - Default Database Agent is resolving bootstrap info.... [main] 03 Nov 2021 14:31:07,418 INFO AgentUtil - Default Host Identifier Resolver using host name for unique host identifier [apdyn-pr-da1.moh.gov.sa] [main] 03 Nov 2021 14:31:07,421 INFO AgentUtil - Default IP Address Resolver found IP addresses [[192.168.122.1, fe80:0:0:0:215:5dff:fef6:695b%eth0, 10.0.195.152]] [main] 03 Nov 2021 14:31:07,421 INFO AgentUtil - Full Agent Registration Info Resolver found system property [appdynamics.agent.applicationName] for application name [Database Monitoring] [main] 03 Nov 2021 14:31:07,422 INFO AgentUtil - Full Agent Registration Info Resolver found system property [appdynamics.agent.tierName] for tier name [Database Monitoring] [main] 03 Nov 2021 14:31:07,422 INFO AgentUtil - Full Agent Registration Info Resolver found system property [appdynamics.agent.nodeName] for node name [Database Monitoring] [main] 03 Nov 2021 14:31:07,453 INFO AgentUtil - Full Agent Registration Info Resolver using selfService [false] [main] 03 Nov 2021 14:31:07,453 INFO AgentUtil - Full Agent Registration Info Resolver using application name [Database Monitoring] [main] 03 Nov 2021 14:31:07,454 INFO AgentUtil - Full Agent Registration Info Resolver using tier name [Database Monitoring] [main] 03 Nov 2021 14:31:07,454 INFO AgentUtil - Full Agent Registration Info Resolver using node name [Database Monitoring] [main] 03 Nov 2021 14:31:07,476 INFO AgentUtil - XML Controller Info Resolver found controller host [appmon.moh.gov.sa] [main] 03 Nov 2021 14:31:07,476 INFO AgentUtil - XML Controller Info Resolver found controller port [443] [main] 03 Nov 2021 14:31:07,498 INFO AgentUtil - XML Agent Account Info Resolver using account name [customer1] [main] 03 Nov 2021 14:31:07,499 INFO AgentUtil - XML Agent Account Info Resolver using account access key [****] [main] 03 Nov 2021 14:31:07,513 INFO AgentUtil - Keystore file /appdynamic/db-agent-21.9.0.2521/conf/cacerts.jks was not found [main] 03 Nov 2021 14:31:10,004 INFO Agent - Default Database Agent resolved bootstrap info! [main] 03 Nov 2021 14:31:10,151 INFO Agent - Started [Default Database Agent] Schedulers [main] 03 Nov 2021 14:31:10,151 INFO Agent - Scheduling Default Database Agent Registration .... [DBAgent-1] 03 Nov 2021 14:31:10,245 INFO RegistrationChannel - Controller host [appmon.moh.gov.sa]; Controller port [443] [DBAgent-1] 03 Nov 2021 14:31:10,278 INFO RegistrationChannel - setting agent hostname [apdyn-pr-da1.moh.gov.sa] [DBAgent-1] 03 Nov 2021 14:31:10,278 INFO RegistrationChannel - setting agent version [Database Agent v21.9.0.0 GA compatible with 4.5.2.0 Build Date 2021-09-22] [DBAgent-1] 03 Nov 2021 14:31:10,278 INFO RegistrationChannel - setting agent properties [{dbagent-name=Default Database Agent, dbagent-launch-id=e726f4a3-517c-48ea-be67-77cdc4db9688}] [DBAgent-1] 03 Nov 2021 14:31:10,278 INFO RegistrationChannel - setting agent install dir [/appdynamic/db-agent-21.9.0.2521] [DBAgent-1] 03 Nov 2021 14:31:10,278 INFO RegistrationChannel - setting agent type [DB_AGENT] [DBAgent-1] 03 Nov 2021 14:31:10,279 INFO RegistrationChannel - setting agent application [Database Monitoring] [DBAgent-1] 03 Nov 2021 14:31:10,279 INFO RegistrationChannel - setting agent tier name [Database Monitoring] [DBAgent-1] 03 Nov 2021 14:31:10,279 INFO RegistrationChannel - setting agent node name [Database Monitoring] [DBAgent-1] 03 Nov 2021 14:31:10,279 INFO RegistrationChannel - Sending Registration request [DBAgent-1] 03 Nov 2021 14:31:44,386 ERROR ControllerHttpRequestResponse - Fatal transport error while connecting to URL [/controller/instance/UNKNOWN_MACHINE_ID/systemagentregistration]: javax.net.ssl.SSLHandshakeException: Remote host closed connection during handshake [DBAgent-1] 03 Nov 2021 14:31:44,387 WARN RegistrationChannel - Could not connect to the controller/invalid response from controller, cannot get registration information [DBAgent-1] 03 Nov 2021 14:32:04,391 INFO RegistrationChannel - Controller host [appmon.moh.gov.sa]; Controller port [443] [DBAgent-1] 03 Nov 2021 14:32:04,391 INFO RegistrationChannel - setting agent hostname [apdyn-pr-da1.moh.gov.sa] [DBAgent-1] 03 Nov 2021 14:32:04,391 INFO RegistrationChannel - setting agent version [Database Agent v21.9.0.0 GA compatible with 4.5.2.0 Build Date 2021-09-22] [DBAgent-1] 03 Nov 2021 14:32:04,391 INFO RegistrationChannel - setting agent properties [{dbagent-name=Default Database Agent, dbagent-launch-id=e726f4a3-517c-48ea-be67-77cdc4db9688}] [DBAgent-1] 03 Nov 2021 14:32:04,391 INFO RegistrationChannel - setting agent install dir [/appdynamic/db-agent-21.9.0.2521] [DBAgent-1] 03 Nov 2021 14:32:04,392 INFO RegistrationChannel - setting agent type [DB_AGENT] [DBAgent-1] 03 Nov 2021 14:32:04,392 INFO RegistrationChannel - setting agent application [Database Monitoring] [DBAgent-1] 03 Nov 2021 14:32:04,392 INFO RegistrationChannel - setting agent tier name [Database Monitoring] [DBAgent-1] 03 Nov 2021 14:32:04,392 INFO RegistrationChannel - setting agent node name [Database Monitoring] [DBAgent-1] 03 Nov 2021 14:32:04,392 INFO RegistrationChannel - Sending Registration request [DBAgent-1] 03 Nov 2021 14:32:40,544 INFO Agent - Full certificate chain validation performed using default certificate file [DBAgent-1] 03 Nov 2021 14:32:40,755 ERROR ControllerHttpRequestResponse - Fatal transport error while connecting to URL [/controller/instance/UNKNOWN_MACHINE_ID/systemagentregistration]: javax.net.ssl.SSLHandshakeException: Remote host closed connection during handshake
I am doing eval response = if ("msg.RESPONSE"="200", "Success", "Fail" ), and I have all msg.RESPONSE as 200 but still i get Fail in output. As per splunk docs, value after condition should be return... See more...
I am doing eval response = if ("msg.RESPONSE"="200", "Success", "Fail" ), and I have all msg.RESPONSE as 200 but still i get Fail in output. As per splunk docs, value after condition should be returned if condition is true but it's reverse in my case. The logs are in JSON format like below msg.RESPONSE : 200
Hi all,   in classic Splunk xml dashboards it was very easy to create conditional dashboards that for example hide, when a toke has a specific value. Is there any option to to this in Dashboard S... See more...
Hi all,   in classic Splunk xml dashboards it was very easy to create conditional dashboards that for example hide, when a toke has a specific value. Is there any option to to this in Dashboard Studio?
Hi, Here's my query - | mstats max(_value) avg(_value) min(_value) prestats=true WHERE metric_name="cpu.system" AND"index"="osnixperf" AND [| inputlookup Unix.csv] BY host span=1h | stats Avg(_... See more...
Hi, Here's my query - | mstats max(_value) avg(_value) min(_value) prestats=true WHERE metric_name="cpu.system" AND"index"="osnixperf" AND [| inputlookup Unix.csv] BY host span=1h | stats Avg(_value) AS Avg1 BY host | join [| mstats max(_value) avg(_value) min(_value) prestats=true WHERE metric_name="cpu.user" AND"index"="osnixperf" AND [| inputlookup Unix.csv] BY host span=1h | stats Avg(_value) AS Avg2 BY host] | eval totalavg=Avg1+Avg2,totalavg=round(totalavg,2) I need the timechart that shows with totalavg value like below image.   
Hi, I am trying to ingest long JSON files into my Splunk index, where a record could contain more than 10000 characters. To prevent long records from getting truncated, I added a "TRUNCATE=0" into ... See more...
Hi, I am trying to ingest long JSON files into my Splunk index, where a record could contain more than 10000 characters. To prevent long records from getting truncated, I added a "TRUNCATE=0" into my props.conf, and the entire record was ingested into the index. All events are forwarded and stored in the index, but I'm having problems with fields that appear towards the end of the JSON records.  I'm currently testing with 2 files: File A has 382 records, of which 166 are long records.  File B has 252 records, of which all are long records.  All 634 events are returned with a simple search of the index, and I can see all fields in each event, regardless of how long the event is. However, not all fields are extracted and directly searchable. For example, one of the fields is called "name", and it appears towards the end of each JSON record. On the "Interesting fields" pane, under "name", it shows only a count of 216 events from File A, and none of the remaining 166 + 252 long events in Files A and B. This is the same for other fields that appear towards the end of each JSON record, but fields towards the beginning of the record show all 634 events. If I negate the 216 events, then these fields do not appear on the Fields pane at all. Also, while I'm not able to directly search for "name=<name in File B>", I can still select the field from the event and "add to search", and all 252 events would be returned. I'm not sure why these fields are not properly extracted even though they did not appear to be truncated. How can I extract them properly? Thank you.
Anyone recently used splunk add-on for tripwire? Their website do not have the correct installation file.
hi All, I am not getting any user feedback data in the teams addon. I have checked microsoft graph api it has an endpoint specifically for that , but still addon is populating data as null. Thought... See more...
hi All, I am not getting any user feedback data in the teams addon. I have checked microsoft graph api it has an endpoint specifically for that , but still addon is populating data as null. Thoughts ideas will be helpful. @Jason Cogner @Skyler Taylor @Robert Sisson  
I want to add the in_usage and out_usage value from the below table. for example, I want to add in_usage with out_usage and result should be as total. Likewise for other values. can someone give ide... See more...
I want to add the in_usage and out_usage value from the below table. for example, I want to add in_usage with out_usage and result should be as total. Likewise for other values. can someone give ideas for this. _time source status Avg metric_name 11/3/2021 5:02 Interface_Summary_Out out_usage 16.01833333 GigabitEthernet0/1 11/3/2021 5:00 Interface_Summary_In in_usage 5.555 GigabitEthernet0/1 11/3/2021 4:02 Interface_Summary_Out out_usage 17.085 GigabitEthernet0/1 11/3/2021 4:00 Interface_Summary_In in_usage 5.270833333 GigabitEthernet0/1 11/3/2021 3:02 Interface_Summary_Out out_usage 17.425 GigabitEthernet0/1 11/3/2021 3:00 Interface_Summary_In in_usage 5.48 GigabitEthernet0/1   Please refer the attached screenshot for you reference
Hi all. I have a report set up in Splunk producing a visualisation we're embedding on our website. A member of the public has asked if they can instead get the raw data in JSON format. I don't want... See more...
Hi all. I have a report set up in Splunk producing a visualisation we're embedding on our website. A member of the public has asked if they can instead get the raw data in JSON format. I don't want to create them a user in the system, and I'd really rather link them through our Azure API portal, where we have our other API end points for retrieving customer data. So, what I'm really wanting to do is work out how I can get the scheduled report data out of Splunk and into the Azure API. I'm aware this is not a pure Splunk data, but TBH, I thought people here would be most likely to have the most experience, especially as I seem to need to 2-step the REST queries to find the search IDs and then get the results and....  It all ended up being quite a lot more complicated than I expected it to be. I couldn't find any relevant how-to guides online either, so thought this 
I am running a query that gives me various percentile metric in different row, and I would like to format them in an easily readable table.  For example, here is the current outcome after I run the b... See more...
I am running a query that gives me various percentile metric in different row, and I would like to format them in an easily readable table.  For example, here is the current outcome after I run the below query index=my_indexer | stats p50(startuptime) as "startuptime_p50", p90(startuptime) as "startuptime_p90", p99(startuptime) as "startuptime_p99", p50(render_time) as "render_time_p50", p90(render_time) as "render_time_p90", p99(render_time) as "render_time_p99", p50(foobar_time) as "foobar_time_p50", p90(foobar_time) as "foobar_time_p90", p99(foobar_time) as "foobar_time_p99", | transpose column                             row1 startuptime_p50         50 startuptime_p70         70 startuptime_p90         90 render_time_p50         51 render_time_p70         72 render_time_p90         93 foobar_time_p50         53 foobar_time_p70         74 foobar_time_p90         95 I would like to format the final table as follow (the column header is optional) Marker                P50         P70         P90 startup                50            70            90 render                 51            72            93 foobar                 53            74            95 thank you very much for your help
Hello, Im getting several APPCRASH Event ID 1001. Do we have a solution ? Below the entire error message : Fault bucket , type 0 Event Name: APPCRASH Response: Not available Cab Id: 0 Problem ... See more...
Hello, Im getting several APPCRASH Event ID 1001. Do we have a solution ? Below the entire error message : Fault bucket , type 0 Event Name: APPCRASH Response: Not available Cab Id: 0 Problem signature: P1: splunk-perfmon.exe P2: 2048.1280.24325.31539 P3: 5f057cc6 P4: splunk-perfmon.exe P5: 2048.1280.24325.31539 P6: 5f057cc6 P7: c0000005 P8: 00000000009b5003 P9: P10: Attached files: \\?\C:\ProgramData\Microsoft\Windows\WER\Temp\WERBA61.tmp.WERInternalMetadata.xml These files may be available here: \\?\C:\ProgramData\Microsoft\Windows\WER\ReportQueue\AppCrash_splunk-perfmon.e_b6ce8fa2681eaf73929792a742cf0cc962c88_d85046b8_bc677b10 Analysis symbol: Rechecking for solution: 0 Report Id: 0e443e45-f399-4d76-a355-2c805ed3a192 Report Status: 131172 Hashed bucket: Cab Guid: 0
Hello All,  This may seem easy, but its been quite tedious. How can I create one field that has common values from two separate strings: Example:  Field 1=123_yyy  Field 2=777_x_123_0 Desir... See more...
Hello All,  This may seem easy, but its been quite tedious. How can I create one field that has common values from two separate strings: Example:  Field 1=123_yyy  Field 2=777_x_123_0 Desired Results= New Field = 123  I have tried the below, but it only gives me false --- I know they dont match - I just want what is matching - any suggestions anyone?  | eval matched=if(like(Field1,"%".Field2."%"),"True","False")  
Sample JSON     { message: { application: hello deploy: { X: { A: { QPY: 14814 } } Y: { A: { BWQ: 10967 ... See more...
Sample JSON     { message: { application: hello deploy: { X: { A: { QPY: 14814 } } Y: { A: { BWQ: 10967 MQP: 1106 } } } ABC: 4020 DEF: 1532 } severity: info }     I'm trying to extract key names and values under message.deploy.Y.A (key names are not static) Goal is to put them in a line chart and track values over time. tried foreach but don't know how to use eval. Can someone help please     | foreach message.deploy.Y.A.*