All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Input: Message                                                          ID ... tt_1 ... tt_2 ... tt_9 ... tt_3                        1 ... tt_6 ... tt_4 ... tt_5                                ... See more...
Input: Message                                                          ID ... tt_1 ... tt_2 ... tt_9 ... tt_3                        1 ... tt_6 ... tt_4 ... tt_5                                      2   Ouput: Message                                                          ID     TT ... tt_1 ... tt_2 ... tt_9 ... tt_3                        1       tt_1 tt_2 tt_9 tt_3 ... tt_6 ... tt_4 ... tt_5                                      2.     tt_6 tt_4 tt_5 In the above "Message" field the "..." indicates some random text in between. So basically I want to extract all words starting with "tt_" and display it as in the table shown above. Can anyone help be with the splunk query of it.
Hi, As AppDynamics is a part of Cisco, what is the procedure for Cisco employees to request a fully functional AppDynamics account for lab and testing purpose?
Hi all, I am trying to build a query that only shows the NEW results compared to yesterday. I would like to get some alert and data to show ONLY if the message/key is new today, compared to the res... See more...
Hi all, I am trying to build a query that only shows the NEW results compared to yesterday. I would like to get some alert and data to show ONLY if the message/key is new today, compared to the results yesterday. for example:    {query} | stats count by key   Yesterday, the query returned - "key1", and "key2". | key    | count | | key1 | 10       | | key2 | 5        | Today, there are some results returned - "key1", and "key3". I would like to get the count of "key3" only as it is new today and didn't show up yesterday. | key    | count | | key3 | 15       | Thanks in advance!
Hi, im currently facing problem where splunk can detect all my files in directory but when doing searching, splunk cannot detect all of it? any ideas?    
Hello, What is the recommended way to create Apps from SPLUNK CLI, do you think > $SPLUNK_HOME/etc/apps/splunkdj createapp MyAppName should work? Your recommendation will be highly appreciated,... See more...
Hello, What is the recommended way to create Apps from SPLUNK CLI, do you think > $SPLUNK_HOME/etc/apps/splunkdj createapp MyAppName should work? Your recommendation will be highly appreciated, thank you.  
Hi, Has anyone tried using the node.js agent to see if it will work with detecting the Next.js framework? Next.js is an open-source web development framework built on top of Node.js, so don't k... See more...
Hi, Has anyone tried using the node.js agent to see if it will work with detecting the Next.js framework? Next.js is an open-source web development framework built on top of Node.js, so don't know if it will at least partially work.
In the iOS mobile app, the time range picker for all the dashboards is defaulting to 15 mins, instead of 'Today' as the web version. How to update the time range picker default value to show the same... See more...
In the iOS mobile app, the time range picker for all the dashboards is defaulting to 15 mins, instead of 'Today' as the web version. How to update the time range picker default value to show the same as the web?
How do I configure my AWS application (which is mostly lambda functions called by state machines) to properly propagate trace context? Right now I see traces that represent portions of my state machi... See more...
How do I configure my AWS application (which is mostly lambda functions called by state machines) to properly propagate trace context? Right now I see traces that represent portions of my state machines, but not the whole state machine Can splunk ingest the x-ray traceID generated by AWS Step Functions (if I turn on x-ray tracing for step function)? I am assuming that without that traceID being generated, splunk APM won't be able to track lambda functions across a state machine.
I have three total servers in a Windows deployment.  A Splunk Search server, a Splunk Index server and a Splunk deploy server.  I would like to upgrade the KV Store but I'm not clear if I need to use... See more...
I have three total servers in a Windows deployment.  A Splunk Search server, a Splunk Index server and a Splunk deploy server.  I would like to upgrade the KV Store but I'm not clear if I need to use the cluster implementation instructions or the single KV Store instructions.  I am currently running 8.2.6 and upgraded a few months ago from 8.0.x.  When I run the command "start-shcluster-migration kvstore -storageEngine wiredTiger -isDryRun true", I get "Admin handler 'shclustercaptainkvstoremigrate' not found".   This is Step1 in a clustered KV Store setup.  If I try the REST API option, for Step 1, the command errors out because -d is listed twice in the command: curl -k -u admin:changeme https://localhost:8089/services/shcluster/captain/kvmigrate/start -d storageEngine=wiredTiger -d isDryRun=true Also in the single deployment instructions, you edit the server.conf file: [kvstore] storageEngineMigration=true  I don't see this in the cluster implementation instructions unless I missed something.    
Hello Everyone , I am trying to solution a use case where team wants to review FIX messages and use that data for application observability.  After reading a the article (https://www.splunk.com/en_... See more...
Hello Everyone , I am trying to solution a use case where team wants to review FIX messages and use that data for application observability.  After reading a the article (https://www.splunk.com/en_us/blog/customers/splunk-in-financial-services.html?_gl=1*z1b7wa*_ga*MTkzMzA1MzUxOC4xNjA3MzY5MDc3*_gid*Nzc1NDIxOTMzLjE2NTQ1NDE4Njc.&_ga=2.37788645.775421933.1654541867-1933053518.1607369077) it looks like there as an app translatefix which could be used. However i'm not able to find the app in splunk base. Can someone please guide if the addon exist or is there a more feasible solution available.
Dears, I am new to splunk, just installed trail versions through wget, Splunkd is running but unable to connect with 8000 port after adding inbound rules in aws! what needs to be done? can an... See more...
Dears, I am new to splunk, just installed trail versions through wget, Splunkd is running but unable to connect with 8000 port after adding inbound rules in aws! what needs to be done? can anyone help me on this? Thanks in advance.
We are excited to announce the preview of Splunk ITSI custom threshold windows (CTW). ITSI CTW allows you to adjust your severity levels when an expected abnormal behavior may arise - e.g. public hol... See more...
We are excited to announce the preview of Splunk ITSI custom threshold windows (CTW). ITSI CTW allows you to adjust your severity levels when an expected abnormal behavior may arise - e.g. public holidays, peak day of the year or month, or large retail moment like Black Friday. Need access to the ITSI CTW preview? Complete this brief application, we will contact you if there is space and availability to participate!  Already have access to the preview?  Want to access product docs? ITSI custom threshold windows Docs offers detailed guidance on how to use the feature  Want to request more features? Add your ideas and vote on other ideas at ITSI custom threshold windows Ideas Portal    Please reply to the thread below with any questions or to get support from the Splunk team, our product and engineering teams are subscribed to this post and will be checking for feedback and questions!
Hi, I have a table as the main search using dbxquery below: | dbxquery connection=my_connection query="SELECT id, start_date, end_date FROM my_table" Sample records: id, start_date, end_date ... See more...
Hi, I have a table as the main search using dbxquery below: | dbxquery connection=my_connection query="SELECT id, start_date, end_date FROM my_table" Sample records: id, start_date, end_date 1, 2020-01-01, 2020-01-04 2, 2020-01-03, 2020-01-05 ...... And I have another lookup csv with only two columns below: date, amount 2020-01-01, 10 2020-01-02, 20 2020-01-03, 10 2020-01-04, 10 2020-01-05, 20 ...... The output I want is: id, start_date, end_date, total 1, 2020-01-01, 2020-01-04, 50 # total sum of 2020-01-01 to 2020-01-04 (10+20+10+10) 2, 2020-01-03, 2020-01-05, 40 # total sum of 2020-01-03 to 2020-01-05 (10+10+20) What could be the best way to get this done? Thanks in advance!
Cisco ACI APP for Splunk, when I enable this collection, it creates a huge load on the APIC. [script://$SPLUNK_HOME/etc/apps/TA_cisco-ACI/bin/collect.py -classInfo aaaModLR faultRecord eventRecord]... See more...
Cisco ACI APP for Splunk, when I enable this collection, it creates a huge load on the APIC. [script://$SPLUNK_HOME/etc/apps/TA_cisco-ACI/bin/collect.py -classInfo aaaModLR faultRecord eventRecord] I have attempted to widen the interval, but it just reduces the number of times the load happens. The APIC is almost unusable while this collection is happening.  I removed these one at a time, and it appears to be the poll of the eventRecord that is causing the drag on the APIC. I thought Splunk would only pull in the new information since the last poll, but that does not appear to be what is actually happening.  Is this expected? Is there a way to remedy this issue?
Do the resulting files from a "dump" command have a TTL? I think they must since the files I created on Friday no longer exist. Here is the search I am using to create the files.   index = “myI... See more...
Do the resulting files from a "dump" command have a TTL? I think they must since the files I created on Friday no longer exist. Here is the search I am using to create the files.   index = “myIndexName” sourcetype=”mySourcetype” myFilterField IN(123ABC, 456DEF, 789GHI) | dump basefilename= ABCCorp_06-06-22_0800_01330_ rollsize=1000 compress=5 format=raw | table *     Thank you.
Hi there, I have this type of event coming into splunk:  ``` [redacted:54407 24943076666] Processing MessageDispatcher.deliver_batch([#<Message::Queued:0x0000aaab14d8f418 @id=10440927, @created_a... See more...
Hi there, I have this type of event coming into splunk:  ``` [redacted:54407 24943076666] Processing MessageDispatcher.deliver_batch([#<Message::Queued:0x0000aaab14d8f418 @id=10440927, @created_at=Fri, 03 Jun 2022 14:21:43.890133282 UTC +00:00>, #<Message::Queued:0x0000aaab14dbc8c8 @id=10440928, @created_at=Fri, 03 Jun 2022 14:21:43.896693884 UTC +00:00>]{"tag":"something","strand":null,"singleton":null,"priority":25,"attempts":0,"created_at":"2022-06-03T14:21:43Z","max_attempts":15,"source":"hostname:redacted,pid:29920"} ```   I would like to extract all of the json fields dynamically without individually pulling them out with multiple rex's. I have tried the following, but I am not seeing the json fields being parsed. `myjson` is successfully extracted, but spath does not pull out individual fields from the json:  ``` index="myindex" source="mysource"  | rex field=_raw "(?<myjson>\{.+\})" | spath myjson ```
Hi, I am trying to create a splunk app that mimics as much of the Search and Report functionality as possible with some additional customizations. At the moment I am trying to import the field_extr... See more...
Hi, I am trying to create a splunk app that mimics as much of the Search and Report functionality as possible with some additional customizations. At the moment I am trying to import the field_extractor view into my application, however I do not see documentation to support incorporating views that do not exist in the splunkjs/mvc directory.  Is this possible?
Hello, I have a number of unique searches for various infrastructure resources. I would like to create a dashboard that builds a chart based on the chosen entry from a dropown. Unfortunately, there... See more...
Hello, I have a number of unique searches for various infrastructure resources. I would like to create a dashboard that builds a chart based on the chosen entry from a dropown. Unfortunately, there's no easy way to create a base search and use tokens. In other words, I would like each input title to reference a specific search in datsources. example dropdown:  input selected value 1= "dataSources": "search_1" input selected value 2 = "dataSources": "search_2" input selected value 3 = "dataSources": "search_3" I could not find any documentation with examples of something similar. Thanks in advance.
Hello, Anyone know a fix for Tenable Add-On for Splunk on that error? After splunk upgrade to 8.2.6 from 8.0.5 i got that thing blinking on red on my indexer. "Unable to initialize modular inpu... See more...
Hello, Anyone know a fix for Tenable Add-On for Splunk on that error? After splunk upgrade to 8.2.6 from 8.0.5 i got that thing blinking on red on my indexer. "Unable to initialize modular input "tenable_sc" defined in the app "Splunk_TA_nessus": Introspecting scheme=tenable_sc: script running failed (PID 753455 exited with code 1)" Thanks a lot for any advices! Have a nice day!!