All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi everyone, I am using Splunk UI (Splunk Design System) to develop the Splunk App (with ReactJs) I want to send the email from my app( use Splunk SMTP setting). My app is allowing users can se... See more...
Hi everyone, I am using Splunk UI (Splunk Design System) to develop the Splunk App (with ReactJs) I want to send the email from my app( use Splunk SMTP setting). My app is allowing users can select the sender, recipient (to, cc, bcc), body. So my question is: Can we get the SMTP setting from Splunk? How can I send the email from app with Reactjs, does Splunk JS SDK support it? I knew I could create a custom rest endpoint with a python script to create send mail endpoint backend API but it is very complex. I want to find another way. Thank!
Hi  have a problem in stream app, in some flows wrong source and dest IP observed. For instance, I checked the original flow in Wireshark  and the original source IP and port were192.168.1.1:56271... See more...
Hi  have a problem in stream app, in some flows wrong source and dest IP observed. For instance, I checked the original flow in Wireshark  and the original source IP and port were192.168.1.1:56271 and dest IP and port were 192.168.1.2:80, meanwhile in stream the source and dest are displaced! any suggestion on this wired issue?
Hello, everyone! I have few questions about indexers cleaning: - How it's performed in clustered architecture? - Does it really needed? Do I correctly understand that frozen buckets delete auto... See more...
Hello, everyone! I have few questions about indexers cleaning: - How it's performed in clustered architecture? - Does it really needed? Do I correctly understand that frozen buckets delete automatically?
Did any one know what naming convention need to onboard the data from Corelight to Splunk? Do we need this kind of naming convention conn_<date>_<time>.log or conn.log dns.log are fine.
Hello, I need to install ARUBA TA; do you have any recommendations on how to proceed.  Your recommendations will be highly appreciated. Thank you!
I've created a table of test results using stats list() to create a table that looks like this (the application name is only listed once against the group of tests it's related to): Application ... See more...
I've created a table of test results using stats list() to create a table that looks like this (the application name is only listed once against the group of tests it's related to): Application TestName Outcome Website Search One Passed   Contact Page Passed   Order Form Passed Internal Query Form Passed   Look Up Passed   I would like to amend the table so that the Application is shown in the 'TestName' column above the group of tests it's related to, so it looks like this: TestName Outcome Website   Search One Passed Contact Page Passed Order Form Passed Internal   Query Form Passed Look Up Passed   I know this breaks normal table data layout but for the purposes of my dashboard I think it will make it look more readable.
1. I have below logs: server6z: INFO could not find the logs under this path(apimanager call) server6z: INFO could not find the logs under this path(apimanager call) server6z: INFO could not find ... See more...
1. I have below logs: server6z: INFO could not find the logs under this path(apimanager call) server6z: INFO could not find the logs under this path(apimanager call) server6z: INFO could not find the logs under this path(apimanager call), unable to find the logs from this server. server6z: INFO could not find the logs under this path(apimanager call) server6z: INFO could not find the logs under this path(apimanager call) server6z: INFO could not find the logs under this path(apimanager call), unable to find the logs from this server. server6z: INFO could not find the logs under this path(apimanager call) i have mentioned in my props should_linemerge=false line_breaker=([\r\n]+) but i am seeing error like failed to parse timestamp defaulting to file modtime. How to resolve this issue. 2. I am getting the same issue as above for this type of logs as well Sample logs: /path/svgt/app/loadscript/file.com: coloumn12: /path/svgt/app/loadscript/file.com: not able to view file /applicatins/dir/wrd-start/loadscript/filedata.com: line24: /applicatins/dir/wrd start/loadscript/filedata.com: not able to read the files /path/svgt/app/loadscript/file.com: coloumn12: /path/svgt/app/loadscript/file.com: not able to view file /applicatins/dir/wrd-start/loadscript/filedata.com: line24: /applicatins/dir/wrd start/loadscript/filedata.com: not able to read the files /path/svgt/app/loadscript/file.com: coloumn12: /path/svgt/app/loadscript/file.com: not able to view file /path/svgt/app/loadscript/file.com: coloumn12: /path/svgt/app/loadscript/file.com: not able to view file /applicatins/dir/wrd-start/loadscript/filedata.com: line24: /applicatins/dir/wrd start/loadscript/filedata.com: not able to read the files
Hello, When I run a query I get the results as I need them in a table from Splunk but when I download the .csv file, the timestamp field changes to an incorrect date and year. Does anyone know h... See more...
Hello, When I run a query I get the results as I need them in a table from Splunk but when I download the .csv file, the timestamp field changes to an incorrect date and year. Does anyone know how I can fix it?        
Does anyone know a command to monitoring the web loading response (time) of a Splunk page/server? Like when you navigate on one page to another, one search to another.
Hi Experts, We have created a new role with the same capabilities as a user role, but we wanted to add another capability to this role to authorize them to enable or disable the alerts as required.... See more...
Hi Experts, We have created a new role with the same capabilities as a user role, but we wanted to add another capability to this role to authorize them to enable or disable the alerts as required.  Thanks heaps 
I have an  ``` index=xyz data.id=1 ``` which gives me list of unique id's [1,2,3,4,5]Not sure how to store the above result to get it used for another query. | stats count by uniqueId Now... See more...
I have an  ``` index=xyz data.id=1 ``` which gives me list of unique id's [1,2,3,4,5]Not sure how to store the above result to get it used for another query. | stats count by uniqueId Now I want to use the list above and get the data from another query and find the values Query 2 will return  1 -> good 2 -> Bad 3 -> Neural / etc Index2 I want to use the result [1,2,3,4] for the next query which will give me some extra information based on the ID only. Eg: Query 2 has index=xyz data.msg.id=1, data.xyz.val=good How can we do that? I am trying something like this   index="test" actionSubCateg IN (xyz) landingPageURL="xyz/?search=game_gupta" data.msg.queryName="query FindBtf" | table data.msg.id Find in second query the results of top [ search index="test" actionSubCateg="game" | rename data.DATA.id as id | fields id, scope | table id, scope]  
Hi I am trying to capture all event="DcSyncs" from my index. This index also contains event="DcID". The event "DCSyncs" can occur at anytime (pretty often though), but "DcID" occurs once every 8 hour... See more...
Hi I am trying to capture all event="DcSyncs" from my index. This index also contains event="DcID". The event "DCSyncs" can occur at anytime (pretty often though), but "DcID" occurs once every 8 hours. I am trying to get all "DcSyncs" and then take the HostName field of those results and see if that HostName field has a result for event="DcID". If it does filter it out of the results. To summarize: I am trying to collect all HostName's that have a "DCSyncs" event, but no "DcID" event. I have this setup to run on an 8 hour interval so I don't think I need the time logic of the search.  I keep trying different variations, but I think I am way off. Any help is appreciated. index=MyIndex event="DcSyncs" | join HostName [search NOT index=MyIndex event="DcID"] | table _time HostName event
I have 3 columns that I'm using. URL, website, count. The URL is too large and I would like to reduce just the size it keeping all the words. I tried using the following command in the code but it... See more...
I have 3 columns that I'm using. URL, website, count. The URL is too large and I would like to reduce just the size it keeping all the words. I tried using the following command in the code but it didn't work: }, "options": { "columnFormat": { "URL": { "width": 100  
I guess my real question is how do I move Splunk from one company to another, including some but not all of the data and the indexes for the selected data? I see I can copy config and indexes from t... See more...
I guess my real question is how do I move Splunk from one company to another, including some but not all of the data and the indexes for the selected data? I see I can copy config and indexes from the $SPLUNK_HOME, but indexes are (I guess) just metadata, referencing other data. So, a search will read the index, then use that to get the data to return and display. I am going to guess Splunk will make a copy of the indexed data, because data sources can disappear for various reasons and that would not be ideal for later searches.  
I am trying to add fields from a lookup table. However, the matching field is a multivalue field. I need to expand the matching field but do not know how to group the lookup command with a multivalue... See more...
I am trying to add fields from a lookup table. However, the matching field is a multivalue field. I need to expand the matching field but do not know how to group the lookup command with a multivalue command lookup file assest.csv:  ip, host 10.10.1.1|10.100.1.1|10.10.200.1, srv1 10.10.1.2|10.100.1.2|10.10.200.2, srv2 original search that returns an IP value | [lookup assets.csv ip OUTPUT host |makemv delim="|" ip] does not work  
I have a list of IPs and want to check if they are sending data to Splunk but using a single query. The devices in this list need troubleshooting. Is there some query I could run referencing this l... See more...
I have a list of IPs and want to check if they are sending data to Splunk but using a single query. The devices in this list need troubleshooting. Is there some query I could run referencing this list to get an output of stats or something similar? Any guidance, please?  
I have splunk logs as given below. However, I wanted display fields in between square brackets "[ ]" in a table as given below. Please advise. Expeted query result in a table sqsMsgId            ... See more...
I have splunk logs as given below. However, I wanted display fields in between square brackets "[ ]" in a table as given below. Please advise. Expeted query result in a table sqsMsgId                                                                        | snsMsgId                                                                        | requestId dec6c564-9e1c-4d0f-8e5e-ac9dc7bdf14a  | 7d81b4cf-43c0-5bb4-8370-ef064a78da16 | d487108c-863f-5ab2-96df-4b458f97c74e My splunk Logs {"level":"info","message":"[sqsMsgId=dec6c564-9e1c-4d0f-8e5e-ac9dc7bdf14a | snsMsgId=7d81b4cf-43c0-5bb4-8370-ef064a78da16 | workItemKey=CAMP:MI4:ORG_ID:103857:7fbf0f46-4131-404d-9a13-57cdff7c473a | requestId=d487108c-863f-5ab2-96df-4b458f97c74e | status=SUCCESS | ags=CAMP | component=MI4 | duration=383]","requestId":"d487108c-863f-5ab2-96df-4b458f97c74e"}
@jkat54  I ran into an error with a long data parameter: command="curl", field larger than field limit (10485760) I see this value in a bunch of internals.py under csv.field_size_limit and changed... See more...
@jkat54  I ran into an error with a long data parameter: command="curl", field larger than field limit (10485760) I see this value in a bunch of internals.py under csv.field_size_limit and changed every possible one to a smaller value to see if the error message changes but it is still the same. Do you know which config controls this issue?
Do we have terraform provider for splunk alerts replicating in multiple environments We have search queries and alerts created in one environment -  can we promote same alerts to different environm... See more...
Do we have terraform provider for splunk alerts replicating in multiple environments We have search queries and alerts created in one environment -  can we promote same alerts to different environments - Do we have a way to automate this or if we have any terraform provider for replicating alerts across environments
Hi All, I wan to see user who are using splunk more. I am using the below query: |rest /services/authentication/users splunk_server=local Here i am getting all results, but i need the list of... See more...
Hi All, I wan to see user who are using splunk more. I am using the below query: |rest /services/authentication/users splunk_server=local Here i am getting all results, but i need the list of users  who are using Splunk more.