All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I have this output from a field, with a lot of blank spaces,  what would it be the best way to convert this data into a table? or maybe a regex to parse it out better Start TREATMENTING ROUTES. TRE... See more...
I have this output from a field, with a lot of blank spaces,  what would it be the best way to convert this data into a table? or maybe a regex to parse it out better Start TREATMENTING ROUTES. TREATMENTS IS: GNCT   1 T12023   2 LKOUTWDF   POSITIONS ROUTES. POSITIONS IS: TOPS   1 CGHBRAB21053TBX   N S3T55NS   End.
Hello, I'm having a problem with mvexpand in Splunk. I'm having the following error: command.mvexpand: output will be truncated at 1103400 results due to excessive memory usage. Memory threshol... See more...
Hello, I'm having a problem with mvexpand in Splunk. I'm having the following error: command.mvexpand: output will be truncated at 1103400 results due to excessive memory usage. Memory threshold of 500MB as configured in limits.conf / [mvexpand] / max_mem_usage_mb has been reached. Doing some searching here on answers I came across this previous answer: https://answers.splunk.com/answers/98620/mvexpand-gives-mvexpand-output-will-be-truncated-due-to-exc... Although that solution seemed to help a lot of people it did not help me. I don't seem to see a fix anywhere else. If anyone has some advice it would be most helpful. Thanks! Taking the question, is it possible to improve this range? Here is my search: index=_raw UserName=* timeformat="%d-%m-%YT%H:%M:%S" earliest="01-12-2021T00:00:00" latest="02-12-2021T23:59:00" | stats values(_time) as Time by UserName | eval i = mvrange(0,20) | mvexpand i | eval reconnection=if(UserName==UserName, tonumber(mvindex(Time,i+1))-tonumber(mvindex(Time,i)), "falha") | where reconnection>0 AND reconnection<1200 | eval reconnection=tostring(reconnection, "duration") | chart count by reconnection
Need help on trimming the month from the field  EX:  Input      November 29, 2021 2:02:33 PM          output   Nov 29, 2021 2:02:33 PM  for all month in the field  Thanks 
We have a couple of critical batch jobs running every night and we a way to monitor them. The jobs are doing life cycle management, so it's very important that we are sure the jobs have actually been... See more...
We have a couple of critical batch jobs running every night and we a way to monitor them. The jobs are doing life cycle management, so it's very important that we are sure the jobs have actually been started, but of course also that they ended without issues. Does anyone have experience with setting up monitoring and alarms for this in AppD?
Hi  We have a situation, while trying to post a request to a external api from java script, we are getting timeout error.   While trying to hit the same url through curl we are getting the below ss... See more...
Hi  We have a situation, while trying to post a request to a external api from java script, we are getting timeout error.   While trying to hit the same url through curl we are getting the below ssl cert error.  "curl: (60) SSL certificate problem: unable to get local issuer certificate More details here: https://curl.haxx.se/docs/sslcerts.html curl failed to verify the legitimacy of the server and therefore could not establish a secure connection to it. To learn more about this situation and how to fix it, please visit the web page mentioned above. " I have attached the error snip and the related JS here. JS snippet: const userAction = async (pid) => { const url='https://xyz.com:443/PID?ppid='+pid; const response = await fetch(url,{ method: 'POST',mode: 'cors'}) .then(response => console.log(response)) .then(xmlString => console.log($.parseXML(xmlString)) ) .catch(error => console.log(error)) has anyone come across the above issue, any help is appreciated  Thanks.
Hi All In our environment, servers are put under maintenance (serves are shutdown) at a particular time of a day . So we need to disable the Correlation searches during this period .To avoid Inciden... See more...
Hi All In our environment, servers are put under maintenance (serves are shutdown) at a particular time of a day . So we need to disable the Correlation searches during this period .To avoid Incidents getting created . How can we disable/enable an Correlation searches  during a particular time.  Please let me know if you have any suggestions    Thanks and Regards  
1. I have installed universal forwarder and have a Splunk cloud account. 2. On the laptop in universal forwarder, i downloaded the file and execute the command:  /opt/splunkforwarder/bin/splunk inst... See more...
1. I have installed universal forwarder and have a Splunk cloud account. 2. On the laptop in universal forwarder, i downloaded the file and execute the command:  /opt/splunkforwarder/bin/splunk install app /tmp/splunkclouduf.spl. 3. I restart the splunk process.   No data went in, may I know why?   Note: I am trying to forward the Windows event log which is the same host where i installed the Splunk universal forwarder
Two concerns come when moving on-prem data to the cloud:   1. Data sensitivity- What if confidential data is lost? (in transit or at rest) 2. Authentication - Login into the cloud, there is no 2FA... See more...
Two concerns come when moving on-prem data to the cloud:   1. Data sensitivity- What if confidential data is lost? (in transit or at rest) 2. Authentication - Login into the cloud, there is no 2FA or anything, just username and password, and the user can just login like this.   Would like to ask cloud users how do you manage to overcome these 2 concerns when shifting your data to Splunk cloud?
After creating a dashboard having 6 panels, all the jobs are getting queued. Also the search lag health status is yellow.  Search Lag Root Cause(s): The percentage of non high priority searches ... See more...
After creating a dashboard having 6 panels, all the jobs are getting queued. Also the search lag health status is yellow.  Search Lag Root Cause(s): The percentage of non high priority searches lagged (50%) over the last 24 hours is very high and exceeded the yellow thresholds (40%) on this Splunk instance. Total Searches that were part of this percentage=2. Total lagged Searches=1 Please help me to resolve these issues.
I have created a search which is working fine. It sends an email when the alert condition meets. My question is, is there any way I can add/update the email address in my alert using curl command? ... See more...
I have created a search which is working fine. It sends an email when the alert condition meets. My question is, is there any way I can add/update the email address in my alert using curl command? also can I update my alert search query using curl command? Thanks, Regards,    
i would like to get below values from splunk into shell script . i am creating alert for below values and using webhook to invoke a shell script.  i am using below webhooklink to trigger the scr... See more...
i would like to get below values from splunk into shell script . i am creating alert for below values and using webhook to invoke a shell script.  i am using below webhooklink to trigger the script  but i don't know how to get those splunk search results into shell script? can someone help to suggest me which command/code has to used to capture the value form splunk ?
I am getting this data when I am pulling events from a sourcetype Name=Microsoft Hyper-V Network Adapter _2 Now I want to show this in a table, but when I am using --> table Name then it is showing... See more...
I am getting this data when I am pulling events from a sourcetype Name=Microsoft Hyper-V Network Adapter _2 Now I want to show this in a table, but when I am using --> table Name then it is showing only Microsoft i.e. only the first word is being shown. How can I show the whole value of the name field? Please help.  
I have event data from the search result in format as shown in the image, now I want to extract the following fields with their corresponding values excluding the remaining fields or data from the ev... See more...
I have event data from the search result in format as shown in the image, now I want to extract the following fields with their corresponding values excluding the remaining fields or data from the event data/string: id = b0ad6627-a6e1-4f5e-92f4-9c2deaa1ff2a_1cd4b06f83caac09 start_date_time = 1638433382 (value always required) end_date_time = null or 1638433491  (if value not present) current = <value> (only if the field exist) (6 in the example) total = <value> (6 in the example) status_type = COMPLETED bot_uri = repository:///Automation%20Anywhere/Bots/Test%20A2019/AALogTestBot I tried using <search query> | rex field=_raw "(?msi)(?<ev_field>\{.+\}$)" | spath input=ev_field  to extract all the fields in the Event data, but did not change the search results. Any suggestion or help highly appreciated I am newbie to Splunk... TIA   12/2/21 7:24:52.106 PM   2021-Dec-02 Thu 19:24:52.106 INFO [pool-12-thread-1] - com.automationanywhere.nodemanager.service.impl.NodeMessagingServiceImpl - {} - writeSuccess(NodeMessagingServiceImpl.java:395) - Message eventData { id: "b0ad6627-a6e1-4f5e-92f4-9c2deaa1ff2a_1cd4b06f83caac09" bot_execution { start_date_time { seconds: 1638433382 nanos: 210329300 } end_date_time { seconds: 1638433491 nanos: 993822800 } progress { current: 6 total: 6 percentage: 100 } status_type: COMPLETED bot_uri: "repository:///Automation%20Anywhere/Bots/Test%20A2019/AALogTestBot?fileId=1098948&workspace=PRIVATE" }} sent to CR successfully.  
Hi Team, i have created my account on support portal with my official email address followed by all details .. but i don't remember the user name and password which i have given at the time of user... See more...
Hi Team, i have created my account on support portal with my official email address followed by all details .. but i don't remember the user name and password which i have given at the time of user creation . now i am unable to create the new one since it already  have some account with that email id .. seems we have to delete that account associated with that email Id .. how can we delete that account associated with my offical email_id so that i can create the new one ..  
Hi all. I am ingesting a CSV file from a UF where the CSV is daily updated by the app team at a particular time and  I have seen data in splunk ingesting on hourly basis which should not be the case ... See more...
Hi all. I am ingesting a CSV file from a UF where the CSV is daily updated by the app team at a particular time and  I have seen data in splunk ingesting on hourly basis which should not be the case and it should be ingesting only once per day need suggestions on how to eliminate this and make sure the data is ingesting only once per day        
Hello all,   I am trying to extract a field from the below event and the extraction is missing the last part of the field. Please help in getting this extracted. Event: 117691777,00004105,0000000... See more...
Hello all,   I am trying to extract a field from the below event and the extraction is missing the last part of the field. Please help in getting this extracted. Event: 117691777,00004105,00000000,5064,"20211202100006","20211202100006",4,-1,-1,"SYSTEM","","IPSC002",94882466,"MS932","Server-I ジョブ(Server:/IZ_SSYS_DB/DAILY/MP7/MP_D41/物流ルートテーブルデータ送信:@20H7984)を開始します(host: Host, JOBID: 229589)","Information","tdi01","/HITACHI/JP1/AJS2","JOB","AJSROOT1:/IZ_SSYS_DB/DAILY/MP7/MP_D41/物流ルートテーブルデータ送信","JOBNET","Server:/IZ_SSYS_DB/DAILY/MP7","Server:/IZ_SSYS_DB/DAILY/MP7/MP_D41/物流ルートテーブルデータ送信","START","20211202100006","","",16,"A0","Server:/IZ_SSYS_DB/DAILY","A1","MP7","A2","MP_D41/物流ルートテーブルデータ送信","A3","@20H7984","ACTION_VERSION","0600","B0","n","B1","2","B2","tdi01","B3","IPSC002","C0","IPSC202","C1","","C6","r","H2","188677","H3","pj","H4","q","PLATFORM","NT",   Extraction used: (?:[^,]+,){14}(?<alert_description>[^,]+),   However the same extraction is working on the below event as expected. 117727680,00004103,00000000,5064,"20211202172828","20211202172828",4,-1,-1,"SYSTEM","","IPSC002",94918000,"MS932","Server-I ジョブネット(Server:/HTHACHU/IJH03/IJH03:@20I8438)が正常終了しました","Information","tdi01","/HITACHI/JP1/AJS2","JOBNET","AJSROOT1:/HTHACHU/IJH03/IJH03","JOBNET","AJSROOT1:/HTHACHU/IJH03/IJH03","AJSROOT1:/HTHACHU/IJH03/IJH03","END","20211202172827","20211202172828","",10,"A0","AJSROOT1:/HTHACHU/IJH03","A1","IJH03","A3","@20I8438","ACTION_VERSION","0600","B0","n","B1","0","B3","IPSC002","H2","853876","H3","n","PLATFORM","NT", Please help extract the highlighted field.
Hello Everyone, This is a general question that I haven't found an answer to yet. I am aware of how a license violation is carried out.(https://docs.splunk.com/Documentation/Splunk/8.2.3/Admin/Abo... See more...
Hello Everyone, This is a general question that I haven't found an answer to yet. I am aware of how a license violation is carried out.(https://docs.splunk.com/Documentation/Splunk/8.2.3/Admin/Aboutlicenseviolations#:~:text=What%20is%20a%20license%20warning,clock%20on%20the%20license%20master.) In a training course it was mentioned that license warnings are reported to Splunk, is that correct? What I would like to know is the following: Does Splunk receives anything? Does Splunk receives messages at license warnings or violations? Does Splunk receives messages, when a license pool goes in warning or violation? What does Splunk receives? Data Volume? License ID Thank you all for you help.
Hi I retrieve the fields of a dropdown list from an CSV file It works but the probleme I have is that randomnly I have the message "filling on going" which last and as a consequence I am unable to ... See more...
Hi I retrieve the fields of a dropdown list from an CSV file It works but the probleme I have is that randomnly I have the message "filling on going" which last and as a consequence I am unable to update the dashboards panels because each panels are referenced to this dropdown list there is 1300 lines in my csv file What is the problem please? <input type="dropdown" token="site" searchWhenChanged="true"> <label>Site</label> <fieldForLabel>site</fieldForLabel> <fieldForValue>site</fieldForValue> <search> <query>| inputlookup site.csv</query> </search> <choice value="*">*</choice> <default>*</default> <initialValue>*</initialValue> </input>   
Hello, I have some issues extracting fields from the following raw event. I should be getting following fileds from this event. Any help will be highly appreciated. Thank you! Field Names: TIMESTA... See more...
Hello, I have some issues extracting fields from the following raw event. I should be getting following fileds from this event. Any help will be highly appreciated. Thank you! Field Names: TIMESTAMP, USERTYPE, USERID, SYSTEM, EVENTTYPE, EVENTID, SRCADDR, SESSIONID, TAXPERIOD, RETURNCODE, TAXFILERTIN, VARDATA Sample Event: {"log":"\u001b[0m\u001b[0m05:14:09,516 INFO  [stdout] (default task-4193) 2021-12-02 05:14:09,516 INFO  [tltest.logging.TltestEventWriter] \u003cMODTRANSAUDTRL\u003e\u003cEVENTID\u003e1210VIEW\u003c/EVENTID\u003e\u003cEVENTTYPE\u003eDATA_INTERACTION\u003c/EVENTTYPE\u003e\u003cSRCADDR\u003e192.131.8.1\u003c/SRCADDR\u003e\u003cRETURNCODE\u003e00\u003c/RETURNCODE\u003e\u003cSESSIONID\u003etfYU4-AEPnEzZg\u003c/SESSIONID\u003e\u003cSYSTEM\u003eTLCATS\u003c/SYSTEM\u003e\u003cTIMESTAMP\u003e20211202051409\u003c/TIMESTAMP\u003e\u003cUSERID\u003eAX3BLNB\u003c/USERID\u003e\u003cUSERTYPE\u003eAdmin\u003c/USERTYPE\u003e\u003cVARDATA\u003eCASE NUMBER, CASE NAME;052014011348000,BANTAM LLC\u003c/VARDATA\u003e\u003c/MODTRANSAUDTRL\u003e\n","stream":"stdout","time":"2021-12-02T05:14:09.517228451Z"}
Hi Everyone , I have two applications and I have created dashboards forteh apps: index=epaas_epaas2_idx ns=blazegateway app_name=blazecrsgateway* I need to get the below info: Total YTD Volume f... See more...
Hi Everyone , I have two applications and I have created dashboards forteh apps: index=epaas_epaas2_idx ns=blazegateway app_name=blazecrsgateway* I need to get the below info: Total YTD Volume for PSF Push API Total Volume to GRS YTD Can someone guide me how we can get the above two information with index,ns and app name.