All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I have a lookup like this  Name Status ExamID John Pass 123 Bob Pass 345 John Fail 234 Bob Pass 235 Smith Fail 231   My Events are having Name alone as the unique ... See more...
I have a lookup like this  Name Status ExamID John Pass 123 Bob Pass 345 John Fail 234 Bob Pass 235 Smith Fail 231   My Events are having Name alone as the unique identifier.   I wrote my query like this  index=userdata [ inputlookup userinfo.csv | fields Name]  | lookup userinfo.csv Name as Name OUTPUT Status as Status ExamID as Identifier  Via first subsearch I extracted the events only belong to names present in the table and then i tried to ouput the status and examid for those Names. On combination of these 3 in the event i need to evaluate fourth result.  John - Pass - 123 ->> In this if ExamID falls between 120 and 125 I need to print value for fourth field as "GOOD"  However while am printing output from lookup i got multivalues like this. Then i tried to do mvappend and that did not work correctly.  So how to do this correctly John Pass Fail 123 234
Hi Team, Good day! We have extracted the set of job names from the event using the below rex query. index=app_events_dwh2_de_uat _raw=*jobname* | rex max_match=0 "\\\\\\\\\\\\\"jobname\\\\\\\\\\\\... See more...
Hi Team, Good day! We have extracted the set of job names from the event using the below rex query. index=app_events_dwh2_de_uat _raw=*jobname* | rex max_match=0 "\\\\\\\\\\\\\"jobname\\\\\\\\\\\\\":\s*\\\\\\\\\\\\\"(?<Name>[^\\\]+).*?\\\\\\\\\\\\\"status\\\\\\\\\\\\\":\s*\\\\\\\\\\\\\"(?<State>ENDED OK).*?Timestamp\\\\\\\\\\\\\": \\\\\\\\\\\\\"(?<TIME>\d+\s*\d+\:\d+\:\d+).*?execution_time_in_seconds\\\\\\\\\\\\\": \\\\\\\\\\\\\"(?<EXECUTION_TIME>[\d\.\-]+)" | table "TIME", "Name", "State", "EXECUTION_TIME" | mvexpand TIME | dedup TIME After using the above query we have obtained the result in the table format like below. 20240417 21:13:23 CONTROL_M_REPORT ENDED OK 73.14 DWHEAP_FW_BHW ENDED OK 80.66 DWHEAP_FW_TALANX ENDED OK 80.18 DWHEAP_TALANX_LSP_FW_NODATA ENDED OK 3.25 SALES_EVENT_TRANSACTION_RDV ENDED OK 141.41   Is it possible to extract only the jobs with name consists of string NODATA from the above set of job names.  Below is the sample event for the above one. Dataframe row : {"_c0":{"0":"{","1":" \"0\": {","2":" \"jobname\": \"CONTROL_M_REPORT\"","3":" \"status\": \"ENDED OK\"","4":" \"execution_time_in_seconds\": \"46.39\"","5":" \"Timestamp\": \"20240418 12:13:23\"","6":" }","7":" \"1\": {","8":" \"jobname\": \"DWHEAP_FW_AIMA_001\"","9":" \"status\": \"ENDED OK\"","10":" \"execution_time_in_seconds\": \"73.14\"","11":" \"Timestamp\": \"20240418 12:13:23\"","12":" }","13":" \"2\": {","14":" \"jobname\": \"DWHEAP_FW_BHW\"","15":" \"status\": \"ENDED OK\"","16":" \"execution_time_in_seconds\": \"71.19\"","17":" \"Timestamp\": \"20240418 12:13:23\"","18":" }","19":" \"3\": {","20":" \"jobname\": \"DWHEAP_FW_NODATA\"","21":" \"status\": \"ENDED OK\"","22":" \"execution_time_in_seconds\": \"80.63\"","23":" \"Timestamp\": \"20240418 12:13:23\"","24":" }","25":" \"4\": {","26":" \"jobname\": \"DWHEAP_FW_TALANX\"","27":" \"status\": \"ENDED OK\"","28":" \"execution_time_in_seconds\": \"80.20\"","29":" \"Timestamp\": \"20240418 12:13:23\"","30":" }","31":" \"5\": {","32":" \"jobname\": \"DWHEAP_FW_UC4_001\"","33":" \"status\": \"ENDED OK\"","34":" \"execution_time_in_seconds\": \"80.13\"","35":" \"Timestamp\": \"20240418 12:13:23\"","36":" }","37":" \"6\": {","38":" \"jobname\": \"DWHEAP_TALANX_LSP_FW_NODATA\"","39":" \"status\": \"ENDED NOTOK\"","40":" \"execution_time_in_seconds\": \"120.12\"","41":" \"Timestamp\": \"20240418 12:13:23\"","42":" }","43":" \"7\": {","44":" \"jobname\": \"RDV_INFRASTRUCTURE_DETAILS\"","45":" \"status\": \"ENDED OK\"","46":" \"execution_time_in_seconds\": \"81.16\"","47":" \"Timestamp\": \"20240418 12:13:23\"","48":" }","49":" \"8\": {","50":" \"jobname\": \"VIPASNEU_STG\"","51":" \"status\": \"ENDED OK\"","52":" \"execution_time_in_seconds\": \"45.04\"","53":" \"Timestamp\": \"20240418 12:13:23\"","54":" }","55":"}"}} Please look into this and kindly help us in extraction of the job which contains string NODATA from the above set of job names that has been extracted     
Hi All, I have a json event which has test cases and test case status and jenkins build number. There are many test cases in my events. I want to find if any of the test case is failing more than in... See more...
Hi All, I have a json event which has test cases and test case status and jenkins build number. There are many test cases in my events. I want to find if any of the test case is failing more than in 5 jenkins build number continuously. If any of the test cases is failing continuously in 5 builds i want to list such test cases. I have tried streamstats but not able to implement it fully. Does anyone have a better approach on this? please guide me on this.  
Hi Community, I have a question about regex and extraction I have _raw data in 2 rows/lines  (key and value) and I have to extract filed with key and value e.g :  row 1 : Test1 Test2 Test3 Test... See more...
Hi Community, I have a question about regex and extraction I have _raw data in 2 rows/lines  (key and value) and I have to extract filed with key and value e.g :  row 1 : Test1 Test2 Test3 Test4 Test5 Test6 Test7 Test8 Test9 Test10 row 2:  101    102     103.    104.     105.   106.   107.   108.   109.    110      I have to extract only Test7 from above log and have print it's value in table  Pls help me  Regards, Moin
Hi @ITWhisperer , Thank you so much for the info. I referred the example mentioned above and i was able to get the answer.
@kiran_panchavat  Thank you for your response! I've already reached out to PowerConnect via email, but if anyone has access to a guide or documentation that could help me plan my solution more effect... See more...
@kiran_panchavat  Thank you for your response! I've already reached out to PowerConnect via email, but if anyone has access to a guide or documentation that could help me plan my solution more effectively, I would greatly appreciate it.
@chanathipRefer the below link. How to integrate Splunk with SAP HANA? - Splunk Community    
This still has not been solved in 2024.  I do mess a Heavy Forwarder group/tag as well.  Indexer group should only contain indexers, but we now have the HF in that group as well.
Hi Hardik, The query that i sent to you is working for the whole collectors (I have only 1 collector so it is showing only 1 )   If you have more than 1 collector you need to add "WHERE" clause ... See more...
Hi Hardik, The query that i sent to you is working for the whole collectors (I have only 1 collector so it is showing only 1 )   If you have more than 1 collector you need to add "WHERE" clause with "server-id" property in order to filter your exact collector match. (you can also find which server-id is equal to which collector name via Chrome 12 Network tools. I sent you a reference screenshot)  The query below will be useful for you to filter the exact collector, In my example, I'm only working with a collector that server-id =4838  you can find your collector's server-id via Chrome-FireFox brows. developer tools like this below, First Open you database collector via AppD controller UI below at the same time you can find the server-id detail over Browser's "Network-Response tab"  After finding the exact server-id property you can use this Select query below for the result If you want to compare result via Default dashboard widget as you can see it is also same below, Btw you can also find wait-stat-id explanation detail same way (over Chrome developer tool) If you want more detail please feel free. Thanks Cansel
Hi All, I want to know anyone has Document for guiding How to integrate Splunk with Hana using PowerConnect? Or Configuration.
Assuming this is supposed to be good JSON (which it isn't) and that you had missed a field name on the last object in the collection, you could try this. | spath ``` Fix up message to make a valid J... See more...
Assuming this is supposed to be good JSON (which it isn't) and that you had missed a field name on the last object in the collection, you could try this. | spath ``` Fix up message to make a valid JSON field ``` | eval message="{\"message\":".message."}" ``` Get the collection from message ``` | spath input=message message{} output=collection ``` Expand the collection into separate events ``` | mvexpand collection ``` Extract the fields ``` | spath input=collection ``` Assume you want the totals by ARUNAME ``` | stats sum(TOTAL) as Total, sum(PROCESSED) as Processed sum(REMAINING) as Remaining sum(ERROR) as Error sum(SKIPPED) as Skipped by ARUNAME For the first view, you would remove the by clause from the stats command
Suggestion is "don't use map". Map is an expensive, resource intensive, and slow command. Other ways to achieve this might be index=hello sourcetype=welcome | eventstats max(DATETIME) as LatestTime ... See more...
Suggestion is "don't use map". Map is an expensive, resource intensive, and slow command. Other ways to achieve this might be index=hello sourcetype=welcome | eventstats max(DATETIME) as LatestTime | where DATETIME=LatestTime | stats sum(HOUSE_TRADE_COUNT) as HOUSE_Trade_Count
Hi @slider8p2023 , good for you, see next time! I still don't use Dashboard Studio because it doesn't still have all the features I use of the Classical Dashboard! Ciao and happy splunking Giusep... See more...
Hi @slider8p2023 , good for you, see next time! I still don't use Dashboard Studio because it doesn't still have all the features I use of the Classical Dashboard! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Thanks @gcusello that seemed to work. I cloned the original dashboard panel by panel and saved it as a NON Dashboard studio dashboard. The schedule to export as PDF. I was un-aware the scheduling of... See more...
Thanks @gcusello that seemed to work. I cloned the original dashboard panel by panel and saved it as a NON Dashboard studio dashboard. The schedule to export as PDF. I was un-aware the scheduling of PDF exporting is not available in using Dashboard Studio.   
Also , i have the following error which is generated for only one previous alert , if you could please look and see what other steps I can take , if that helps   2024-04-18 05:18:47,938 +0000 ERROR... See more...
Also , i have the following error which is generated for only one previous alert , if you could please look and see what other steps I can take , if that helps   2024-04-18 05:18:47,938 +0000 ERROR sendemail:187 - Sending email. subject="Splunk Alert: ITSEC_Backup_Change_Alert", encoded_subject="Splunk Alert: ITSEC_Backup_Change_Alert", results_link="*****", recipients="['it-security@durr.com']", server="********" @marnall 
Thanks Yuanliu for your quick reply. Yes, I need % sign included. In the email body I need to color the data of percentage column like below: @yuanliu 
Hi @masakazu, in the deploymentclient.conf file of the Cluster Master, you have to add: repositoryLocation = $SPLUNK_HOME/etc/managed-apps in this way the deployment Server, only on the Cluster Ma... See more...
Hi @masakazu, in the deploymentclient.conf file of the Cluster Master, you have to add: repositoryLocation = $SPLUNK_HOME/etc/managed-apps in this way the deployment Server, only on the Cluster Master, doesn't deploy the apps into apps but in managed-apps folder. Then you have to push it by GUI (I'm not sure that's possible to automate it by DS). Put attention that this deploymentclient.conf must be different thant the one on the other clients. Ciao. Giuseppe
Hi @slider8p2023, you could try to clone it going in https://<your_host>/en-US/app/SplunkEnterpriseSecuritySuite/dashboards and cloning the dashboard, but I'm not sure that it's possible to schedule... See more...
Hi @slider8p2023, you could try to clone it going in https://<your_host>/en-US/app/SplunkEnterpriseSecuritySuite/dashboards and cloning the dashboard, but I'm not sure that it's possible to schedule it. Otherwise, you should create a custom clone of the Security Posture dashboard using the searches that you can extract from the original dashboard and then schedule it to send by eMail as a pdf. Ciao. Giuseppe
Yes absolutely , the new alerts or reports that I am creating is unable to get notified through emails...If you have any suggestion kindly help
Grazie Signor Gcusello! 1. Create an APP with index.con in DS 2. After deploying to the manager, received by manager-app 3. peer received on peer-app. Also, is it standard specification for the m... See more...
Grazie Signor Gcusello! 1. Create an APP with index.con in DS 2. After deploying to the manager, received by manager-app 3. peer received on peer-app. Also, is it standard specification for the manager server to receive data using manager-app? best regards