All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, I'm trying to get Sign-ins for Azure. It seems that add-on is only fetching interactive sign-ins and not-interactive not. IS there a possibility to fetch these also? They are showing in Azure con... See more...
Hi, I'm trying to get Sign-ins for Azure. It seems that add-on is only fetching interactive sign-ins and not-interactive not. IS there a possibility to fetch these also? They are showing in Azure console as "User sign-ins (non-interactive)" .
Hello   Until now, I was using this search :   [| inputlookup host.csv | table host] `fiability` | fields host Logfile SourceName ProductName | eval host=upper(host) | lookup fo_all HOSTN... See more...
Hello   Until now, I was using this search :   [| inputlookup host.csv | table host] `fiability` | fields host Logfile SourceName ProductName | eval host=upper(host) | lookup fo_all HOSTNAME as host output SITE | search Logfile=Application AND (SourceName="Application Hang" OR SourceName="Application Error") | search (ProductName=*) | stats count(eval(SourceName="Application Error")) as "Number of Errors", count(eval(SourceName="Application Hang")) as "Number of Hang", count as "Number of crashes" by ProductName | sort -"Number of crashes" | head 10     The host.csv lookup was updated manually in order to add new hostname The lookup host.csv is now replace by a KV store called "cmdb_fo_all"   | inputlookup cmdb_fo_all where TYPE="Ind"   With this KV store, it's possible to filter automatically the type of hostname I need Please also note that the field host is called "HOSTNAME" in this KV Now I need to replace the lookup by the KV store in my search Could you help me please?
How to read logs from Vmware- vRealize log insight ?   is there an app?
I have Splunk logs with data that is roughly like this: Timestamp adapterName responseCode xx A 1 xx A 2 xx B 1 xx B 2    For each combination of (adapterName, response... See more...
I have Splunk logs with data that is roughly like this: Timestamp adapterName responseCode xx A 1 xx A 2 xx B 1 xx B 2    For each combination of (adapterName, responseCode), I want to: 1. determine its average count for the last 24 hours (excluding the most recent 1 hour) 2. determine its count for the most recent hour 3. create alerts if its last hour count exceeds 1 standard deviation from the average counts of the last 24 hours There are other similar questions, and I have referenced them (e.g. https://community.splunk.com/t5/Alerting/Alert-when-sample-is-2-standard-deviations-from-moving-average/m-p/132829). However, the difference is that I have multiple (adapterName, responseCode) combinations, so the query is not as straightforward and I cannot use `time chart` etc easily. Here's my own attempt so far:   index=itms "searchTerm" earliest=-25h@h latest=@h | rex "\"adapterName[\":]*(?<adapter>[A-Z]+)\"" | rex "\"name[\":]*(?<responseCode>[\w]+)\"" | eval StartTime=relative_time(now(), "-1h@h") | eval series=if(_time>=StartTime, "lastHour", "last24Hours") | bin _time span=1h | stats count As hourlyCount BY adapter, responseCode, series, _time | stats avg(hourlyCount) AS averageCount stdev(hourlyCount) as standardDev by adapter, responseCode, series   And I am very stuck with a table like this: adapterName responseCode series averageCount standardDev A 1 lastHour ... 0 A 1 last24Hours ... ... A 2 lastHour ... 0 A 2 last24Hours ... ... B 1 lastHour ... 0 B 1 last24Hours ... ... B 2 lastHour ... 0 B 2 last24Hours ... ... And I have no idea how to proceed. I think the query also made the table more complicated, but I am not sure. Can someone give some advice here?
I use an inputlookup to fill a multiselect/dropdown-input.   |inputlookup Errornumber 12 44 68    If i now use a multiselect with token "input_error_number", with "field for value" is "Errornumb... See more...
I use an inputlookup to fill a multiselect/dropdown-input.   |inputlookup Errornumber 12 44 68    If i now use a multiselect with token "input_error_number", with "field for value" is "Errornumber", then the multiselect properly fills the token, so    $input_error_number$ = 12   for example. If i replace the multiselect by a dropdown, then it fills the token with   $input_error_number$ = 1 OR 2    This is not desired. Does somebody know how to fix it? Thx
Hi team, with below query, I can't get expected result with the bins splitted by every 2 hour which I specified by "| bin span=2h TIME" index=*bizx_application AND sourcetype=perf_log_bizx AND Auto... See more...
Hi team, with below query, I can't get expected result with the bins splitted by every 2 hour which I specified by "| bin span=2h TIME" index=*bizx_application AND sourcetype=perf_log_bizx AND AutoSaveForm OR SaveFormV2 OR SaveForm | eval TIME=strftime(_time,"%Y-%m-%d %H:%M:%S") | bin span=2h TIME | stats count by TIME SFDC   The result I got from above query is below table. As you see, the TIME column, it's not splitted by 2 hour. What's wrong here?    
Hello All, I was looking for download link for Splunk enterprise 6.6.1 but it appears it is not listed here  https://www.splunk.com/en_us/download/previous-releases.html#tabs/linux . Is there any ot... See more...
Hello All, I was looking for download link for Splunk enterprise 6.6.1 but it appears it is not listed here  https://www.splunk.com/en_us/download/previous-releases.html#tabs/linux . Is there any other download link available?
When a valid sourcetype is not showing up in "Data Summary" under "sourcetypes", what does it mean, and how do I get it to show up? The sourcetype in question can be searched; there's nothing about ... See more...
When a valid sourcetype is not showing up in "Data Summary" under "sourcetypes", what does it mean, and how do I get it to show up? The sourcetype in question can be searched; there's nothing about that particular sourcetype in "health checks". One other place where the sourcetype does not show up: Settings - Add Data. Whether I am uploading a file or setting up a file watcher, the sourcetype isn't there among the choices to assign the new data to. Splunk Enterprise 8.1, clustered indexers, single SH. A Deployment Server is used to distribute SUF configurations. P.S. This is related to my other recent question, "troubleshooting props.conf".
Hello Team, This is for Splunk Cloud Version 7.2 . I have a dashboard and by default it runs for a time range of last 24 hours. Time picker is also available on the dashboard (for the visualization... See more...
Hello Team, This is for Splunk Cloud Version 7.2 . I have a dashboard and by default it runs for a time range of last 24 hours. Time picker is also available on the dashboard (for the visualizations and reports). I need to generate a PDF and schedule it's delivery to a list of users every 3 months. The issue is that I want the time range for this dashboard to be for a period of 3 months instead of last 24 hours before publishing the PDF. Is it possible to achieve the same or do I need to create separate a dashboard for a period of 3 months to achieve the same? I tried looking up for answers over the community but I was unable to find the one which could answer my query. Any help/suggestion would be greatly appreciated.  Thanks ! 
Hi All, We are trying to organise some monitoring / Alerting for users and search disk usage and I know SplunkAdmins app has some stuff, but we need something a little different. What I need atm is... See more...
Hi All, We are trying to organise some monitoring / Alerting for users and search disk usage and I know SplunkAdmins app has some stuff, but we need something a little different. What I need atm is a way to determine a users effective settings as most users have at least 2 or more roles and I haven't found any clear way to determine what a given users allowance is to be able to configure an alert against. Not sure if I have just missed something simple? Hoping someone out there might have some suggestions. Thanks in advance!
hi I try to improve the performances of the search below "fo all" is a KV store with 454000 lines This search takes approximatively 14 secondes When I have a look to the job inspector, the "appen... See more...
hi I try to improve the performances of the search below "fo all" is a KV store with 454000 lines This search takes approximatively 14 secondes When I have a look to the job inspector, the "appendcols' command takes 11 secondes Is there a solution to improve the performances of this search?? Thanks     | inputlookup tablet_host.csv | lookup lookup_pana "name0" as host OUTPUT CycleCount0 | where CycleCount0 > 300 | lookup fo_all HOSTNAME as host output SITE | search SITE=$tok_filtersite|s$ | stats count as NbHostCycleSup300 | appendcols [| inputlookup host.csv | lookup fo_all HOSTNAME as host output SITE | search SITE=$tok_filtersite|s$ | stats count as NbIndHost] | eval NbHostCycleInf300 = (NbIndHost - NbHostCycleSup300) | eval NbHostCycleSup300=NbHostCycleSup300, NbHostCycleInf300=NbHostCycleInf300 | table NbHostCycleSup300 NbHostCycleInf300 SITE | rename NbHostCycleSup300 as "> 300", NbHostCycleInf300 as "< 300" | transpose        
Hi, I am trying to ingest huawei USG6650 device logs but it seems that no app is available in splunk base for this purpose. Is there any other way/guide for this?
I have below query which will get results from other panels and corresponding results will get stored here. I have used global variable to get the results from other panels. | makeresults | eval AU... See more...
I have below query which will get results from other panels and corresponding results will get stored here. I have used global variable to get the results from other panels. | makeresults | eval AUGCB="$AUGCB-PROD$", AUCFS="$AUCFS-PROD$", AUVMA="$AUVMA-PROD$" | stats values(AUGCB) as AUGCB, values(AUCFS) as AUCFS, values(AUVMA) as AUVMA Currently my output showing as below after applying trellis. But i couldn't able to do trellis drilldown as my query doesn't have any Split by field. So requirement is how to make Split by field in my query for trellis drilldown. I tried to use both $trellis.value$ and $trellis.name$, but no luck.
Hello I used a scheduled search in a table panel of my dashbaord   | loadjob savedsearch="admin:TOTO_sh:Event - BSOD"   From this table panel, I use a drilldown in order to display more details... See more...
Hello I used a scheduled search in a table panel of my dashbaord   | loadjob savedsearch="admin:TOTO_sh:Event - BSOD"   From this table panel, I use a drilldown in order to display more details But in this drilldown, I put the search corresponding to the scheduled dashboard search So when I run my drilldown, I have new events regarding the scheduled search I know it's normal but is there a solution to hame the same events that there is in my scheduled search even if I dont also use a scheduled search in my dashboard? Thanks
We have scenario where we run a indexer cluster with 10+ indexers and the Universal Forwarders send data to all these indexers. Since, all these indexers do have different Common Names (CN), they do ... See more...
We have scenario where we run a indexer cluster with 10+ indexers and the Universal Forwarders send data to all these indexers. Since, all these indexers do have different Common Names (CN), they do have different SSL certificates and we are left with an option to use server-specific [SSLConfig] stanza for each indexer in the outputs.conf of each forwarder. This solution has challenges when we are managing larger indexer clusters, where we keep adding / removing indexers from Indexer Cluster Hence, would like to pursue on the alternate options available for the same. Does Splunk support SSL certificate Subject Alternative Name (SAN). - SSL certificate that has the ability to cover multiple host names (domains)? Any alternate options available are also much appreciated...
Hello Everyone,  I'm in a bit of a brain pickle right now and hoping the community can help. I have a Linux box with a UF on it. Currently it is setup to send to a HF with SSL configured on the port... See more...
Hello Everyone,  I'm in a bit of a brain pickle right now and hoping the community can help. I have a Linux box with a UF on it. Currently it is setup to send to a HF with SSL configured on the port. I'm now in a situation where I need to allow that same UF to send to a different HF with a different SSL Cert.  I thought this wouldn't be an issue, I know how to go into outputs.conf and specify two different outputs variables and I even know on my inputs.conf file I can specify to monitor the same file to 2 different indexers with different indexes.  What I don't know is how this all works in the server.conf file. On both HF/Indexers I have a server.conf file setup, how do I get this to work on the UF? Is there a way for me to specify 2 different HF/Indexers SSL configs in server.conf like you can with outputs.conf? Any help would be appreciated!
Hi  Can anyone help me why the below search is not working.  index=aws sourcetype=aws:cloudtrail eventName=Create* OR eventName=Run* OR eventName=Attach* |stats count by src eventName | iplocatio... See more...
Hi  Can anyone help me why the below search is not working.  index=aws sourcetype=aws:cloudtrail eventName=Create* OR eventName=Run* OR eventName=Attach* |stats count by src eventName | iplocation src   Thanks
After doing testing using Site improve(accessibility checker ) I am facing this issue and also Splunk is not allowing aria-label in the label field also we are using" Splunk cloud" instance  Below ... See more...
After doing testing using Site improve(accessibility checker ) I am facing this issue and also Splunk is not allowing aria-label in the label field also we are using" Splunk cloud" instance  Below suggestion I get from site improve : " Text areas should always have a description that is explicitly associated with the area to make sure that users of assistive technologies will also know what the area is for. If the text area has a visible description indicating the purpose of the area, this description should be explicitly associated to the text area either as a HTML label or using the WAI-ARIA attribute aria-labelled by. If it is not possible to add a visible description, either add a mouseover text ('title' attribute) to the text area or create an invisible label using the WAI-ARIA attribute 'aria-label'.  " below way i have tried, but i get warning -Unknown attribute "for" for node "label" <input type="dropdown" token="id" id="search"> <label for="search">Search with ID</label> and also this way ,warning -Unknown attribute "aria-label" for node "input" <input type="dropdown" token="id" aria-label="search"> I get warnings for both. Can anyone suggest a fix for this issue ?
Hi Splunkers, Need your help, i have DBXQuery like this : | dbxquery connection="myconn" query="sdbxquery connection=monsplunk_ibank query="select CREATED_BY, BILLER_CUST_ID, BILLER_NAME, TRX_AMT a... See more...
Hi Splunkers, Need your help, i have DBXQuery like this : | dbxquery connection="myconn" query="sdbxquery connection=monsplunk_ibank query="select CREATED_BY, BILLER_CUST_ID, BILLER_NAME, TRX_AMT as AMOUNT from "APPDB"."TBL_APP_PYMT_PURCHASE"" and i want to filter the data where CREATED_BY not in : index=mobile_purchase_idx  | fields CREATED_BY I have tried using join type inner but the result still show all data from the index, not the negative, this is my query : | dbxquery connection=monsplunk query="select CREATED_BY, BILLER_CUST_ID, BILLER_NAME, TRX_AMT as AMOUNT from "APPDB"."TBL_APP_PYMT_PURCHASE"" | join type=inner CREATED_BY [| search index=mobile_purchase_idx latest=now | fields CREATED_BY] | table CREATED_BY, BILLER_CUST_ID, BILLER_NAME  
Hi All,   I have a query to extract the from the xml log data -------------- 2020-10-22 11:29:23,712 INFO (default-workqueue-73) [com.xyz.uyt.eds.building.documents.btw.house] ID-apprdhgr001-co-d... See more...
Hi All,   I have a query to extract the from the xml log data -------------- 2020-10-22 11:29:23,712 INFO (default-workqueue-73) [com.xyz.uyt.eds.building.documents.btw.house] ID-apprdhgr001-co-dmz-33780-17w876834676-5-w87434 building-documents-vic:pull-order-orchestration-v1 pull Order Response : <v1:SubmitOrderResponse xmlns:v1="http://www.xyz.com/services/building/documents/btw/v1" xmlns:xs="http://www.w3.org/2001/XMLSchema"> <v1:context> <v1:chnnlcd>06</v1:chnnlcd> <v1:user> <v1:memCode>5436</v1:memCode> <v1:brCode>7654</v1:brCode> <v1:usrCode>7634B</v1:usrCode> </v1:user> <v1:clientReferences> <v1:clientReference type="type0">1234 - amit</v1:clientReference> </v1:clientReferences> <v1:transaction> <v1:transactionDateTime>2020-10-22T11:29:00Z</v1:transactionDateTime> </v1:transaction> <v1:conversationId>2</v1:conversationId> </v1:context> <v1:documents> <v1:document sequenceNo="1"> <v1:identifer>67</v1:identifer> <v1:name>Reg Src Statement (Title)</v1:name> <v1:description/> <v1:organisations> <v1:authority> <v1:identifier>320</v1:identifier> <v1:name/> </v1:authority> </v1:organisations> <v1:fee>5.9</v1:fee> <v1:gst>0</v1:gst> <v1:deliveryFee>11</v1:deliveryFee> <v1:deliveryGst>1</v1:deliveryGst> <v1:recoupmentFee>5.9</v1:recoupmentFee> <v1:totalFeeExclGst>15.9</v1:totalFeeExclGst> <v1:totalGst>1</v1:totalGst> </v1:document> <v1:document sequenceNo="8"> <v1:identifer>75</v1:identifer> <v1:name>Index Search</v1:name> <v1:description/> <v1:organisations> <v1:authority> <v1:identifier>320</v1:identifier> <v1:name>Index Registry</v1:name> </v1:authority> </v1:organisations> <v1:fee>6.91</v1:fee> <v1:gst>0.63</v1:gst> <v1:deliveryFee>11</v1:deliveryFee> <v1:deliveryGst>1</v1:deliveryGst> <v1:recoupmentFee>0</v1:recoupmentFee> <v1:totalFeeExclGst>16.28</v1:totalFeeExclGst> <v1:totalGst>1.63</v1:totalGst> </v1:document> <v1:document sequenceNo="9"> <v1:identifer>238</v1:identifer> <v1:name>Plan Copy</v1:name> <v1:description/> <v1:organisations> <v1:authority> <v1:identifier>320</v1:identifier> <v1:name>Plan 8T6543</v1:name> </v1:authority> </v1:organisations> <v1:fee>5.84</v1:fee> <v1:gst>0</v1:gst> <v1:deliveryFee>11</v1:deliveryFee> <v1:deliveryGst>1</v1:deliveryGst> <v1:recoupmentFee>5.84</v1:recoupmentFee> <v1:totalFeeExclGst>15.84</v1:totalFeeExclGst> <v1:totalGst>1</v1:totalGst> </v1:document> <v1:document sequenceNo="10"> <v1:identifer>1</v1:identifer> <v1:name>=Tax Certificate</v1:name> <v1:description/> <v1:organisations> <v1:authority> <v1:identifier>34</v1:identifier> <v1:name>Revenue Cert</v1:name> </v1:authority> </v1:organisations> <v1:fee>20.27</v1:fee> <v1:gst>1.84</v1:gst> <v1:deliveryFee>11</v1:deliveryFee> <v1:deliveryGst>1</v1:deliveryGst> <v1:recoupmentFee>0</v1:recoupmentFee> <v1:totalFeeExclGst>28.43</v1:totalFeeExclGst> <v1:totalGst>2.84</v1:totalGst> <v1:serviceLevel> <v1:period>10</v1:period> </v1:serviceLevel> </v1:document> <v1:document sequenceNo="89"> <v1:identifer>80</v1:identifer> <v1:name>Information Certificate</v1:name> <v1:description/> <v1:organisations> <v1:authority> <v1:identifier>249</v1:identifier> <v1:name>Check</v1:name> </v1:authority> </v1:organisations> <v1:fee>35.690002</v1:fee> <v1:gst>3.24</v1:gst> <v1:deliveryFee>11</v1:deliveryFee> <v1:deliveryGst>1</v1:deliveryGst> <v1:recoupmentFee>0</v1:recoupmentFee> <v1:totalFeeExclGst>42.45</v1:totalFeeExclGst> <v1:totalGst>4.24</v1:totalGst> <v1:serviceLevel> <v1:period>5</v1:period> </v1:serviceLevel> </v1:document> <v1:document sequenceNo="26"> <v1:identifer>23</v1:identifer> <v1:name>Other Information</v1:name> <v1:description/> <v1:organisations> <v1:authority> <v1:identifier>698</v1:identifier> <v1:name>RIVER WATER</v1:name> </v1:authority> </v1:organisations> <v1:fee>87.04</v1:fee> <v1:gst>7.91</v1:gst> <v1:deliveryFee>11</v1:deliveryFee> <v1:deliveryGst>1</v1:deliveryGst> <v1:recoupmentFee>0</v1:recoupmentFee> <v1:totalFeeExclGst>89.13</v1:totalFeeExclGst> <v1:totalGst>8.91</v1:totalGst> <v1:serviceLevel> <v1:period>7</v1:period> </v1:serviceLevel> </v1:document> </v1:documents> <v1:details> </v1:details> </v1:SubmitOrderResponse> ----------------- Similar lines are there in the log files but the entries vary as per the record. Hence, I need to output to extract <v1:fee>97.99</v1:fee> <v1:gst>8.9</v1:gst> <v1:deliveryFee>11</v1:deliveryFee> <v1:deliveryGst>1</v1:deliveryGst> <v1:recoupmentFee>0</v1:recoupmentFee> <v1:totalFeeExclGst>99.09</v1:totalFeeExclGst> <v1:totalGst>9.9</v1:totalGst>   And show them in tabular format for each memcode, brcode and usrcode. I tried using spath and xpath and xmlkv but none of them are working as expected. With xmlkv, it shows only the last value or one value. The xpath and spath are not extracting any data as shown in my below queries   <main search> | xpath outfield=memCode "/v1:SubmitOrderResponse/v1:context/v1:user/v1:memCode" | table outfield, _time <main search> | spath outfield=v1:memCode path=v1:SubmitOrderResponse.v1:context.v1:user.v1:memCode | xmlkv| table outfield, _time   Can you please advise.   Thanks, Amit