All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello, I'd like to use some other shapefile or KML/Z geo files. Another team in my organisation is tasked with the job to maintain and publish it on WFS or WMS endpoints. I couldn't find any docum... See more...
Hello, I'd like to use some other shapefile or KML/Z geo files. Another team in my organisation is tasked with the job to maintain and publish it on WFS or WMS endpoints. I couldn't find any documentation to help me to connect to geo files outside of splunk environment. Updating the lookup file is ok for testing but not in a production environment. Could you get me some pointers ? Is a WFS connexion possible ? Is there a doc I missed ? Thanks, Eglantine
Could someone please explain what are the scenarios where having a data-model would be important rather than using Reports ?   Until now i have been using scheduled reports to prepare data to be us... See more...
Could someone please explain what are the scenarios where having a data-model would be important rather than using Reports ?   Until now i have been using scheduled reports to prepare data to be used in dashboard visuals but came across data models and am not able to understand the point since a reporting mechanism is already available.
Hi all, I have 2 dashboards. The first dashboard has a table with 6 columns. Currently i am passing a value from a column "name" as a drill down token to another dashboard. There is column "URL" of ... See more...
Hi all, I have 2 dashboards. The first dashboard has a table with 6 columns. Currently i am passing a value from a column "name" as a drill down token to another dashboard. There is column "URL" of the "name". I want to pass the value of URL of the selected name as token to be used in the second dashboard. I don't know how to pass the URL of the selected name to the second dashboard. Tried setting the URL like this. But its not working. <drilldown> <link target="_blank">/app/abcd/dashboard1?name=$row.PName$</link> <set token="tok_url">$row.URL$</set> </drilldown> Can anyone help me in this..
Hi, i want to extract bytes fields (using the bytes values) from this: Sep 23 14:11:52 XXX.XXX.X.XX date=2021-09-23 time=14:11:52.004 device_id=FE-3KET123 log_id=6716781232 type=event subtype=smtp p... See more...
Hi, i want to extract bytes fields (using the bytes values) from this: Sep 23 14:11:52 XXX.XXX.X.XX date=2021-09-23 time=14:11:52.004 device_id=FE-3KET123 log_id=6716781232 type=event subtype=smtp pri=information user=mail ui=mail action=NONE status=N/A session_id="47K0CjSc111111-47K0CjSc111111" msg="to=<XXXXXXXX@hotmail.com>, delay=00:00:04, xdelay=00:00:04, mailer=esmtp, pri=61772, relay=hotmail-com.olc.protection.outlook.com. [XXX.XX.XX.XXX], dsn=2.0.0, stat=Sent (<d97263bhagstbhbhet7c01f54636vfd37@GGP0HSDVVHHA9.XXX.XXX.XXX> [InternalId=32836723661134, Hostname=XXXXXXXXXX.namXXXX.prod.outlook.com] 71422 bytes in 0.303, 229.746 KB/sec Queued mail for delivery -> 250 2.1.5)" I've already found the regex -    (?<bxmt>\d+) bytes But it didnt seem to work fine. Can anyone help?
Hi Community team, I have an issue whenever I enable the this add-on on my Search Head with this below error, Problem replicating config (bundle) to search peer ' X.X.X.X:8089 ', Upload bundle="E:\S... See more...
Hi Community team, I have an issue whenever I enable the this add-on on my Search Head with this below error, Problem replicating config (bundle) to search peer ' X.X.X.X:8089 ', Upload bundle="E:\Splunk\var\run\SPL-SH2-1630562214.bundle" to peer name=SPL-Ind3 uri=https://X.X.X.X:8089 failed; http_status=400 http_description="Failed to untar the bundle="E:\Splunk\var\run\searchpeers\SPL-SH2-1630562214.bundle". This could be due Search Head attempting to upload the same bundle again after a timeout. Check for sendRcvTimeout message in splund.log, consider increasing it.". Health Check: One or more apps ("TA-microsoft-graph-security-add-on-for-splunk") that had previously been imported are not exporting configurations globally to system. Configuration objects not exported to system will be unavailable in Enterprise Security. Note: we had increased sendRcvTimeout in distsearch.conf at both SH to 900 as per our requirement need. We are using Splunk Enterprise 8.0.5 on premise with 2 SH (1 with ES), 3 IDX, 1 Deployment/MC, 1 LM, 1 HF Anyone ever experiencing this issue or successfully installed and use the add-on in your environment?.. Appreciate the feedback, thanks  
Hi, I have the below log entry, can you help with the regex to extract the line in Red. The regex i have is not working properly in props.conf   2021-09-23 19:03:40.802 INFO 1 --- [sdgfsdgsdfgsdfg... See more...
Hi, I have the below log entry, can you help with the regex to extract the line in Red. The regex i have is not working properly in props.conf   2021-09-23 19:03:40.802 INFO 1 --- [sdgfsdgsdfgsdfg] asdfasdfasdfasfasfgfdhdfhdf : Response --> { "claimId" : asfdasdfadf, "claimFilerId" : "sadfasdf", "vendorName" : "asfasfadfadf. ", "vendorId" : "aefadf", "vendorAddressId" : "asfafsd", "vendorAddress" : "sdfgsdgsfg", "preparedDate" : "09-22-2021", "receivedDate" : "09-22-2021", "enteredDate" : "09-22-2021", "assignedTo" : { "employeeId" : "sdfasdf ", "firstName" : "asfasf", "lastName" : "zsdfdf", "adUserIdentifier" : "zsdfvzdv" }, "correspondence" : { "type" : { "code" : 5947, "shortName" : "EOB", "longName" : "EOB" }, "dispatchCode" : { "code" : 5947, "shortName" : "NtRqd", "longName" : "Not Required" }, "emailAddress" : "abcd@g.com,       dgfh@a.in" }
I want to count up my aws resources by region, and show it like a heatmap. How do I table it in a way where my account IDs are going down the left, the aws regions are going across the top, and the t... See more...
I want to count up my aws resources by region, and show it like a heatmap. How do I table it in a way where my account IDs are going down the left, the aws regions are going across the top, and the table is the count of of events for the specific combination? There are some heatmap apps but they require time to be on the x axis which is not what I want here.
Hi all, I'm trying hard to add data into Splunk from a .csv file instead of .json. I managed to convert it from .json to .csv and now, when i try to alter <Timestamp format > using strptime() is s... See more...
Hi all, I'm trying hard to add data into Splunk from a .csv file instead of .json. I managed to convert it from .json to .csv and now, when i try to alter <Timestamp format > using strptime() is showing me time from the adding time, not the time from the field time inside the .csv that is in Epoch Unix Timestamp. I have read this resource,  https://docs.splunk.com/Documentation/SplunkCloud/8.2.2107/Data/Configuretimestamprecognition but to no avail ... Please advice ...  
I got the following error when a setting a data input in DB Connect -     java.lang.NullPointerException at java.net.URLDecoder.decode(Unknown Source) at com.splunk.dbx.utils.PercentEncodingQue... See more...
I got the following error when a setting a data input in DB Connect -     java.lang.NullPointerException at java.net.URLDecoder.decode(Unknown Source) at com.splunk.dbx.utils.PercentEncodingQueryDecoder.decode(PercentEncodingQueryDecoder.java:20) at com.splunk.dbx.command.DbxQueryCommand.getParams(DbxQueryCommand.java:274) at com.splunk.dbx.command.DbxQueryCommand.generate(DbxQueryCommand.java:350) at com.splunk.search.command.GeneratingCommand.process(GeneratingCommand.java:183) at com.splunk.search.command.ChunkedCommandDriver.execute(ChunkedCommandDriver.java:109) at com.splunk.search.command.AbstractSearchCommand.run(AbstractSearchCommand.java:50) at com.splunk.search.command.GeneratingCommand.run(GeneratingCommand.java:15) at com.splunk.dbx.command.DbxQueryCommand.runCommand(DbxQueryCommand.java:256) at com.splunk.dbx.command.DbxQueryServer.lambda$handleQuery$1(DbxQueryServer.java:144) at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) at java.util.concurrent.FutureTask.run(Unknown Source) at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.lang.Thread.run(Unknown Source)       Has anybody seen it before?
I have a log as a below cod:5678,status:600 cod:9012,staus:600 cod:1234,status:600 cod: 1234,status:900 cod:4987,status:600 cod:4987,status:900 cod:3655,status:600 cod:3655,status:900 I ... See more...
I have a log as a below cod:5678,status:600 cod:9012,staus:600 cod:1234,status:600 cod: 1234,status:900 cod:4987,status:600 cod:4987,status:900 cod:3655,status:600 cod:3655,status:900 I need a query that give me this result cod status1 status2 1234 600 900 5678 600   9012 600   4987 600 900 3655 600 900   how can i write a query for this? Thanks
Issue I'm facing: My use case is to detect a successful ssh login from an external ip_address. I have my linux logs in: index=linux_logs These logs have a field called "hostname". "hostname" is so... See more...
Issue I'm facing: My use case is to detect a successful ssh login from an external ip_address. I have my linux logs in: index=linux_logs These logs have a field called "hostname". "hostname" is sometimes a FQDN and sometimes it's an ip_address. I have an asset list (lookup file),  assets.csv.  Not all of the FQDN from the linux_logs are in this list. Here is my initial query: index=linux_logs sourcetype=syslog exe="/usr/sbin/sshd" res=success NOT hostname=? | stats count, min(_time) as first_time, max(_time) as last_time, values(dest) as dest, values(hostname) as src by acct | lookup assets.csv dns AS src OUTPUT ip | fillnull value=no_ip ip   A sample of the results: acct count first_time last_time dest hostname ip user1 50 epoch_time_format epoch_time_format host1.mycompany.com src1.mycompany.com 10.36.25.14 user2 40 epoch_time_format epoch_time_format host3.mycompany.com src3.mycompany.com no_ip    I want to eliminate the RFC1918 and keep the "no_ip" and ip's outside of the RFC1918 ranges. I do have a lookup for the rfc1918 ranges but I'm struggling with how to write the spl to check the "ip" field for what I need. Any help is greatly appreciated.
I have SAI 2.0.2 installed on my SHC and the SAI Addon on the indexer cluster. Went ahead and installed the full IT Essentials on the SHC and installed the SA_IndexCreation on the indexer cluster per... See more...
I have SAI 2.0.2 installed on my SHC and the SAI Addon on the indexer cluster. Went ahead and installed the full IT Essentials on the SHC and installed the SA_IndexCreation on the indexer cluster per the instructions. So far I only see 2 entities in IT Essentials. The ironic thing is that SAI app is showing much more. Anything I missed? Perhaps SAI and IT Essentials for Work apps are in conflict with one another?
Hello, We are using inputs.conf and props.conf to ingest a flat csv file. The issue we are having is the sourcetype name is appending a -2 to the sourcetype even though it is a unique name. Example: ... See more...
Hello, We are using inputs.conf and props.conf to ingest a flat csv file. The issue we are having is the sourcetype name is appending a -2 to the sourcetype even though it is a unique name. Example: sourcetype=sourcetypename | results sourcetypename-2 #inputs.conf [monitor://C:\Import\sample.csv] index= test sourcetype= sourcetypename #props.conf [sourcetypename] FIELD_DELIMITER=, CHECK_FOR_HEADER = true HEADER_MODE = firstline  Any help would be appreciated!
Hi There, I need to download the dashboard in PNG file using CLI, is there any way to do this?  Splunk Version: 8.2, Dashboard type:  Dashboard Studio. Thank you in advance for your help.
Hi, I am asking if it's possible to ingest logfiles where one logline would contain a DateTime and the following lines only contain Time, until the next entry with a DateTime. If we ignore Date as a... See more...
Hi, I am asking if it's possible to ingest logfiles where one logline would contain a DateTime and the following lines only contain Time, until the next entry with a DateTime. If we ignore Date as a whole by using a custom time DateTime format only consisting of %H:%M:%S it's using the creation date of the file and the time pulled from the individual event. While that works without issues for files containing less than 24 hours it fails for files containing more than 24 hours of data: ### Job STARTED at 2021/09/21 00:30:00 [INFO ] 00:30:01 This is a test message [WARN ] 01:15:01 This is a warning message ### Job STARTED at 2021/09/22 06:10:00 [INFO ] 06:10:01 This is a test message [WARN ] 07:11:00 This is a warning message Regards  
Here is log example -  http://host/manager/resource_identifier/ids/getOrCreate/bulk?dscid=LuSxrA-1c42bb5b-f862-4861-892f-69320e1a59e7:200 Created:78 I need to extract string after ids/ untill first... See more...
Here is log example -  http://host/manager/resource_identifier/ids/getOrCreate/bulk?dscid=LuSxrA-1c42bb5b-f862-4861-892f-69320e1a59e7:200 Created:78 I need to extract string after ids/ untill first ? or :  So output would be - getOrCreate/bulk I am trying this -  rex field=log ":(?<url>ids\/[^?: ]*)"   What am I missing?
Please help with SPLs to find list of my Splunk server instances, FWs & Indexers. Need Splunk version & machine names & IPs. Thx a million in advance. What is the best order to upgrade them all to Sp... See more...
Please help with SPLs to find list of my Splunk server instances, FWs & Indexers. Need Splunk version & machine names & IPs. Thx a million in advance. What is the best order to upgrade them all to Splunk 8.2.2.? 
Hi splunkers, We are having  multisite architecture in our organization which is running on splunk version 7.3.6. This version ig going to end of support after 22 october 2021. We are upgrading vers... See more...
Hi splunkers, We are having  multisite architecture in our organization which is running on splunk version 7.3.6. This version ig going to end of support after 22 october 2021. We are upgrading version to 8.1 but after upgrade are not able to login into splunk through web.  Even in internal logs are seeing this error message "ERROR [614c8c71667f891f364ad0] config:140 - [HTTP 401] Client is not authenticated Traceback (most recent call last):". But on test environemnt we haven't got this issue. For login into splunk having facility of LDAP and IDM . All IDM roles are mapping and working fine with existing version and also working on test environment with version 8.1.
Our MPS team provisions IAM user for non-personalized access.  All IAM Access Keys need to be rotated every 90 days. MPS has created a rotation service which helps you to fulfil this requirement. Th... See more...
Our MPS team provisions IAM user for non-personalized access.  All IAM Access Keys need to be rotated every 90 days. MPS has created a rotation service which helps you to fulfil this requirement. This impacts Splunk as the Key ID within Configuration - Account will need to be updated once a new Key has been created. Is there a solution where Splunk can automatically update the Key ID value once a new access key has been created?
Hi, I have some saved queries that return results that do not show any result when included in a dashboard as a panel.     <panel> <title>PAN - GlobalProtect session duration AVG</title> <table>... See more...
Hi, I have some saved queries that return results that do not show any result when included in a dashboard as a panel.     <panel> <title>PAN - GlobalProtect session duration AVG</title> <table> <search ref="PAN - GLobalProtect session duration AVG last 7d"></search> <option name="drilldown">none</option> </table> </panel>   but when I click on the magnifier glass in the dashboard , it runs the query in a search box and it shows the correct results. what am I missing here? I checked permissions in the saved search and everyone has read permission. cheers