Activity Feed
- Got Karma for Why am I unable to index a file if the size is too small? Is there a minimum file size setting?. 12-27-2020 01:33 AM
- Got Karma for Column to row conversion. 06-05-2020 12:49 AM
- Got Karma for Why am I getting error "File will not be read, is too small to match seekptr checksum" trying to parse and index XML data?. 06-05-2020 12:48 AM
- Got Karma for Why am I getting error "File will not be read, is too small to match seekptr checksum" trying to parse and index XML data?. 06-05-2020 12:48 AM
- Got Karma for Is it possible to edit props.conf from Splunk Web?. 06-05-2020 12:47 AM
- Got Karma for Is it possible to edit props.conf from Splunk Web?. 06-05-2020 12:47 AM
- Got Karma for Is it possible to edit props.conf from Splunk Web?. 06-05-2020 12:47 AM
- Got Karma for SPLUNK Tableau no data. 06-05-2020 12:47 AM
- Posted Re: Compare two fields tables on Splunk Search. 10-06-2017 04:43 AM
- Posted Compare two fields tables on Splunk Search. 10-05-2017 08:05 PM
- Tagged Compare two fields tables on Splunk Search. 10-05-2017 08:05 PM
- Tagged Compare two fields tables on Splunk Search. 10-05-2017 08:05 PM
- Posted Column to row conversion on Splunk Search. 09-05-2017 08:20 PM
- Posted Invoking external Rest request using SPLUNK REST API on Deployment Architecture. 04-05-2017 05:38 PM
- Posted Re: SPLUNK Tableau no data on Getting Data In. 02-26-2016 08:41 AM
- Posted Why am I getting error "File will not be read, is too small to match seekptr checksum" trying to parse and index XML data? on Dashboards & Visualizations. 02-24-2016 02:24 PM
- Tagged Why am I getting error "File will not be read, is too small to match seekptr checksum" trying to parse and index XML data? on Dashboards & Visualizations. 02-24-2016 02:24 PM
- Tagged Why am I getting error "File will not be read, is too small to match seekptr checksum" trying to parse and index XML data? on Dashboards & Visualizations. 02-24-2016 02:24 PM
- Tagged Why am I getting error "File will not be read, is too small to match seekptr checksum" trying to parse and index XML data? on Dashboards & Visualizations. 02-24-2016 02:24 PM
- Posted Re: Is there a way to validate the URI for HTTP Event Collector? on Getting Data In. 02-23-2016 09:02 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
1 | |||
0 | |||
2 | |||
0 | |||
1 | |||
3 | |||
0 | |||
1 | |||
0 |
10-06-2017
04:43 AM
Thanks Sekar !
The first part of command I have is a savedsearch which returns table or set of fields, JobName is one of them. JobName is one of the fields.I tried to table or field+ to expose only jobname field. Something like this
| savedsearch "XYZ" NOT [| inputlookup JobnamesAll.csv | fields jobnames]
but no luck so far.
... View more
10-05-2017
08:05 PM
I have one saved search which returns list of successful job runs e.g
jobname
A
B
C
D
I also have a lookup table with list of all the jobs
jobnames
1
A
2
B
8
C
X
5
I am looking for a way to identify which jobs were not successful. Can we achieve this in SPLUNK ?
... View more
09-05-2017
08:20 PM
1 Karma
Hi friends
I am facing an issue where I have to consolidate and convert the data from Column to rows. The sample data looks like this -
Datetime result duration Dash1 Dash2 Dash3
05-09-2017 20:45:11 SUCCESS 11.07 3.91 3.33 3.34
05-09-2017 20:44:01 SUCCESS 20.57 7.94 6.62 5.32
05-09-2017 20:43:06 SUCCESS 19.98 7.89 6.40 5.14
05-09-2017 20:42:05 SUCCESS 21.29 8.13 6.90 5.82
05-09-2017 20:41:18 SUCCESS 16.38 5.98 5.69 4.18
I have to change data in this format
Datetime result Name ResponseTime
05-09-2017 20:45:11 SUCCESS duration 11.07
05-09-2017 20:45:11 SUCCESS Dash1 3.91
05-09-2017 20:45:11 SUCCESS Dash2 3.33
05-09-2017 20:45:11 SUCCESS Dash3 3.34
05-09-2017 20:44:01 SUCCESS duration 20.57
05-09-2017 20:44:01 SUCCESS Dash1 7.94
05-09-2017 20:44:01 SUCCESS Dash2 6.62
05-09-2017 20:44:01 SUCCESS Dash3 5.32
Is there a way we can achieve this in SPLUNK ?
... View more
04-05-2017
05:38 PM
Hi Friends
I have a requirement to collect some data from a RESTful service hosted in one of cloud environments. This service offers some monitoring data which I am planning to index in SPLUNK so that it can be used for monitoring, availability calculation and capacity analysis.
I know SPUNK offers a free REST API App, but document refers to use cURL command to leverage it. My question is -
Can we use REST API to call services external to SPLUNK ? if yes, what would be the best way to call the service with JSON requests.
... View more
- Tags:
- splunk-cloud
02-26-2016
08:41 AM
SPLUNK authentication from Tableau works fine. I've created couple of savedsearch in global search application and custom applications, tableau does not display either of them.
... View more
02-24-2016
02:24 PM
2 Karma
HI friends,
I am trying to index some XML data (size ~ 2-3MB) using Splunk. I've set up a data input to continuously monitor the file location. However, Splunk fails to index/parse any of the XML files. Following error is reported in splunkd.log:
ERROR TailReader - File will not be read, is too small to match seekptr checksum (file=\\share\Integration\partners\gateway\SD-882_834_990ABDX_20160219_085743.xml). Last time we saw this initcrc, filename was different. You may wish to use larger initCrcLen for this sourcetype, or a CRC salt on this source.
I am not sure about the reason. Did anyone face this issue?
... View more
02-23-2016
09:02 AM
Great question! I've added Token as a session variable in my Invoke-WebRequest command, it is working now!
Here is the command for viewers' reference:
$Result = Invoke-WebRequest -URI $URL -Headers $Headers -Body $Body -Method POST -ContentType "Application/JSON" -SessionVariable "$Token"
Thank you Rich!
... View more
02-23-2016
08:27 AM
I am trying to leverage Powershell to POST the event in form of JSON. The Invoke-WebRequest does not work well. Is there a way to validate the URI for HTTP Event Collector to rule out the possibility of a wrong URI?
$Token = "B50D09B4-HG24-8N52-JN38-8h8ASD789998"
$URL = "https://localhost:8088/services/collector"
$Headers = @{"Authorization"=("Splunk " + $Token)}
$Body = '{"event" : "Hello !"}' | ConvertTo-Json
$Result = Invoke-WebRequest -URI $URL -Headers $Headers -Body $Body -Method POST -ContentType "Application/JSON"
... View more
02-21-2016
06:30 PM
1 Karma
Hi Friends,
I am using Tableau to connect to SPLUNK saved searches. I am using ODBC driver Tableau 9.1 for this connection. However Tableau user is able to authenticate against SPLUNK but no data is being displayed.
Initially I thought it is an account or permission issue but I tried admin account but no luck. Did anyone else face the similar issue with tableau ?
Any suggestion is highly appreciated !
... View more
- Tags:
- connection
- tableau
02-10-2016
07:36 AM
3 Karma
Hi Friends,
I've added a custom application in SPLUNK which utilizes LINE_BREAKER and SHOULD_LINEMERGE features of props.conf. The implementation works great in my development instance of SPLUNK.
I have to create this application and add line merge logic on a Splunk Cloud instance. I need your help to understand:
Does SPLUNK offer a way to update application specific props.conf file (from Splunk Web) so that I can apply LINE_BREAKER and LINEMERGE logic? If yes, Please help me with the procedure or how can I achieve this if creating a file in Filesystem and editing is the only option.
... View more
02-04-2016
08:40 AM
Hi Team,
I am working to design a couple of dashboards which will be accessed by a larger audience. The dashboard will be comprised of certain related and individual searches. There is a requirement of auto refresh. I am currently using the concept of post process where a parent search object can be re-utilized multiple times.
In order to accommodate more charts in my dashboard, I am looking for an option (preferably saved search) to run in scheduler and utilize the pre-populated search results in my dashboard, provided this is possible in SPLUNK. This will point all my user hits to a single search result.
Please let me know if this is possible and how to use the saved search results.
... View more
02-03-2016
07:55 AM
1 Karma
Hi Friends,
I am facing an issue where SPLUNK does not index a file if the size is too low. The file sits in a UNC location(NAS), therefore, I created a data input to monitor the file. Splunk started indexing this file when the file size increased (I believe more than 800KB).
I believe Splunk has a limit to indicate the minimum file size to be indexed. Please help me finding the setting and overriding it.
... View more
01-04-2016
05:15 PM
Dear Splunk experts,
I am working on parsing multiline custom application logs where log represents multiple lines to process some cases with invariable pattern including case type, processing StartvTime, SQL statements(multiple), respective SQL statement response time and processing end time.
Please suggest the best practice to parse these kind of logs for field extraction.
Thanks
... View more
08-12-2015
10:45 AM
All,
This issue was due to permission issue. The ID through which DBmon was created, did not have permission to write in the index. Issue was resolved by adding correct permissions.
... View more
03-11-2015
06:17 AM
Hi All,
I am facing an issue with Splunk DB connect. I was able to create a Database connection (MSSQL) and add a Database input. The Database connection works but Database input (DBMon Tail) is not able to fetch the data from SQL. Here is my inputs.conf file -
[script://.\bin\jbridge_server.py]
disabled = 0
[batch://$SPLUNK_HOME\var\spool\dbmon*.dbmonevt]
crcSalt =
disabled = 0
move_policy = sinkhole
sourcetype = dbmon:spool
[dbmon-tail://DB_MyUAT/Table_LOG]
host = localhost
index = mssql
interval = 1h
output.format = mkv
output.timestamp = 1
output.timestamp.column = List_CREATE_DATE
output.timestamp.format = "yyyy-MM-dd HH:mm:ss.SSS"
query = SELECT * FROM [DB_MyUAT].[Sch_UAT].[Table_LOG] {{WHERE $rising_column$ > ?}}
sourcetype = Table_LOG
tail.rising.column = List_CREATE_DATE
table = Table_LOG
I did not find anything wrong in the DB side, I am able to run same query in DB using same account. I am not sure if I missed anything. Please suggest.
... View more
02-25-2015
04:44 AM
Thanks ! This is very good explanation, I appreciate your help.
Is there any way to use dboutput command to append existing table ?
... View more
02-12-2015
04:31 AM
Hi,
We are looking forward to setup Hunk in our IT environment for Log Collection and trend analysis. However (as expected) data volume and velocity is high. We are looking for a way to run Hunk interactive search queries from RAW data and store the processed results in NoSQL Database.
Is it possible for Hunk to store the search results on to a Database (Preferably NoSQL) ? Please suggest the best way to store processed search results.
Gaurav
... View more