Getting Data In

What is the path where the logs are stored?

rosho
Communicator

Hi

I have set up a virtual machine because I do not want to mess with production servers.
Now, I want to use SFTP to send logs to this virtual machine.
Is there a path where the logs are stored? If so, what is the path where the logs are stored?
For example, this is the path where the .csv files are stored:

/opt/splunk/etc/users/<USER>/Splunk_ML_Toolkit/lookups/<FILE.csv>

Thank you.

0 Karma
1 Solution

Richfez
SplunkTrust
SplunkTrust

The path to where the logs are stored can be anything you want the path to be. It has only a couple of rather obvious (if you think about it) requirements.

  1. The Splunk process has to be able to access that file. Both path-wise and permissions-wise.

And that's it for the hard requirements?

BUT there's some complexity that is purely because of the example you gave. That file path is a lookup for the Splunk_ML_Toolkit. Why are you trying to SFTP logs to that path and file, or was it just an example? If you are trying to replace that file with something you are dropping onto the server with SFTP ... well, OK, but I just wanted to ask to make sure that's actually what you want to do.

If on the other hand it was just some random "I found this example somewhere" type thing, then no worries. Set your SFTP location to somewhere sane like /opt/data/some_subfolder or something (you'll have to make that directory structure), then have the input you build in the UI (or via conf files or whatever) point to that folder.

If you were creating a lookup, but NOT that particular lookup, well. In that case I guess reply back with what it is you are trying to do and maybe we can be more specific about what to do with it, but ... well in that case it's possible it may not be terrible to write the SFTPed file to somewhere like /opt/splunk/etc/apps/Splunk_ML_Toolkit/lookups/ (You wouldn't probably want to only write it to your own user folder version, indeed I'm not sure that would even work).

Happy Splunking,
Rich

View solution in original post

0 Karma

Richfez
SplunkTrust
SplunkTrust

The path to where the logs are stored can be anything you want the path to be. It has only a couple of rather obvious (if you think about it) requirements.

  1. The Splunk process has to be able to access that file. Both path-wise and permissions-wise.

And that's it for the hard requirements?

BUT there's some complexity that is purely because of the example you gave. That file path is a lookup for the Splunk_ML_Toolkit. Why are you trying to SFTP logs to that path and file, or was it just an example? If you are trying to replace that file with something you are dropping onto the server with SFTP ... well, OK, but I just wanted to ask to make sure that's actually what you want to do.

If on the other hand it was just some random "I found this example somewhere" type thing, then no worries. Set your SFTP location to somewhere sane like /opt/data/some_subfolder or something (you'll have to make that directory structure), then have the input you build in the UI (or via conf files or whatever) point to that folder.

If you were creating a lookup, but NOT that particular lookup, well. In that case I guess reply back with what it is you are trying to do and maybe we can be more specific about what to do with it, but ... well in that case it's possible it may not be terrible to write the SFTPed file to somewhere like /opt/splunk/etc/apps/Splunk_ML_Toolkit/lookups/ (You wouldn't probably want to only write it to your own user folder version, indeed I'm not sure that would even work).

Happy Splunking,
Rich

0 Karma

rosho
Communicator

Hi @rich7177

The file path I gave (/opt/splunk/etc/users//Splunk_ML_Toolkit/lookups/) is only an example. This is what I am doing:

PRODUCTION SERVER
I use the command: outputlookup FILE.csv
Then I go to the path: /opt/splunk/etc/users//Splunk_ML_Toolkit/lookups/ and make an SFTP to send the FILE.csv to my VM (to the exactly same path). This way I can call the FILE using inputlookup FILE.csv when udinh the VM.

The problem is that I can only send an SPL search (some values of the log) but not the whole log.

So, that is why I would like to know where the logs are stored in my Production server. Then I will be able to make an SFTP to my VM (some logs, not everything).

0 Karma

Richfez
SplunkTrust
SplunkTrust

I think that makes it a bit more clear in many ways.

So you | outputlookup file.csv from your production server, and you want to copy this to a VM - which I assume is a test or development server? (Doesn't actually matter for this purpose, I just think that's a common scenario so I'm adding it to the answer).

The location of the file you write with | outputlookup file.csv is probably going to be the local app's lookup folder.

For instance, if I'm in the search and reporting app, then that process above would end up with a file in
/opt/splunk/etc/apps/search/lookups/file.csv

If I'm in an app called "mud_slinger", then it'll end up in /opt/splunk/etc/apps/mud_slinger/lookups/file.csv.

If you aren't sure what app you are in, ... you really should be - it's very important! But the shortcut at this time is to look at your url. In the case I did above, the beginning of my url was
http:///en-US/app/search/search?q=search ind...
I highlighted the app name. Hopefully that comes through.

BUT. It is possible for it to go elsewhere. Now that I've written it with outputlookup, if I go to Settings, Lookups, then Lookup table files, I can find where that CSV ended up.

Probably the biggest non-obvious thing to note is that if you have a lookup with that same filename in another app already, and it's shared globally, then when you outputlookup to that filename it will instead overwrite that other file that lives in another location with your data. But even in that case, Settings, Lookups, Lookup table files will still tell you where it is, though you may have to change a few options at the top to see it.

Hope this helps!

0 Karma

rosho
Communicator

@rich7177

Yes it helps. In my case, I am using the MLTK. That is why I gave that path as an example.
But my question is: Is there a PATH for the logs? Not the lookups, because I already know where they are. I want the logs. This way when using the VM I will be able to use SPL to explore the data.

0 Karma

Richfez
SplunkTrust
SplunkTrust

Sure...

But which logs?

I mean, every log is, by definition in a different place. The MLTK "logs" aren't usually logs, if I recall correctly the ML app usually uses | inputlookup filename.csv to get its data, so the "log" is ... just that, the lookup itself. There essentially is no "log" except those lookups.

BUT now that I think I know what you are trying to do -

If you wanted to do something in another Splunk instance to play around with this data, you have a couple of options.

a) Make a copy of those lookups into a lookup folder in an app on the new instance and use | inputlookup filename.csv on them, just like the MLTK does. Of course what you do AFTER that inputlookup can be entirely different.

b) OR make a copy of that .csv file into some other, non-lookup folder thenset up an input to read them from disk and load them into an index. I mean, they're just CSV files. From Splunk's point of view there's nothing magical about them. (Though because they're in the lookups folder of MLTK you can just do a | inputlookup filename.csv to read them, so I guess that's at least a little special.)

Make sense?

0 Karma

rosho
Communicator

When I use the |outputlookup filename.csv, this is just a table that I got using SPL but it is not the log.

By table I mean this. I see only 2 values.
timestamp devname

xxxx xxxx

By log I mean this. I see everything:
timestamp=xxxxx devname=xxxxx devid=xxxx date=xxxx
dstip=xx.xx.xx.xx srcport=55486 .......

0 Karma

Richfez
SplunkTrust
SplunkTrust

Please try

| inputlookup filename.csv

The command outputlookup takes "whatever search results you have" and writes them TO an outputlookup file. This is not what you need here.

Inputlookup reads that same file off disk and displays it, as if it's data. This is what you need.

At this point if you've been doing outputlookups on the filename.csv, it may be overwritten. If it is all overwritten it isn't a big deal - likely you can just reinstall the app and it'll all be fine because the app will rewrite those lookup files.

If you mean you use outputlookup in combination with some other commands and things, then please provide that entire SPL/command so we can see what it is you are doing.

0 Karma

Richfez
SplunkTrust
SplunkTrust

Let me try another way and sort of start over:

Option 1
You are trying to take the log file - by which you mean the data that you find in and which drives some part of the MLTK - and copy that to another system.

Option 1, solution 1
I probably should have mentioned this before, but the EASY way to do this is to install the MLTK on the new VM server! Poof, there's that data.

But if that's just an example location and you want to know how to do this generally - then OK no worries, read on.

Option 1, solution 2
The log files that drive MLTK can be found as inputlookups mostly. MLTK uses them this way | inputlookup somefilename.csv | things | stuff | magic. It looks just like "real data" from a real log file. And in a way it IS real data from a real log file, it's just using them in a different way. But, the point is that's just a csv file. Copy it from it's location one one system into another system, and it's still just a csv file.

If you save that CSV file in a lookups folder in an app, it can be used like |inputlookup somefilename.csv | stuff | things.... On that same system in another app, or on a different system in any app you want to put it in.

Option 1, solution 3
A CSV file saved somewhere OTHER THAN in a lookup folder (e.g. not even inside splunk/etc or anything like that) can be read as an input and turned into events in an index. This is done via the link I gave earlier on creating an input and has nothing to do either with inputlookup or outputlookup. (With the exception of things like copying one lookup from one filename to another by doing something like | inputlookup filename1.csv | make changes like evals or something if desired | outputlookup filename2.csv).

*Option 2 - a whole different thing *
There is one other option - your examples were very specific, and you used certain and very specific words so I hadn't thought of this until now. If you mean "I have data in an index, I can read it from my index with, for instance, index=A sourcetype=B ... " and you want to have that data over on another server...

Option 2, solution 1
You can easily move (or copy) indexes between systems. I mean it might take a while if they're big, but it's not hard. Here's a list of instructions to migrate them between locations on one server, but you'll have to adapt them to your needs. You don't even have to change splunkhome or splunkdb or anything - you'll just want to copy the index files (after stopping splunk) to your new server into the right locations - which oughtta mostly match the source locations. Then start splunk on each server. (Turns out you probably need like two of the half dozen steps only - but you'll have to add a new step of doing the actual file copy to the new system).

Option 3
Lastly, if you have log files that you've set up inputs for on your production system (like for instance some web logs, or some application's log files or something)...

Option 3, solution 1
well, just copy those log files to the new system and create a new input for them.

I hope that's clear - And be careful with words like "log file" and "index" or "lookup" and things like that. They all have very specific meanings, and maybe that's the only real problem we've had is a minor miscommunication on what is what...

-Rich

0 Karma

rosho
Communicator

@rich7177

Thank you for giving me a detailed answer.
It was all I needed. I have moved .csv, json, raw, and the csv created using "|outputlookup" using the Splunk web and CLI. With .json and raw, results are different from using the original logs (ex. IIS logs downloaded from the win server and uploaded using the Splunk web).
Do you know what is "raw" useful for?

I will also try moving indexes. And moving the files to Elasticsearch.

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

Splunk is officially part of Cisco

Revolutionizing how our customers build resilience across their entire digital footprint.   Splunk ...

Splunk APM & RUM | Planned Maintenance March 26 - March 28, 2024

There will be planned maintenance for Splunk APM and RUM between March 26, 2024 and March 28, 2024 as ...