Hi all, 
I would like to create a simple script in python that forwards a syslog file from ubtuntu VM to Splunk.
My syslog files are stored in a folder in ubuntu\home, and when the script runs, it should take the file and forward it to splunk, installed on the same ubuntu VM.
My question is: What library and what function can I use to make this?
My big problem is that I can't install the splunk SDK for python. I have tried thousand of times, but I always have the same installation problems, so I would like to find if I can make this in another way.
Thank you!
import splunklib.client as client
service = client.connect(host='localhost',port=8089,username='admin',password='somepass')
myindex = service.indexes["main"]
mysocket = myindex.attach(sourcetype='myfile',host='myhost')
file_data = ''
with open("foo.txt", "r") as lines:
    for line in lines:
        if line.isspace():             
            line = ' '
        file_data += line      
        file_data += '\r\n'
    mysocket.send(file_data)
mysocket.close()
					
				
			
			
				
			
			
			
			
			
			
			
		Ok with both of them send data to splunk, but my data is not what I want. 
I send a xml file, but Splunk don't extract all the field, and don't read all the row... If for example this is my syslog stored in xml :
*
4673 
  0 
  0 
  13056 
  0 
  0x8010000000000000 
232992271
Security 
  FROSSI-LT.integrity.local 
S-1-5-21-1549169020-2314017464-2061785924-3556 
FRossi 
INTEGDOM 
0x2235d 
Security 
This is my event in splunk:
*1/29/15             2:14:20.000 PM
-  -   4673 0 0 13056 0 0x8010000000000000
Event Actions
Type        Field   Value   Actions
Selected        host    127.0.0.1
    source  http-stream
    sourcetype  http-stream-too_small
Event       Guid    {54849625-5478-4994-A5BA-3E3B0328C30D}
    Name    Microsoft-Windows-Security-Auditing
    index   main
    linecount   9
    splunk_server   ubuntu
    timestamp   none
    xmlns   http://schemas.microsoft.com/win/2004/08/events/event
Time        time   2015-01-29T14:14:20.000+00:00
Default         punct   -<="://../////">-<><="---"="{----}"_/>_<>  * 
Instead I would like have a events equals to a syslog file
My code I provided above works perfectly for me.
And how is your file that send in splunk?
Any text file. Have you actually tried the above code I posted yet ?
Yeah, sure. 
But when I send the file, splunk don't extract all the field, but only a few, maybe 5 or 6.
OK , so are you talking about data uploading or field extraction now ? I'm having a very hard time following you.
If you are talking about field extraction , then the docs are a useful start : 
http://docs.splunk.com/Documentation/Splunk/latest/Knowledge/Managesearch-timefieldextractions
I'm very sorry!
Yes, my file is in splunk, but it look like a long string, there are not all the field that there is normally when I use the splunk forwarder to send equal files..
Are you using the EXACT code that I have posted above ? Because I just tried it with multiple different text files , worked fine every time.
yeah, the same exact code, I think that depends of the files...I think
email me your actual file.
Please email the actual file .You emailed me a copy/paste of the file contents.
Email attachments are your friend.
Why don't you use a Splunk Universal Forwarder to monitor the files and forward them to your Splunk Indexer ? Much more robust approach.
I have installed it, but I have need to generate specific log and send them to my splunk.
Like "generate 5  529 log and next 3 4020 log and so on..."
In the end, I've found how install the python sdk, and I'm trying to send the data using "To add data directly to an index" but with this code: 
myindex = service.indexes["test_index"]
uploadme = "/Applications/Splunk/README-splunk.txt"
myindex.upload(name_file.txt);
but it give me this error: " raise KeyError(key)
KeyError: UrlEncoded('test_index') "
I have not found so many documentations about these library...
It means that you need create "test_index" inside splunk, before send into it
You can not upload remote files to Splunk.
The upload method takes the path of  a file that is already local to the Splunk instance.
You will need to write python code to read in the contents of the files.
Then use the submit or attach methods to send the data to Splunk.
Still better to use a Universal Forwarder(UF).
1) you have some code , presumably a scheduled process , that periodically generates log files.
2) you configure the UF the monitor the directory where you write these log files to
Simple !
Thank you! Ok I can use the submit or the attach, I got it, but how I can  write python code to read in the contents of the files? I have xml file saved in my home. 
(syslog file write in xml) 
something like this: 
with file(filename) as f:
    s = f.read() ?