Is there a way I can call a URL (https://who.is/whois-ip/ip-address/)
and pass it a parameter (54.174.106.18) so the URL will now read: https://who.is/whois-ip/ip-address/54.174.106.18
? Then on the resulting web screen
web screen
NetRange: 54.160.0.0 - 54.175.255.255
CIDR: 54.160.0.0/12
NetName: AMAZON-2011L
NetHandle: NET-54-160-0-0-1
Parent: NET54 (NET-54-0-0-0-0)
NetType: Direct Allocation
OriginAS:
Organization: Amazon Technologies Inc. (AT-88-Z)
RegDate: 2014-06-20
Updated: 2014-06-20
Ref: https://whois.arin.net/rest/net/NET-54-160-0-0-1
Get the Organization field and use it in a chart?
Hello @dbcase, welcome to Splunk Answers
Can you give us a bit more information on what your trying to accomplish?
You technically could do this assuming you're trigger will be something that appears in the logs and sets off a script. An example would be, you install a Splunk forwarder on a remote machine which logs data, you create a Splunk alert which monitors that server and if a keyword appears then the alert will fire and executes a script. You can write a quick python script to append the IP to the base URL and do a wGET
Hello @dbcase, welcome to Splunk Answers
Can you give us a bit more information on what your trying to accomplish?
You technically could do this assuming you're trigger will be something that appears in the logs and sets off a script. An example would be, you install a Splunk forwarder on a remote machine which logs data, you create a Splunk alert which monitors that server and if a keyword appears then the alert will fire and executes a script. You can write a quick python script to append the IP to the base URL and do a wGET
Hi Skoelpin,
Sure! I have a query that pulls 404's and IP out of an access log. I would like to take the IP, do a whois, then get the Org so the users don't have to go look it up.
I just looked to see if whois has an API and they do, but the bad news is that you get 500 free lookups before they start charging you. Here's what I would do
1) Create a field extraction for that IP address
2) Create an alert on the 404's
3) Create an action on the alert which will execute a script
4) Save the field (IP Address) as a variable and pass that to your script
If you're using bash then your script will look like this
#!bin/bash
whois="https://who.is/whois-ip/ip-address/"
ip=$Variable
combine="$whois$ip"
wget $combine &> output.txt
You will need to go to splunk/etc/apps/myapp/bin/
and edit your commands.conf
with the stanza below
[MYSCRIPT]
type = shell
filename = ./myscript.sh
wow that is pretty slick! One thing I'm not clear on is how to extract the Organization field? Would that be in the output.txt that would need to be indexed into splunk?
If this has helped you, can you please accept/upvote the answer?
Yeah Splunk is great, I've been showing guys in different departments how powerful Splunk and how much easier it is than just tailing the logs in a terminal window.
But anyways, you would want to do a field extraction. This will require you to create a regular expression to match the pattern and get the value. That output.txt
in the script is just a Linux redirection to put the output of the whois response to a text file rather than showing them in the terminal window.
Here's a write up a did yesterday explaining how the field extractions work
So when extracting a permanent field, you could either use the built in field extractor which is kind of crappy or you can write your own regular expression. It sounds like you've tried using the built in filed extractor. The reason I say it is crappy is because it builds a sloppy regular expression which does not work across the board. The point of a regular expression is to match patterns even though the value will vary.
If you had the following text and wanted to capture the value between the StatusCode tags, you would need to write a regular expression which will capture the values between the tags.. Also notice how the values will vary (200, Yes, This is a Status Code)
<StatusCode>200</StatusCode>
<StatusCode><Yes</StatusCode>
<StatusCode> This is a Status Code</StatusCode>
If you used the Splunk built in filed extractor then it may only capture the first value but miss all the other ones. So in my opinion, its better to write your own regular expression so you can capture 100% of the values. The way you can pick up regex is by going to www.regex101.com
and practicing. It took me about a month before getting to a very skilled level.
So back to your question, after clicking Extract New Fields, you will then be asked what sourcetype you want to use if you have multiple sourcetypes, if you have 1 sourcetype then it will skip this step. If you need to use a field over multiple sourcetypes, then you will need to extract a field for each sourcetype. After this step, there will be something that says I'd prefer to write this regular expression myself
.. Click this and enter in the regular expression below, then hit preview. This will let you see what values were extracted. I like to click non-matches to see what didn't match (Usually this part is blank since everything matched), I then click matches and scroll through a dozen events to make sure the right value was extracted. Then you hit save and go take a look at your new field
https://answers.splunk.com/answers/439145/field-extraction-problem-1.html#comment-438343