Activity Feed
- Got Karma for Compare a date field with current date. 06-05-2020 12:46 AM
- Got Karma for Splunk for Active Directory - No incoming data from powershell source. 06-05-2020 12:46 AM
- Got Karma for Color in a table based on values. 06-05-2020 12:46 AM
- Got Karma for Color in a table based on values. 06-05-2020 12:46 AM
- Got Karma for Cisco IPS app not working. 06-05-2020 12:46 AM
- Got Karma for Cisco IPS app not working. 06-05-2020 12:46 AM
- Got Karma for Compare a date field with current date. 06-05-2020 12:46 AM
- Got Karma for Compare a date field with current date. 06-05-2020 12:46 AM
- Got Karma for Compare a date field with current date. 06-05-2020 12:46 AM
- Got Karma for Simple Bubble Chart. 06-05-2020 12:46 AM
- Got Karma for Compare a date field with current date. 06-05-2020 12:46 AM
- Got Karma for match an IP with a CIDR mask into a CSV file. 06-05-2020 12:46 AM
- Got Karma for match an IP with a CIDR mask into a CSV file. 06-05-2020 12:46 AM
- Got Karma for match an IP with a CIDR mask into a CSV file. 06-05-2020 12:46 AM
- Got Karma for match an IP with a CIDR mask into a CSV file. 06-05-2020 12:46 AM
- Got Karma for match an IP with a CIDR mask into a CSV file. 06-05-2020 12:46 AM
- Got Karma for Simple Bubble Chart. 06-05-2020 12:46 AM
- Got Karma for Simple Bubble Chart. 06-05-2020 12:46 AM
- Posted Splunk for Active Directory - No incoming data from powershell source on All Apps and Add-ons. 07-20-2013 06:29 AM
- Tagged Splunk for Active Directory - No incoming data from powershell source on All Apps and Add-ons. 07-20-2013 06:29 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
1 | |||
0 | |||
2 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
2 | |||
5 |
07-20-2013
06:29 AM
1 Karma
Hi
I'm using Splunk app for Active Directory, i've installed and configured it to make it run.
I receive data regarding the CPU/RAM monitoring, general info, etc ... in the 3 index msad, perform & winevents.
Unfortunately, i don't receive any information regarding the DC status/helth.
I see it's due to the search "index=msad source=powershell", i'd never indexed data with the field source=powershell in the msad index (only index=msad source=ActiveDirectory).
How could i check where the problem come from ? The script doesn't work ? Isn't executed ? something else ?
The GPO making run the PS script on my DCs is enabled.
I use 1 splunk server with 2 Win 2012 DCs.
Some help would be fine 🙂
Thanks !
... View more
07-19-2013
04:03 AM
I'm facing the same issue, some news about that ?
... View more
07-18-2013
09:00 AM
Thanks for the link knewter, i will have a look on it.
... View more
07-18-2013
08:46 AM
Well i resolved my issue by copying the files into the Windows TA from the "default" folder to the "local" folder especially inputs.conf.
The 3 index are now receiving data, i saw the number indexed events growing from 0 to n.
Anyway there's still nothing on the splunk app for active directory interface ...
I got now the message "no matching fields exist".
This is really frustrating
... View more
07-18-2013
06:25 AM
Hello
I'm having an issue with Splunk app for Active Directory
All the data are index to the main index, that make the app unsable as it search into the index msad, perform and winevents.
I've installed Windows TA on the Windows servers and the Splunk instance side.
I've used the latest version downloaded here
http://splunk-base.splunk.com/apps/28933/splunk-for-windows-technology-add-on
On the windows servers monitored :
C:\Program Files\SplunkUniversalForwarder\etc\apps\Splunk_TA_windows
On the servers side
/opt/splunk/etc/apps/Splunk_for_ActiveDirectory/appserver/addons/TA-DomainController-NT6/local
With the inputs.conffile like this :
[WinEventLog:DFS Replication]
disabled=0
sourcetype="WinEventLog:DFS Replication"
index=winevents
queue=parsingQueue
#
# Application and Services Logs - Directory Service
#
[WinEventLog:Directory Service]
disabled=0
sourcetype="WinEventLog:Directory Service"
index=winevents
queue=parsingQueue
#
# Application and Services Logs - File Replication Service
#
[WinEventLog:File Replication Service]
disabled=0
sourcetype="WinEventLog:File Replication Service"
index=winevents
queue=parsingQueue
#
# Application and Services Logs - Key Management Service
#
[WinEventLog:Key Management Service]
disabled=0
sourcetype="WinEventLog:Key Management Service"
index=winevents
queue=parsingQueue
#
# Collect Replication Information
#
[script://.\bin\runpowershell.cmd ad-repl-stat.ps1]
source=Powershell
sourcetype=MSAD:NT6:Replication
interval=300
index=msad
disabled=false
#
# Collect Health and Topology Information
#
[script://.\bin\runpowershell.cmd ad-health.ps1]
source=Powershell
sourcetype=MSAD:NT6:Health
interval=300
index=msad
disabled=false
#
# Collect Site, Site Link and Subnet Information
#
[script://.\bin\runpowershell.cmd siteinfo.ps1]
source=Powershell
sourcetype=MSAD:NT6:SiteInfo
interval=3600
index=msad
disabled=false
#
# Perfmon Collection
#
[perfmon://Processor]
object = Processor
counters = *
instances = *
interval = 10
disabled = 0
index=perfmon
[perfmon://Memory]
object = Memory
counters = *
interval = 10
disabled = 0
index=perfmon
[perfmon://Network_Interface]
object = Network Interface
counters = *
instances = *
interval = 10
disabled = 0
index=perfmon
[perfmon://DFS_Replicated_Folders]
object = DFS Replicated Folders
counters = *
instances = *
interval = 30
disabled = 0
index=perfmon
[perfmon://NTDS]
object = NTDS
counters = *
interval = 10
disabled = 0
index=perfmon
#
# ADMon Collection
#
[script://$SPLUNK_HOME\bin\scripts\splunk-admon.path]
interval=3600
disabled=false
index=msad
#
# Subnet Affinity Log
#
[monitor://C:\Windows\debug\netlogon.log]
sourcetype=MSAD:NT6:Netlogon
disabled=false
index=msad
I got data from the execution of the scripts as i find these sourcetypes into the main index :
- WinEventLog:Security
- WinEventLog:System
- fs_notification
- WinEventLog:Application
- ActiveDirectory
- WinEventLog:Setup
I guess i've followed all the steps to install and configure the app by following this tutorial but it seems i've done something wrong ...
http://docs.splunk.com/Documentation/ActiveDirectory/latest/DeployAD/Deploymentprocess
I've already looked for my mistake but without success
Could someone help me to troubleshoot this ?
Thanks.
... View more
04-10-2013
04:52 AM
2 Karma
Hello
I try to modify text color in a table based on a field value.
Here's the table i display.
ScanName ScanSatus ScanDate
Scan1 Up to date Apr 01, 2013
Scan2 Up to date Apr 01, 2013
Scan3 Up to date Apr 01, 2013
Scan4 Not up to date Mar 01, 2013
I want to put in green the rows where the ScanStatus field is at "Up to date" and red when it's at "Not up to date".
I'd applied the recommendations on the following topic but i have no result
http://splunk-base.splunk.com/answers/42994/advanced-xml-highlight-certain-values-in-a-table-not-numerical
The table is still white everywhere, maybe i'm doing something wrong or it dosen't work on 5.0 ?
Could someone confirm me what is currently the best choice to implement this feature ?
Still using .jss/.css files, using the sideview utils, an other solution ?
Please let me know.
Thanks.
... View more
03-22-2013
08:05 AM
Well the fillnul function worked, not the transforms.conf modification.
Thanks for help yannK !
... View more
03-22-2013
07:13 AM
Hi guys,
I'm using a lookup file matching on decades values field.
My goal is to make a chart with 5 columns, 4 with the main values and 1 with all the others aggregated together.
Here's the lookup file
N_vendor,vendor
java,java
adobe,adobe
microsoft,microsoft
mozilla,mozilla
*,Others
Here's the transforms.conf
[vendor_bis]
filename = vendor_bis.csv
min_matches = 1
default_match = Others
case_sensitive_match = false
match_type = WILDCARD(N_vendor)
The 4 first columns in my lookup file appears on the chart but not the last using a wildcard to make match the rest as "Others".
Did i'm missing something ?
Thanks.
... View more
12-14-2012
07:05 AM
Hello,
I would like to know if that possible to configure on a single splunk fowarder, 2 distincts inputs and outputs.
In a concrete way, i would to receive the inputs logs coming from :
UDP:10001 and send them to TCP:10001 in output
UDP:10002 and send them to TCP:10002 in output
The only configuration i see, is centralizing the logs in a single point from the inputs, and send them to the outputs.
I want 2 distincts flow is it possible ?
It should look like this
Thanks !
... View more
12-10-2012
01:15 AM
Great, thanks for the answer I hadn't seen this feature in the documentation that exactly what i was looking for.
... View more
12-07-2012
04:49 AM
Hello,
Currently we're processing about 30 scheduled saved search in our splunk server.
The processing of these searches are taking a lot of ressources in CPU and memory.
So i was wondering if there's a way to export the processing functions to another server making our splunk server less overloaded ?
The goal would be to use the splunk server to just ensure the display of the searches on the dashboards.
Thanks
... View more
12-04-2012
05:21 AM
Thanks for the feedback Ayn.
Well, i tried the REGEX you told me but it dosen't seems to work, i still have risk_rating events lower than 75 indexed.
This field look like this in my events
risk_rating="00"
I've only changed the REGEX value into the transforms.conf mentioned above.
It should be enough right ?
... View more
12-03-2012
06:47 AM
Ok thanks for the feedback.
Well, the possible values i want to exclude for this field goes from 0 to 74.
If i add a regex for this 75 possibilities, is there a performance impact for indexing the coming data ?
... View more
12-03-2012
06:20 AM
Hello,
I'm consulting the documentation regarding filtering events before they get indexed but i have issue to understand how i could do that.
I got events coming from 1 IP, and i don't want to index the events where the field "risk_rating" is lower than 75.
i really don't know how to do that with props.conf and transforms.conf
Props.conf
[host::10.6.75.16]
TRANSFORMS-null= setnull
Tranforms.conf
[setnull]
REGEX=???? /did i have to use this field/
DEST_KEY=queue
FORMAT=nullQueue
How could i say all the values under 75 for the field "risk_rating" isn't indexed under the transforms file ?
It seems quit simple but i really don't get it ...
Thanks.
... View more
11-23-2012
01:05 AM
Hello
I search could i move sepcific data from an index called index1 to another one called index2.
Let's say i have all this kind of events in my index 1 :
Events kind 1
Events kind 2
Events kind 3
Events kind 4
I want to move all the events 3 & 4 to index index2 and delete them from index1.
So the final result would be :
Index1 :
Events kind 1
Events kind 2
Index2 :
Events kind 3
Events kind 4
Is there a way to do that ?
Thanks.
... View more
10-10-2012
06:20 AM
2 Karma
Hello
I have issue to make work the Cisco IPS app under splunk.
I made it works the first time indexing correctly the IPS logs.
I did a lot of register script under the set up menu on the Cisco IPS.
I tried to delete the wrong one but i was unable to do it because i did get an error message everytime.
So i decided to uninstall the app by removing the Splunk_CiscoIPS folder under $SPLUNK/etc/apps/ and restart splunk to make a fresh install.
I'd also deleted the CiscoIPS folder I founded under $SPLUNK/etc/users/%user%/
I made a fresh install and now i'm unable to get the IPS events after doing the set up.
Here's the log i have in $SPLUNK/var/log/splunk/sdee_get.log
Wed Oct 10 15:00:22 2012 - INFO - No exsisting SubscriptionID for host: 1.2.3.4
Wed Oct 10 15:00:22 2012 - INFO - Attempting to connect to sensor: 1.2.3.4
Wed Oct 10 15:00:22 2012 - INFO - Successfully connected to: 1.2.3.4
Wed Oct 10 15:00:22 2012 - INFO - Checking for exsisting SubscriptionID on host: 1.2.3.4
Wed Oct 10 15:00:22 2012 - INFO - No exsisting SubscriptionID for host: 1.2.3.4
Wed Oct 10 15:00:22 2012 - INFO - Attempting to connect to sensor: 1.2.3.4
Wed Oct 10 15:00:22 2012 - INFO - Successfully connected to: 1.2.3.4
Wed Oct 10 15:00:23 2012 - ERROR - Connecting to sensor - 1.2.3.4: HTTPError: HTTP Error 401: Unauthorized
Wed Oct 10 15:00:24 2012 - ERROR - Connecting to sensor - 1.2.3.4: HTTPError: HTTP Error 400: Bad Request
Wed Oct 10 15:05:23 2012 - INFO - Checking for exsisting SubscriptionID on host: 1.2.3.4
Wed Oct 10 15:05:23 2012 - INFO - No exsisting SubscriptionID for host: 1.2.3.4
Wed Oct 10 15:05:23 2012 - INFO - Attempting to connect to sensor: 1.2.3.4
Wed Oct 10 15:05:23 2012 - INFO - Successfully connected to: 1.2.3.4
Wed Oct 10 15:05:24 2012 - ERROR - Connecting to sensor - 1.2.3.4: HTTPError: HTTP Error 401: Unauthorized
Wed Oct 10 15:05:25 2012 - INFO - Checking for exsisting SubscriptionID on host: 1.2.3.4
Wed Oct 10 15:05:25 2012 - INFO - No exsisting SubscriptionID for host: 1.2.3.4
Wed Oct 10 15:05:25 2012 - INFO - Attempting to connect to sensor: 1.2.3.4
Wed Oct 10 15:05:25 2012 - INFO - Successfully connected to: 1.2.3.4
Wed Oct 10 15:05:26 2012 - ERROR - Connecting to sensor - 1.2.3.4: HTTPError: HTTP Error 400: Bad Request
It seems to be my credentials which aren't correct but i'd already tried to make another account unsuccessfully.
Do you have any idea ?
Thanks.
... View more
- Tags:
- cisco
09-20-2012
03:58 AM
I tried with the following lines in my search and it works now.
eval epochevent=strptime(N_patch, "%Y/%m/%d") | eval epoch30daysago=relative_time(now(), "-30d@d" ) | where epoch30daysago>=epochevent
Thanks for your help !
... View more
09-19-2012
05:55 AM
5 Karma
Hello,
I have some events into splunk which I would like to compare with today's date less than 30 days.
I want to exctract all the events which are older than 30 days like this.
The date field in the events has this form : Date="2012-09-24" which is %Y-%m-%d
How could I get the current splunk date in my search and make a compare with the date field ?
I suppose the use of epoch values as proposed here could be a solution once the current date obtained.
http://splunk-base.splunk.com/answers/37272/compare-two-date
Thanks.
... View more
09-04-2012
02:26 AM
Well i didn't see this page, thanks.
But i want to do this on a UniversalSplunkFowarder, not a heavy fowarder which is i guess a physical splunk appliance, correct ?
... View more
09-03-2012
07:43 AM
Hello,
Here's the situation.
I have an equipement sending 2 kinds of events with UDP syslog to a splunk fowarder and then send it to a splunk server in TCP.
I would like to filter events on the splunk fowarder with the outputs.conf or inputs.conf files by gathering only 1 kind of log.
i'd see this is possible on the splunk server directly but i want to minimize the impact on the bandwidth and not sending useless logs for nothing.
Is there a way to do that via a regex or specific char on the event ?
Thanks.
... View more
07-26-2012
01:55 AM
hello,
I got a question regarding the field indexed by splunk when an event is received on splunk server.
I would like to index and use the timestamp present into the logs I get from multiple sources.
All those logs are stored into the default DB.
There's 3 kind of timestamps present in the 3 diffrents logs source which look like this :
2012-07-25T08:07:30
1343250669001 => This is epoch time
Jul 23 12:09:43
3 eventtype has been created for each.
Splunk is currently indexing these logs at the time it were received on the splunk server.
The purpose would be to do search on splunk from these events using the time present in the logs file.
I tried to follow the instrctions present in this page but it doesen't seems to work, i'm pretty sure i'm doing something wrong.
http://docs.splunk.com/Documentation/Splunk/4.3.3/Data/Configuretimestamprecognition
Here's the first entry i made on the props.conf file.
[EVENT_Spyware]
TIME_PREFIX = (?i) .*?="(?P<Spyware>\d+\-\d+\-\d+\w+:\d+:\d+)\w+"
TIME_FORMAT = %Y-%m-%dT%H:%M:%S
TZ = Europe/Paris
TRANSFORMS-Virus = Spyware
Could someone help please ?
Thanks.
... View more
07-13-2012
04:58 AM
Thanks for the answer Lamar.
I tried with the following XML code, but when i load the view,i have a 404 error.
Advanced view
Search_Name
False
I'm almost sure this a syntax issue, because when i replace the module with a sample provided on another module, it works fine.
... View more
07-12-2012
06:07 AM
Hello,
Actually, I wonder how to do that too, so i permit myself to post on this topic.
I tried to put on these lines regarding the module HiddenSavedSearch you've previously mentionned Lamar, but nothing happens.
Im not sure about how and where i should put these lines, i did it on my dahsboard xml file file like this :
<dashboard>
<label>My_dashboard_label</label>
<module name="HiddenSavedSearch">
<param name="savedSearch">My_Saved_Search_Name</param>
<param name="useHistory">False</param>
</module>
I could set the savedSearch to any value in the parameter mentionned in the splunk doc (Fasle, True, None, ...), nothing is happening.
Could someone confirm how to do it ?
Thanks.
... View more