Activity Feed
- Karma Re: How to use a different field other than _time to group events based on a desired time interval (e.g. 1 week) for DalJeanis. 06-05-2020 12:49 AM
- Got Karma for Why are the EventType Reference Missing for wineventlog-dns in Splunk App for Windows Infrastructure version 1.4.2 and 1.4.3?. 06-05-2020 12:49 AM
- Karma Re: How to adjust Radial Gauge numeric range within a dashboard for mporath_splunk. 06-05-2020 12:47 AM
- Karma Re: iframes and views broken after Splunk 6 upgrade for hexx. 06-05-2020 12:46 AM
- Posted Issue with Setup page being grayed out for each field on All Apps and Add-ons. 04-04-2020 04:23 PM
- Tagged Issue with Setup page being grayed out for each field on All Apps and Add-ons. 04-04-2020 04:23 PM
- Posted Re: In the Splunk Add-on for Unix and Linux, can you help me with a Splunk_TA_nix Issue with transform? on All Apps and Add-ons. 01-29-2019 05:30 PM
- Posted Which is this the best way to get a count of events indexed for entire environment? on Splunk Search. 01-29-2019 01:03 PM
- Tagged Which is this the best way to get a count of events indexed for entire environment? on Splunk Search. 01-29-2019 01:03 PM
- Posted In the Splunk Add-on for Unix and Linux, can you help me with a Splunk_TA_nix Issue with transform? on All Apps and Add-ons. 10-09-2018 01:07 PM
- Posted Re: Best way to publish a live dashboard that does not allow the logged in user to search other data, and has a persistent login. on Splunk Search. 09-17-2018 01:00 PM
- Posted Best way to publish a live dashboard that does not allow the logged in user to search other data, and has a persistent login. on Splunk Search. 08-27-2018 09:51 AM
- Tagged Best way to publish a live dashboard that does not allow the logged in user to search other data, and has a persistent login. on Splunk Search. 08-27-2018 09:51 AM
- Tagged Best way to publish a live dashboard that does not allow the logged in user to search other data, and has a persistent login. on Splunk Search. 08-27-2018 09:51 AM
- Tagged Best way to publish a live dashboard that does not allow the logged in user to search other data, and has a persistent login. on Splunk Search. 08-27-2018 09:51 AM
- Posted Re: Trying to run searches in Search and Reporting on PAN logs but getting issues on All Apps and Add-ons. 06-04-2018 10:46 AM
- Posted Trying to run searches in Search and Reporting on PAN logs but getting issues on All Apps and Add-ons. 05-31-2018 02:42 PM
- Tagged Trying to run searches in Search and Reporting on PAN logs but getting issues on All Apps and Add-ons. 05-31-2018 02:42 PM
- Tagged Trying to run searches in Search and Reporting on PAN logs but getting issues on All Apps and Add-ons. 05-31-2018 02:42 PM
- Posted Re: How to get license usage data from monitoring console to searchhead? on All Apps and Add-ons. 05-30-2018 03:11 PM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
1 | |||
0 |
04-04-2020
04:23 PM
Unable to initialize modular input "web_input" defined in the app "website_input": Introspecting scheme=web_input: script running failed (exited with code 1)..
... View more
01-29-2019
05:30 PM
I deleted that part of the config file as a resolution.
... View more
01-29-2019
01:03 PM
Blockquote
1.
| eventcount summarize=false | stats sum(count)
Blockquote
OR
Blockquote
2.
https://docs.splunk.com/Documentation/Splunk/7.2.3/Troubleshooting/Aboutmetricslog
index=_internal source=*metrics.log group=thruput | stats sum(ev)
Blockquote
My use case is currently for last 30 days, and I am getting very different results. The first search command appears to be identical to the dashboard that shows up on the "What to Search" section on the search and reporting home page.
The metrics.log I want to believe to be more accurate given that I can't do a time modifier on the first search command, "| eventcount."
... View more
- Tags:
- splunk-enterprise
10-09-2018
01:07 PM
The TA is overriding my Host Segment in my monitor file. Is there a purpose for this?
Splunk_TA_nix/default/transforms.conf:FORMAT = host::ACME-001
Splunk_TA_nix/default/transforms.conf:FORMAT = host::ACME-002
Splunk_TA_nix/default/transforms.conf:FORMAT = host::ACME-003
Splunk_TA_nix/default/transforms.conf:FORMAT = host::ACME-004
props.conf-###### Global ######
props.conf-[source::...(linux.*|sample.*.linux)]
props.conf:TRANSFORMS-force_host_for_linux_eventgen = force_host_for_linux_eventgen
... View more
09-17-2018
01:00 PM
Cusello, which step ensures that the user stays logged in persistently? Or is there no need to log in if we run through these steps?
... View more
08-27-2018
09:51 AM
I'm trying to put a dashboard on a TV in a high traffic hallway with people that aren't allowed to search the other information in an index.
1) How can I publish a live dashboard that doesn't provide someone the ability to click the magnifying glass and search the rest of the data, or manipulate the search or dashboard panels in anyway?
2) How can I setup persistent login for one service account?
Thanks
... View more
06-04-2018
10:46 AM
I've checked the TA and it's up to date. I've checked the /local directory and there isn't anything in it.
[splunk@server Splunk_TA_paloalto]$ find ./* -type d -name local
./local
[splunk@server Splunk_TA_paloalto]$ for h in `find ./* -type d -name local`; do ls -larth $h; done
total 4.0K
-rw------- 1 splunk splunk 21 May 10 11:00 app.conf
drwxr-xr-x 11 splunk splunk 237 May 10 11:00 ..
drwx------ 2 splunk splunk 22 May 10 11:00 .
[splunk@server Splunk_TA_paloalto]$
... View more
05-31-2018
02:42 PM
Error in 'SearchParser': The search specifies a macro 'session' that cannot be found. Reasons include: the macro name is misspelled, you do not have "read" permission for the macro, or the macro has not been shared with this application. Click Settings, Advanced search, Search Macros to view macro information.
| tstats summariesonly=t prestats=t latest(_time), values(log.log_subtype), values(log.severity), values(log.app), values(log.user), values(log.threat_name), values(log.file_name), values(log.file_hash), values(log.url), values(log.dest_name), count FROM datamodel="pan_firewall" WHERE (nodename="log.threat" OR nodename="log.wildfire.malicious") log.action="" GROUPBY sourcetype `session` log.direction log.action
| tstats summariesonly=t prestats=t append=t latest(_time), values(log.log_subtype), values(log.severity), values(log.threat_name), values(log.user), count FROM datamodel="pan_firewall" WHERE nodename="log.correlation" log.action="" GROUPBY sourcetype log.serial_number log.log_subtype log.client_ip log.action
| tstats summariesonly=t prestats=t append=t latest(_time), values(log.log_subtype), values(log.severity), values(log.file_name), values(log.file_hash), values(log.user), values(log.threat_name), count FROM datamodel="pan_endpoint" WHERE nodename="log.attacks" log.action="" GROUPBY sourcetype log.log_subtype log.client_ip log.action
| tstats summariesonly=t prestats=t append=t latest(_time), latest(log.incident_id), values(log.log_subtype), values(log.app), values(log.user), values(log.threat_name), values(log.client_ip), count FROM datamodel="pan_aperture" WHERE nodename="log.incident" GROUPBY sourcetype log.threat_name log.file_name
| fillnull value="" log.client_ip log.server_ip log.serial_number log.session_id log.direction log.action log.file_name log.threat_name
| stats latest(_time) AS _time, latest(log.incident_id) AS log.incident_id, values(log.log_subtype) AS log.log_subtype, values(log.severity) AS log.severity, values(log.app) AS log.app, values(log.user) AS log.user, values(log.threat_name) AS log.threat_name_values, values(log.file_name) AS log.file_name_values, values(log.client_ip) AS log.client_ip_values, values(log.file_hash) AS log.file_hash, values(log.url) AS log.url, values(log.dest_name) AS log.dest_name, count BY sourcetype `session` log.direction log.action log.file_name log.threat_name
| rename log. AS *
| fillnull value="high" severity
| eval action=if(action=="", "allowed", action)
| eval severity=case(severity=="critical","critical", severity=="high","high", severity=="medium","medium", severity=="low","low", severity=="informational","informational", sourcetype=="pan:aperture","high")
| eval victim_ip=if(direction=="" OR direction=="client-to-server", if(server_ip!="",server_ip,client_ip), client_ip)
| eval file_name=if(file_name=="", file_name_values, file_name)
| eval threat_name=if(threat_name=="", threat_name_values, threat_name)
| eval client_ip=if(client_ip=="", client_ip_values, client_ip)
| lookup minemeldfeeds_lookup indicator AS client_ip OUTPUT value.autofocus_tags AS client_autofocus_tags
| lookup minemeldfeeds_lookup indicator AS server_ip OUTPUT value.autofocus_tags AS server_autofocus_tags
| lookup minemeldfeeds_lookup indicator AS file_hash OUTPUT value.autofocus_tags AS file_autofocus_tags
| lookup minemeldfeeds_lookup indicator AS url OUTPUT value.autofocus_tags AS url_autofocus_tags
| lookup minemeldfeeds_lookup indicator AS dest_name OUTPUT value.autofocus_tags AS domain_autofocus_tags
| eval autofocus_tags=mvappend(client_autofocus_tags,server_autofocus_tags,file_autofocus_tags,url_autofocus_tags,domain_autofocus_tags) | eval time_in_seconds=_time | eval drilldown_token=case(sourcetype=="pan:endpoint","endpoint_event", sourcetype=="pan:aperture","aperture_event", true(),"network_event") | search severity=critical action=allowed latest=-5d |table _time log_subtype threat_name severity action app client_ip server_ip user file_name session_id serial_number drilldown_token victim_ip time_in_seconds autofocus_tags incident_id sourcetype | eval autofocus_tags=mvdedup(autofocus_tags) | sort -_time
I've tried doing
[]
export = system
In both the Splunk TA and Splunk app to no avail - i also acknowledge that it said not to export to system in the $app$/metadata/default.meta
I need these searches to work in search and reporting because i'm building a dashboard with an array of searches from different applications.
... View more
05-30-2018
03:11 PM
I came up with my own solution but I'm open to new ideas. I added the deployment server which is also my license and monitoring console as a search peer. Then I used this command:
| rest splunk_server=deploymentserver /services/licenser/pools | search [rest splunk_server=deploymentserver /services/licenser/groups | search is_active=1 | eval stack_id=stack_ids | fields stack_id] | join type=outer stack_id [rest splunk_server=local /services/licenser/stacks | eval stack_id=title | eval stack_quota=quota | fields stack_id stack_quota] | stats sum(used_bytes) as used max(stack_quota) as total | eval usedGB=round(used/1024/1024/1024,3) | eval totalGB=round(total/1024/1024/1024,3) | eval gauge_base=0 | eval gauge_danger=totalGB*0.8 | eval gauge_top=totalGB+0.001 | gauge usedGB gauge_base gauge_danger totalGB gauge_top | eval
If you don't know the server name, you can replace deploymentserver with * and it will query all search peers. Then you could look at the fields value and see which splunk_server values are available.
... View more
05-30-2018
02:39 PM
I have a distributed Splunk environment and my deployment server is where my monitoring console for the environment resides.
I have a search head cluster that I'm putting a dashboard on with a variety of searches/reports/etc. I want to get the license information from the monitoring console dashboard on my dashboard on my searchheads, but the macros and such from the monitoring console app along with the data are all on my deployment server.
What's the best way to get the license information I get from this search on the deployment server over to my search head cluster?
`dmc_licensing_base_summary(deployment.company,"")` | `dmc_licensing_summery_no_split(deployment.company, dmc_licensing_stack_size_srch, deployment.company, "", "")`
I've tried using this search but the results are off by like 10-15%:
| savedsearch instrumentation.licenseUsage | spath date
... View more
05-10-2018
02:24 PM
I was coming from 6.6.2 to 7.0.3
There were no logs in splunkd or the web logs.
... View more
05-10-2018
02:22 PM
I had the same issue and I had to look at the crash log and found (in hex code) that there was a duplicate HEC (HTTP Event Collector) key in an app. So in summary, I had an app that was a culprit. You can backup all your apps, and either remove all and add 1 at a time and restart splunk, or have them all on there and delete one by 1 and try starting splunk.
This is the process I went through and it is also the recommended approach by Splunk to ensure that all apps work on a splunk (dev) server before upgrading prod.
... View more
05-10-2018
02:15 PM
Hey Maweyandt, I've ran into an array of issues with the splunk web server not showing up myself. Some tips I'd recommend:
1) Always run splunk as the splunk user
2) Make sure splunk is the owner of all files in Splunk home. (usually /opt/splunk
3) If you're running an upgrade and it won't come up. Make a copy of all of your apps, remove them, and try starting it. If it starts - you know you have an app that is a culprit and you'll have to add/remove 1 by 1 until you find the culprit.
... View more
05-08-2018
02:46 PM
Solved the problem:
Install https://splunkbase.splunk.com/app/1477/ on the universal forwarders for DC's/Print Servers/etc.
Install https://splunkbase.splunk.com/app/3208/ on UF's, like above, and Indexers, and searchheads.
... View more
05-07-2018
11:56 AM
Is there a way to generate 1 alert for the first time a user logs into something?
I've been thinking through this all morning and came up with a potential way to go about it - I've done my search and sorted by _time so that the first event is at the top, and remove all other "duplicate" events specific to the email(username) field. If someone knows how to generate 1, and only 1, alert per unique event specific to the raw data that could potentially work too.
index=myindex “actor.email”=* “events{}.name”=login_success | bucket _time span=1d | iplocation ipAddress | stats count by _time,“actor.email”,“events{}.name”,ipAddress,City,Region,Country | where count=1 | sort _time | dedup actor.email
I recognize that the bucket _time is useless in this case ;).
Thanks,
BG
... View more
- Tags:
- alerts
- automation
04-18-2018
04:18 AM
I'm currently not using indexer clustering. I'm on all flash storage and I'm looking into increasing the speed of some of my larger indexes, which are about 4-5TB and growing. When I run a 1 day search it is uber slow. CPU performance and memory performance is 10-50%. Would indexer clustering with dedication to searchability factor help me? What would I aim to do to increase search speed? I know there's some customers out there with petabytes of ingestion daily, and I'm betting they don't wait a year for a 15min search to complete.
Thanks
... View more
- Tags:
- splunk-enterprise
03-28-2018
01:19 PM
1 Karma
Error Running in Search and Reporting App:
[indexer1.domain.com] Eventtype 'wineventlog-dns' does not exist or is disabled.
[indexer2.domain.com] Eventtype 'wineventlog-dns' does not exist or is disabled.
Extra info: I'm in a distributed Splunk environment. However, I'm not using indexer clustering.
Grepped eventtypes:
./apps/splunk_app_windows_infrastructure/default/eventtypes.conf
[splunk@splunk-dev-1 etc]# vi apps/splunk_app_windows_infrastructure/default/eventtypes.conf
[msad-dns-events]
search = ((eventtype=wineventlog_application OR eventtype=wineventlog_system OR eventtype=wineventlog_security) (Type=Warning OR Type=Error) DNS) OR (*eventtype=wineventlog-dns* (Type=Warning OR Type=Error))
Full EventTypes File:
--- AD and DNS eventtypes
[msad-account-lockout]
search = eventtype=wineventlog_security EventCode=4740
[msad-account-unlock]
search = eventtype=wineventlog_security EventCode=4767
[msad-ad-access]
search = eventtype=wineventlog_security EventCode=4662
[msad-admin-audit]
search = (eventtype=msad-group-changes OR eventtype=msad-groupmembership-changes OR eventtype=msad-computer-changes OR eventtype=msad-user-changes OR eventtype=msad-account-lockout OR eventtype=msad-account-unlock) user!="*$" src_user!="*$"
[msad-anomalous-events]
search = eventtype=msad-security-anomalous-events OR eventtype=msad-dirsvcs-anomalous-events
[msad-computer-changes]
search = eventtype=wineventlog_security (EventCode=4741 OR EventCode=4742 OR EventCode=4743)
[msad-dirsvcs-anomalous-events]
search = eventtype=wineventlog-ds (Type=Error OR Type=Warning OR EventCode=1458)
[msad-disabled-logons]
search = eventtype=wineventlog_security EventCode=4625 (Status=0xC000006E OR Status=0xC0000072 OR Status=0xC0000193)
[msad-dns-events]
search = ((eventtype=wineventlog_application OR eventtype=wineventlog_system OR eventtype=wineventlog_security) (Type=Warning OR Type=Error) DNS) OR (eventtype=wineventlog-dns (Type=Warning OR Type=Error))
[msad-failed-computer-logons]
search = eventtype=wineventlog_security EventCode=4625 user="*$"
[msad-failed-user-logons]
search = eventtype=wineventlog_security (EventCode=4625 OR ((EventCode=4768 OR EventCode=4771 OR EventCode=4776) Keywords="Audit Failure")) user!="*$"
[msad-group-changes]
search = eventtype=wineventlog_security (EventCode=4727 OR EventCode=4730 OR EventCode=4731 OR EventCode=4734 OR EventCode=4735 OR EventCode=4737 OR EventCode=4744 OR EventCode=4745 OR EventCode=4748 OR EventCode=4749 OR EventCode=4750 OR EventCode=4753 OR EventCode=4754 OR EventCode=4755 OR EventCode=4758 OR EventCode=4759 OR EventCode=4760 OR EventCode=4763 OR EventCode=4764)
[msad-groupmembership-changes]
search = eventtype=wineventlog_security (EventCode=4728 OR EventCode=4729 OR EventCode=4732 OR EventCode=4733 OR EventCode=4746 OR EventCode=4747 OR EventCode=4751 OR EventCode=4752 OR EventCode=4756 OR EventCode=4757 OR EventCode=4761 OR EventCode=4762)
[msad-password-changes]
search = eventtype=wineventlog_security (EventCode=4723 OR EventCode=4724)
[msad-rep-errors]
search = eventtype=wineventlog-ds (EventCode=1014 OR EventCode=1083 OR EventCode=1084 OR EventCode=1203 OR EventCode=1307 OR EventCode=1308 OR EventCode=1311 OR EventCode=1566 OR EventCode=1699 OR EventCode=1800 OR EventCode=1801 OR EventCode=1865 OR EventCode=1925 OR EventCode=1926 OR EventCode=1988 OR EventCode=2087 OR EventCode=2088)
[msad-security-anomalous-events]
search = eventtype=wineventlog_security (Type=Error OR Type=Warning OR EventCode=512 OR EventCode=513 OR EventCode=516 OR EventCode=517 OR EventCode=1100 OR EventCode=1101 OR EventCode=1102 OR EventCode=1104 OR EventCode=4609 OR EventCode=4612 OR EventCode=1621)
[msad-successful-computer-logons]
search = eventtype=wineventlog_security EventCode=4624 user="*$"
[msad-successful-user-logons]
search = eventtype=wineventlog_security EventCode=4624 user!="*$"
[msad-user-changes]
search = eventtype=wineventlog_security (EventCode=4720 OR EventCode=4722 OR EventCode=4724 OR EventCode=4725 OR EventCode=4726 OR EventCode=4738 OR EventCode=4767 OR EventCode=4781 OR EventCode=4912) user!="*$"
Windows eventtypes
[eventlog_Update_Successful]
search = sourcetype="*:System" "Installation Successful"
[eventlog_Update_Failed]
search = sourcetype="*:System" "Installation Failure"
[updatelog_Update_Successful]
search = sourcetype="WindowsUpdateLog" "Content Install" "Installation Successful"
[updatelog_Update_Failed]
search = sourcetype="WindowsUpdateLog" Failure "Content Install" "Installation Failure"
[Update_Successful]
search = eventtype=eventlog_Update_Successful OR eventtype=updatelog_Update_Successful
[Update_Failed]
search = eventtype=eventlog_Update_Failed OR eventtype=updatelog_Update_Failed
[Key_Events_On_Hosts]
search = \
( \
sourcetype="WinEventLog*" OR sourcetype="XmlWinEventLog*" \
(Type="*Error*" OR Type="*Fail*" OR EventCode=1074 OR EventCode=19 OR \
EventCode=20 OR EventCode=21 OR Eventcode=1001) \
) OR \
( \
sourcetype="WindowsUpdateLog" (status="installed" OR \
status="failure" OR status="restart required") \
)
description = This event type identifies key events on a host machine running Windows. \
Key events are defined as most commonly occurring scenarios on a Windows host that \
may require immediate attention or may explain issues seen on a host. hence the \
events collected by this event type include the following: \
1. Windows event log events from the System, Application or Security event log \
that indicate errors. \
2. Windows system event log event with event code 1074 for system reboots. Event codes \
600*s although useful in detecting the reboots of the event log itself are useful, \
they have been known to be noisy in reality and so excluded from this event type. \
3. Windows update event log events with event codes 19, 20 or 21 for installs, \
failures and required reboot scenarios. \
4. Windows update log actions indicating install, failure or install requiring reboots. \
Note: since event types dont support append command, we OR the source types.
"
... View more
03-08-2018
03:14 PM
I'm using a REST API broker platform and it's running Splunk searches. All of my searches I've ran come out the exact same way they do in the Splunk UI, except when I try to run an ldapsearch.
Command being sent:
| ldapsearch search="(&(objectclass=user)(objectcategory=user)
(proxyAddresses=smtp:person@domain.com))"
| search userPrincipalName
| table fieldICareabout | rex mode=sed field=fieldICareAbout "s/(.*\/)//"
I recognize that this command has to be the first command, so how do I send a command like this via API if "search" is appended to every search?
I get the FATAL error
messages: [
{
type: FATAL,
message: Error in 'ldapsearch' command: This command must be the first command of a search.
}
... View more
03-06-2018
02:43 PM
I appreciate you sharing your experience, livehybrid. I've shared with my team your response.
We have some very strong network engineers and they're hoping to get more details from your experience. What kind of failures were you seeing when you tried using a load balancer?
Looking at F5, they've said there's a few different ways to configure the load balancers, from sticky connections to round robin, and details around the state of the connections.
They're alluding to plenty of room for error on the load balancer configuration side, but without knowing the details, they're likely going to go down the "trust by verify" route.
... View more
03-06-2018
12:38 PM
Thanks for the response, Starcher. This will be for my endpoints on the internet to send their data internally. I'm intending to put Universal Forwarders in the DMZ to relay the data to our internal network.
With that said, my load balancer is my front end. Are you suggesting giving external IP addresses to each UF/HF I put in the DMZ?
... View more
03-05-2018
01:05 PM
iframes are automatically disabled after a certain version of Splunk (I believe 6.) This link explains it and provides the solution.
https://answers.splunk.com/answers/104277/iframes-and-views-broken-after-splunk-6-upgrade.html
... View more
03-05-2018
01:01 PM
Thanks, I'll submit the P4 enhancement. Fortunately, I realized that the delineation between numbering orders was operating system based, OSX/Windows clients. I was able to do the sort correctly by running two searches specific to the OS', and then sorting that way.
I can confirm it was lexicographical ordering vs semantic versioning.
... View more
03-05-2018
12:00 PM
This is the sheet they sent me from F5:
Type | Measurement
Traffic Processing L7 requests per second
L4 connections per second
L4 HTTP requests per second
Maximum L4 concurrent connections
Throughput (Gbps) #### I know that I'll need to calculate this.
Hardware Offload:
SSL/TLS: TPS (transactions per second)
... View more
03-05-2018
11:52 AM
Hello,
My networking team is curious about Transactions Per Second (TPS), and they have explained that by that they mean that they want to know how many connections will be made per second.
I haven't seen any information on whether or not the Universal Forwarder (UF) establishes a persistent connection, or if it connects when it needs to send data and/or check-in.
Is there any information on if it is a persistent connection, or how many connections per second it makes?
We're currently on Splunk 6.6.2.
Thanks
BG
... View more
02-27-2018
09:38 PM
Hi,
I'm dealing with decimal points trying to determine the latest version of some software, but it's botching the integers after the first decimal point.
SoftwareVersion
5.0.1450.509
5.0.1450.8 <----Example
4.2.1330.31
I've tried stats list(field) and it still didn't provide the searching I was looking for. Thoughts? Please don't suggest an eval field separation per decimal 😉
... View more