All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

We have to forward some data from a Splunk Heavy Forwarder to a third party syslog server. This is possible as indicated here: https://docs.splunk.com/Documentation/Splunk/8.0.6/Forwarding/Forwardd... See more...
We have to forward some data from a Splunk Heavy Forwarder to a third party syslog server. This is possible as indicated here: https://docs.splunk.com/Documentation/Splunk/8.0.6/Forwarding/Forwarddatatothird-partysystemsd   The challenge is to select only some files from a particular host and forward only the logs that contain a particular string. Here is what we were able to achieve (basically 2 rules out of 3, so some files that contain a particular string), I don’t know if it is possible to add in some way a reference also for the host. Do you know if it is feasible?   outputs.conf [syslog:syslog_target] type = udp server = 111.222.333.444:514 props.conf [source::/path/of/myfile/*filename.log] TRANSFORMS-syslog_forward = syslog_forward_rule transforms.conf [syslog_forward_rule] REGEX = www\.mywebsite\.com DEST_KEY = _SYSLOG_ROUTING FORMAT = syslog_target   Thanks a lot, Edoardo
Hi, i am relatively newer to splunk, looking for a solution to get time difference is a splunk sample log like this "attribute1::d1 a2::d2, expectedUpdateDate::2020-09-30 23:30:00, ActualUpdateDate::... See more...
Hi, i am relatively newer to splunk, looking for a solution to get time difference is a splunk sample log like this "attribute1::d1 a2::d2, expectedUpdateDate::2020-09-30 23:30:00, ActualUpdateDate::10/1/20 5:44 PM, CreatedDate::9/30/20 10:14 AM" i need to print both ActualUpdateDate -CreatedDate and expectedUpdateDate - CreatedDate  in seconds.   Experts , Could you pls help me here?
Hey all, just need a sanity check: I would like to migrate a summary index located on a standalone search head to a clustered index on my indexers. This was found after setting up the monitoring co... See more...
Hey all, just need a sanity check: I would like to migrate a summary index located on a standalone search head to a clustered index on my indexers. This was found after setting up the monitoring console in distributed mode and running a health check. How would I do this? I have a feeling that a scp of the local indexed data to a indexer wouldn't replicate the data evenly (unless Splunk figures this out and does some magic). An idea I had was to push a new index via the CM and change the reports to use this newly-pushed index, although that would require some dashboard modifications since this summary index is used in our email dashboard, and the old data would just be sitting there and I'd like to have as few indexes out there as possible to follow best practices. I had a couple steps written down to do, but I'd like to get a confirmation before I give it a go: Create new index (with new name) on CM and push to indexers Stop local summary index on SH ... Thanks for your help!
Hi, Is the entire "Splunk Add-on for Microsoft Windows" needed to be pushed to forwarders in order to enable forwarding of WinEventLogs? While in Linux, I'm sure that the "Splunk Add-on for Unix an... See more...
Hi, Is the entire "Splunk Add-on for Microsoft Windows" needed to be pushed to forwarders in order to enable forwarding of WinEventLogs? While in Linux, I'm sure that the "Splunk Add-on for Unix and Linux" is needed because all inputs are scripted and the scripts come with the TA, I am not sure about Windows. Would it be possible to just push a app that houses a simplistic `inputs.conf` that enables forwarding of selected/whitelisted EventCodes? Or does the `inputs.conf` need to come with the entire Splunk Windows TA?
 any idea to write the query to capture the first packet recorded of the reconnaissance from the vulnerability scanner
We can install Splunk forwarder through ansible tower while executing enable at boot-start we are facing the same error. OS- CentOS Linux release 7.8.2003 (Core)  UF - Linux version 7.2.3 command ... See more...
We can install Splunk forwarder through ansible tower while executing enable at boot-start we are facing the same error. OS- CentOS Linux release 7.8.2003 (Core)  UF - Linux version 7.2.3 command using : /app/splunk/splunkforwarder/bin/./splunk enable boot-start -user awsadmin Error: --------------------------------------------------------------------------- execve: No such file or directory while running command /sbin/chkconfig --------------------------------------------------------------------- I tried as root user as well, but still facing the error. I tried all answers from splunk community but nothing worked for me.   Please help, why is this issue coming at all? How to resolve it?        
On a heavy forwarder, I added a new sourcetype in /opt/splunk/etc/apps/<my_app>/local/props.conf,   [sensor_data] DATETIME_CONFIG = LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true SHOULD_LINEMERGE ... See more...
On a heavy forwarder, I added a new sourcetype in /opt/splunk/etc/apps/<my_app>/local/props.conf,   [sensor_data] DATETIME_CONFIG = LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true SHOULD_LINEMERGE = false category = Custom disabled = false pulldown_type = true BREAK_ONLY_BEFORE_DATE =   and also at  /opt/splunk/etc/apps/<my_app>/local/inputs.conf add   [monitor:///home/slog/sensor_logs/sensor_*.dat] disabled = false index = cse_scada sourcetype = sensor_data   where my_app has been defined and establish working for the existing sourcetypes and indexes.  The index cse_scada has also been defined and working. I thought that this is what it takes to introduce a new sorucetype and input on a Heavy Forwarder. But I don't see the newly defined souncetype listed in the list sourcetype, nor do I found any expected data in a query of    sourcetype=sensor_data   I didn't see any error in the <SPLUNK>/var/log/splunkd.log I wonder what else I need to do to the definitions working?  - Do I have to restart the forwarder? - Any other means to let it to take effect?
The volume of data from collecting log files from 50 Linux servers and 200 Windows servers will require multiple indexers Following best practices, which types of Splunk component instances are nee... See more...
The volume of data from collecting log files from 50 Linux servers and 200 Windows servers will require multiple indexers Following best practices, which types of Splunk component instances are needed? Indexers, search head, universal forwarders, license master Indexers, search head, deployment server, universal forwarders Indexers, search head, deployment server, license master, universal forwarder Indexers, search head, deployment server, license master, universal forwarder, heavy forwarder
Hello, guys I`m trying to extract URL field from my log in Data Model (it is not extracted from _raw log and is not seen via index). I have found some variants in similar topics and added a new fiel... See more...
Hello, guys I`m trying to extract URL field from my log in Data Model (it is not extracted from _raw log and is not seen via index). I have found some variants in similar topics and added a new field (with regular expression) to Data Model. It does not cover 100% of my events, but it works. However, I still don`t see this field when run the command | from datamodel Network_Traffic 2 questions:   1) Can anyone answer me why the field is still not seen when whiting the search | from datamodel Network)Traffic Because the "Preview" tab shows the results and URLs are extracted    2) Maybe you know how I can extract the field URL directly from _raw event, because I`m confused with all answers which I saw about this topic before.   Tranks in advance
For fetching any events related to a Salesforce App Console we generally use logRecordType=ailtn ("appName":"Collections_Platform" AND "appType":"Console")   I have a requirement where I want to d... See more...
For fetching any events related to a Salesforce App Console we generally use logRecordType=ailtn ("appName":"Collections_Platform" AND "appType":"Console")   I have a requirement where I want to display time spent by Users on one of TAB inside the App Console. For ex -> time spent by users on Accounts Tab of the App Console.   How would I meet this requirement ? How should I start the search query ? Any suggestions ?
Hi, I would like to know whether Splunk Enterprise is Agentless and does it support SNMP Service? Any idea about its pricing after free trial?  
I am very new to Splunk administration. Would anyone help me with a simple search to check if a particular device is reporting to splunk, given it's IP address and/or it's hostname.
Hi, I have a Dasboard for user activity and I would like to add usernames with a dropdown input.  <fieldset submitButton="true" autoRun="false"> <input type="dropdown" token="user"> <label>Username... See more...
Hi, I have a Dasboard for user activity and I would like to add usernames with a dropdown input.  <fieldset submitButton="true" autoRun="false"> <input type="dropdown" token="user"> <label>Username / Identity</label> <search> <query>`umbrella` | stats count by identities | sort + identities | fields - count</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <fieldForLabel>user</fieldForLabel> <fieldForValue>user</fieldForValue> </input> Search results with the user's list.  Rest of the boards  are using  `umbrella` identities="$user$" ... searches (if I add a text input and username manually all working fine) But... The dropdow input still appears inactive - greyed out and mouse over show red crossed out circle What I missed? (In other app same thing working fine, naturally I'm admin in this app, so I doubt it's permission)  
For example, My ip is 202.101.53.4, I want to identify what are the domains sent me the most number of packets (most frequently), what will be the query look like? I want to know the domain who sent... See more...
For example, My ip is 202.101.53.4, I want to identify what are the domains sent me the most number of packets (most frequently), what will be the query look like? I want to know the domain who sent me the most number of traffics, what would be the query look like? I want to know the vulnerability scanner the malicious domain is using for reconaissance, which filed I can get info from splunk?  
Hi team,    I need to send statistical chart from Splunk to Microsoft Teams.   Can anyone suggest me the way to complete this task.  Thanks in advance.
Hi,  I've recognized that my modularinput configuration for "myapp" has a strange situation. When I am in Splunk using "myapp" and go to Data Inputs>MyAppInput, the inputs.conf file gets written pr... See more...
Hi,  I've recognized that my modularinput configuration for "myapp" has a strange situation. When I am in Splunk using "myapp" and go to Data Inputs>MyAppInput, the inputs.conf file gets written properly (<myapp>/local/inputs.conf). But when someone is in another app than "myapp" (e.g. search), and goes to Data Inputs>MyAppInput, the data gets written to /search/local/inputs.conf (not "myapp" app). It gets written to the application folder from where you click on "Data Inputs". How can i make sure that from wherever the user clicks on " Data Inputs>MyAppInput", the data gets only written in  (<myapp>/local/inputs.conf)? Thanks!
Hi,  I'm using Splunk universal Forwarder for sending UiPath Robot logs to Splunk Server. I noticed that some of our logs are being truncate at the end. I searched it on the internet and my underst... See more...
Hi,  I'm using Splunk universal Forwarder for sending UiPath Robot logs to Splunk Server. I noticed that some of our logs are being truncate at the end. I searched it on the internet and my understanding is that I have to change Truncate value in props.conf for Universal Forwarder. I could not figure out where should be the props.conf file changed In the beginning I changed   C:\ProgramFiles\SplunkUniversalForwarder\etc\apps\SplunkUniversalForwarder\default\props.conf   but then I got to know that you're not supposed to make changings in default configs, So I removed the line from there.  Then I added a props.config file by myself at the location C:\Program Files\SplunkUniversalForwarder\etc\apps\SplunkUniversalForwarder\local\props.conf because there wasn't any, So I thought maybe we have to add it on our own. I used the following lines in props.conf [default] Truncate = 50000 Still I could not see any changings in Splunk logs. Then I read somewhere that you need to restart your forwarder for changes to take place. I used the following command C:\Program Files\SplunkUniversalForwarder\bin>splunk.exe restart   and got the following error Invalid key in stanza [default] in C:\Program Files\SplunkUniversalForwarder\etc\apps\SplunkUniversalForwarder\local\props.conf, line 2: Truncate (value: 50). Your indexes and inputs configurations are not internally consistent. For more information, run 'splunk btool check --debug' Done Checking default conf files for edits... Validating installed files against hashes from'C:\Program Files\SplunkUniversalForwarder\splunkforwarder-8.0.5-a1a6394cc5ae-windows-64-manifest' File 'C:\Program Files\SplunkUniversalForwarder\etc/apps/SplunkUniversalForwarder/default/props.conf' changed. Problems were found, please review your files and move customizations to local All preliminary checks passed. Starting splunk server daemon (splunkd)... SplunkForwarder: Unable to start the service: Access is denied.   I am very new to splunk so I don't have any idea of these things. I assume that may be I'm doing it wrong. Can someone please answer my following questions Where do I need to add props.conf?  What should I add in props.conf and what should be the syntax?  After doing the above how to restart splunk? Any help will be much appreciated. Thanks 
Hi, I have existing set of prod servers sending logs to splunk which has 10GB license capacity, is this possible to exclude upcoming staging servers which will be sending logs to existing licensing, ... See more...
Hi, I have existing set of prod servers sending logs to splunk which has 10GB license capacity, is this possible to exclude upcoming staging servers which will be sending logs to existing licensing, might in future when we upgrade the license then we are planning to send some security logs to indexing and search for audit purpose.
Hi, I'm trying this search and it seems to be working as i'm not getting anything outside the range.  The issue is I've created an event that should get picked up by the below search, so I'm obviousl... See more...
Hi, I'm trying this search and it seems to be working as i'm not getting anything outside the range.  The issue is I've created an event that should get picked up by the below search, so I'm obviously doing something wrong here.   | search "fooResponse.bets{}.legs{}.propositionId">=150000 AND "fooResponse.bets{}.legs{}.propositionId"<=180000   Any help would be greatly appreciated. Cheers, Steve
I have created web.conf file with [settings] max_upload_size = 1024. But im getting error that says [The entity sent with the request exceeds the maximum allowed bytes] I tried changing the max size... See more...
I have created web.conf file with [settings] max_upload_size = 1024. But im getting error that says [The entity sent with the request exceeds the maximum allowed bytes] I tried changing the max size to 2048 but still nothing same error. Please help