All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Timeout error is occurring randomly when trying to add new website monitoring on splunk. Encountered the following error while trying to save: Splunkd daemon is not responding: ("Error connecting ... See more...
Timeout error is occurring randomly when trying to add new website monitoring on splunk. Encountered the following error while trying to save: Splunkd daemon is not responding: ("Error connecting to /servicesNS/.../launcher/data/inputs/web_ping: ('The read operation timed out',)",) What could be the reason behind this?
Hi All, Input logs are forwarded from a syslog server. We extracted server name and user id from the logs. Our requirement is to find the count of users logged in a particular server per hour. So ... See more...
Hi All, Input logs are forwarded from a syslog server. We extracted server name and user id from the logs. Our requirement is to find the count of users logged in a particular server per hour. So we used the below query but the result is varying at every execution. Could you please help with this issue? | table _time, server, userdetails | timechart span=1h dc(userdetails) by server Thanks in advance.
I am working with MS-Exchange data. I am taking recipient email value and matching with user lookup for other details. Same email have multiple matching values in lookup table. I want only matching r... See more...
I am working with MS-Exchange data. I am taking recipient email value and matching with user lookup for other details. Same email have multiple matching values in lookup table. I want only matching records in same row, instead of repeating it. Ex.: I have an email xyz@abc.com in log. I have 3 records matching in user lookup like below. email first last id type xyz@abc.com Ram Singh 1001 T xyz@abc.com Ram Singh 1042 C xyz@abc.com Ram Singh 1063 T I am using below line to match recipient value and get other details from lookup. | stats values(recipient) as recipient count by _time sender | mvexpand recipient | eval recipient=lower(recipient) | lookup users email AS recipient OUTPUT id type first last I am getting output like below. sender recipient id type first last abc@xyz.com xyz@abc.com 1001 T Ram Singh 1042 C 1063 T But I am expecting result like this, so that i can perform some conditional action. sender recipient id type first last abc@xyz.com xyz@abc.com 1001 T Ram Singh abc@xyz.com xyz@abc.com 1042 C Ram Singh abc@xyz.com xyz@abc.com 1063 T Ram Singh If I am using mvexpand command, it's providing wrong output rows.
my event has a field Transaction:=InpatUPMC_050_Close_WorklistLoad and i am looking to strip the InpatUPMC_050_ part i tried the rex field=source mode=sed "s/Transaction:=InpatUPMC_050_Close_//g" ... See more...
my event has a field Transaction:=InpatUPMC_050_Close_WorklistLoad and i am looking to strip the InpatUPMC_050_ part i tried the rex field=source mode=sed "s/Transaction:=InpatUPMC_050_Close_//g" but does not work.
Hello, I'm currently a Certified SPLUNK Core User and would like to climb higher in the certification ranks (Power User, Admin, etc). My question involves the maintenance of your current certifica... See more...
Hello, I'm currently a Certified SPLUNK Core User and would like to climb higher in the certification ranks (Power User, Admin, etc). My question involves the maintenance of your current certification(s) after the 2 years. What do I have to do to maintain my certification? Are there ways to earn Continuing Education Points to make sure that whichever SPLUNK certification you have stays active? Please tell me that there are other ways of maintaining your certs without having to re-take the test. Thanks in advance
Hi All, I've had a look around but didn't seem to find an answer to my question. If it has been asked before, please direct me to the answer. Currently, I'm trying to embed images and videos in... See more...
Hi All, I've had a look around but didn't seem to find an answer to my question. If it has been asked before, please direct me to the answer. Currently, I'm trying to embed images and videos in a Splunk dashboard, to make Splunk be the single source of truth, making user experience slightly easier avoiding needing to go to multiple pages. Now the way this has been done is through putting the files in "../<>/appserver/static". However from what I can see, the content inside this folder can be accessed by anyone even if they are not logged into Splunk. I want to serve content on my Splunk dashboard, but also ensure that only people who are logged in can view it. If they use the developer tool and retrieve the URL for the video file, they would still need to log in if they are trying to access it. Hope this all makes sense. Please let me know if further clarification is needed. Thanks everyone.
Hello, I am trying to convert current PST time to UTC. I have written below code. But when I compare with current time in UTC, there is a difference between two. They are not same. Please le... See more...
Hello, I am trying to convert current PST time to UTC. I have written below code. But when I compare with current time in UTC, there is a difference between two. They are not same. Please let me know if I am missing anything STRT_TIME = 01-APR-2020 20:30:21 ( Current PST time search from google) | eval START_TIME= "01-APR-2020 20:30:21" | eval myUTCtimeEpoch=round(strptime(START_TIME." PST","%d-%B-%Y %H:%M:%S %Z")) | eval myUTCtime=strftime(myUTCtimeEpoch,"%d-%B-%Y %H:%M:%S %Z") | eval current_date_utc = strftime(round(now()),"%m/%d/%Y %H:%M:%S %Z") | table START_TIME current_date myUTCtime Result: STRT_TIME = 01-APR-2020 20:30:21 myUTCtime = 02-April-2020 04:30:21 UTC current_date_utc = 04/02/2020 03:30:21 UTC Above, I was expecting current_date_utc and myUTCtime to be same but there is 1 hour difference between them.
Hello All, when I am using the Splunk API I am getting different fields as compared to the Splunk UI. How can we get similar results (fields) as we are able to get from Splunk UI. I have tried the "... See more...
Hello All, when I am using the Splunk API I am getting different fields as compared to the Splunk UI. How can we get similar results (fields) as we are able to get from Splunk UI. I have tried the "rf" attribute also but no luck. This is the call: curl -k -u u:p https://splunk:8089/servicesNS/admin/search/search/jobs/export --data-urlencode search="search index=node message="j-report" appId=\"static--logger\" items.data.f_id != \"\" OR items.inst_id!= \"\" earliest=03/30/2020:0:0:0 latest=03/31/2020:0:0:0" -d rf=* -d output_mode=csv -o test.csv
Hello everyone, I have the attached file that is generated every night via my client's internal system and I need to index the information to collect metrics. Job 'NICE Kaizen Job' : Step 1, '... See more...
Hello everyone, I have the attached file that is generated every night via my client's internal system and I need to index the information to collect metrics. Job 'NICE Kaizen Job' : Step 1, 'Run script' : Began Executing 2020-03-31 22:00:00 AgentName Canal Ramal ID do Logger Dia Chamadas ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ ----------- -------------------------------------------------- ------------ ------------------------------ ----------- agent1 333 22222222 00000000 31/03/2020 17 agent1 334 22222222 00000000 31/03/2020 2 Sala de Reuniao - Sala 123 161 333333333 11111111 31/03/2020 1 Sala de Reuniao - Sala 545 157 44444444 11111111 31/03/2020 1 agent2 106 66666666 11111111 31/03/2020 11 agent2 106 88888888 11111111 31/03/2020 11 TI Count FL 545454 (2) 134 999999999 11111111 31/03/2020 6 TI Count FL 545454 (2) 134 999999999 11111111 31/03/2020 6 TI Count FL 545454 (2) 134 999999999 11111111 31/03/2020 6 (9 rows(s) affected) However, it generates a completely unconfigured file for me and due to the company's business rules, I cannot use Python or another language inside the server where the system is hosted. I need to extract the information from (Agent_Name, Canal, Ramal, Id do Logger, Dia), an example is below Remembering that this file is variable, there are days that generate many lines and other days not. AgentName Canal Ramal ID do Logger Dia Chamadas agent1 33 22222222 00000000 31/03/2020 17 agent1 334 22222222 00000000 31/03/2020 2 Is there a way to do this with Splunk?
All, I am relatively new to Splunk and trying to understand some basics along the way. https://docs.splunk.com/Documentation/Splunk/8.0.2/Data/Setadefaulthostforaninput From the Splunk online inf... See more...
All, I am relatively new to Splunk and trying to understand some basics along the way. https://docs.splunk.com/Documentation/Splunk/8.0.2/Data/Setadefaulthostforaninput From the Splunk online info, I see the "Example of static host value assignment" ... This example covers any events coming in from /var/log/httpd [monitor:///var/log/httpd] host = webhead-1 Why are there 3 slashes? I understand that the first slash in /var needs to be escaped, but why the 2nd slash? Sorry for the simple question, but it keeps bugging me.
I have a couple of search queries to execute based on certain conditions. A search query in my dashboard is getting executed, before clicking the submit button. I have used fieldset submitButton="tru... See more...
I have a couple of search queries to execute based on certain conditions. A search query in my dashboard is getting executed, before clicking the submit button. I have used fieldset submitButton="true" autoRun="false". respective search query should be executed, based on the radio button selection and on click of submit button.I want to run the input query, once the input radio button is selected and the submit button is clicked. Attached Screenshots. Dashboard: Once I open the dashboard, it has a couple of radio buttons (Input and Output) with the submit button. Once I click the input radio button, Input Panel displayed with 6 text fields & submit button (invokes input search query). Once I click the output radio button, Output Panel displayed with 6 text fields & submit button (invokes output search query) <form> <label>DemoDashBoard1</label> <fieldset submitButton="true" autoRun="false"> <input type="radio" token="searchBy" searchWhenChanged="false"> <label>SearchBy</label> <choice value="1">Input</choice> <choice value="2">Output</choice> <change> <condition value="1"> <set token="tkninput">true</set> <unset token="tknoutput"></unset> </condition> <condition value="2"> <set token="tknoutput">true</set> <unset token="tkninput"></unset> </condition> </change> </input> <input type="text" token="EventType" depends="$tkninput$"> <label>EventType</label> <default></default> <change> <condition value=""> <set token="EventType">*</set> </condition> </change> </input> <input type="text" token="businessEventTrigger" depends="$tknoutput$"> <label>businessEventTrigger</label> <default></default> <change> <condition value=""> <set token="businessEventTrigger">*</set> </condition> </change> </input> </fieldset> <search> <query> host= "tnt_log_mar" | xmlkv maxinputs=10000 | rename "nspJ:TOR010Id" as TORID "nspMMM:EventType" as EventType | search ns0:ProcessId (EventType=$EventType$ OR businesseventtrigger) | table ns0:ProcessId EventType TORID nspM:SEC010Id nsSec:BUL010OrigId nsSec:BUL010DestinationId nspM:SequencingNr businessEventTrigger rocsTourId rocsMovementId rocsOriginId rocsDestinationId tripLegSeqNbr publishCd routeNm firstLegSchedDprtTmstp firstLegOrigin tripLegSeqNbr origin destination schedDprtTmstp estDprtTmstp | selfjoin ns0:ProcessId | dedup ns0:ProcessId</query> <done> <condition match="$job.doneProgress$=1"> <set token="inputQuery">$result.search$</set> </condition> </done> </search> <row> <panel depends="$tkninput$"> <title>Input Panel</title> <table> <search> <query>$inputQuery$</query> </search> <option name="count">20</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> </table> </panel> <panel depends="$tknoutput$"> <title>Output Panel</title> <table> <search> <query> host= "tnt_log_mar" | xmlkv maxinputs=10000 | rename "nspJ:TOR010Id" as TORID "nspMMM:EventType" as EventType | search ns0:ProcessId (EventType OR businessEventTrigger=$businessEventTrigger$) | table ns0:ProcessId EventType TORID nspM:SEC010Id nsSec:BUL010OrigId nsSec:BUL010DestinationId nspM:SequencingNr businessEventTrigger rocsTourId rocsMovementId rocsOriginId rocsDestinationId tripLegSeqNbr publishCd routeNm firstLegSchedDprtTmstp firstLegOrigin tripLegSeqNbr origin destination schedDprtTmstp estDprtTmstp | selfjoin ns0:ProcessId| dedup ns0:ProcessId </query> </search> <option name="count">20</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> </table> </panel> </row> </form>
Free Splunk Enterprise download for 64 bit. got license agreement page, checked boxed at end. but download box stayed grayed out . why?
I have a .csv file that is being appended to every few minutes using Python. However, monitor reindexes everything each time it is written to, not just the new data. The filename and first 256 byte... See more...
I have a .csv file that is being appended to every few minutes using Python. However, monitor reindexes everything each time it is written to, not just the new data. The filename and first 256 bytes are the same, so crcSalt shouldn't be an issue here. How can I fix this?
consider: Log: 2020-04-01 10:20:30 firstabc secondxyz props.conf [test] REPORT-a = report_a, report_b transforms.conf [report_a] REGEX=first(?<a>\w+) [report_b] REGEX=second... See more...
consider: Log: 2020-04-01 10:20:30 firstabc secondxyz props.conf [test] REPORT-a = report_a, report_b transforms.conf [report_a] REGEX=first(?<a>\w+) [report_b] REGEX=second(?<a>\w+) Question 1: what is value of the field "a"? Question 2: will the results be the same with this props.conf: [test] REPORT-a = report_a REPORT-b = report_b Challenge: try guessing without testing first I'll spare you a search - here is a link for a previous discussion with two different opinions: https://answers.splunk.com/answers/320868/what-is-the-order-of-execution-precedence-of-multi.html Question 3: do you get the expected results? This post is not a 1 April joke Edit 02.04.2020: it is actually the second statement "fields are not overridden so once an earlier-executed transform has given a field a value, later-executed ones will not update/overwrite that original value" confirmed with this test case. Otherwise the "a" field would have the "xyz" value. I was previously ready to bet that "later-executed ones can update/overwrite that original value" but as you see it is not the case. The purpose of this post is to ask community and help to clarify. May be somebody has a link where this behaviour is documented.
All, I've got this code ######################################################################### # Imports for Splunk ######################################################################### i... See more...
All, I've got this code ######################################################################### # Imports for Splunk ######################################################################### import splunk.entity as entity try: from splunk.clilib.bundle_paths import make_splunkhome_path except ImportError: from splunkappserver.mrsparkle.lib.util import make_splunkhome_path ######################################## # get our config data returns a dict ######################################## def getConfig(sessionKey,nameSpace): try: resp = entity.getEntities(['properties', 'verodin', 'verodin'], namespace=nameSpace, owner='nobody', sessionKey=sessionKey) except Exception, e: raise Exception(" Could not get %s credentials from splunk. Error %s" % (nameSpace, str(e))) config = {} for k, v in resp.items(): config[str(k)] = str(v) return config Another member of the team got it working. I'm trying to create new code but I don't understand what the getEntities first argument is beyond a list. I've seen examples with [admin, password] but I haven't found a good example or documentation explaining how this works. Any idea? TIA, Joe
we have one search head and one with Enterprise Security. we have one index which named index=fireeye and logs are coming in CEF format. on search head all the logs are being properly parsin... See more...
we have one search head and one with Enterprise Security. we have one index which named index=fireeye and logs are coming in CEF format. on search head all the logs are being properly parsing but on ES, the logs are not being parsing.
Sorry for the complete noob question. But I have had this splunk project dropped on me and I need to spin up fast. I have added a monitor on "myhost" like so: [root@myhost bin]# pwd /apps/spl... See more...
Sorry for the complete noob question. But I have had this splunk project dropped on me and I need to spin up fast. I have added a monitor on "myhost" like so: [root@myhost bin]# pwd /apps/splunkforwarder/bin [root@myhost bin]# ./splunk add monitor /var/log/foo/ Your session is invalid. Please login. Splunk username: admin Password: Added monitor of '/var/log/foo'. That was yesterday. I executed a script that writes data to a log file that is in the /var/log/foo directory on myhost. But when I execute this search host=myhost I get zero results.
Here is the search (redacted a little).. | inputlookup xxxxxxxxx_xxxxxx.csv | search Area_designation="$status$" | rename "Site Designation" as Site_designation | stats count by Country | ... See more...
Here is the search (redacted a little).. | inputlookup xxxxxxxxx_xxxxxx.csv | search Area_designation="$status$" | rename "Site Designation" as Site_designation | stats count by Country | geom geo_countries allFeatures=True featureIdField=Country The $status$ token has a possible five values, I want to assign a color to each of the values. I have seen a response from 2015 that indicates it is possible, but have not been able to make it work..
Splunk Add-on for Juniper 1.3.0 version 1.3 release notes indicate that the sourcetype juniper:sslvpn is deprecated. version 1.3.0 still supports this sourcetype. Will events associated with this ... See more...
Splunk Add-on for Juniper 1.3.0 version 1.3 release notes indicate that the sourcetype juniper:sslvpn is deprecated. version 1.3.0 still supports this sourcetype. Will events associated with this sourcetype be re-mapped to another sourcetype in the next version after 1.3.0 ? Do I need to take any action when upgrading from version 1.2.0 to 1.3.0? Any assistance is appreciated.
Hello, I am using Splunk 7.2 and recently noticed a problem that I'm trying to figure out. I am using Splunk universal forwarder to collect firewall logs from a local windows machine. Logs are co... See more...
Hello, I am using Splunk 7.2 and recently noticed a problem that I'm trying to figure out. I am using Splunk universal forwarder to collect firewall logs from a local windows machine. Logs are collecting fine and have been for quite some time. I also configured Splunk to pull out the destination IP (as DestIP) from the firewall logs and this has been parsed out correctly for weeks. Nothing has changed with my splunk configuration or windows configuration, but today I noticed that DestIP was no longer being parsed out. I also noticed that the sourcetype has changed. Sourcetype used to be pffirewall. Today it started as pffirewall-too-small. I did some searches and saw that could be the case if there are too few logs, so I ran a ping to generate more than 100 logs. Sourcename then changed to pffirewall-3. The filename that is being collected by universal forward has not changed, nor has the directory of the log location that it's being picked up from. 1) Why is the source name changing? 2) How do I prevent this from happening again? Thank you..