All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I am trying to profiler .NET Core application in linux environment. Here, I have installed and configured .NET core agent in my Centos. I given the environmental variable in service file as follo... See more...
I am trying to profiler .NET Core application in linux environment. Here, I have installed and configured .NET core agent in my Centos. I given the environmental variable in service file as follows, Environment=CORECLR_PROFILER={57e1aa68-2229-41aa-9931-a6e93bbc64d8} \ CORECLR_ENABLE_PROFILING=1 \ CORECLR_PROFILER_PATH=/opt/appdynamics/dotnet/libappdprofiler.so Then restarted the app service and apache server. To check appdynamics profiler installation  by running the following command, lsof -p 2268 | grep -i appd dotnet 2268 root mem REG 253,0 6443304 69595618 /opt/appdynamics/dotnet/libappdprofiler_glibc.so dotnet 2268 root mem REG 253,0 6776 69595628 /opt/appdynamics/dotnet/libappdprofiler.so <2268> is my dotnet process id and confirmed profiler loaded successfully. My question is... We have given only one profiler path which is libappdprofiler.so, then How this file "libappdprofiler_glibc" is loading?  What is the use of this file? Thanks in advance.
Hi, I have a requirement like i need to extract a some card value which was present inside the message body of the log. In this value would come to logs in two different names. Can you please how ca... See more...
Hi, I have a requirement like i need to extract a some card value which was present inside the message body of the log. In this value would come to logs in two different names. Can you please how can i fetch these value and display it in  table.   thanks
Hi, I have a requirement like we have a csv file which has the values of functionid and functiondesc, this file was added in lookup also. I get a value eventid from the logs which is same as functio... See more...
Hi, I have a requirement like we have a csv file which has the values of functionid and functiondesc, this file was added in lookup also. I get a value eventid from the logs which is same as functionid. Now,  we need to fetch  the functiondesc of the corresponding eventid. and display it in table.
Hello Splunk Community,  I have a stats table I have created and I want to change the time field ("%Y-%m-%d %H:%M:%S") to present 'x' minutes ago.  Can anyone help with this?  Many Thanks,  Zoe
Hi Experts, We got two AWS platforms, we are collecting cloudwatch vpcflow logs, one of them works perfectly, we created the inputs to collect the Cloudwatch VPCflow log directly (not using kinesis)... See more...
Hi Experts, We got two AWS platforms, we are collecting cloudwatch vpcflow logs, one of them works perfectly, we created the inputs to collect the Cloudwatch VPCflow log directly (not using kinesis). However another one does not collect any vpcflow logs, when we checked the _internal logs, it just keeps throwing "Start to describe streams. region=ap-southeast-1, log_group=XXX",   "No data input has been configured.", "Previous job of the same task still running. Exit current job." (after 10mins) We checked the cloudwatch in AWS, the corresponding log groups do have logs, and the permissions are set correctly Thanks
I have what should be a simple problem, but I don't have an answer without burning some brain cells Simple query example:  index=some_index sourcetype=some_sourcetype.  Returns 140k events Output o... See more...
I have what should be a simple problem, but I don't have an answer without burning some brain cells Simple query example:  index=some_index sourcetype=some_sourcetype.  Returns 140k events Output of the query above contains the field 'tag', with 7 values, x 30K+ events  But if I use the query: index=some_index sourcetype=some_sourcetype tag="*" I get 'zero', no results
Hi all, I am looking for an automated way to export reports on a recurring schedule and to a location other than the the default where the outputscsv command saves them to. I found a few older quest... See more...
Hi all, I am looking for an automated way to export reports on a recurring schedule and to a location other than the the default where the outputscsv command saves them to. I found a few older questions and answers posted but I wanted to see if there was an updated answer to this. Thanks splunkers!
I found that we can create alerts in Splunk and output the alert to specific email recipients. Is there a way to send alerts or dashboard values to 3rd party apps other than email? 
Hello, I am new to the Splunk and my first task is to pair "github app for splunk" with "Github Audit log monitoring app", to get the visualization for the logs. Can anyone help me or guid me what sh... See more...
Hello, I am new to the Splunk and my first task is to pair "github app for splunk" with "Github Audit log monitoring app", to get the visualization for the logs. Can anyone help me or guid me what should be done once the Github App for Splunk is installed?  "Github Audit log Monitoring Add on for Splunk" is capturing the logs but need some guidance on how Github App for Splunk can be paired with it for visualization.  Thanks in advance, 
I want to simply get new exceptions that occur within last 30 minutes which did not happened anytime last week on the same day. I have this query to get exceptions for last weekday.        earli... See more...
I want to simply get new exceptions that occur within last 30 minutes which did not happened anytime last week on the same day. I have this query to get exceptions for last weekday.        earliest=-7d@d latest=-6d@d index=production "java.lang.NullPointerException*" | stats count by field6       Which gives me result ::    abcd.handler.CreateBankHandler 26 abcd.cr.RequestProcessor 34 abcd.cr.SessionInfo 1 abcd.cr.SSOServlet 2 abcd.impl.ExportManagerImpl 1 abcd.impl.ImportFileProcessor 1       The second query        earliest=-1d@d latest=now index=production "java.lang.NullPointerException*" | stats count by field6         Which gives me result ::  abcd.handler.CreateBankHandler 27 abcd.cr.RequestProcessor 7 abcd.cr.SessionInfo 1 abcd.cr.BaseServlet 6 abcd.cr.SSOServlet     So, the result should be new events from the second query. Name ::  abcd.cr.BaseServlet    
I have an index that ingests scan files and assigns a sourcetype based on the folder location. There are several scans for each host, which caused some issues when I tried to join or append. I'm try... See more...
I have an index that ingests scan files and assigns a sourcetype based on the folder location. There are several scans for each host, which caused some issues when I tried to join or append. I'm trying to build  a search that takes the host, scan, and version in one folder and compares it against the host, scan, and version in the other and create a table with the host, evaluate scan and version, and authoritative scan and version and tell me if we have the most up to date scans in the authoritative folder. What's the best method to create this search? Example data: HOST_FQDN: Host1 sourcetype: Evaluate SCAN: IE11 Version: 2   HOST_FQDN: Host1 sourcetype: Authoritative SCAN: IE11 Version: 3
There is a particular section of the Splunk configuration/installation manual titled Change Administrators group membership on each host that I am having an issue with.  It states:   “Confirm that ... See more...
There is a particular section of the Splunk configuration/installation manual titled Change Administrators group membership on each host that I am having an issue with.  It states:   “Confirm that all accounts that need access to the Administrators group on each host have been added to the Restricted Groups policy setting. Failure to do so can result in losing administrative access to the hosts on which you apply this GPO!”   However, implementing this results in removing any local admin privileges for local accounts on our client machines.  Therefore, I have the following questions:   1)      Am I misunderstanding the quoted text?  This seems like the intention of implementing this section should prevent exactly what is happening to us, which is the removal of local administration privileges.  2)      In either case to the answer for question 1: For our purposes, it is vital that we have at least one local account with admin privileges on each computer. Is it required that the Splunk Access GPO removes administration privileges from local accounts, or is this just a step that would normally improve security on a network in general?
Hi, I would like to intergrate our Splunk on-prem environment with our ServiceNow ITOM in order for the Splunk events to be sent from Splunk to the Event Management Addon for ServiceNow ITOM.   Do... See more...
Hi, I would like to intergrate our Splunk on-prem environment with our ServiceNow ITOM in order for the Splunk events to be sent from Splunk to the Event Management Addon for ServiceNow ITOM.   Do I need to use a ServiceNow MID? I would like for my Splunk environment to intiiate communicaiton wioht the MID server. But I am having a hard time findind docuemntation on how ton configure this.   Thanks  
I am searching a source that has events that have FieldA and FieldB. I need to find which events that have specific FieldA values (x or y) AND matching FieldB values (nonspecific). My current searc... See more...
I am searching a source that has events that have FieldA and FieldB. I need to find which events that have specific FieldA values (x or y) AND matching FieldB values (nonspecific). My current search is: Index=source  FieldA IN ("x", "y") I'm not sure how to filter the results to only show the events that have matching FieldB values.
How can I erex a line TRUE, FALSE, TRUE,, FALSE, FALSE, FALSE, , FALSE, FALSE  source =" an imported CSV" the multiple true and false on the line have different column names. I am trying to create a ... See more...
How can I erex a line TRUE, FALSE, TRUE,, FALSE, FALSE, FALSE, , FALSE, FALSE  source =" an imported CSV" the multiple true and false on the line have different column names. I am trying to create a label for each true and false following a reference sheet.              
Hi All Hoping someone can help me, I am trying to get the Palo Alto App working we are a Splunk cloud customer and have this app on our search-head    When I search for eventype=pan I see the logs... See more...
Hi All Hoping someone can help me, I am trying to get the Palo Alto App working we are a Splunk cloud customer and have this app on our search-head    When I search for eventype=pan I see the logs but they are NOT reclassified   Our set up is we have our Palo Alto firewalls pushing to a syslog server on standard port 514, this data at the moment is currently being ingested as one syslog stream via universal forwarder, where the sourcetype=syslog and index=syslog.    In inputs.conf in   /opt/splunk/etc/system/local I have configured the below    [monitor:///data/rsyslog/10.0.0.1/10.0.0.1.log] index = pan_logs sourcetype = pan:log host_segment = 3     The guide states to configure your TCP outputs in    / opt/splunkforwarder/etc/system/local/outputs.conf in this file we have  [tcpout] indexAndForward = 1   As a cloud customer we have our company app in root@syslog:/opt/splunk/etc/apps/OUR_COMPANY_APP/default   The outputs.conf has but no input file  = inputs1.name.splunkcloud.com:9997,  inputs2.name.splunkcloud.com:9997,  inputs3.name.splunkcloud.com:9997,  inputs4.name.splunkcloud.com:9997,  inputs5.name.splunkcloud.com:9997,  inputs6.name.splunkcloud.com:9997,    The input file being used is   oot@syslog:/opt/splunk/etc/apps/search/local   The PaloAlto app states to add your indexers to Create or modify/opt/splunkforwader/etc/system/local/outputs.conf    and add a tcpout stanza:    Could I copy over the outputs from root@syslog:/opt/splunk/etc/apps/OUR_COMPANY_APP/default to /opt/splunkforwader/etc/system/local/outputs.conf  
OK, I'm trying to improve performance by replacing some join queries with stats, but struggling on a filter. I have the below query, two source types where the common field between events is 'Corr... See more...
OK, I'm trying to improve performance by replacing some join queries with stats, but struggling on a filter. I have the below query, two source types where the common field between events is 'Correlator' . In source_one I have fields 'Correlator', 'sysplex' and 'servername'. In source_detail I have 'Correlator', 'sysplex' and multiple other fields, the one for this data is Sample_NAME. 'servername' in source_one could have multiple values and I want to filter on a match so search servername=xyz* I've tried a number of ways and I can't seem to manage to limit results to a filter on 'servername' without losing everything else, 'sysplex' which is in both sourcetypes filters just fine. Any thoughts would be appreciated. index=my_index sourcetype=source_one OR sourcetype=source_detail sysplex=ABC* | stats values(SAMPLE_NAME) AS SampleName values(SAMPLE_TIME) AS SampleTime by Correlator,SampleTime | eval _time=strptime(SampleTime,"%Y-%m-%d %H:%M:%S.%N") | timechart span=1m count by SampleName
Hello, I would like to know what software I can use to generate traffic for my lab. thanks. 
Hello, We designed a new model for NLP. We are running the model in the Jupyter Notebook noticing that the model was correctly load. The problem appears when we are fiting the model to Splunk: An e... See more...
Hello, We designed a new model for NLP. We are running the model in the Jupyter Notebook noticing that the model was correctly load. The problem appears when we are fiting the model to Splunk: An error says the model was not found . We are working with the command spacy.load() after uploading the model in the container (created at an external notebook using nlp.to_disk(output_dir)).   Any suggestions?   Thomas
Hi! I tried removing an app from a Search Head cluster, deleting it from the deployer's shcluster/apps directory and pushing the other apps, but this doesn't work properly and the app stays on my Sea... See more...
Hi! I tried removing an app from a Search Head cluster, deleting it from the deployer's shcluster/apps directory and pushing the other apps, but this doesn't work properly and the app stays on my Search Heads. Is there another way to remove it? Thanks, Mauro