All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @Jeewan , downloading the UniversalForwarder App from your Splunk Cloud instance, there's the outputs.conf file in which you should find the Splunk Cloud IPs of your instance. Ciao. Giuseppe
Hello, I have been trying to migrate elk data to splunk, we have elk data dating back 2 years and I have attempted to use the elastic integrator app from splunk base. I was able to set it up with SSL... See more...
Hello, I have been trying to migrate elk data to splunk, we have elk data dating back 2 years and I have attempted to use the elastic integrator app from splunk base. I was able to set it up with SSL and its bringing logs in from the past 30 days. The issue I have is that if I try to change the timeframe in the inputs.conf it will not work, and if I try to use a wildcard for the indice it will not work as well. Has anyone found a way around this? I am also open to hearing any other suggestions to get old elk data into splunk, thank you.  #https://splunkbase.splunk.com/app/4175
Hi @Sec-Bolognese , I don't know how AWS Cloudwatch runs, but, it's possible to dend logs from a Forwarder to Splunk Cloud and to a third party, following the instructions at  https://docs.splunk.c... See more...
Hi @Sec-Bolognese , I don't know how AWS Cloudwatch runs, but, it's possible to dend logs from a Forwarder to Splunk Cloud and to a third party, following the instructions at  https://docs.splunk.com/Documentation/Splunk/9.4.0/Forwarding/Routeandfilterdatad#Replicate_a_subset_of_data_to_a_third-party_system and https://docs.splunk.com/Documentation/Splunk/latest/Forwarding/Forwarddatatothird-partysystemsd Ciao. Giuseppe
Hi @secure , in a dashboard it's possible to define more base searches, but in each panel, it's possible to use only one base search, not more. Ciao. Giuseppe
if we have to allow or whitelist the Splunk cloud IP's somewhere how to get the Splunk Cloud IP's for whitelisting ?  are these IP's are static ? is there any fix range of IP's Splunk uses for ... See more...
if we have to allow or whitelist the Splunk cloud IP's somewhere how to get the Splunk Cloud IP's for whitelisting ?  are these IP's are static ? is there any fix range of IP's Splunk uses for Splunk Cloud so we can use those for whitelisting
Hi i have a complex base search where iam comparing data from two indexes using left join and getting the results in a table query is working fine but its very slow so i have now decided to split it... See more...
Hi i have a complex base search where iam comparing data from two indexes using left join and getting the results in a table query is working fine but its very slow so i have now decided to split it into two base searches and then combine them in the panel  index=serverdata | rex "host_name=\"(?&lt;server_host_name&gt;[^\"]*)" | lookup servers_businessgroup_appcode.csv appcode output Business_Group as New_Business_Group |chart dc(host_name) over appcode by host_environment | eval TOTAL_servers=DEV+PAT+PROD | table appcode DEV PAT PROD TOTAL_servers   2nd Base search  index=abc | rex field=data "\|(?<server_name>[^\.|]+)?\|(?<appcode>[^\|]+)?\|" | lookup servers_businessgroup_appcode.csv appcode output Business_Group as New_Business_Group  i want to use this in third panel  combine both the searches using a left join and get the list of servers details in both the index  question how can i use two base searches in a single search   
Hi - I need to be able to send copies of logs to both Splunk Cloud and an AWS Cloudwatch Log Group.  Is it possible to configure the Universal Forwarder to send logs from the same source to both loca... See more...
Hi - I need to be able to send copies of logs to both Splunk Cloud and an AWS Cloudwatch Log Group.  Is it possible to configure the Universal Forwarder to send logs from the same source to both locations?  If not, has anybody use UF and the Cloudwatch Agent to monitor the same log file - I'm worried about two products watching the same file.
I'm searching for a method for general use that I can apply as needed. Currently for testing I'm using a simple tstats search counting events by ip, in spans of 4 hours. I need a way to adjust the st... See more...
I'm searching for a method for general use that I can apply as needed. Currently for testing I'm using a simple tstats search counting events by ip, in spans of 4 hours. I need a way to adjust the starting point of the spans but as shown above, it's not actually shifting where it's searching the data. It's just changing the time labels in the table. 
Hi every one I have a schedule search which will run every day .But some times it going into failed state .Is there any way or setting to re Run that schedule search as soon as it goes into failed s... See more...
Hi every one I have a schedule search which will run every day .But some times it going into failed state .Is there any way or setting to re Run that schedule search as soon as it goes into failed state?
I have an installation where I am trying to leverage an intermediate forwarder (IF) to send logs to my indexers. I have approximately 3000 Universal Forwarders (UFs) that I want to send through the I... See more...
I have an installation where I am trying to leverage an intermediate forwarder (IF) to send logs to my indexers. I have approximately 3000 Universal Forwarders (UFs) that I want to send through the IF, but something is limiting the IF to around 1000 connections. The IF is a Windows Server 2019. I am monitoring the connections with this PowerShell command: netstat -an | findstr 9997 | measure | select count. I never see more than ~1000 connections, even though I have several thousand UFs configured to connect to this IF. I have already tried increasing the max user ports, but there was no change: HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters\MaxUserPort HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters\TcpTimedWaitDelay I have validated the network by creating a simple client and server to test the maximum connections. It reached the expected maximum of 16,000 connections from the client network to the IF. I can also configure a server to listen on port 9997 and see several thousand clients trying to connect to the port. I believe there must be something wrong with the Splunk IF configuration, but I am at a loss as to what it could be. There are no limits.conf configurations, and the setup is generally very basic. My official Splunk support is advising me to build more IFs and limit the clients to less than 1000, which I consider a suboptimal solution. Everything I’ve read indicates that an IF should be capable of handling several thousand UFs. Any help would be greatly appreciated.
Hi @PickleRick , I tried but I am unable to create SPL query can you please help me with the accurate query?
HI Team  Is it possible to use the inputlookup of csv file with 7 column and fill the details in those 7 columns using the search command that fetches the data from splunk ??  Examples:  My csv ... See more...
HI Team  Is it possible to use the inputlookup of csv file with 7 column and fill the details in those 7 columns using the search command that fetches the data from splunk ??  Examples:  My csv looks like this:  Column1 , Column2  Value A1 , Value B1 Value A2 , Value B2 Value A3 , Value B3 Value A4, Value B4 I need output like below :  Column1 , Column2 , Column3 , Column4 Value A1 , Value B1 , Value C1 , Value D1 Value A2 , Value B2 , Value C2 , Value D2 Value A3 , Value B3 , Value C3 , Value D3 Value A4, Value B4 , Value C4 , Value D4 Values of Column 3 and Column4 are fetched from Splunk using search command and using the key value of Column1.  I've tried to use the below search, but it is not working:  | inputlookup File.csv | join Column1 type=left  [ | tstats latest(Column3) as START_TIME ,                    latest(Column4) as END_TIME  where index = main source = xyz  ] | table Column1 , Column2 , START_TIME , END_TIME 
Hi livehybrid Thank you for answering my question. The PSC is already installed. Path : /opt/splunk/etc/apps/Splunk_SA_Scientific_Python_linux_x86_64/bin/linux_x86_64/4_2_2/lib/python3.9/site-pack... See more...
Hi livehybrid Thank you for answering my question. The PSC is already installed. Path : /opt/splunk/etc/apps/Splunk_SA_Scientific_Python_linux_x86_64/bin/linux_x86_64/4_2_2/lib/python3.9/site-packages/pandas What I'm curious about is Why does the Pandas error occur when I run ARIMA.py in the /opt/splunk/etc/apps/Splunk_ML_Toolkit/bin/algos path as below? [root@master algos]# python3 ARIMA.py Traceback (most recent call last): File "ARIMA.py", line 5, in <module> import pandas as pd ModuleNotFoundError: No module named 'pandas'
index=cim_modactions source=/opt/splunk/var/log/splunk/incident_ticket_creation_modalert.log host=sh* search_name=* source=* sourcetype=modular_alerts:incident_ticket_creation user=* action_mode=* ac... See more...
index=cim_modactions source=/opt/splunk/var/log/splunk/incident_ticket_creation_modalert.log host=sh* search_name=* source=* sourcetype=modular_alerts:incident_ticket_creation user=* action_mode=* action_status=* search_name=kafka* [| rest /servicesNS/-/-/saved/searches | search title=kafka* | stats count by actions | table actions] | table user search_name action_status date_month date_year _time
It is still not clear what data you are dealing with. For example, does each job run at most once for each app in each country each day? Which day do you want to use, the day from _time or the day fr... See more...
It is still not clear what data you are dealing with. For example, does each job run at most once for each app in each country each day? Which day do you want to use, the day from _time or the day from the RUNSTARTTIMESTAMP or the day from the RUNENDTIMESTAMP? Your original table doesn't show App, is this not required? Please provide a mock up of your expected results using events like the one you have shared. Also, please explain how the data in the events is related to the expected results.
Fresh installation 9.4.0 errors showing for kvstore provider , http connection error
Thank you so much for the response and yes it worked.
Try using eval rather than set <eval token="mySource">replace($numSuffix$,"_","")</eval>
If you're not explicitly limiting allowed clients to a predefined list, CNs and SANs in the certs don't matter (as long as the certs are not self-signed which means that CN of the CA is the same as C... See more...
If you're not explicitly limiting allowed clients to a predefined list, CNs and SANs in the certs don't matter (as long as the certs are not self-signed which means that CN of the CA is the same as CN of the issued cert). If you do verify server name (sslVerifyServerName setting) there are additional restrictions that the name in the cert presented by the host must match the hostname you're trying to connect to. But at this point you're not using this. So the first thing to enable is to verify server's cert. For this you need to have CA defined on your UF (preferably by setting sslRootCAPath in your server.conf) containing a PEM-encoded certificate of the CA which issued either the indexer's cert directly or is the rootCA from which the indexer's cert is descended. Then you enable sslVerifyServerCert. If at this point UF cannot connect to the indexer, there's something wrong with the trust relationship between indexer and UF. Check logs. Sometimes it helps to do a tcpdump and see where exactly the connection gets terminated and with what alert. If you manage to get server verification working, time to enable client authentication. You have sslRootCAPath = /opt/splunkforwarder/etc/auth/mycerts/myCertAuthCertificate.pem in your inputs.conf (actually this setting is deprecated and you should use the setting from server.conf; if you don't have a separate different setting there, we might leave it at this moment; If you do - I have no idea how Splunk reacts). That means that you need the client (UF) to present a valid certificate on connection attempt. clentCert = /path/to/your/crypto_material.pem Should be enough on the UF end as long as the key is not encrypted. If it is, you need to set sslPassword. The PEM file must be in the form of client certificate, client private key, certification chain (optionally) all concatenated into a single file. Then on the indexer's end you simply enable requireClientCert. And you're good to go. Again - don't do too many things at once. One step at a time. And remember to have valid certificates (properly issued, not self-signed, not expired and so on).
please help me adapt my current request