All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi, We have client looking to ingest logs using webmethod from one of application(caremonitor) logs from S3 bucket. Since we have not been used anytime before fetching logs via webmethod. Could you ... See more...
Hi, We have client looking to ingest logs using webmethod from one of application(caremonitor) logs from S3 bucket. Since we have not been used anytime before fetching logs via webmethod. Could you please let me know  the process to use this method and best practices. we are current using Splunk cloud platform and the care monitor looks to be cloud base application.
How to start the elastic search process. It's not mentioned in the Appd Documentation; Events service installation part. 
I have a simple question how can I check that in which of the apps a particular index has been used.
How to this the following file based on trigger time and elapsed time? "File name","AUTO_231126_012051_0329.CSV","V2.10" "Title comment","T1" "Trigger Time","'23-11-26 01:20:51.500" "CH","U1-2","... See more...
How to this the following file based on trigger time and elapsed time? "File name","AUTO_231126_012051_0329.CSV","V2.10" "Title comment","T1" "Trigger Time","'23-11-26 01:20:51.500" "CH","U1-2","Event" "Mode","Voltage" "Range","200mV" "UnitID","" "Comment","" "Scaling","ON" "Ratio","+1.00000E+02" "Offset","+0.00000E+00" "Time","U1-2[]","Event" +0.000000000E+00,+2.90500E+00,0 +1.000000000E-01,+1.45180E+01,0 +2.000000000E-01,+7.93600E+00,0 +3.000000000E-01,+3.60100E+00,0 +4.000000000E-01,+3.19100E+00,0 +5.000000000E-01,+3.17300E+00,0 +6.000000000E-01,+3.17300E+00,0 +7.000000000E-01,+3.18400E+00,0 +8.000000000E-01,+3.19400E+00,0 +9.000000000E-01,+3.16500E+00,0 +1.000000000E+00,+3.16000E+00,0
I want to combine these two events. Can anyone help me? I have tried using the join and append commands, but haven't been successful. I want to analyze data in the 'Endpoint' model to capture ... See more...
I want to combine these two events. Can anyone help me? I have tried using the join and append commands, but haven't been successful. I want to analyze data in the 'Endpoint' model to capture information related to 'mmc' and then compare the 'ID' to retrieve IP and port information from the 'Network' model.
    Hi, I have 2 indexers running in cluster and from the cluster master, i am getting the "Getting Missing Suitable candidates to create replicated copy in order to meet replication policy... See more...
    Hi, I have 2 indexers running in cluster and from the cluster master, i am getting the "Getting Missing Suitable candidates to create replicated copy in order to meet replication policy error." I tried roll and resync but the same error keeps coming back. if they can't roll and resync, my only choice is to delete, but how do I mass delete instead of clicking one by one to delete?
location that is AM. Where AM location should be the combination of (AB,AC,AD,AF) Like this i need: First, "the combination of (AB,AC,AD,AF)" is NOT "AM" in your illustration.  The illustra... See more...
location that is AM. Where AM location should be the combination of (AB,AC,AD,AF) Like this i need: First, "the combination of (AB,AC,AD,AF)" is NOT "AM" in your illustration.  The illustration is the opposite of what you described. (Also, please use text table instead of screenshot.) Second, for your initial question, I notice that you filter for "HU1","IA2","IB0".  Of course you will only get whatever definition you give for these three.  I think what you wanted is   ``` |search location IN ("HU1","IA2","IB0")``` ``` ^^^ no filtering ``` |eval row=if(location IN ("HU1","IA2","IB0"),location,"AM") |stats c by row   But back to your new example.  Here is an emulation   | makeresults format=csv data="Location AB AC AD AE AF AG AH" ``` data emulation above ``` | eval "New Location" = if(Location IN ("AB","AC","AD","AE","AF"),Location,"AM")   Location New Location AB AB AC AC AD AD AE AE AF AF AG AM AH AM Is this what you illustrated?
Hi Everyone, I am using splunk forwarder and I have below requirements  We have log files under path /opt/airflow/logs/*/*/*/*.log for example  /opt/airflow/logs/getServerInfo/some_run_id/get_upt... See more...
Hi Everyone, I am using splunk forwarder and I have below requirements  We have log files under path /opt/airflow/logs/*/*/*/*.log for example  /opt/airflow/logs/getServerInfo/some_run_id/get_uptime/1.log  or  /opt/airflow/logs/build_upgrade/some_run_id/ami_snapshot_task/5.log Now i want to extract the field some_run_id from the log file path and want to add this some_run_id to each log line while sending the logs to splunk Below is my normal logs format [2024-01-17, 03:17:02 UTC] {subprocess.py:89} INFO - PLAY [Gather host information] [2024-01-17, 03:17:01 UTC] {taskinstance.py:1262} INFO - Executing <Task(BashOperator): get_os_info> on 2024-01-17 03:16:37+00:00 [2024-01-17, 03:17:01 UTC] {standard_task_runner.py:52} INFO - Started process 1081826 to run task Now i want below format of logs in splunk (I want this format of logs in splunk not on the actual log files) some_run_id [2024-01-17, 03:17:02 UTC] {subprocess.py:89} INFO - PLAY [Gather host information] some_run_id [2024-01-17, 03:17:01 UTC] {taskinstance.py:1262} INFO - Executing <Task(BashOperator): get_os_info> on 2024-01-17 03:16:37+00:00 some_run_id [2024-01-17, 03:17:01 UTC] {standard_task_runner.py:52} INFO - Started process 1081826 to run task Any help is much appreciated !
Great this works. i went ahead and added eval to it  table client,transaction | eval user_transaction = client . "-" . transaction now the second query returns below result 2024-01-16 19:08:13.3284... See more...
Great this works. i went ahead and added eval to it  table client,transaction | eval user_transaction = client . "-" . transaction now the second query returns below result 2024-01-16 19:08:13.3284 [43] INFO [.ServiceClassTraCack] 0LO19-1901631 Report Finished successfully at 7:08:13 PM on 1/16/2024 my first query is returning result as 0LO19-1901631, i want to match these results to above query along with Report Finished snippet
Hello - I'd like to start with thanking the community for reviewing and helping!  Problem Statement: I have appt data from multiple clinical locations in Splunk with different types statues. I am ... See more...
Hello - I'd like to start with thanking the community for reviewing and helping!  Problem Statement: I have appt data from multiple clinical locations in Splunk with different types statues. I am trying to create a dashboard that would show trends in appts requests to see if we're gaining pts or losing them, what days are the busiest, what days are the slowest.  Query: index="index" cluster_id="*" dump_info:98 | spath output=log path=log | rex field=log ".*\{\'name\'\:\s\'(?<name>.*)\'\,\s\'service_type\'\:\s\'(?<service_type>.*)\'\,\s\'status\'\:\s\'(?<status>.*)\'\,\s\'start\'\:\s\'(?<start>.*)\'\,\s\'lastUpdated\'\:\s\'(?<lastUpdated>.*)\'\,\s\'date\'\:\s\'(?<date>.*)\'\}" | search name="*" AND status="*" AND start="*" | dedup name service_type status start lastUpdated date | eval startdate=strftime(strptime(start,"%Y-%m-%dT%H:%M:%SZ"),"%Y-%m-%d"), today=strftime(now(),"%Y-%m-%d") | where startdate=today | table name, status | stats count(status) as status_count, values(*) as * by name, status
Changing the permissions on the splunk.key file to read only/400 did the trick for me. Thanks!!!
Hi,  I had the same output on a centos7. I added the option -v to get more verbosity and I was able to see that the installer cannot generate the certificate. Creating HTTPS cert... Aborting htt... See more...
Hi,  I had the same output on a centos7. I added the option -v to get more verbosity and I was able to see that the installer cannot generate the certificate. Creating HTTPS cert... Aborting https cert create. File already exists Shell command: openssl x509 -in /opt/phantom/etc/ssl/certs/httpd_cert.crt -pubkey -noout Initialization function create_https_cert failed! Traceback (most recent call last): File "/opt/phantom/bin/initialize.py", line 965, in initialize func() File "/opt/phantom/bin/initialize.py", line 334, in create_https_cert cert_tools.create_https_cert(group=group, force=force) File "pycommon3/phantom_common/cert_tools.py/cert_tools.py", line 123, in create_https_cert File "pycommon3/phantom_common/phproc.py/phproc.py", line 269, in run File "pycommon3/phantom_common/phproc.py/phproc.py", line 379, in __init__ File "/opt/phantom/usr/python39/lib/python3.9/subprocess.py", line 951, in __init__ self._execute_child(args, executable, preexec_fn, close_fds, File "/opt/phantom/usr/python39/lib/python3.9/subprocess.py", line 1821, in _execute_child raise child_exception_type(errno_num, err_msg, err_filename) FileNotFoundError: [Errno 2] No such file or directory: 'openssl' Done. I installed openssl and I was able to complete the installation.
Try something like this | rex "client\s'(?<client>[^']*)'" | rex "transaction\s'(?<transaction>[^']*)'"
Hi, recently I was given a task to create an app for a specific dept. in my Org which will have only 2-3 selected indexes.  Please guide me through high-level steps. 
This is a known issue in versions before 6.2, where it is fixed. Until you're able to upgrade, you have a couple of options: Ctrl+A on the output so the output is all highlighted Change to light t... See more...
This is a known issue in versions before 6.2, where it is fixed. Until you're able to upgrade, you have a couple of options: Ctrl+A on the output so the output is all highlighted Change to light theme Click your username in the upper right Account Settings Change Them to Light Theme
I got the same error when sslPassword was wrong for a particular outputs.conf entry.   I determined even if you disable cert validation sslCertPath, sslPassword, and sslRootCAPath must all point at v... See more...
I got the same error when sslPassword was wrong for a particular outputs.conf entry.   I determined even if you disable cert validation sslCertPath, sslPassword, and sslRootCAPath must all point at valid files, and they weren't inherited from a higher level so they must be set if the remote host is ssl.   You can check if they are set elsewhere you can copy and paste into your config by using $SPLUNK_HOME/bin/splunk btool outputs list --debug
Sorry i am a noob to regex and splunk regex especially. Regex to extarct all that is between the two single quotes. there will never be a single quote in the name. EG extract the client code after ... See more...
Sorry i am a noob to regex and splunk regex especially. Regex to extarct all that is between the two single quotes. there will never be a single quote in the name. EG extract the client code after word client and same for transaction   2024-01-16 15:04:22.7117 [135] INFO [javalang] Starting Report for client '0SD45' user 'user1' for transaction '123456'   @fieldextraction  @Anonymous 
@yuanliu Let's assume for example in my data I'm having loction sites - AB,AC,AD,AF. I want new location that is AM. Where AM location should be the combination of (AB,AC,AD,AF) Like this i need: ... See more...
@yuanliu Let's assume for example in my data I'm having loction sites - AB,AC,AD,AF. I want new location that is AM. Where AM location should be the combination of (AB,AC,AD,AF) Like this i need: I tried this query as you mentioned earlier, but it doesn't work.
How to create trending graph history for tenable vulnerability data? I would like for it to show the daily timeline.