All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

We are at AWS 6.3.0 version still we are seeing the below errors in splunk_ta_aws_aws_sqs_based_s3_ess-p-sys-awsconfig.log log file. 2023-03-16 02:55:41,201 level=ERROR pid=72620 tid=Thread-8 lo... See more...
We are at AWS 6.3.0 version still we are seeing the below errors in splunk_ta_aws_aws_sqs_based_s3_ess-p-sys-awsconfig.log log file. 2023-03-16 02:55:41,201 level=ERROR pid=72620 tid=Thread-8 logger=splunk_ta_aws.modinputs.sqs_based_s3.handler pos=handler.py:_ingest_file:514 | datainput="ess-p-sys-awsconfig" start_time=1678849700, message_id="39693d02-55gg -4e62-9895-9796fa51ed2f" created=1678902941.093989 ttl=300 job_id=95c972ab-5316-4bda-bfaf-a337ad5effbe | message="Failed to ingest file." uri="s3://essaws-p-system/aws/config/AWSLogs/385473250182/Config/ap-northeast-1/2023/3/15/OversizedChangeNotification/AWS::EC2::SecurityGroup/sg-03e7b8de81918c141/385473250182_Config_ap-northeast-1_ChangeNotification_AWS::EC2::SecurityGroup_sg-03e7b8de81918c141_20230315T171015Z_1678900215945.json.gz" 2023-03-16 02:55:41,201 level=CRITICAL pid=72620 tid=Thread-8 logger=splunk_ta_aws.modinputs.sqs_based_s3.handler pos=handler.py:_process:442 | datainput="ess-p-sys-awsconfig" start_time=1678849700, message_id="39693d02-55gg-4e62-9895-9796fa51ed2f" created=1678902941.093989 ttl=300 job_id=95c972ab-5316-4bda-fgff-ad5effbe | message="An error occurred while processing the message." Traceback (most recent call last): File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/splunk_ta_aws/modinputs/sqs_based_s3/handler.py", line 431, in _process self._parse_csv_with_delimiter, File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/splunk_ta_aws/modinputs/sqs_based_s3/handler.py", line 478, in _ingest_file for records, metadata in self._decode(fileobj, source): File "/opt/splunk/etc/apps/Splunk_TA_aws/bin/splunk_ta_aws/common/decoder.py", line 95, in __call__ records = document["configurationItems"] KeyError: 'configurationItems'
Hi, I seem to be having a mental block which maybe someone can help with. I have an input dropdown which runs a query to populate values in the dropdown, depending on the value selected I want to ... See more...
Hi, I seem to be having a mental block which maybe someone can help with. I have an input dropdown which runs a query to populate values in the dropdown, depending on the value selected I want to set two more token. The condition match should be on the first 6 characters of the selected value, so if aaaaaa* set two new tokens and if aaaaab* set a different two. All tokens are then used in panels on the dashboard and should refresh each time the value in the dropdown is changed. sample xml below that will hopefully give the idea. <input type="dropdown" token="iptoken" searchWhenChanged="true"> <label>Apply Engine</label> <fieldForLabel>value</fieldForLabel> <fieldForValue>value</fieldForValue> <search> <query>index=ti-u_* sourcetype=$iptoken$ value=aaaaa** | table value |dedup value |sort value</query> <earliest>@y</earliest> <latest>now</latest> </search> <<change> <condition match="$iptoken$==&quot;aaaaaa*&quot;"> <set token="newtoken1">newvalue1</set> <set token="newtoken2">newvalue2</set> </condition> <condition match="$iptoken$==&quot;aaaaab*&quot;"> <set token="newtoken1">newvalue1</set> <set token="newtoken2">newvalue2</set> </condition> </change> </input> thanks in advance
Is there a way I can reduce cost on Splunk by using AWS security lake add on? 
Is the software compatible with large Citrix VAD environments, MCS provisioned Windows Servers, 1 Image (Frontend Servers)? For example 9000 concurrent users and 9000 Java processes?  How do you use... See more...
Is the software compatible with large Citrix VAD environments, MCS provisioned Windows Servers, 1 Image (Frontend Servers)? For example 9000 concurrent users and 9000 Java processes?  How do you use environment variables in the config files? agent.properties of the java agent, analytics-agent.properties of the analytics agent ? Setting environment variables as APPDYNAMICS_AGENT_UNIQUE_HOST_ID via GPO is not possible since the hostname is changing for each machine. Machines can be dynamically removed and new ones added. If I add the COMPUTERNAME  systemvariable to the APPDYNAMICS_AGENT_UNIQUE_HOST_ID  the service won't be able to see it after reboot, because it is not visible to the process, due to windows changing the variable while the service is running. It can be a timing issue. Restarting the service is no usefull workaround. I need the configs to use environment variables. No configuration is done per Server. All needs to be done globally. Setting APPDYNAMICS_AGENT_NODE_NAME, APPDYNAMICS_AGENT_UNIQUE_HOST_ID, ad.agent.name needs to be dynamic Setting APPDYNAMICS_AGENT_BASE_DIR, ad.dw.log.path for log paths needs to be dynamic Setting those globally but with dynamic values Using scripts to modify startups is no usefull way. Javawebstart is used for the application, so JAVA_TOOL_OPTIONS is set, since this is the only working way to start the java agent with JNLP. Instrumenting Java Web Start Applications - AppDynamics Community Note: javaws doens't know the -J-javaagent parameter (tested with Oracle Java and OpenWebstart) Note: A service can only access UNC paths Services and Redirected Drives - Win32 apps | Microsoft Learn
Hi All We have installed Installed SmartStore in our environment . Need your help in validating SmartStore Features. Have followed the below links and able to verify  the connectivity and remote ... See more...
Hi All We have installed Installed SmartStore in our environment . Need your help in validating SmartStore Features. Have followed the below links and able to verify  the connectivity and remote store.   basic testing is fine. [SmartStore] How to verify splunk indexer connecti... - Splunk Community and Troubleshoot SmartStore - Splunk Documentation BUT, Need to know if there are any SPL Query or steps to show that after installing smartstore it have improved the performance. Keen on steps or data that will give as evidence that will justify Smartstore features    Thanks and Regards  
Hi, I'm tring to change the sourcetype of all data of a specific source in props.conf [source::/var/log/messages] TRANSFORMS-change_sourcetype = syslog_sourcetype_change in transform.conf [s... See more...
Hi, I'm tring to change the sourcetype of all data of a specific source in props.conf [source::/var/log/messages] TRANSFORMS-change_sourcetype = syslog_sourcetype_change in transform.conf [syslog_sourcetype_change] SOURCE_KEY = MetaData:Sourcetype REGEX = .* FORMAT = sourcetype::syslog:nix DEST_KEY = MetaData:Sourcetype I checked the running config via btool and the stanzas are correctly configured on my heavy forwarder but it not works, the logs remain into syslog sourcetype Thanks in advance
Hi Splunk Experts I have a set of set of users whom I just want them to allow only run ad-hoc searches. I don't want them to creating dashboard, reports and alerts.  How it can be achievable ? ... See more...
Hi Splunk Experts I have a set of set of users whom I just want them to allow only run ad-hoc searches. I don't want them to creating dashboard, reports and alerts.  How it can be achievable ? Any pointers to document will be helpful.  Thanks in advance Santosh 
From the events below, wanted to extract fields as per my requirements. Please check events are not listed Event 1  gcse1 DB-OK-lpdecpdb0001089-deusw1pgecpsd000083 nemoe1 DB-OK-lpdecpdb0000922-hvi... See more...
From the events below, wanted to extract fields as per my requirements. Please check events are not listed Event 1  gcse1 DB-OK-lpdecpdb0001089-deusw1pgecpsd000083 nemoe1 DB-OK-lpdecpdb0000922-hvidlnssdb01 gcodse1 DB-OK-lpdecpdb0002495-deusw1pgecpsd000198 edmse1 DB-OK-lpdecpdb0002521 vPaymente1 DB-OK-lpdecpdb0001121-deusw1pgecpsd000094 cadence1 DB-OK-lpdecpdb0001269-deusw1pgecpsd000111 Event 2 nemoe1 DB-OK-lpdecpdb0000922-hvidlnssdb01 gcodse1 DB-OK-lpdecpdb0002495-deusw1pgecpsd000198 gcse1 DB-OK-lpdecpdb0001089-deusw1pgecpsd000083 cadence1 DB-OK-lpdecpdb0001269-deusw1pgecpsd000111 edmse1 DB-OK-lpdecpdb0002521 vPaymente1 DB-OK-lpdecpdb0001121-deusw1pgecpsd000094 Event 3 nemoe1 DB-OK-lpdecpdb0000922-hvidlnssdb01 gcodse1 DB-OK-lpdecpdb0002495-deusw1pgecpsd000198 gcse1 DB-OK-lpdecpdb0001089-deusw1pgecpsd000083 cadence1 DB-OK-lpdecpdb0001269-deusw1pgecpsd000111 edmse1 DB-OK-lpdecpdb0002521 vPaymente1 DB-OK-lpdecpdb0001121-deusw1pgecpsd000094 First column from the above event lines must be extracted as "ClusterName" gcse1 nemoe1 gcodse1 edmse1 vPaymente1 caddence1   Second Column from the above event lines must be extracted as "DB_Status" DB-OK   Third Column from the above event lines must be extracted as "PostgresDB_VIPName" lpdecpdb0001089 lpdecpdb0000922 lpdecpdb0002495 lpdecpdb0002521 lpdecpdb0001121 lpdecpdb0001269
Having problem creating a props configuration Seeing could not use striptime to parse timestamp. Below logs comes from Docker  ("log":"[20:52:02] [/home/a153509/.local/share/code-server/extensions... See more...
Having problem creating a props configuration Seeing could not use striptime to parse timestamp. Below logs comes from Docker  ("log":"[20:52:02] [/home/a153509/.local/share/code-server/extensions/ms-toolsai.jupyter-2022.9.1303220346]: Extension is not compatible with Code 1.66.2 . Extension requires: 1.72.0.\n","stream":"stderr","time":"2023-03-06T20:52:02.2194402152"}{"log":"[20:52:02] [ /home/a15 3509/.local/share/code-server/extensions/ms-python.vscode-pylance-2023. 1.10]: Extension is not compatible with Code 1.66.2. Extension req uires: 1.67.0.\n ", "stream":"stderr","time": "2023-03-06T20:52:02.219891147Z")("log": "[20:52:02] [\u003cunknown\u003e][80d9f7e6][ Extension HostConnection] New connection established.\n","stream":"stdout","time":"2023-03-06T20:52:02.604222684Z"){"log":"[20:52:02] [ \u003cunknow n\u003e][80d9f7e6][ExtensionHostConnection] \u003c1453\u003e Launched Extension Host Process. \n","stream":"stdout","time":"2023-03-06T20: 52:02.617643295Z"] ["log": "[IPC Library: Pty Host] INFO Persistent process "1": Replaying 505 chars and 1 size events\n","stream":"stdo ut", "time":"2023-03-06T20 :52:06.9 270320622"} ["log":"[IPC Library: Pty Host] WARN Shell integration cannot be enabled for executable \"/b in/bash and args undefined\n", "stream":"stdout","time": "2023-03-06T20:52:56.754368802Z"}{ log":"[20:57:00] [\u003cunknown\u003e][laf3f4 9a][ExtensionHostConnection] \u003c766\u003e Extension Host Process exited with code: 0 , signal: null.\n","stream"stdout", "time":"2023- 03-06T20:57:00 839578031Z"}"log" [02:12:50] [\u003cunknown\u003e][adf26d01 ][ManagementConnection] The client has disconnected, will wai t for reconnection 3h before disposing...\n","stream":"stdout, "time":"2023-03-07T02:12:50. 7892555182")("log":"[05:12:59] [\u003cunknown \u003e][adf26d01][ManagementConnection] The reconnection grace time of 3h has expired, so the connection will be disposed. \n", "stream ":"s tdout","time":"2023-03-07T05:12:59.567198587Z" log":[13:16:53] [\u003cunknown\u003e][adf26d01][ManagementConnection] Unknown reconnect ion token ( seen before) \n","stream":"stderr","time":"2023-03-07T13:16:53 2951627292")("log":"[13:16:53] [\u003cunknown\u003e ][80d9f7e6] [ExtensionHostConnection] The client has reconnected. \n","stream":"stdout", "time": "2023-03-07T13: 16:53.453120386Z")     Hers is my props.conf  auto learned  SHOULD LINEMERGE=false LINE BREAKER=([\n\r]+)\s*("log":"{\n  NO BINARY CHECK-true TIME PREFIX="time" MAX TIMESTAMP LOOKAHEAD=48 TIME FORMAT=%Y-%m-%dT%H:%M:%S.9N%z  TRUNCATE=999999 CHARSET=UTF-8 KV MODE=json  ANNOTATE POINT=false
Hello Community, I am having issues connecting my Universal Forwarder with a Heavy Forwarder. I have the following set up: UF-->HF-->IDx I can see the logs from HF to IDx, but not sure why I ca... See more...
Hello Community, I am having issues connecting my Universal Forwarder with a Heavy Forwarder. I have the following set up: UF-->HF-->IDx I can see the logs from HF to IDx, but not sure why I cannot see logs from UF-->HF The connection HF-->IDx is [splunktcp-ssl] whereas the connection UF-->HF is [tcpout] My question is how to troubleshoot the broken connection? I read the UF logs but still cannot the issue. Any help much appreciated. Thank you All!
Hi, I have a lookup table where column names are with weekdays (like monday, tuesday, wednesday,...) and have possible values as 1 and 0 only. What I want to achieve.. ...some query | eval day=... See more...
Hi, I have a lookup table where column names are with weekdays (like monday, tuesday, wednesday,...) and have possible values as 1 and 0 only. What I want to achieve.. ...some query | eval day=strftime(now(),"%A") | where 'day'=1 but this doesn't seems to be working. Any idea how to search dynamic fields.   Thanks
Hi team, I want to set email & slack alert when error code 405 will occur in NGINX access logs. Splunk should trigger when 405 error code will appear.
Hi All, we have newly installed ES cluster where we cannot see the any action populating in adaptive response. We tried installing ES on stand alone server it works fine. Below is the error we are ... See more...
Hi All, we have newly installed ES cluster where we cannot see the any action populating in adaptive response. We tried installing ES on stand alone server it works fine. Below is the error we are getting. Splunk version is 8.2 and ES 7.0.2. Thanks in advance  
Hi Guys, We have a Windows Controller and recently upgraded it  and have end to end SSL from the BigIP F5 to the Controller working.  When we log into the admin console (ie. admin.jsp)  and back int... See more...
Hi Guys, We have a Windows Controller and recently upgraded it  and have end to end SSL from the BigIP F5 to the Controller working.  When we log into the admin console (ie. admin.jsp)  and back into the normal Controller UI, the Tier "App Server" doesn't show the node.  Does anyone know the path location of the java agent is on the windows controller?     
Hi everyone, I'm trying to view the events from Azure AD MFA in Splunk Cloud.  Use the sign-ins report to review Azure AD Multi-Factor Authentication events - Azure Documentation https://learn.mic... See more...
Hi everyone, I'm trying to view the events from Azure AD MFA in Splunk Cloud.  Use the sign-ins report to review Azure AD Multi-Factor Authentication events - Azure Documentation https://learn.microsoft.com/en-us/azure/active-directory/authentication/howto-mfa-reporting We already have the Microsoft Office 365 App and Add-on in Splunk, but in the authentication logs we are not seeing anything related with MFA data. It is supposed to be in Splunk already? Am I missing something? Is there another way to collect that report into Splunk?  Thanks!
In Incident Review, one can create a filter and save it as a default.  Where does it store that configuration so I can push it across multiple ES instances?
I am working on a custom alert app to replace our old custom alert script action.  It was working fine but all of sudden I am no longer getting the --execute argument being passed and my script doesn... See more...
I am working on a custom alert app to replace our old custom alert script action.  It was working fine but all of sudden I am no longer getting the --execute argument being passed and my script doesn't work any more.   Here is the code:   if __name__ == "__main__": #clear logs now = datetime.now() dt_string = now.strftime("%d/%m/%Y %H:%M:%S") log(dt_string + ": Start Version 1.2","w") log("Checking to see if we have any arguments...") log("Number of arguments: " + str(len(sys.argv))) if len(sys.argv) > 1 and sys.argv[1] == "--execute": log("We have arguments.") try: payload = json.loads(sys.stdin.read()) result_file = payload['results_file'] #Pass the payload to main for processing.... main(payload) #End now = datetime.now() dt_string = now.strftime("%d/%m/%Y %H:%M:%S") log(dt_string + ": Processing complete.") except: log("We have an error on settings, exiting") sys.exit() else: log("There were no arguments. Exiting.") sys.exit()   Here is the output of my logging:   16/03/2023 10:55:16: Start Version 1.2 Checking to see if we have any arguments... Number of arguments: 1 There were no arguments. Exiting.     I have no idea what the --execute argument is and how it is passed, or what it actually means and can't find anything much about it  Hoping to shed some light here   thanks!  
Hi All, I have one universal forwarder which is reporting to DS and receiving internal logs, but i am not getting any data into index. Logs are present in the server. How to troubleshoot this ... See more...
Hi All, I have one universal forwarder which is reporting to DS and receiving internal logs, but i am not getting any data into index. Logs are present in the server. How to troubleshoot this kind of issues???
Hello all, I have three individual searches for a single value viz. the value for each viz is a sum of a field. I have bytes, bytes_in, and bytes_out. Each search is | stats sum(bytes) as Total, ... See more...
Hello all, I have three individual searches for a single value viz. the value for each viz is a sum of a field. I have bytes, bytes_in, and bytes_out. Each search is | stats sum(bytes) as Total, sum(bytes_in) as In, and sum(bytes_out) as Out So 3 searches for each field, and a single value viz for each field. I have looked at the trellis viz, but it is not much help. My actual spl is using the same formula for each field: index=squid | stats sum(bytes_in) as TotalBytes | eval gigabytes=TotalBytes/1024/1024/1024 | rename gigabytes as "Bytes In" | table "Bytes In" Is there some way to put all three stats commands in the same search, and maybe the trellis can get each calculation? I looked at trying to put  each single value in a table 3 column by one row, etc How can this be accomplished. Thanks again, eholz1
I cant seem to find a efficient way to do this.  I have text box where a user first and last name is entered and depending on the search the token will be used but the text box is "first last" and I ... See more...
I cant seem to find a efficient way to do this.  I have text box where a user first and last name is entered and depending on the search the token will be used but the text box is "first last" and I  need to transform it to  be either:   first.last OR first-last.   Please help as everything I have tried does not work.