All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Just want to ask if the approach in mapping groups/users in appdynamics controller is the same in Account Portal? I already configured the SAML federation and in azure.Then,  in Azure, I added a gro... See more...
Just want to ask if the approach in mapping groups/users in appdynamics controller is the same in Account Portal? I already configured the SAML federation and in azure.Then,  in Azure, I added a group and map in account portal for the SSO. However, users are still requires to to enter a password upon login. Is there something I missed? 
This is the current query,  but it's  not really providing the needed data for the search. index=main sourcetype=XmlWinEventLog EventCode=4624 Logon_Type=3 | transaction src maxspan=10m maxpause=2... See more...
This is the current query,  but it's  not really providing the needed data for the search. index=main sourcetype=XmlWinEventLog EventCode=4624 Logon_Type=3 | transaction src maxspan=10m maxpause=2m | stats dc(dest) as Dest_Count, values(dest) as Target_Systems by src | search Dest_Count >35 | sort - Dest_Count I really don't care about the Dest_Count >35 it was  and attempt to gather something  to start with. I was told to research the transaction command to obtain the required results.
Hi, My search result brings back a GUID in the ID field. The GUID refers to a customer. I would like it to reflect the customers name. Can I make a Splunk search do this every time? Best wishes Mi... See more...
Hi, My search result brings back a GUID in the ID field. The GUID refers to a customer. I would like it to reflect the customers name. Can I make a Splunk search do this every time? Best wishes Michael
Hi, I need to filter data to reduce my ingestion volume and for that I need to change below two files. But I don't see these file in any option in splunk cloud platform. I am attaching the settings ... See more...
Hi, I need to filter data to reduce my ingestion volume and for that I need to change below two files. But I don't see these file in any option in splunk cloud platform. I am attaching the settings menu in splunk cloud. I have sc_admin privileges but still I don't see these files anywhere. Can you please let me know where to find these files and how to edit them in splunk clould ?    File props.conf [httpevent] TRANSFORMS-t1=eliminate-okhttp3   Below need to edit in transforms.conf. [eliminate-okhttp3] REGEX = okhttp3 DEST_KEY = queue FORMAT = nullQueue Thanks, Dee
There is a SPL search, ending with stats that generates 300 events. Now that Search, lets call it "SEARCH-1" is saved as a 'saved search', and in the 'saved-search' one extra line is added, i.e. | ... See more...
There is a SPL search, ending with stats that generates 300 events. Now that Search, lets call it "SEARCH-1" is saved as a 'saved search', and in the 'saved-search' one extra line is added, i.e. | collect index=sec_apps_summary source="savedSearch_1d" And earliest , latest setting as -1@d and @d  . There is another SEARCH-2, that invokes the 'saved search' and the SPL starts like, | index=sec_apps_summary source="savedSearch_1d" .... What confuses me is, SEARCH-1 and SEARCH-2 should show same count of result, but I see 300 events for SEARCH-1 and very less 16 events for SEARCH-2. I suspect something about the way the 'saved search' is utilized , I quite don't understand the difference in result.  Any idea , why ?
HI All, We are facing and issue with Splunk Addon for AWS where the Configuration and Inputs page on UI isn't loading and it keeps on loading with a circle without getting the page.  We are obser... See more...
HI All, We are facing and issue with Splunk Addon for AWS where the Configuration and Inputs page on UI isn't loading and it keeps on loading with a circle without getting the page.  We are observing it after we did Splunk enterprise upgrade recently to 8.2.3.3 . We also updated the add-on to latest version - 5.2.1 Under Splunkd log we can see some REST errors . PFB  12-23-2021 07:40:25.606 -0500 ERROR AdminManagerExternal [11556 TcpChannelThread] - Unexpected error "<class 'splunklib.binding.HTTPError'>" from python handler: "HTTP 500 Internal Server E rror -- Unexpected error "<class 'splunktaucclib.rest_handler.error.RestError'>" from python handler: "REST Error [500]: Internal Server Error -- Traceback (most recent call last):\n File "/opt/splunk/hf/etc/apps/Splunk_TA_aws/bin/3rdparty/python3/urllib3/connectionpool.py", line 667, in urlopen\n self._prepare_proxy(conn)\n File "/opt/splunk/hf/etc/apps/Splunk_TA_aws/bi n/3rdparty/python3/urllib3/connectionpool.py", line 930, in _prepare_proxy\n conn.connect()\n File "/opt/splunk/hf/etc/apps/Splunk_TA_aws/bin/3rdparty/python3/urllib3/connection.py", li ne 316, in connect\n self._tunnel()\n File "/opt/splunk/hf/lib/python3.7/http/client.py", line 931, in _tunnel\n message.strip()))\nOSError: Tunnel connection failed: 403 Forbidden\n \nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/opt/splunk/hf/etc/apps/Splunk_TA_aws/bin/3rdparty/python3/solnlib/packa ges/requests/adapters.py", line 449, in send\n timeout=timeout\n File "/opt/splunk/hf/etc/apps/Splunk_TA_aws/bin/3rdparty/python3/urllib3/connectionpool.py", line 725, in urlopen\n m ethod, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]\n File "/opt/splunk/hf/etc/apps/Splunk_TA_aws/bin/3rdparty/python3/urllib3/util/retry.py", line 439, in increment\n raise MaxRetryError(_pool, url, error or ResponseError(cause))\nurllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='0', port=8089): Max retries exceeded with url: /servicesNS/nobody/Splun k_TA_aws/configs/conf-aws_sqs_tasks/_reload (Caused by ProxyError('Cannot connect to proxy.', OSError('Tunnel connection failed: 403 Forbidden')))\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/opt/splunk/hf/etc/apps/Splunk_TA_aws/bin/3rdparty/python3/splunktaucclib/rest_handler/handler.   Error from python.log Traceback (most recent call last): File "/opt/splunk/hf/etc/apps/Splunk_TA_aws/bin/3rdparty/python3/solnlib/splunk_rest_client.py", line 145, in request verify=verify, proxies=proxies, cert=cert, **kwargs) File "/opt/splunk/hf/etc/apps/Splunk_TA_aws/bin/3rdparty/python3/solnlib/packages/requests/api.py", line 60, in request return session.request(method=method, url=url, **kwargs) File "/opt/splunk/hf/etc/apps/Splunk_TA_aws/bin/3rdparty/python3/solnlib/packages/requests/sessions.py", line 533, in request resp = self.send(prep, **send_kwargs) File "/opt/splunk/hf/etc/apps/Splunk_TA_aws/bin/3rdparty/python3/solnlib/packages/requests/sessions.py", line 646, in send r = adapter.send(request, **kwargs) File "/opt/splunk/hf/etc/apps/Splunk_TA_aws/bin/3rdparty/python3/solnlib/packages/requests/adapters.py", line 510, in send raise ProxyError(e, request=request) solnlib.packages.requests.exceptions.ProxyError: HTTPSConnectionPool(host='0', port=8089): Max retries exceeded with url: /servicesNS/nobody/Splunk_TA_aws/configs/conf-aws_sqs_tasks/_reload (Caused by ProxyError('Cannot connect to proxy.', OSError('Tunnel connection failed: 403 Forbidden')))   It has just impacted the UI, the inputs continues to work in the background.  Any help with this would be appreciated.  Regards, Sumeet  
Does anyone know how to register log event to another index by SPL. I'm assuming the answers like registering recodes from lookup file to an index executing a SPL regularly. If there any way like... See more...
Does anyone know how to register log event to another index by SPL. I'm assuming the answers like registering recodes from lookup file to an index executing a SPL regularly. If there any way like that, please give your answer.
ERROR TailReader - File will not be read, is too small to match seekptr checksum (file=c:\logs\MailBoxAudit\mailboxaudit_23_12_2021_13_48.csv). Last time we saw this initcrc, filename was different.... See more...
ERROR TailReader - File will not be read, is too small to match seekptr checksum (file=c:\logs\MailBoxAudit\mailboxaudit_23_12_2021_13_48.csv). Last time we saw this initcrc, filename was different. You may wish to use larger initCrcLen for this sourcetype, or a CRC salt on this source.     inputs.conf   /opt/splunk/etc/deployment-apps/TA-Exchange-Mailbox/local/inputs.conf  [monitor://c:\logs\MailBoxAudit] whitelist=\.csv$|\.CSV$ sourcetype=csv index= indexname disabled=false crcSalt = <SOURCE> initCrcLength=8192 after making a change to the file inputs.conf. i ran the command /opt/splunk/bin/splunk reload deploy-server -class heavy_forwarders for the changes to be accepted the file comes to the index, but it does not start up some   what could be the problem ?        
Salesforceのログにて以下の要件でSPLを作成したいと考えております。   ①1週間以上 、 毎日複数回ログインを失敗しているユーザ  ②同一IP で複数のユーザ ID に対してログインロックされているユーザの検知   どのようなSPLをかけばよいでしょうか。  
Hello everyone,  So according to the Splunk blog: Splunk Security Advisory for Apache Log4j (CVE-2021-44228 and CVE-2021-45046) | Splunk it says that the affected versions are: "All supported non-... See more...
Hello everyone,  So according to the Splunk blog: Splunk Security Advisory for Apache Log4j (CVE-2021-44228 and CVE-2021-45046) | Splunk it says that the affected versions are: "All supported non-Windows versions of 8.1.x and 8.2.x only if DFS is used. "  I'm using Splunk Enterprise Search Head & Indexer with version 7.3.1 and I can see various log4j-1.2.17.jar files under location "/bin/jars/vendors/spark/2.3.0/lib/", "/etc/apps/splunk_app_db_connect/bin/lib/", /etc/apps/splunk_archiver/java-bin/jars/vendors/spark/ and etc.  Also, I am attaching the result I received from a search query to determine if DFS is enabled on my Splunk servers. Should I be concerned about this vulnerability?  Also to remediate, do I just need to replace this log4j-1.2.17.jar with the latest files directly in the respective directories or do I need to make any changes in the conf files as well?  Any help will be appreciated.  Thank you!  
Our TA is cloud vetted on Splunkbase. Our TA is working on the Splunk cloud Classic version whereas the same TA is throwing the below error while accessing the setup page on the Splunk cloud Victori... See more...
Our TA is cloud vetted on Splunkbase. Our TA is working on the Splunk cloud Classic version whereas the same TA is throwing the below error while accessing the setup page on the Splunk cloud Victoria version.  Please let me know how to identify the root cause and resolve the error. Thanks in advance.    
Currently all our applications are using 1.6.0 version of splunk-library-javalogging. This dependency has the vulnerable log4j2 version and I am trying to upgrade the to the latest version 1.11.3. I ... See more...
Currently all our applications are using 1.6.0 version of splunk-library-javalogging. This dependency has the vulnerable log4j2 version and I am trying to upgrade the to the latest version 1.11.3. I downloaded the dependency from splunk website and tried installing using maven. The installation is successful but this version is unable to recognize the appenders we have configured on log4j2.xml. Below are the errors I am  seeing when using the latest splunk-library-javalogging dependency. 2021-12-22 14:12:55,484 WrapperListener_start_runner ERROR Error processing element HttpSplunk ([AppenderSet: null]): CLASS_NOT_FOUND 2021-12-22 14:12:55,485 WrapperListener_start_runner ERROR Error processing element HttpSplunk ([AppenderSet: null]): CLASS_NOT_FOUND 2021-12-22 14:12:55,642 WrapperListener_start_runner ERROR No node named SplunkDevAppender in org.apache.logging.log4j.core.appender.AppenderSet@7ac97772 2021-12-22 14:12:55,643 WrapperListener_start_runner ERROR Null object returned for ScriptAppenderSelector in Appenders.   2021-12-22 14:12:55,649 WrapperListener_start_runner ERROR Unable to locate appender "SelectSplunkInstance" for logger config "root"   Log4J2 Configuration: ============================= <Configuration  status="warn" name="splunk-cloudhub" packages="com.splunk.logging,com.mulesoft.ch.logging.appender">    <Properties>    <Property name="target.env">${sys:env}</Property>        <Property name="target.app">${project.name}</Property>         <Property name="dev.splunk.host">{{host}}</Property>         <Property name="dev.splunk.token">{{token}}</Property>     </Properties>     <Appenders>   <ScriptAppenderSelector name="SelectSplunkInstance">       <Script language="JavaScript"><![CDATA[         "${target.env}".search("prd") > -1 ? "SplunkPrdAppender" : "SplunkDevAppender";]]>       </Script>       <AppenderSet>         <HttpSplunk name="SplunkDevAppender"               url="https://${dev.splunk.host}"               token="${dev.splunk.token}"               host=""               index="index1"               source="${target.app}-${target.env}"               sourcetype="application"   messageFormat="json"               middleware=""               send_mode="sequential"               batch_size_bytes="0"               batch_size_count="0"               batch_interval="0"               disableCertificateValidation="true">             <PatternLayout pattern="%m"/>         </HttpSplunk>         <HttpSplunk name="SplunkPrdAppender"               url="https://${prd.splunk.host}"               token="${prd.splunk.token}"               host=""               index="index2"               source="${target.app}-${target.env}"               sourcetype="application"   messageFormat="json"               middleware=""               send_mode="sequential"               batch_size_bytes="0"               batch_size_count="0"               batch_interval="0"               disableCertificateValidation="true">             <PatternLayout pattern="%m"/>         </HttpSplunk>       </AppenderSet>     </ScriptAppenderSelector>              </Appenders> ============================= Could you please let us know if there are any additional steps that we have to follow to get this to work. Attaching our log4j2 file for your reference.
hi Experts, We are planning to decommission on-prem Splunk Ent 8.0. can anyone advise on how to backup and archive existing Splunk Indexed data for future reference? also if we have to open this a... See more...
hi Experts, We are planning to decommission on-prem Splunk Ent 8.0. can anyone advise on how to backup and archive existing Splunk Indexed data for future reference? also if we have to open this archived data in the future then how we can open it without Splunk. We have 1xSH, 1xIndexer, 2 HF all are 8.0.   
I can see that there is a new version of splunk-library-javalogging dependency released for the log4j2 vulnerability. Can we just override the log4j2 versions to the newer version(2.17.0) on the pare... See more...
I can see that there is a new version of splunk-library-javalogging dependency released for the log4j2 vulnerability. Can we just override the log4j2 versions to the newer version(2.17.0) on the parent pom instead of updating the splunk-library-javalogging dependency to 1.11.*?
Searching _internal for source=sc4s shows:       srlssydr01 syslog-ng 174 - [meta sequenceId="32595295"] Message(s) dropped while sending message to destination; driver='d_hec_fmt#0', worker_inde... See more...
Searching _internal for source=sc4s shows:       srlssydr01 syslog-ng 174 - [meta sequenceId="32595295"] Message(s) dropped while sending message to destination; driver='d_hec_fmt#0', worker_index='5', time_reopen='10', batch_size='19'       and       srlssydr01 syslog-ng 174 - [meta sequenceId="32594764"] http: handled by response_action; action='drop', url='https://http-inputs-acme.splunkcloud.com:443/services/collector/event', status_code='400', driver='d_hec_fmt#0', location='root generator dest_hec:5:5'      
I want to join two source types ST1(has fields id,title) and ST2(no fields only _raw="xid https://www.example.com?q1=test1&q2=test2") . I have tried via join it is working but due to sub search row c... See more...
I want to join two source types ST1(has fields id,title) and ST2(no fields only _raw="xid https://www.example.com?q1=test1&q2=test2") . I have tried via join it is working but due to sub search row constraint, I  am getting wrong result. I have tried without join(sourcetype="ST1" OR sourcetype="ST2" approach), I am getting incorrect result. sourcetype="ST1" (id,title are fields here) id=1 title=one id=2 title=two  id=3 title=three sourcetype="ST2" _raw 1 "GET https://www.example.com?q1=one" 2 "GET https://www.example.com?q1=test&q2=test2"  3 "GET https://www.example.com?q3=thr" I want to join these source types and get the below output(grab the url params alone in source type ST2). Can you please help me on this? id title params 1 one q1=one 2 two q1=test&q2=test2 3 three q3=thr
I have a base search below but I need to use a time_window that is in table since various logs come in at diff times and I'm trying to create alerts for indexes not reporting but I dont want false po... See more...
I have a base search below but I need to use a time_window that is in table since various logs come in at diff times and I'm trying to create alerts for indexes not reporting but I dont want false positives for indexes that have a expected time lag.  splunk_security_index is used to get a specific subset of indexes.     | tstats max(_time) as _time where index=* by index sourcetype | lookup splunk_security_indexes.csv index as index OUTPUT index time_window | eval time_window="-7d@d" | where _time < relative_time(now(),'time_window')
I have started getting Event processing errors in the MC & messages on the ES main page. I looked for skipped & delayed searches but did not find the root cause. Please advise. Thanks a lot. Happy ho... See more...
I have started getting Event processing errors in the MC & messages on the ES main page. I looked for skipped & delayed searches but did not find the root cause. Please advise. Thanks a lot. Happy holidays.
Is there a way to remove or relocate the floating "Splunk Product Guidance" button that appears on the lower right of search results? It has a tendency to block useful information and it's fairly ann... See more...
Is there a way to remove or relocate the floating "Splunk Product Guidance" button that appears on the lower right of search results? It has a tendency to block useful information and it's fairly annoying.