All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Any help is greatly appreciated.   How to convert the following json into a table? { "Summary":{ "jobType":"jobA", "summaryId":22746666, "objectsArchived":[ { "name":"tableA", "count":85... See more...
Any help is greatly appreciated.   How to convert the following json into a table? { "Summary":{ "jobType":"jobA", "summaryId":22746666, "objectsArchived":[ { "name":"tableA", "count":855 }, { "name":"tableB", "count":678 } ] } }   Jobtype | SummaryId | Table | Count jobA.        | 22746666.  | tableA | 855 jobA.         | 22746666  | tableB | 678
Hey there, I have a field let's say "abc" with values as such : 1,3,5,7,5,3,2,1,5,7,8,5,1,1,2,2,3,2,1,1,2,3,2,3 here what I am trying to look here is first do a stats count by abc | where count >... See more...
Hey there, I have a field let's say "abc" with values as such : 1,3,5,7,5,3,2,1,5,7,8,5,1,1,2,2,3,2,1,1,2,3,2,3 here what I am trying to look here is first do a stats count by abc | where count > 2  and then again do a stats dc(abc) by "some other field"   I have tried do to it but unable to get any results not sure if there is any other option to perform it. thanks
I have an accelerated CIM data model. The indexes used to populate the datamodel (and accelerated summaries) are defined by a macro (a typical CIM approach - cim_Email_indexes, cim_Network_Traffic_i... See more...
I have an accelerated CIM data model. The indexes used to populate the datamodel (and accelerated summaries) are defined by a macro (a typical CIM approach - cim_Email_indexes, cim_Network_Traffic_indexes and so on). What will happen if I change this macro to include additional index? Will splunk: a) Just add data from new index to next summary rebuild starting from the last summarized timestamp? b) Add data from new index looking back up to Summary Range  the during next rebuild? c) Rebuild whole summaries back up to Summary Range?  
Hi all, I'm trying to set up the Splunk Ad-On for Microsoft O365  https://docs.splunk.com/Documentation/AddOns/released/MSO365/Configuretenant When adding a tenant I receive an error message: "Co... See more...
Hi all, I'm trying to set up the Splunk Ad-On for Microsoft O365  https://docs.splunk.com/Documentation/AddOns/released/MSO365/Configuretenant When adding a tenant I receive an error message: "ConnectionResetError 104" What could be the reason? What are all the required Azure URL's that need to be connected by the add-on?  Thank you in advance
is  there a way for a user without admin privileges to export existing lookup file information locally after processing and upload CSV with the same file name after manual update?
Hi all, does anyone knows if there's any way to make transaction start and end with the proper results. I have a transaction URL startswith=STATUS=FAIL endswith=STATUS=PASS. The data has pattern li... See more...
Hi all, does anyone knows if there's any way to make transaction start and end with the proper results. I have a transaction URL startswith=STATUS=FAIL endswith=STATUS=PASS. The data has pattern like FAIL,PASS,FAIL,PASS,PASS,FAIL,FAIL,FAIL,PASS... The transaction command doesn't work well. My requirement is to get the immediate PASS URL after the FAIL one. In a situation like FAIL...... PASS will take the last part of FAIL, PASS. I want it to take FAIL..............PASS. Does anyone know how to do this?
Does Splunk support HIDS-features like monitoring the data-traffic and suspicious activities on the computer infrastructure? 
Hello, Did anyone else encountered this problem on a Search Head?  KV Store changed status to failed. No suitable servers found: `serverSelectionTimeoutMS` expired. I tried all the solutions that ... See more...
Hello, Did anyone else encountered this problem on a Search Head?  KV Store changed status to failed. No suitable servers found: `serverSelectionTimeoutMS` expired. I tried all the solutions that I could find related to this problem, but without success. 
Hi I'm trying to group items by a specific field, and get all the values returned (i.e. without aggregation). I have the following: I'm trying to convert that to: I have tried    ... See more...
Hi I'm trying to group items by a specific field, and get all the values returned (i.e. without aggregation). I have the following: I'm trying to convert that to: I have tried      | chart values(value) by field | transpose header_field=field     However the values(value) only selects unique values - I'm looking for all values.        
We would like to use React to create a frontend for our Splunk App. We were able to integrate react and react-router to create and route to different pages successfully (ie myApp/eventPage), however ... See more...
We would like to use React to create a frontend for our Splunk App. We were able to integrate react and react-router to create and route to different pages successfully (ie myApp/eventPage), however on page refresh Splunk is unable to maintain the current route and we get a 404 page.  page refresh at /myApp --> works page refresh at /myApp/eventPage --> 404 This is also an issue if we would like users to be able to navigate to a specific page of our app from an external link.  Do you have an example or documentation on how to maintain multiple pages in a splunk app? Or is this a known limitation of using React with Splunk? Thanks in advance for your time and help!
HI Team, I am facing an issue with few of the servers which client had request to on-board new set of log data into splunk.  We had deployed the monitoring stanza & Parsing stanza by updating an ex... See more...
HI Team, I am facing an issue with few of the servers which client had request to on-board new set of log data into splunk.  We had deployed the monitoring stanza & Parsing stanza by updating an existing app and app was successfully deployed into their respective servers. But we are unable to see the data ingest happening from the new monitoring stanza in Splunk. When troubleshooting could see this INFO related to the monitoring  stanza in _internal logs. Apart from this is INFO, there is no other messages or Events related to the below source found in the _internal logs.   Monitoring Stanza details [monitor:///usr/local/tet/t12/var/was/log/server.log] sourcetype = usr:genericapp:server index = test_index disabled = 0 ignoreOlderThan = 14d Parsing stanza: [usr:genericapp:wfserver] NO_BINARY_CHECK=true LINE_BREAKER=([\r\n]+)\d{4}\-\d{2}\-\d{2}\s\d{2}\:\d{2}\:\d{2}\.\d{3} TIME_PREFIX=^ TIME_FORMAT=%Y-%m-%d %H:%M:%S.%3N MAX_TIMESTAMP_LOOKAHEAD= 23 SHOULD_LINEMERGE=false internal logs: 1:40:04.292 PM 02-25-2022 13:40:04.292 +0000 INFO TailingProcessor - Parsing configuration stanza: monitor:///usr/local/tet/t12/var/was/log/server.log Kindly guide me to fix this .  
Hello please I will ask several questions and thank you for taking step by step because I am a student and this is my first time using splunk enterprise: I want to monitor my active directory I fou... See more...
Hello please I will ask several questions and thank you for taking step by step because I am a student and this is my first time using splunk enterprise: I want to monitor my active directory I found the application "splunk for windows infrastructure" I have successfully configured add on Splunk_TA_microsoft_ad on the portal. of course these 2 add ons exist in C:\Program Files\SplunkUniversalForwarder\etc\apps on my active directory server for licensing reasons I only enabled index [WinEventLog://Security] for the input of the add on Splunk_TA [WinEventLog://Security] disabled = 0 start_from oldest current_only = 0 evt_resolve_ad_obj = 1 Interval checkpoint = 5 whitelist= 4724,4725,4726,4624,4625,4720,4732,4722,4738,4742,4729,4715,4719 blacklist1 = EventCode="4662" Message="Object Type: (?! \s*group Policy Container)" blacklist2 = EventCode="566" Message="Object Type: (?! \s*group PolicyContainer)" renderXml=true and I created the local folder from which I copied the input.con and app.conf files after modification. but when I run the Splunk for windows infrastructure application I find no information: either the search is waiting for input or no results found. I don't know what configuration I missed. Of course, I deactivated the firewall carefully and when I do raw searches with search and reporting I got the information so the logs are sent from the server but there is a problem at the application level If not, do you have another proposal for AD monitoring?? and thank you
Hi ,  Why are we receiving this kind of issue on "o365:cas:api" while the others listed below are working as expected. o365:graph:api o365:management:activity o365:service:updateMessage ... See more...
Hi ,  Why are we receiving this kind of issue on "o365:cas:api" while the others listed below are working as expected. o365:graph:api o365:management:activity o365:service:updateMessage We didn't put a Cloud App Security Token in the tenant configuration since we already have the client secret, Tenant ID, Client Id, Tenant Subdomain and Tenant Data Center Is it needed for the "o365:cas:api" to work? ERROR : 2022-02-28 07:02:42,801 level=ERROR pid=23110 tid=MainThread logger=splunk_ta_o365.modinputs.cloud_app_security pos=utils.py:wrapper:72 | datainput=b'at_rbi_cloud_microsoft_cloud_application_security_files' start_time=1646031762 | message="Data input was interrupted by an unhandled exception." Traceback (most recent call last): File "/opt/splunk/etc/apps/splunk_ta_o365/bin/splunksdc/utils.py", line 70, in wrapper return func(*args, **kwargs) File "/opt/splunk/etc/apps/splunk_ta_o365/bin/splunk_ta_o365/modinputs/cloud_app_security.py", line 184, in run return consumer.run() File "/opt/splunk/etc/apps/splunk_ta_o365/bin/splunk_ta_o365/modinputs/cloud_app_security.py", line 47, in run for message in reports.get(self._session): File "/opt/splunk/etc/apps/splunk_ta_o365/bin/splunk_ta_o365/common/portal.py", line 639, in get raise O365PortalError(response) splunk_ta_o365.common.portal.O365PortalError: 401:{"detail":"Invalid token"} 2022-02-28 07:02:42,801 level=ERROR pid=23110 tid=MainThread logger=splunk_ta_o365.common.portal pos=portal.py:__init__:50 | datainput=b'at_rbi_cloud_microsoft_cloud_application_security_files' start_time=1646031762 | message="failed to get error code" body=b'{"detail":"Invalid token"}' Traceback (most recent call last): File "/opt/splunk/etc/apps/splunk_ta_o365/bin/splunk_ta_o365/common/portal.py", line 44, in __init__ self._code = data['error']['code']
Hi All, I am trying onboard AWS S3 Bucket logs to splunk cloud using the ARN name,   Question from my client is how to restrict access to only splunk cloud to the S3 Bucket in their AWS and no ... See more...
Hi All, I am trying onboard AWS S3 Bucket logs to splunk cloud using the ARN name,   Question from my client is how to restrict access to only splunk cloud to the S3 Bucket in their AWS and no one else should be able to access the S3 Bucket using the ARN name?  
Dear professional,  I have a search like this index="hcg_oapi_prod" relatedPersons And the search value is store in attached image. Then, I want to add a filter to search the log that have "r... See more...
Dear professional,  I have a search like this index="hcg_oapi_prod" relatedPersons And the search value is store in attached image. Then, I want to add a filter to search the log that have "relatedPersons":[] only, like this 2022-02-28 13:18:24.253 [c8058db8c5664bd1b3c49b749b607df8/c8058db8c57dcac5] - DEBUG OncePerRequestFilter - obtained request content: {"offerCode":"aaaa","application":{"applicantPerson":{"name":{"firstName":"aaa","middleName":"aaa","lastName":"aaa"},"gender":"MALE","birthDate":"1aaa","addresses":[],"phoneNumbers":[{"phoneType":"PRIMARY_MOBILE","number":"11111","verificationID":"ascve"}],"identificationDocuments":[{"type":"ID_CARD","number":"2222"}]},"additionalDocuments":[{"documentType":"PHOTO_PERSON","documentInfo":[],"photoTakingResult":"TECHNICAL_PROBLEM"},{"documentType":"MEDICAL_INS_CARD","documentInfo":[{"type":"INSURANCE_CARD_NUM"}]},{"documentType":"BANK_STATMENT_3","documentInfo":[{"type":"BANK_ACCOUNT_NUM"}]}],"employmentInfo":{"econimicalStatus":"OTHER","monthlyIncome":{"amount":8000000.0,"currency":"VND"},"monthlyPaymentLoan":{"amount":0.0,"currency":"USD"}},"relatedPersons":[]},"userStatistics":[{"key":"tongdun_appname","value":"HomeCredit_vn_and"},{"key":"tongdun_blackbox","value":"eyu"},{"key":"tongdun_source","value":"dqshand"},{"key":"tracksessionid","value":"0459588f-b583-4cf0-954c-8ecbbcc31a8e_16460289063825295"}]} Please help me.
Hi all, I applied an 14-days trail license for splunk cloud to develop an cloud app. I followed the link https://dev.splunk.com/enterprise/docs/releaseapps/manageprivatecloud to deploy a private ap... See more...
Hi all, I applied an 14-days trail license for splunk cloud to develop an cloud app. I followed the link https://dev.splunk.com/enterprise/docs/releaseapps/manageprivatecloud to deploy a private app on cloud platform. However, when I launch the 'App Management' and click the 'Uploaded Apps' tag, the browser keeps 'loading' but not displaying the desired page. I checked the browser's console, the response indicates that '<msg type="WARN">DMC is disabled</msg>'. How can I fix this issue? Regards, Blake
Hi, Below warning message is showing in our Search head cluster. Search peer XXXBIXX has the following message: Received event for unconfigured/disabled/deleted index=A with source="B" host="h... See more...
Hi, Below warning message is showing in our Search head cluster. Search peer XXXBIXX has the following message: Received event for unconfigured/disabled/deleted index=A with source="B" host="host::C" sourcetype="D". So far received events from 2 missing index(es). I have verified "A" Index is not exists in our indexers and from the host no internal logs received except license_usage.log. how to figured out where the inputs configured for this host host="host::C" ?
I have the data format below, and I would like to filldown with specific field value base on command Field1. i.e.  Fill Field2 with character 'B' if Field1 is 'A'        Fill Filed2 with character ... See more...
I have the data format below, and I would like to filldown with specific field value base on command Field1. i.e.  Fill Field2 with character 'B' if Field1 is 'A'        Fill Filed2 with character 'C' if Field1 is 'B' Data: Field1 Field2 Field3 Field4 A   fooA barB A   abc def A B ghi jkl B C fooB barC B   aaa bbb B   ccc ddd   Change to below format Field1 Field2 Field3 Field4 A B fooA barB A B abc def A B ghi jkl B C fooB barC B C aaa bbb B C ccc ddd  
Hello, Thank you for taking the time to consider my question. I'm trying to visualize the health of several windows & linux systems using IT essentials work, and no matter what I do it seems like I... See more...
Hello, Thank you for taking the time to consider my question. I'm trying to visualize the health of several windows & linux systems using IT essentials work, and no matter what I do it seems like I just can't get the data to actually be read by IT essentials Work (ITEW).  For testing purposes, I have only started with Windows machines, since I figured those would be better documented and easier. I have installed the Splunk Add on for Microsoft on both the indexer/search head as well as the client, and added the custom inputs.conf which is linked from Splunk Security Essentials App on monitoring CPU/Memory performance on remote windows systems.  I have installed IT essentials work on my indexer/search head, and it automatically created the "itsi_im_metrics" index, which should collect the data being reported by the foreign host, and then allow ITEW to read it and visualize it, right? When I go into "indexes" on the indexer/search head, it shows that it has thousands of events within that index, and shows it was recently updated as of just a few minutes prior, so the flow it working. However this index doesn't show any events when I search for it in both the normal search & reporting search bar, as well as the ITEW search bar.  It's obviously something stupid that I missed on my end, since I feel like it's missing one small configuration and then it will work fine, but the fact that there's no guides or videos on this practice and just some very generic documentation on ITSI/ITEW is very disappointing.  Thank you in advance for considering and assisting me with this, and I look forward to your responses so I can resolve this issue. Any help that leads to the solution will of course be accepted and rewarded with karma for those who appreciate that.  Thanks again
Hello,  Thank you for taking the time to consider my question. I'm currently working on getting the InfoSec App (https://splunkbase.splunk.com/app/4240/) integrated via Common Information Model wit... See more...
Hello,  Thank you for taking the time to consider my question. I'm currently working on getting the InfoSec App (https://splunkbase.splunk.com/app/4240/) integrated via Common Information Model with active directory logs that are obtained either through the Splunk Supporting Add on for Active Directory, or the Splunk Add on for Microsoft Windows.  There doesn't seem to be any real good documentation for this process for beginners, even though this is likely a very easy integration for Splunk Admins given how many use cases there are for it and the prevalence of AD in large organizations.  My question is how do people normally ingest data from AD through an inputs.conf (please link documentation of an example inputs.conf file that does this, if it exists, I can't find one) And some best practices for indexes that are supported for mapping AD auth data to CIM by default. I'm not trying to do anything special here, it just seems like this should have tutorials all over the place and nobody has taken the time to really explain the process of this from start to finish, which is extremely frustrating for people trying to teach this to themselves without expensive Splunk ondemand support having to walk you through it.  Any help regarding this would be greatly appreciated. For context I have already installed both Supporting Add ons for MSFT and AD on the indexer/search head, and installed the Splunk TA for windows on the actual AD host, where I'm assuming I need to use some sort of admon configuration to monitor active directory, but it's unclear what index I should be sending them to, and how that index should be configured on the search head.