All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello and Thanks,   It looks like this app, desires to store data in some sort of "cloud" based storage. Is this correct? The data I have cannot be anywhere but on a private LAN.  I am not sure ho... See more...
Hello and Thanks,   It looks like this app, desires to store data in some sort of "cloud" based storage. Is this correct? The data I have cannot be anywhere but on a private LAN.  I am not sure how to use this app, is there a posting or source of instructions on how to use this SplukBase app?   Thanks for the reply, ewholz
Hi Marco, i dont really remember what was the problem nor the solution. We are currently working with Splunk Edge Hub on Splunk 9.2.0.1 and we got no problems with the device registration. What i... See more...
Hi Marco, i dont really remember what was the problem nor the solution. We are currently working with Splunk Edge Hub on Splunk 9.2.0.1 and we got no problems with the device registration. What i do remember about that time where we implemented the SSG for Mobile use, is that there was a problem with cert inspection on the firewall, it changed something on the cert itself ,and it was no longer recognized as a valid one for the SSG. Regards.
This seems to fix the rest endpoint issue (esp. changes to outputs.conf). https://docs.splunk.com/Documentation/Splunk/latest/Updating/Upgradepre-9.2deploymentservers
Hi @Shane.Tembo, Since it's been a few days with no reply from the community, did you happen to find a solution or anything you can share? If you still need help, you can reach out to Cisco AppD... See more...
Hi @Shane.Tembo, Since it's been a few days with no reply from the community, did you happen to find a solution or anything you can share? If you still need help, you can reach out to Cisco AppDynamics Support: How do I submit a Support ticket? An FAQ 
Hello @Eduardo.Rosa, It's been a few days with no reply from the Community. Have you happened to find a solution or anything you can share? If you still need help, you can contact Cisco AppDynam... See more...
Hello @Eduardo.Rosa, It's been a few days with no reply from the Community. Have you happened to find a solution or anything you can share? If you still need help, you can contact Cisco AppDynamics Support: How do I submit a Support ticket? An FAQ 
Hey everyone!    Would anyone have any resources on how this works we have working scripts mostly external api calls that are working in the testing environment however when attempting to configure... See more...
Hey everyone!    Would anyone have any resources on how this works we have working scripts mostly external api calls that are working in the testing environment however when attempting to configure the app to work as an adaptive response action we are running into some problem. Are there any resources out there on this at all, we have been unable to find anything really help on this section or anyone to really offer any insight or guidance into how this is supposed to work. Any insight or suggestions at all would be greatly appreciated, thank you in advance!!    
Hi @Anonymous, I got your message and have responded. I have a ticket open now getting your account closed.
Hi everyone, I'm trying to extract fields from salesforce from a complex architecture. I created a dedicated index for extracting a log that contains the summary of the order with the various items... See more...
Hi everyone, I'm trying to extract fields from salesforce from a complex architecture. I created a dedicated index for extracting a log that contains the summary of the order with the various items. The structure of the objects is not editable and the query I would like to be able to execute is this: SELECT Id, (SELECT Id, (SELECT Description FROM FulfillmentOrderLineItems) from FulfillmentOrders) FROM OrderSummary Is there a way to extract this log?
You can continue to use the props/transforms from the Splunk Windows TA BUT you need to use the /local/props.conf and /local/transforms.conf So, create a local folder within the TA and add the two ... See more...
You can continue to use the props/transforms from the Splunk Windows TA BUT you need to use the /local/props.conf and /local/transforms.conf So, create a local folder within the TA and add the two files, if you change the default props and trans, they will get overwritten during upgrades in the future.    So, configure as per below example, you will need to work out what events you want to discard, so some regex, and this is the better way, the rest will get logged into Splunk.     props.conf [MSAD:NT6:Netlogon] TRANSFORMS-send_to_null_events = send_null_netlogin_events   # transforms.conf [send_null_netlogin_events] REGEX = <YOUR REGEX FOR LINES YOU DONT WANT> DEST_KEY = queue FORMAT = nullQueue   The above code then needs to placed on the Indexers or Heavy forwarder - if the data is sent here first (Splunk Full Instances), so deploy the Windows TA that contains your new code. Note: The UF will not do it) Another way is to create your own side car TA and have the code there and run it alongside the Windows TA.
Can I configure a Darktrace asset that routes through the automation broker? When Attepmting to configure, the broker is greyed out and will not let me select it. (Yes, the broker is on and active.)
| eval routingkey=if(routingkey="routingdynatrace_2","dynatrace_2",routingkey) | stats sum(count) as count by routingkey
@sylim_splunk It's not just a display problem. Queries using the rest interface also return no results. But the strange thing is that on some deployment servers it works without problems on the sa... See more...
@sylim_splunk It's not just a display problem. Queries using the rest interface also return no results. But the strange thing is that on some deployment servers it works without problems on the same version, but a large part does not allow rest on /services/deployment/server/clients. On servers where the rest query don't work, there's also zero results under Forwarder Management/Clients. Regards Marco
Hi @Orange_girl , it seems that something changed: Splunk hasn't more the requested permissions on the files to read: check them. Ciao. Giuseppe
Thanks Giuseppe. The logs I shared here are the last logs I received for this index.  I also checked logs for ABC.csv which is used by the index, and same here - logs only until May 26th: 26/05/202... See more...
Thanks Giuseppe. The logs I shared here are the last logs I received for this index.  I also checked logs for ABC.csv which is used by the index, and same here - logs only until May 26th: 26/05/2024 02:19:39.647 // 05-26-2024 02:19:39.647 -0400 WARN TailReader [12321 tailreader0] - Access error while handling path: failed to open for checksum: '/opt/splunk/etc/apps/.../.../ABC.csv' (No such file or directory) 26/05/2024 02:19:38.208 // 05-26-2024 02:19:38.208 -0400 INFO WatchedFile [12321 tailreader0] - Will begin reading at offset=0 for file='/opt/splunk/etc/apps/.../.../ABC.csv'. 26/05/2024 02:19:38.208 // 05-26-2024 02:19:38.208 -0400 INFO WatchedFile [12321 tailreader0] - Checksum for seekptr didn't match, will re-read entire file='/opt/splunk/etc/apps/.../.../ABC.csv'. 26/05/2024 02:19:37.621 // 05-26-2024 02:19:37.621 -0400 WARN TailReader [12321 tailreader0] - Insufficient permissions to read file='/opt/splunk/etc/apps/.../.../ABC' (hint: No such file or directory , UID: 0, GID: 0). 26/05/2024 02:19:37.512 // 05-26-2024 02:19:37.512 -0400 INFO WatchedFile [12321 tailreader0] - Will begin reading at offset=0 for file='/opt/splunk/etc/apps/.../.../ABC.csv'. 26/05/2024 02:19:37.512 // 05-26-2024 02:19:37.512 -0400 WARN LineBreakingProcessor [12299 parsing] - Truncating line because limit of 10000 bytes has been exceeded with a line length >= 50968856 - data_source="/opt/splunk/etc/apps/.../.../ABC.csv", data_host="host", data_sourcetype="sourcetype" 26/05/2024 02:19:37.512 // 05-26-2024 02:19:37.512 -0400 INFO WatchedFile [12321 tailreader0] - Will begin reading at offset=0 for file='/opt/splunk/etc/apps/.../.../ABC.csv'. 26/05/2024 02:19:37.143 // 05-26-2024 02:19:37.143 -0400 WARN LineBreakingProcessor [12299 parsing] - Truncating line because limit of 10000 bytes has been exceeded with a line length >= 50276856 - data_source="/opt/splunk/etc/apps/.../.../ABC.csv", data_host="host", data_sourcetype="sourcetype" 26/05/2024 02:19:36.947 // 05-26-2024 02:19:36.947 -0400 INFO Dashboard - group=per_source_thruput, series="/opt/splunk/etc/apps/.../.../ABC.csv", kbps=219.057, eps=482.877, kb=6791.592, ev=14971, avg_age=0.000, max_age=0   Would this be of any help?
Hello community, I'm having a problem with a probably stupid addition but I can't find a solution. I make a simple query which returns me an account using a field called "routingKey": However, ... See more...
Hello community, I'm having a problem with a probably stupid addition but I can't find a solution. I make a simple query which returns me an account using a field called "routingKey": However, in this example I have duplicate routingKey but with different names (example: routingdynatrace_2 and dynatrace_2 are actually the same source). This is due to a change in the way I collect my data and this has changed the name of the routingKey. The data is however not the same (the data of the routingKey "routingdynatrace_2" is not the same as "dynatrace_2") My question is: how do I add two RoutingKey after the count to get the overall total? I tried to rename the routingKey upstream but the query does not add them after renaming. If you have any ideas, I'm interested. Sincerely, Rajaion
Hi @Orange_girl , check if you received logs until the 31st of May, if yes and data flow stopped at 1st of June, check the timestamp format because probably you missed a configuration, but until the... See more...
Hi @Orange_girl , check if you received logs until the 31st of May, if yes and data flow stopped at 1st of June, check the timestamp format because probably you missed a configuration, but until the 31st of May you didn't discover it. the check the time forma of your data. Ciao. Giuseppe 
Hi Giuseppe,  I haven't changed anything in SPLUNK and the indexing used to work well, would this just randomly change by itself?  I'm happy to check it though, could you let me know where and what... See more...
Hi Giuseppe,  I haven't changed anything in SPLUNK and the indexing used to work well, would this just randomly change by itself?  I'm happy to check it though, could you let me know where and what I should be looking for? Are you referring to the time value in logs? thank you. 
Hi @ganeshkumarmoha  I suppose that you need to check if you' received events from each host with that source, is it correct? if this is your requirement and if the source column has a fixed part t... See more...
Hi @ganeshkumarmoha  I suppose that you need to check if you' received events from each host with that source, is it correct? if this is your requirement and if the source column has a fixed part that you can use for checking (e.g. the file name without path), please tru something like this: <your_search> | rex field=source "\\(?<Source>logpath\d*.txt)$" | rename host AS Host | stats count BY host Source | append [ | inputlookup your_lookup.csv | eval count=0 | fields Host Source count ] | stats sum(count) AS total BY Host Source | where total=0 Ciao. Giuseppe
Hi Team, For a business requirement, I need to validate log file generated for last an hour with combination of host and source in below order: Host  Source server001 c\:...\logpath1.txt ... See more...
Hi Team, For a business requirement, I need to validate log file generated for last an hour with combination of host and source in below order: Host  Source server001 c\:...\logpath1.txt server002 c\:...\logpath2.txt server003 c\:...\logpath3.txt server004 c\:...\logpath4.txt server005 c\:...\logpath5.txt   I knew, inputlookup keyword is single column based; however, I need it two columns to check the log file. Can you please suggest what is the best to accomplish my requirement? Thanks in advance!
I'm using Splunk Enterprise 9.1 with Windows Universal Forwarders. I'm ingesting the Windows Domain Contoller netlogon.log file. The Splunk Add-on for Windows has all the parsing/extraction rules def... See more...
I'm using Splunk Enterprise 9.1 with Windows Universal Forwarders. I'm ingesting the Windows Domain Contoller netlogon.log file. The Splunk Add-on for Windows has all the parsing/extraction rules defined for me to parse netlogon.log via its sourcetype=MSAD:NT6:Netlogon definition. Now, my use case is that I only wish to retain certain lines from netlogon.log and discard all others. How can I acheive this? Is it a case of defining a new sourcetype and copying the props/transforms from this Splunk_TA_Windows or is there a way to keep using the sourcetype sourcetype=MSAD:NT6:Netlogon and discard the lines via some other mechanism that does not result in my modidying the Splunk_TA_Windows app?