All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

The field is "Country" not "country". Try ... | iplocation src_ip | geostats count by Country   Happy Splunking! -Rich
The iplocation command generates the capitalized field "Country", not "country", so it should work if you capitalize Country: | geostats count by Country
Any fixes for this?
Hi @Osama.Abbas  I noticed the Community has not yet jumped in to help. Did you happen to make any discoveries or find a solution you could share?
Hi @David.Machacek, I noticed the Community has not yet jumped in to help. Did you happen to make any new discoveries or find a solution you could share? If you have not, you can always contact ... See more...
Hi @David.Machacek, I noticed the Community has not yet jumped in to help. Did you happen to make any new discoveries or find a solution you could share? If you have not, you can always contact AppD Support.  How do I submit a Support ticket? An FAQ  If you do contact Support, it would be awesome if you came back and shared any learnings or outcomes here as a reply. 
I am now seeing the following error log 2024-03-08 13:06:35,386 level=ERROR pid=34152 tid=MainThread logger=modular_inputs.mscs_azure_event_hub pos=mscs_azure_event_hub.py:run:939 | datainput="PFG-A... See more...
I am now seeing the following error log 2024-03-08 13:06:35,386 level=ERROR pid=34152 tid=MainThread logger=modular_inputs.mscs_azure_event_hub pos=mscs_azure_event_hub.py:run:939 | datainput="PFG-AzureEventHub" start_time=1709903177 | message="Error occurred while connecting to eventhub: Failed to initiate the connection due to exception: Websocket failed to establish connection: SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1106)') Error condition: ErrorCondition.SocketError Error Description: Failed to initiate the connection due to exception: Websocket failed to establish connection: SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1106)')"
Hi @gcusello , i tried the query, still  i am getting alerts
Any reason why this can't be visualized in a geo cluster map? source="udp:514" index="syslog" NOT src_ip IN (10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16, 17.0.0.0/8) action=DROP src_ip!="162.159.192.9... See more...
Any reason why this can't be visualized in a geo cluster map? source="udp:514" index="syslog" NOT src_ip IN (10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16, 17.0.0.0/8) action=DROP src_ip!="162.159.192.9" | iplocation src_ip | geostats count by country      
i have splunk index configured in my openshift cluster as a configmap, now if i change the index on the cluster still my container logs are moving to the old index. is there something i am missing? 
Hi @Harish2 , instead of appendcols thet run as it prefer, please try this: | tstats count latest(_time) as _time WHERE index=app-idx host="*abfd*" sourcetype=app-source-logs NOT [ | inp... See more...
Hi @Harish2 , instead of appendcols thet run as it prefer, please try this: | tstats count latest(_time) as _time WHERE index=app-idx host="*abfd*" sourcetype=app-source-logs NOT [ | inputlookup calendsr.csv WHERE type="holyday" | fields date ] BY host Ciao. Giuseppe
Hello all - Trying to get Azure Event Hub data to flow into Splunk. Having issues configuring it with the add-on for Microsoft Cloud Services. I have configured an app in Azure that has Reader & Ev... See more...
Hello all - Trying to get Azure Event Hub data to flow into Splunk. Having issues configuring it with the add-on for Microsoft Cloud Services. I have configured an app in Azure that has Reader & Event Hub Receiver roles. Event Hub has been configured it receive various audit information. I am trying to configure the input. But receive error message in splunk_ta_microsoft_cloudservices_mscs_azure_event_hub_XYZ.log     Error - 2024-03-08 16:20:31,313 level=ERROR pid=22008 tid=MainThread logger=modular_inputs.mscs_azure_event_hub pos=mscs_azure_event_hub.py:run:939 | datainput="PFG-AzureEventHub1" start_time=1709914805 | message="Error occurred while connecting to eventhub: CBS Token authentication failed. Status code: None Error: client-error CBS Token authentication failed. Status code: None"     I then tried to input the Connection string-primary key in the FQDN space, but receive the below error message. This is occurring because it is trying to create a ckpt file, but the file path is too long and it contains invalid characters.     2024-03-08 14:41:32,112 level=ERROR pid=34216 tid=MainThread logger=modular_inputs.mscs_azure_event_hub pos=utils.py:wrapper:72 | datainput="PFG-AzureEventHub1" start_time=1709908886 | message="Data input was interrupted by an unhandled exception." Traceback (most recent call last): File "L:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\lib\splunksdc\utils.py", line 70, in wrapper return func(*args, **kwargs) File "L:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\lib\modular_inputs\mscs_azure_event_hub.py", line 933, in run consumer = self._create_event_hub_consumer(workspace, config, credential, proxy) File "L:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\lib\modular_inputs\mscs_azure_event_hub.py", line 851, in _create_event_hub_consumer args.consumer_group, File "L:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\lib\modular_inputs\mscs_azure_event_hub.py", line 238, in open checkpoint = SharedLocalCheckpoint(fullname) File "L:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\lib\modular_inputs\mscs_azure_event_hub.py", line 103, in __init__ self._fd = os.open(fullname, os.O_RDWR | os.O_CREAT) FileNotFoundError: [Errno 2] No such file or directory: 'L:\\Program Files\\Splunk\\var\\lib\\splunk\\modinputs\\mscs_azure_event_hub\\Endpoint=sb://REDACTED.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=REDACTED-insights-activity-logs-$Default.v1.ckpt'      Here is my inputs.conf file for the add-on     [mscs_azure_event_hub://PFG-AzureEventHub1] account = AzureActivity consumer_group = $Default event_hub_name = insights-activity-logs event_hub_namespace = REDACTED.servicebus.windows.net index = azure-activity interval = 300 max_batch_size = 300 max_wait_time = 10 sourcetype = mscs:azure:eventhub use_amqp_over_websocket = 1      I have been stuck on this for the past couple of days. Any advice would be greatly appreciated!
Hi @gcusello , my goal is to i don't want to receive the alerts  during certain days. For example in csv file i gave todays date.  My alert condition is count >0, corn job is 15mins  for last 15 minu... See more...
Hi @gcusello , my goal is to i don't want to receive the alerts  during certain days. For example in csv file i gave todays date.  My alert condition is count >0, corn job is 15mins  for last 15 minutes. time range. Used below query still i am receiving alerts. | tstats count latest(_time) as _time WHERE index=app-idx host="*abfd*" sourcetype=app-source-logs BY host |where count >0 | appendcols [ | makeresults | eval date=strftime(_time, "%m/%d/%Y") | lookup calendsr.csv date OUTPUT type | eval type=if(isnotnull(type),type,"NotHoliday"]  
Recently our TA was rejected for Splunk Cloud compatibility due to a configuration option that would allow our customers to disable SSL verification so that they can make the REST API calls to a serv... See more...
Recently our TA was rejected for Splunk Cloud compatibility due to a configuration option that would allow our customers to disable SSL verification so that they can make the REST API calls to a server that has a self-signed TLS certificate. The TA is using Python code for the inputs, and one of the configuration options when setting up the input was to Enable or Disable SSL Verification.  Customers using servers with self-signed certificates could opt to disable verification.  This would set the verify parameter to the helper.send_http_request to False. This option passed Cloud compatibility until recently when we were notified that external network calls must be made securely and so our TA no longer qualified for Cloud compatibility with the option to set verify=False. Has anyone else ran into this issue and is there a solution other than forcing customers to purchase TLS certificates from a trusted CA? I did see there is an option to the helper.send_http_request call to specify the CA bundle, but we do not have any control over what CA is used to generate the self-signed certificate so there is no way to include a bundle in the TA. Any suggestions are welcome.  
Hi @rbakeredfi, forgetting for a moment the use of Windows that I'd avoid in production systems! how many logs this HF must manage? are there many syslogs? if yes how do you input them using Splun... See more...
Hi @rbakeredfi, forgetting for a moment the use of Windows that I'd avoid in production systems! how many logs this HF must manage? are there many syslogs? if yes how do you input them using Splunk inputs or an external rsyslog server? Are you sure to have a performant network between the HF and the Indexers? are Indexers overloaded or not? Ciao. Giuseppe
Hello, How to modify _time when running summary index on a scheduled search? Please suggest. I appreciate your help. Thank you When running summary index on a scheduled search, by default, _time... See more...
Hello, How to modify _time when running summary index on a scheduled search? Please suggest. I appreciate your help. Thank you When running summary index on a scheduled search, by default, _time was set to info_min_time, (start time of a search duration), instead of search_now (time when the search run) So, if at this current time I collect the summary index in the last 30 day , the _time will be set to the last 30 days , instead of current time. The problem is if I run a search in the past 24 hours, the data won't show up because the _time is dated the last 30 days, so I had to search in the past 30 days
Hi @Harish2 , the condition you required is that hour must be NOT hour<8 OR hour>11. Are the hours of the events in the results compliant with this condition? maybe you should change the hour cond... See more...
Hi @Harish2 , the condition you required is that hour must be NOT hour<8 OR hour>11. Are the hours of the events in the results compliant with this condition? maybe you should change the hour condition. Ciao. Giuseppe
I am looking for more of a generic mapping of resources to parts of the pipeline. However, this specific case is regarding a HF. Machine Name Machine CPU Cores (Physical / Virtual) Physical Me... See more...
I am looking for more of a generic mapping of resources to parts of the pipeline. However, this specific case is regarding a HF. Machine Name Machine CPU Cores (Physical / Virtual) Physical Memory Capacity (MB) Operating System Architecture redacted 16 / 32 131020 Windows x64
yes events are present, my alert condition is results greater than zeros for last 15 minutes. but as per my requirement i mentioned todays date in the csv file,  so alert should not trigger right
Hi @Harish2 , check the hours of these events, if they match the condition of your search. Ciao. Giuseppe
IHAC that is trying to ingest logs from their self-hosted Trellix instance.   When I try to add an account, the URL field only lists: Global Frankfort India Singapore Sydney There i... See more...
IHAC that is trying to ingest logs from their self-hosted Trellix instance.   When I try to add an account, the URL field only lists: Global Frankfort India Singapore Sydney There is no other input field to specify an actual FQDN/IP.  Am I missing something, or is this feature not present?