All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I've been researching this for the last 30 minutes and can't find anything to let you read that file.  Everything is around the conf files only, so not even scripts or such.  You could look into a da... See more...
I've been researching this for the last 30 minutes and can't find anything to let you read that file.  Everything is around the conf files only, so not even scripts or such.  You could look into a dashboard with a custom javascript call maybe but that is outside my wheelhouse to even know if that is possible.
Hello, We have a lookup csv file: 1 million records (data1); and a kvstore: 3 million records (data2). We need to compare a street address in data2 with a fuzzy match of the street address in data1 ... See more...
Hello, We have a lookup csv file: 1 million records (data1); and a kvstore: 3 million records (data2). We need to compare a street address in data2 with a fuzzy match of the street address in data1 - the bold red text below -returning the property owner. Ex" data2 street address:    123 main street  data1 street address:    123 main street apt 13 We ran a regular lookup command and this took well over 7 hours. We have tried creating a sub-address (data1a) removing the apt/unit numbers, but still a 7 hour search. Plus if there is more than one apt/unit at the address, there might be more than one property owner. This is why a fuzzy-type compare is what we are looking for. Hope my explanation is clear. Ask if not. Thanks and God bless, Genesius (Merry Christmas and Happy Holidays)
I have a client who wants to share the Readme file in their app with end users so that they can reference this in the UI. Seems reasonable and prevents them having to duplicate content into a view. O... See more...
I have a client who wants to share the Readme file in their app with end users so that they can reference this in the UI. Seems reasonable and prevents them having to duplicate content into a view. Otherwise the readme file is only available to admins who have CLI access. I have tried using the REST endpoint to locate the file, I have checked that the metadata allows read, it is just the path and actual capability I am unclear on. https://<splunk-instance>/en-GB/manager/<redacted>/apps/README.md Thanks  
Hi @Dawoo, how are you? You can follow the documentation steps to install UF on MacOS
Hi @inventsekar, thanks for the reply, I just entered one site. On the other hand I would be pleased to share the error logs but none are showing. I configured logging level on the app as "ERROR" a... See more...
Hi @inventsekar, thanks for the reply, I just entered one site. On the other hand I would be pleased to share the error logs but none are showing. I configured logging level on the app as "ERROR" and "DEBUG" but no logs appear when I run: index=_internal source="/opt/splunk/var/log/splunk/python.log" level!=INFO Regards, PB
@user487596, an easier way to manipulate passwords is by using the Splunkbase app: https://splunkbase.splunk.com/app/4013
Thank you. It's worked for me.
@isoutamo Thanks for the reply, I'm afraid to go with the scripts route, but I'll still check if there is any other solution I can find as I'm looking to move the existing users from the old applicat... See more...
@isoutamo Thanks for the reply, I'm afraid to go with the scripts route, but I'll still check if there is any other solution I can find as I'm looking to move the existing users from the old application to the renamed one without much effort.
@PickleRick Thanks for the reply,  Yes I meant the same; i.e. Splunkbase is a channel for application distribution. I agree that we can release a new app along with migration steps. Still, I'm loo... See more...
@PickleRick Thanks for the reply,  Yes I meant the same; i.e. Splunkbase is a channel for application distribution. I agree that we can release a new app along with migration steps. Still, I'm looking for a solution where the existing application user can seamlessly move to the newly renamed application without having to worry about replicating the same saved searches as well as the app setup to the newer app.
Hi  @inventsekar  thanks for responding I have user gatelogs .  my logs look something like this user=john gate=gate1 action="IN"  use_id=12345 i am trying to visualise this in such a way that i... See more...
Hi  @inventsekar  thanks for responding I have user gatelogs .  my logs look something like this user=john gate=gate1 action="IN"  use_id=12345 i am trying to visualise this in such a way that i have a live dashboard which shows me which users are passing through which gate 
Hi First of all, I'm a total beginner to Splunk. I just started my free trial of Splunk Cloud and want to install the UF on my MacBook. I don't know how to install the credential file, splunkclouduf... See more...
Hi First of all, I'm a total beginner to Splunk. I just started my free trial of Splunk Cloud and want to install the UF on my MacBook. I don't know how to install the credential file, splunkclouduf.spl. I have unpacked that file but in what directory should I move them to?  You can also see the directory of SplunkForwarder.          
Hi @BRFZ  (As others have not mentioned it yet) maybe pls have a look at this doc,.. it got pretty good details: https://docs.splunk.com/Documentation/Splunk/9.4.0/Security/WhatyoucansecurewithSplu... See more...
Hi @BRFZ  (As others have not mentioned it yet) maybe pls have a look at this doc,.. it got pretty good details: https://docs.splunk.com/Documentation/Splunk/9.4.0/Security/WhatyoucansecurewithSplunk  
Hi @arunkuriakose  Good day to you. may I ask you to provide more details pls.  1) what kinds of data you have ingested into Splunk so far 2) what details about the employees are already available... See more...
Hi @arunkuriakose  Good day to you. may I ask you to provide more details pls.  1) what kinds of data you have ingested into Splunk so far 2) what details about the employees are already available inside the Splunk? 3) is it the employee id card login and logout details are already available inside the Splunk?
Hi @PolarBear01 i am not sure how to help on this, but thought to ask you: 1) did you enter only one site or multiple sites? 2) as shown on that error text, may i know, if had a chance to check the... See more...
Hi @PolarBear01 i am not sure how to help on this, but thought to ask you: 1) did you enter only one site or multiple sites? 2) as shown on that error text, may i know, if had a chance to check the python.log for more details
Thank you so much, the eval command is magical !!!
My bad - The LogText column has the key word (connected or disconnected) with other texts. It will some kind of wildcard lookup for either of these 2 words. So, I'm looking to extract row 4 and 5 wh... See more...
My bad - The LogText column has the key word (connected or disconnected) with other texts. It will some kind of wildcard lookup for either of these 2 words. So, I'm looking to extract row 4 and 5 which has the "disconnected" text and where there isn't an associated connected row within say 120 secs.  Row Time LogText 1 7:00:00am text connected text 2 7:30:50am text disconnected text 3 7:31:30am text connected text 4 8:00:10am text disconnected text 5 8:10:30am text disconnected text
Yes thats what i did and its working now - thanks for your advises.  Cheers 
Hello, I am configuring statsd to send custom metric from AWS EC2 instance on which splunk-otel-collector.service is running to Splunk Observability Cloud to monitor this custom metrics. I have fol... See more...
Hello, I am configuring statsd to send custom metric from AWS EC2 instance on which splunk-otel-collector.service is running to Splunk Observability Cloud to monitor this custom metrics. I have followed the steps mentioned in the https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/receiver/statsdreceiver to setup statsd as receiver. receivers: statsd: endpoint: "localhost:8125" # default aggregation_interval: 60s # default enable_metric_type: false # default is_monotonic_counter: false # default timer_histogram_mapping: - statsd_type: "histogram" observer_type: "histogram" histogram: max_size: 50 - statsd_type: "distribution" observer_type: "histogram" histogram: max_size: 50 - statsd_type: "timing" observer_type: "summary" I have a problem in setting service for this statsd as receiver, as per github doc below configuration is written for the exporters, but I am not sure how this will work. exporters: file: path: ./test.json service: pipelines: metrics: receivers: [statsd] exporters: [file]  I also tried setting exporters in service section "receivers: [hostmetrics, otlp, signalfx, statsd]" and "exporters: [signalfx]" in the agent_config.yaml file as mentioned below, when I restart the "systemctl restart splunk-otel-collector.service", splunk otel collector agent stop sending any metric to the  Splunk Observability Cloud and when I remove statsd (receivers: [hostmetrics, otlp, signalfx]) then splunk otel collector agent starts sending any metric to the  Splunk Observability Cloud. # pwd /etc/otel/collector # # ls agent_config.yaml config.d fluentd gateway_config.yaml splunk-otel-collector.conf splunk-otel-collector.conf.example splunk-support-bundle.sh # service: extensions: [health_check, http_forwarder, zpages, smartagent] pipelines: traces: receivers: [jaeger, otlp, zipkin] processors: - memory_limiter - batch - resourcedetection #- resource/add_environment exporters: [otlphttp, signalfx] # Use instead when sending to gateway #exporters: [otlp/gateway, signalfx] metrics: receivers: [hostmetrics, otlp, signalfx, statsd] processors: [memory_limiter, batch, resourcedetection] exporters: [signalfx] # Use instead when sending to gateway #exporters: [otlp/gateway] What should be correct/supported exporter for the statsd as receiver? Thanks
You were correct, this solved the issue
Requirement: We need to monitor the Customer Decision Hub (CDH) portal, including Campaigns and Dataflows, using Real User Monitoring (RUM) in AppDynamics. Steps Taken: We injected the AppDynamic... See more...
Requirement: We need to monitor the Customer Decision Hub (CDH) portal, including Campaigns and Dataflows, using Real User Monitoring (RUM) in AppDynamics. Steps Taken: We injected the AppDynamics JavaScript agent code into the UserWorkForm HTML fragment rule. This is successfully capturing OOTB (Out-of-the-Box) screens but is not capturing Campaigns-related screens. Challenges: Pega operates as a Single Page Application (SPA), which complicates page load event tracking for Campaigns screens. Additionally, the CDH portal lacks a traditional front-end structure (HTML/CSS/JS), as Pega primarily serves server-generated content, which may restrict monitoring. Has anyone here successfully implemented such an integration? What are the best practices for passing this kind of contextual data from Pega to AppDynamics? Looking forward to your insights! Best regards,