All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Thank you for your questions @PickleRick . I'm using forwarding mechanismen. Here are the stanzas form the forwarder: inputs.conf [monitor:///daten/datasources/data/mg_test/entry2group/*.json] d... See more...
Thank you for your questions @PickleRick . I'm using forwarding mechanismen. Here are the stanzas form the forwarder: inputs.conf [monitor:///daten/datasources/data/mg_test/entry2group/*.json] disabled = false index = mg_test sourcetype = json_test crcSalt = <SOURCE> whitelist = .*\d{8}_Q\d_entry_entry2group\.v\d\.(\d\d\.){2}json$ [json_test] DATETIME_CONFIG = TIMESTAMP_FIELDS = test.sys_created_on INDEXED_EXTRACTIONS = json KV_MODE = none LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true category = Structured description = test json disabled = false pulldown_type = true I've copied this props.conf from my first try to upload (over splunk-web). Here is the stanza from ../etc/system/local/props.conf [test_json] DATETIME_CONFIG = INDEXED_EXTRACTIONS = json KV_MODE = none LINE_BREAKER = ([\r\n]+) TIMESTAMP_FIELDS = test.sys_created_on category = Structured description = JavaScript Object Notation format. For more information, visit http://json.org/ disabled = false pulldown_type = true Another investigation shows me, you are on the right way! I found following event on _internal. 08-25-2024 19:31:28.338 +0200 ERROR JsonLineBreaker [1737739 structuredparsing] - JSON StreamId:1586716756715697390 had parsing error:Unexpected character while looking for value: ',' - data_source="daten/datasources/data/mg_test/entry2group/20240825_Q2_entry_entry2group.v0.03.01 .json]", data_host="socmg_local_fw", data_sourcetype="json_test"   So in the next step i will isolate one event (object) which is lost if there are special sign in the data.
There is something not right about this. If your events are indeed formed this way (a multiline entries) and your LINE_BREAKER is set to ([\r\n]+) there is no way they are ingested as a whole. Tell ... See more...
There is something not right about this. If your events are indeed formed this way (a multiline entries) and your LINE_BREAKER is set to ([\r\n]+) there is no way they are ingested as a whole. Tell us more about how you are ingesting it (and if you're reading a file with a forwarder, show us the relevant inputs.conf stanza and props.conf stanza from the forwarder).
Further investigation: I shortened the json-objects from 44 to 43 lines. { "1.Entry": "1.Data", ... "43.Entry": "43.Data" }, ... 48.186 similiar entries... { "1.Entry": "1.Data", ... ... See more...
Further investigation: I shortened the json-objects from 44 to 43 lines. { "1.Entry": "1.Data", ... "43.Entry": "43.Data" }, ... 48.186 similiar entries... { "1.Entry": "1.Data", ... "43.Entry": "43.Data" }  But forwarding the json-file leaded to the count of 45.352 events (presents 45.352 json-objects), instead of 48.188 objects. That's a little bit 'loco' i think.
We don't have access to your Splunk so we can't provide links to it. To find your license page, go to Settings->Licensing.  You must have (or inherit) the Admin role and be using Splunk Enterprise.
Hi, it really did my my headthis path thing, but I found a way, I tried LongPath Tool Program and that sorted it.
260 get me ? Yeah it can be a real headache, I tried LongPath Tool Program which helped a lot.
260 get me ? Yeah it can be a real headache, I tried LongPath Tool Program which helped a lot.  
Thank you @manjunathmeti . But it doesn't function. The result is the same as before. I think your advice helps if splunk doesn't import a whole file, if it is not salted and/or the first character... See more...
Thank you @manjunathmeti . But it doesn't function. The result is the same as before. I think your advice helps if splunk doesn't import a whole file, if it is not salted and/or the first characters in it doesn't have a difference to another file imported before. Further Investigation: I have exported the items from splunk (csv) and compare the original file with the export. I can't see any muster,  which object is imported and which not.  A muster could be like the first 22.256 objects were importet, I see, that object 66 to is not imported, 104, 108, 113, and so on not imported. I think there is a limit to import json-objects. But which is it?
we have created new app registation as per the document and assigned correct permistion as per the docuement.  still not able to pull the logs.  splunk support portal is down for 5 days.  need urgent... See more...
we have created new app registation as per the document and assigned correct permistion as per the docuement.  still not able to pull the logs.  splunk support portal is down for 5 days.  need urgent spport.  invalid_client","error_description":"AADSTS7000216: 'client_assertion', 'client_secret' or 'request' is required for the 'client_credentials' grant type. Trace ID
Hi i still get an error regarding GlusterFS during another fresh install of the latest Splunk SOAR even i already update the wording from mirror to vault in install_common.py and already do an apply... See more...
Hi i still get an error regarding GlusterFS during another fresh install of the latest Splunk SOAR even i already update the wording from mirror to vault in install_common.py and already do an apply to save the changes.     any idea where should i update again for those links that tooks an error? i do already installed manually the packages, but it still check into that links. Please help
As I said - while the connection might be working without properly authenticating the server (verifying server's certificate) the proper way of working with TLS-protected connection is to make sure t... See more...
As I said - while the connection might be working without properly authenticating the server (verifying server's certificate) the proper way of working with TLS-protected connection is to make sure the server is who it claims it is. So you should make sure your python app can properly verify the server's certificate - the server's cert should be issued by CA that your python code trusts. And that is one thing but it's just a general security-related thing not directly causing the server to return 401. 401 means you're not providing correct authentication data. As I said before - if supposedly the same token works with another host or app comparw the requests made by tue working app and the non-working app and check what is different. We can't know what's wrong as so far the only thing we have is "the server says 401".
When I try to login to splunk it give me authentication options. Once user pass is provided. it gives me below error.   Also when i checked web_service.log I see below error 'Error connect... See more...
When I try to login to splunk it give me authentication options. Once user pass is provided. it gives me below error.   Also when i checked web_service.log I see below error 'Error connecting to /services/authentication/users/splunk: timed out',)
Hi @whales , could you better describe your question? because the answer is too simple: when you have to search in data stored in tha index! Ciao. Giuseppe
Hi Team, Hope this message finds you well. I have a new splunk on-premise instance and we are planning to implement Splunk Trackme app on our SHC to monitor any data latency, missing data etc. for ... See more...
Hi Team, Hope this message finds you well. I have a new splunk on-premise instance and we are planning to implement Splunk Trackme app on our SHC to monitor any data latency, missing data etc. for our instance.  I read through few docs (https://trackme.readthedocs.io/en/latest/deployment.html)  that says it is resource consuming. I want to understand if it will impact our license consumption apart from CPU and memory post deployment. Also do we need any separate license for the splunk track-me. What are cons of it. Pls reply soon, thanks in advance
please provide the link to go my license page.  its too deficult to navigate. from 3 days navigating there is no link found
@MK3- I believe its an permission and/or app-context issue. When you create service object, Provide the same username you use to login on Splunk UI Provide the same App name which you use on UI a... See more...
@MK3- I believe its an permission and/or app-context issue. When you create service object, Provide the same username you use to login on Splunk UI Provide the same App name which you use on UI and search works fine service = client.connect(host="<ip/hostname>", username="<username>", password="<user-passwd", app="<same app as you use on UI>")   I hope this helps!!!!
hi @a101755, Try adding below configs in input monitors in inputs.conf. crcSalt = <SOURCE> initCrcLength = 2048
So, what should I do in my program? Do I need to add the ssl certificate? Also how to know properly authenticating to the server? Can I ask for your help about these matters? Thank you for your at... See more...
So, what should I do in my program? Do I need to add the ssl certificate? Also how to know properly authenticating to the server? Can I ask for your help about these matters? Thank you for your attention to this matter, I am waiting for your response
I have a json-File with with 23.904 objects in it. They are all like: { "1.Entry": "1.Data", ... "44.Entry": "44.Data" }, ... 23.902 similiar entries... { "1.Entry": "1.Data", ... "4... See more...
I have a json-File with with 23.904 objects in it. They are all like: { "1.Entry": "1.Data", ... "44.Entry": "44.Data" }, ... 23.902 similiar entries... { "1.Entry": "1.Data", ... "44.Entry": "44.Data" } But forwarding the json-file leaded to the count of 22.256 events (presents 22.256 json-objects) My props.conf [json_test] DATETIME_CONFIG = TIMESTAMP_FIELDS = test.sys_created_on INDEXED_EXTRACTIONS = json KV_MODE = none LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true category = Structured description = test json disabled = false pulldown_type = true   The problem so is not that a single event is truncated, but the json-file is.  
Hi,  I'm trying to instrument my .NET application for Splunk Observability Cloud. I'm using this package for that and it's working. I can see traces coming in. However in the Database Query Performa... See more...
Hi,  I'm trying to instrument my .NET application for Splunk Observability Cloud. I'm using this package for that and it's working. I can see traces coming in. However in the Database Query Performance section, I can only see the queries executed by hangfire (which we use to manage background jobs) in the application. Other DB queries are not captured. We are using a PostgreSQL database hosted in Amazon RDS which is compatible. The SQL Database MetricSets is also active. How can I make sure all the DB queries are captured? .