Hi everyone i want to ask where can i get latest update for legit_domains.csv ? Ask here because when i check it in lookup it says no owner, so i think it created automatically from Splunk. I k...
See more...
Hi everyone i want to ask where can i get latest update for legit_domains.csv ? Ask here because when i check it in lookup it says no owner, so i think it created automatically from Splunk. I know it can be update it manually, but it takes time again. it will helpfull when you can give me latest update for this .csv
Checking the "host" details these complaints are all coming from the heavy forwarders, so I assumed this is where I should be checking. Running the command on a heavy forwarder produced the followin...
See more...
Checking the "host" details these complaints are all coming from the heavy forwarders, so I assumed this is where I should be checking. Running the command on a heavy forwarder produced the following output: /opt/splunk/etc/system/default/props.conf DATETIME_CONFIG = /etc/datetime.xml While on any search head or indexer the output is as you suggested: /opt/splunk/etc/apps/Splunk_TA_nix/default/props.conf DATETIME_CONFIG = CURRENT There is no '/etc/datetime.xml', however there is an '/opt/splunk/etc/datetime.xml' file. I have no idea of the configuration is reffering to a non-existing file or the built in Splunk one. I don't know if or why anyone would have modified this setting, or the consequences of doing such. I'll do some reasearch on my own but any feedback and/or suggestions are more than welcome
i've parsed my InputFile (json-parser) and before one of the missing event there is an error, like unexpected non-white-space sign. So i think, it is not a problem of splunk!
Thank you for your questions @PickleRick . I'm using forwarding mechanismen. Here are the stanzas form the forwarder: inputs.conf [monitor:///daten/datasources/data/mg_test/entry2group/*.json] d...
See more...
Thank you for your questions @PickleRick . I'm using forwarding mechanismen. Here are the stanzas form the forwarder: inputs.conf [monitor:///daten/datasources/data/mg_test/entry2group/*.json] disabled = false index = mg_test sourcetype = json_test crcSalt = <SOURCE> whitelist = .*\d{8}_Q\d_entry_entry2group\.v\d\.(\d\d\.){2}json$ [json_test] DATETIME_CONFIG = TIMESTAMP_FIELDS = test.sys_created_on INDEXED_EXTRACTIONS = json KV_MODE = none LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true category = Structured description = test json disabled = false pulldown_type = true I've copied this props.conf from my first try to upload (over splunk-web). Here is the stanza from ../etc/system/local/props.conf [test_json] DATETIME_CONFIG = INDEXED_EXTRACTIONS = json KV_MODE = none LINE_BREAKER = ([\r\n]+) TIMESTAMP_FIELDS = test.sys_created_on category = Structured description = JavaScript Object Notation format. For more information, visit http://json.org/ disabled = false pulldown_type = true Another investigation shows me, you are on the right way! I found following event on _internal. 08-25-2024
19:31:28.338 +0200 ERROR JsonLineBreaker [1737739 structuredparsing] -
JSON StreamId:1586716756715697390 had parsing error:Unexpected character
while looking for value: ',' -
data_source="daten/datasources/data/mg_test/entry2group/20240825_Q2_entry_entry2group.v0.03.01 .json]", data_host="socmg_local_fw", data_sourcetype="json_test" So in the next step i will isolate one event (object) which is lost if there are special sign in the data.
There is something not right about this. If your events are indeed formed this way (a multiline entries) and your LINE_BREAKER is set to ([\r\n]+) there is no way they are ingested as a whole. Tell ...
See more...
There is something not right about this. If your events are indeed formed this way (a multiline entries) and your LINE_BREAKER is set to ([\r\n]+) there is no way they are ingested as a whole. Tell us more about how you are ingesting it (and if you're reading a file with a forwarder, show us the relevant inputs.conf stanza and props.conf stanza from the forwarder).
Further investigation: I shortened the json-objects from 44 to 43 lines. {
"1.Entry": "1.Data",
...
"43.Entry": "43.Data"
},
... 48.186 similiar entries...
{
"1.Entry": "1.Data",
...
...
See more...
Further investigation: I shortened the json-objects from 44 to 43 lines. {
"1.Entry": "1.Data",
...
"43.Entry": "43.Data"
},
... 48.186 similiar entries...
{
"1.Entry": "1.Data",
...
"43.Entry": "43.Data"
} But forwarding the json-file leaded to the count of 45.352 events (presents 45.352 json-objects), instead of 48.188 objects. That's a little bit 'loco' i think.
We don't have access to your Splunk so we can't provide links to it. To find your license page, go to Settings->Licensing. You must have (or inherit) the Admin role and be using Splunk Enterprise.
Thank you @manjunathmeti . But it doesn't function. The result is the same as before. I think your advice helps if splunk doesn't import a whole file, if it is not salted and/or the first character...
See more...
Thank you @manjunathmeti . But it doesn't function. The result is the same as before. I think your advice helps if splunk doesn't import a whole file, if it is not salted and/or the first characters in it doesn't have a difference to another file imported before. Further Investigation: I have exported the items from splunk (csv) and compare the original file with the export. I can't see any muster, which object is imported and which not. A muster could be like the first 22.256 objects were importet, I see, that object 66 to is not imported, 104, 108, 113, and so on not imported. I think there is a limit to import json-objects. But which is it?
we have created new app registation as per the document and assigned correct permistion as per the docuement. still not able to pull the logs. splunk support portal is down for 5 days. need urgent...
See more...
we have created new app registation as per the document and assigned correct permistion as per the docuement. still not able to pull the logs. splunk support portal is down for 5 days. need urgent spport. invalid_client","error_description":"AADSTS7000216: 'client_assertion', 'client_secret' or 'request' is required for the 'client_credentials' grant type. Trace ID
Hi i still get an error regarding GlusterFS during another fresh install of the latest Splunk SOAR even i already update the wording from mirror to vault in install_common.py and already do an apply...
See more...
Hi i still get an error regarding GlusterFS during another fresh install of the latest Splunk SOAR even i already update the wording from mirror to vault in install_common.py and already do an apply to save the changes. any idea where should i update again for those links that tooks an error? i do already installed manually the packages, but it still check into that links. Please help
As I said - while the connection might be working without properly authenticating the server (verifying server's certificate) the proper way of working with TLS-protected connection is to make sure t...
See more...
As I said - while the connection might be working without properly authenticating the server (verifying server's certificate) the proper way of working with TLS-protected connection is to make sure the server is who it claims it is. So you should make sure your python app can properly verify the server's certificate - the server's cert should be issued by CA that your python code trusts. And that is one thing but it's just a general security-related thing not directly causing the server to return 401. 401 means you're not providing correct authentication data. As I said before - if supposedly the same token works with another host or app comparw the requests made by tue working app and the non-working app and check what is different. We can't know what's wrong as so far the only thing we have is "the server says 401".
When I try to login to splunk it give me authentication options. Once user pass is provided. it gives me below error. Also when i checked web_service.log I see below error 'Error connect...
See more...
When I try to login to splunk it give me authentication options. Once user pass is provided. it gives me below error. Also when i checked web_service.log I see below error 'Error connecting to /services/authentication/users/splunk: timed out',)
Hi @whales , could you better describe your question? because the answer is too simple: when you have to search in data stored in tha index! Ciao. Giuseppe
Hi Team, Hope this message finds you well. I have a new splunk on-premise instance and we are planning to implement Splunk Trackme app on our SHC to monitor any data latency, missing data etc. for ...
See more...
Hi Team, Hope this message finds you well. I have a new splunk on-premise instance and we are planning to implement Splunk Trackme app on our SHC to monitor any data latency, missing data etc. for our instance. I read through few docs (https://trackme.readthedocs.io/en/latest/deployment.html) that says it is resource consuming. I want to understand if it will impact our license consumption apart from CPU and memory post deployment. Also do we need any separate license for the splunk track-me. What are cons of it. Pls reply soon, thanks in advance
@MK3- I believe its an permission and/or app-context issue. When you create service object, Provide the same username you use to login on Splunk UI Provide the same App name which you use on UI a...
See more...
@MK3- I believe its an permission and/or app-context issue. When you create service object, Provide the same username you use to login on Splunk UI Provide the same App name which you use on UI and search works fine service = client.connect(host="<ip/hostname>", username="<username>", password="<user-passwd", app="<same app as you use on UI>") I hope this helps!!!!