Activity Feed
- Posted Re: props and transforms to extract multiline muntivalue event on Splunk Search. 02-06-2023 11:48 PM
- Posted How can I use props and transforms to extract multiline muntivalue event? on Splunk Search. 02-06-2023 11:26 PM
- Tagged How can I use props and transforms to extract multiline muntivalue event? on Splunk Search. 02-06-2023 11:26 PM
- Tagged How can I use props and transforms to extract multiline muntivalue event? on Splunk Search. 02-06-2023 11:26 PM
- Posted Re: Stuck with Splunk ES Upgrade on Splunk Enterprise Security. 05-01-2022 08:00 PM
- Posted Stuck with Splunk ES Upgrade on Splunk Enterprise Security. 04-27-2022 08:42 PM
- Posted Re: Splunk Apps throwing Invalid key python3 warnings on All Apps and Add-ons. 03-03-2022 03:17 PM
- Posted Re: Splunk Apps throwing Invalid key python3 warnings on All Apps and Add-ons. 03-03-2022 03:13 PM
- Posted Splunk Apps throwing Invalid key python3 warnings on All Apps and Add-ons. 03-02-2022 07:16 PM
- Posted Re: eval not working in props.conf on Getting Data In. 02-15-2022 08:48 PM
- Posted Re: eval not working in props.conf on Getting Data In. 01-26-2022 03:25 PM
- Posted Re: eval not working in props.conf on Getting Data In. 01-26-2022 03:24 PM
- Posted Why does eval search work but eval in the props conf file doesn't creating new field? on Getting Data In. 01-24-2022 07:18 PM
- Posted Re: Search Time extraction not working on Splunk Enterprise. 12-09-2021 03:01 PM
- Karma Re: Search Time extraction not working for isoutamo. 12-09-2021 03:00 PM
- Posted Re: Search Time extraction not working on Splunk Enterprise. 12-09-2021 02:18 PM
- Posted Search Time extraction not working on Splunk Enterprise. 12-08-2021 04:52 PM
- Got Karma for Re: How to configure a sourcetype for JSON data to parse each line as a distinct event?. 01-28-2021 03:55 PM
- Posted Data Anonimization - Multiple transforms not working for single _raw event on Getting Data In. 09-17-2020 12:23 AM
- Posted Re: JSON payloads not getting indexed into Splunk on Getting Data In. 08-24-2020 06:54 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 |
02-06-2023
11:48 PM
Sure @gcusello , and what else should I put in the conf files to extract that fields as multivalued
... View more
02-06-2023
11:26 PM
Hi experts there,
Trying to extract multivalue output from a multiline json field through props and transforms. How best can I achieve for the below sample data (for my_mvdata field) ?
I can write a regex in pros.conf with \\t delimiter. But only getting the first line. How to use multi add and do it through transforms?
{
something: false
somethingelse: true
blah:
blah:
my_mvdata: server1 count1 country1 code1 message1
server2 count1 country1 code1 message2
server3 count1 country1 code1 message3
server4 count1 country1 code1 message4
blah:
blah:
}
... View more
- Tags:
- multiline
- multivalue
Labels
- Labels:
-
field extraction
04-27-2022
08:42 PM
Hi Helpers - Below is my usecase where I am stuck with my ES upgrade. My Splunk version recently upgraded from 7.2.7 to 8.1.3 Post the Splunk upgrade, Splunk ES views were throwing pop-up messages “Timelines could not be loaded”. Splunk ES was on 4.5.2 which was working fine on Splunk 7.2.7. Since it looked incompatible, we planned to upgrade it to 6.2.0. Below is the process followed. It's on a SHC environment with 3 Search Heads On ES Deployer, take backups of etc/shcluster/apps to etc/apps folders On ES Deployer, copied the apps (SA-*, DA-*, SplunkEnterpriseSecuritySuite) from etc/shcluster/apps to etc/apps folder Ran the upgrade command – (/opt/splunk/bin/splunk install app ./splunk-enterprise-security_620.spl -update 1) Ran the essinstall command as per the install documentation – (/opt/splunk/bin/splunk search '| essinstall --deployment_type shc_deployer' -auth admin:TelstraDR01 action=upgrade) – (Output attached) /opt/splunk/bin/splunk restart – (Multiple Invalid Stanzas and Output attached) Planning to replace all conf files from backup apps directories to the upgraded apps directories as we have noticed there is a change in the conf files. Not sure which ones to replace and the consequences – PENDING Bit confused with the documentation. Upgrade documentation didn't have essinstall action=upgrade part. But read about it in some blog. Am I supposed to run it or not? When I followed the upgrade documentation, only SplunkEnterpriseSecuritySuite app folder got changed and the remaining SA-* and DA-* apps were unchanged. But SA-* and DA-* got changed when I ran essinstall command followed by splunk restart. All this is just on deployer. Haven't pushed any changes to search heads. Has anyone recently did ES upgrade and can share me clear steps to be followed? Raised a Splunk support case and they are advicing just to follow the upgrade doco which is fully not clear. Thanks & Regards, Naresh
... View more
Labels
- Labels:
-
upgrade
-
using Enterprise Security
03-03-2022
03:17 PM
And I didnt go for Python Upgrade Readiness App because it said - This app can scan Splunk apps installed on Splunk Enterprise version 7.3 and higher ; while im sitting on 7.2.x and didn't want to upgrade my suite to 7.3.x just to use this app
... View more
03-03-2022
03:13 PM
Thanks for responding @gcusello Below are the steps performed from my end: I am currently on 7.2.x 1) Installed "Splunk Platform Upgrade Readiness App" - https://splunkbase.splunk.com/app/4698/
2) Ran a full scan of apps and got blockers/warnings for the mentioned apps.
3) Scan recommended to upgrade the apps. I picked the dual-compatible version of the app and upgraded the apps
4) Got the python3 messages after a splunk restart. Since I am on 7.2.x, I have commented those warning lines from the conf files. python.version=python3 My question is why is it mentioned as dual compatible when python3 is mentioned in those new versions of the app? And what-ever I did is the right thing? Post apps upgrade, I ran the scan again and the warnings and blockers are still there with a different message this time. And it has given some syntax changes to the py scripts in detail. "You can ask the app developer to update this app using their contact information on Splunkbase. If the app is unsupported, you can make changes yourself. Learn more" Is this App a legit and true scanning app - "Splunk Platform Upgrade Readiness App" ?
... View more
03-02-2022
07:16 PM
My current splunk env is on 7.2.x. As part of Splunk 8.x upgrade, I am trying to upgrade below apps to dual compatible versions (for both 7.2.x and 8.x) first Splunk Supporting Add-on for Active Directory - to version 3.0.1 Splunk App for Unix - to version 6.0.0 Though it is mentioned as both are compatible with 7.2.x to 8.2.x, I am getting below warnings in my Deployer. Can someone confirm if they have recently upgraded with same scenario as me and faced no issues? so that I can ignore the warning and push the apps to search heads. Invalid key in stanza [ldapsearch] in /opt/splunk/etc/apps/SA-ldapsearch/default/commands.conf, line 2: python.version (value: python3).
Invalid key in stanza [ldapfetch] in /opt/splunk/etc/apps/SA-ldapsearch/default/commands.conf, line 11: python.version (value: python3).
Invalid key in stanza [ldapfilter] in /opt/splunk/etc/apps/SA-ldapsearch/default/commands.conf, line 21: python.version (value: python3).
Invalid key in stanza [ldapgroup] in /opt/splunk/etc/apps/SA-ldapsearch/default/commands.conf, line 31: python.version (value: python3).
Invalid key in stanza [ldaptestconnection] in /opt/splunk/etc/apps/SA-ldapsearch/default/commands.conf, line 41: python.version (value: python3).
Invalid key in stanza [script://./bin/update_hosts.py] in /opt/splunk/etc/apps/splunk_app_for_nix/default/inputs.conf, line 2: python.version (value: python3).
Invalid key in stanza [admin_external:unix_conf] in /opt/splunk/etc/apps/splunk_app_for_nix/default/restmap.conf, line 6: python.version (value: python3).
Invalid key in stanza [admin_external:alert_overlay] in /opt/splunk/etc/apps/splunk_app_for_nix/default/restmap.conf, line 12: python.version (value: python3).
Invalid key in stanza [admin_external:sc_headlines] in /opt/splunk/etc/apps/splunk_app_for_nix/default/restmap.conf, line 22: python.version (value: python3).
Invalid key in stanza [admin_external:unix_configured] in /opt/splunk/etc/apps/splunk_app_for_nix/default/restmap.conf, line 32: python.version (value: python3).
... View more
Labels
- Labels:
-
upgrade
02-15-2022
08:48 PM
@PickleRick - your 1) point is valid. I have defined the field extraction inside a custom app. But doing the search from default search app. How can I just run a search from search app with that sourcetype and get my extracted field. I can see the permissions on that app are read/write to everyone
... View more
01-26-2022
03:25 PM
1) Does the user you're searching with have proper permissions to the app the field is defined in? Yes - searching as admin user 2) Are you sure you're not using fast mode? smart mode
... View more
01-24-2022
07:18 PM
Hi,
My environment has multiple apps. I got a requirement to default a value to a temp field. While my eval in the search works but eval in the props conf file isn't creating the new field. Please help me troubleshoot.
My conf files are below:
INPUTS ON FORWARDERS:
[monitor:///var/log/omega]
index=foo_bar_transaction
sourcetype=foo_car
PROPS ON SHC:
[foo_car]
EVAL-tempvariable = "Test_Eval"
EVAL-datacenter = if(IN(mvindex(split(host,"-"),1),"clc","dkn"),"DANGER",mvindex(split(host,"-"),1))
INDEXER:
/opt/splunk/bin/splunk cmd btool props list foo_car --debug
/opt/splunk/etc/slave-apps/INFRA_APP_logs/default/props.conf [foo_car]
/opt/splunk/etc/system/default/props.conf ADD_EXTRA_TIME_FIELDS = True
/opt/splunk/etc/system/default/props.conf ANNOTATE_PUNCT = True
/opt/splunk/etc/system/default/props.conf AUTO_KV_JSON = true
/opt/splunk/etc/system/default/props.conf BREAK_ONLY_BEFORE =
/opt/splunk/etc/system/default/props.conf BREAK_ONLY_BEFORE_DATE = True
/opt/splunk/etc/system/default/props.conf CHARSET = UTF-8
/opt/splunk/etc/system/default/props.conf DATETIME_CONFIG = /etc/datetime.xml
/opt/splunk/etc/system/default/props.conf DEPTH_LIMIT = 1000
/opt/splunk/etc/system/default/props.conf HEADER_MODE =
/opt/splunk/etc/system/default/props.conf LEARN_MODEL = true
/opt/splunk/etc/system/default/props.conf LEARN_SOURCETYPE = true
/opt/splunk/etc/system/default/props.conf LINE_BREAKER_LOOKBEHIND = 100
/opt/splunk/etc/system/default/props.conf MATCH_LIMIT = 100000
/opt/splunk/etc/system/default/props.conf MAX_DAYS_AGO = 2000
/opt/splunk/etc/system/local/props.conf MAX_DAYS_HENCE = 40
/opt/splunk/etc/system/default/props.conf MAX_DIFF_SECS_AGO = 3600
/opt/splunk/etc/system/default/props.conf MAX_DIFF_SECS_HENCE = 604800
/opt/splunk/etc/system/default/props.conf MAX_EVENTS = 256
/opt/splunk/etc/system/default/props.conf MAX_TIMESTAMP_LOOKAHEAD = 128
/opt/splunk/etc/system/default/props.conf MUST_BREAK_AFTER =
/opt/splunk/etc/system/default/props.conf MUST_NOT_BREAK_AFTER =
/opt/splunk/etc/system/default/props.conf MUST_NOT_BREAK_BEFORE =
/opt/splunk/etc/system/default/props.conf SEGMENTATION = indexing
/opt/splunk/etc/system/default/props.conf SEGMENTATION-all = full
/opt/splunk/etc/system/default/props.conf SEGMENTATION-inner = inner
/opt/splunk/etc/system/default/props.conf SEGMENTATION-outer = outer
/opt/splunk/etc/system/default/props.conf SEGMENTATION-raw = none
/opt/splunk/etc/system/default/props.conf SEGMENTATION-standard = standard
/opt/splunk/etc/system/default/props.conf SHOULD_LINEMERGE = True
/opt/splunk/etc/system/default/props.conf TRANSFORMS =
/opt/splunk/etc/system/default/props.conf TRUNCATE = 10000
/opt/splunk/etc/system/default/props.conf detect_trailing_nulls = false
/opt/splunk/etc/system/default/props.conf maxDist = 100
/opt/splunk/etc/system/default/props.conf priority =
/opt/splunk/etc/system/default/props.conf sourcetype =
... View more
Labels
- Labels:
-
props.conf
12-09-2021
03:01 PM
@isoutamo - it worked after setting up permissions in default.meta. Thanks for your reply. it worked 🙂
... View more
12-09-2021
02:18 PM
Yes, sourcetypes and indexes are just examples in this forum. My config doesnt have typos
... View more
12-08-2021
04:52 PM
Hi, I am currently working in a new environment where I am trying to do field extraction based of pipe delimiter. 1) A new app (say my_app) with only inputs.conf is pushed onto the target uf through the deployment server. inputs.conf:
[monitor:///path1/file1]
index=my_index
soyrcetype=my_st 2) Data is getting ingested and the requirement is to do field extraction on all the events separated by pipe delimiter (12345|2021-09-12 11:12:34 345|INFO|blah|blah|blah blah) My approach followed 1) Create a new app (plain folder my_app) on my deployer and push it to the search heads with below conf files I felt it was simple to achieve and did this. somehow it's not working. Did I miss any step to link the app on forwarder and the shc? ls my_app/default/
app.conf props.conf transforms.conf
props.conf
[my_st]
REPORT-getfields = getfields
transforms.conf
[getfields]
DELIMS = "|"
FIELDS = "thread_id","timestamp","loglevel","log_tag","message"
... View more
Labels
- Labels:
-
configuration
09-17-2020
12:23 AM
Hi Punters, I am facing issues with Data Anonimization. Below are my conf files. My transforms.conf anonimizes the data if my _raw event have any one regex pattern. But it's not anonimizing my _raw event if it has both the regex patterns. Need help please. xml-anonymizer also doesn't work if my _raw event is having JSON message. But it works fine if the _raw event is a normal line. props.conf [dp_logs_multiline] CHECK_METHOD = modtime NO_BINARY_CHECK = true SHOULD_LINEMERGE = false LINE_BREAKER=([\r\n]+)\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}.\d{3} category = Custom disabled = false pulldown_type = 1 MAX_TIMESTAMP_LOOKAHEAD = 24 TIME_FORMAT = %Y-%m-%d %H:%M:%S.%3N TIME_PREFIX = ^ TRANSFORMS-anonymize = json-anonymizer, xml-anonymizer ANNOTATE_PUNCT = false TRUNCATE = 100000 MAX_EVENTS = 10000 transforms.conf [json-anonymizer] REGEX = (?ms)^(.*\"[sS]hippingAddress\"\s+\:\s+\{)[\s\S]*?(\}.*)$ FORMAT = $1#########JSON PCC DATA ANONIMIZED#############$2 REPEAT_MATCH = true MV_ADD = true DEST_KEY = _raw [xml-anonymizer] REGEX = (?ms)^(.*\<[bB]illTo\>)[\s\S]*?(\<\/[rR]equestMessage\>.*)$ FORMAT = $1#########XML PCC DATA ANONIMIZED#############$2 REPEAT_MATCH = true MV_ADD = true DEST_KEY = _raw
... View more
Labels
- Labels:
-
heavy forwarder
-
props.conf
-
transforms.conf
08-24-2020
06:54 AM
Hi @to4kawa Do you reckon it resolves my issue ? By moving SEDCMDs after INDEXED_EXTRACTIONS=json I am not sure if my conf is wrong as I picked it from other Splunk answers blog. It works for me except the issue I raised in this ticket.
... View more
08-23-2020
11:57 PM
Hi, Below is my props.conf on my Heavy Forwarder. I have recently found that there are few JSON messages completely missed getting indexed into Splunk. It's a high transaction system. When I actually check my source json logs, eg: out of 10 json payloads, 1-2 doesn't get indexed. But all the 10 json payloads are having similar content and same number of lines [dp_json] SEDCMD-strip_prefix = s/^[^{]+//g SEDCMD-dumpxml = s/(\<|\>\\r\\n).*//g SEDCMD-remove = s/\"(shippingAddress)\"\s+\:\s+{[\s\S]*?(?=\n.*?{)//g INDEXED_EXTRACTIONS=JSON NO_BINARY_CHECK = true category = Custom description = dp_json_custom disabled = false pulldown_type = true DATETIME_CONFIG = CURRENT TRUNCATE = 100000 MAX_EVENTS = 10000 I couldn't troubleshoot the splunkd.log on forwarder because I continuously get below messages in it. I can't ask the source application system to change the json payload message to rectify this error. So, I am living with this error. 08-24-2020 13:19:52.474 +1000 ERROR JsonLineBreaker - JSON StreamId:10360380474397151566 had parsing error:Unexpected character while looking for value: 'a' - data_source="D:\Logs\myjson.log", data_host="myjsonhost", data_sourcetype="dp_json" Is there a way to get notifications if any events are missed indexing? Hope someone would ve faced same issue. Need urgent resolution as we don't want to miss any data in Splunk. Thanks, Naresh
... View more
08-23-2020
08:31 PM
Hi, I have a remote file (on server 2) which can be accessed directly from my Indexer (on server 1). What is the best and recommended way to ingest data from that file into indexer 1) Read directly from indexer's inputs.conf (monitor://remote-path to the file) - Everything on server 1 2) Install universal forwarder on the target machine and forward data (complete log file. no props and transforms) - indexer on server1 and forwarder on server 2 Whats the main difference between these 2 options? pros and cons? Thanks
... View more
08-23-2020
08:25 PM
Hi @richgalloway I am struggling with regex actually. My regex is only capturing partial json message (until the first "}") I am trying to search all lines between "line starting with {" and "line starting with }". But ^ is not picking my search So, I am stuck with this regex currently -- \{[\s\S]*?\} { { {}, }, }.
... View more
08-20-2020
06:51 PM
Hi,
I want to extract all the log events (normal lines) except JSON messages. There should be an easy way for this. Any hints, please?
My log file is a mix something like below
----------
normal line
normal line
json events {
{json messages}
}
normal line
etc
etc
Thanks,
Naresh
... View more
Labels
- Labels:
-
JSON
01-29-2020
03:24 PM
Hi @niketnilay - My series is not fixed. And I just want to make the bigger pie slices bold.
Any working use-case with similar requirement? Somehow couldn't attach the image to the reply to your comment. So, posting it as an answer.
... View more
01-28-2020
06:35 PM
@niketnilay - Is there a way to make the font of "ERROR" alone bold using CSS?
I have a trellis pie chart and am looking to bold the pies (of each trelli) with maximum count (bigger pie slice)
Thanks
Naresh
... View more
12-16-2019
07:12 PM
Is there a way to use splunk to extract data from a SQL DB and send it (using Heavy Forwarder?) as a csv to a remote share without indexing the actual data? - I don't want to see any of the data in Splunk. Rather use Splunk as a integration tool.
Is this possible? Any successful use-cases will help me to implement this.
TA,
Naresh
... View more
10-03-2019
07:33 AM
I want to exclude part of JSON message before indexing. How can I achieve that> Below is a sample JSON. I used SED command in props.conf to exclude the first line and make it only JSON Indexed extraction. And another 2 more SEDs in props.conf to ignore a section of XML message (part of below JSON) as well
How can I write SED to ignore the "ignore" section below?
2019-10-02_09:09:09.234 My JSON message is
{A_message
{"blah": is blah
"blah2": is blah2
"blah3" : {
blah blah blah
}
"Ignore this" :
{
"ignore1": ignore1
ignore2: "ignore2"
}
}
... View more
09-30-2019
10:19 PM
Agree with Renjith's comments. But if you need to capture the time of the max event as well, then try this.
"your search"
|untable _time Host AV
|eventstats max(AV) as max_AV by Host
| where AV=max_AV
| table _time Host AV
... View more