All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I'll see if I can remove the time stamps in the raw data, since it is causing parsing issues.
I've been searching for the same answer, as Splunk ES is is limiting in the regards.  Most our other tools are found elswhere - to expedite the review or mitigation, it would be very helpful to add a... See more...
I've been searching for the same answer, as Splunk ES is is limiting in the regards.  Most our other tools are found elswhere - to expedite the review or mitigation, it would be very helpful to add a link in the next steps to say go to the EDR, the Proofpoint Server, O365 etc... vs. the SOC analyst needing to fumble through his/her bookmarks etc..   If this doesn't exist, I sure how it's on the roadmap. 
For the one single _raw event would be the following: eval _raw="TimeStamp Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Com... See more...
For the one single _raw event would be the following: eval _raw="TimeStamp Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Component=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value" My apologies, I didn't include the TimeStamp since it didn't appeared important when handling evaluating the data. Still trying to figure out the lingo for Splunk.
There is now a conflict between the corrected mock data and the emulation pseudo code.  The former seems to imply that Component contains what you want as Field-Type, but the latter directly uses Fie... See more...
There is now a conflict between the corrected mock data and the emulation pseudo code.  The former seems to imply that Component contains what you want as Field-Type, but the latter directly uses Field-Type as field name. Let's take baby steps.  First, can you confirm that your _raw events look like, or contain something like the following emulation? In other words, the mock data you give, are they emulating _raw?   | makeresults | eval data=split("Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Component=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value", " ") | mvexpand data | rename data AS _raw ``` emulation assuming Splunk "forgets" to extract ```   _raw _time Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value 2024-02-14 11:10:02 Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value 2024-02-14 11:10:02 Component=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value 2024-02-14 11:10:02 (See how similar this is from my previous emulation? You can simply adopt the formula with the field names.)  Whether you use forwarder or some other mechanism to ingest data is not a factor in Splunk extraction.  But if Splunk does NOT give Component and Section_5, you should dig deeper with admin.  Maybe post the props.conf that contains this source type.  You can always run | extract with _raw.  But it it would be so much better if you don't have to. TimeStamp Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Component=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value Or, do you mean all these 3 (and more) lines form one single _raw event? In other words, does this emulation better resembles your _raw events?   | makeresults | eval _raw="TimeStamp Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Component=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value"   _raw _time TimeStamp Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Component=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value 2024-02-14 11:20:05
Hi @Fernando.Moreira, I was informed that is correct. 
Hello All, I have the below SPL to compare hourly event data and indexed data to find if they follow similar pattern and if there big gap. |tstats count where index=xxx sourcetype=yyy BY _indextime... See more...
Hello All, I have the below SPL to compare hourly event data and indexed data to find if they follow similar pattern and if there big gap. |tstats count where index=xxx sourcetype=yyy BY _indextime _time span=1h |bin span=1h _indextime as itime |bin span=1h _time as etime |eventstats sum(count) AS indexCount BY itime |eventstats sum(count) AS eventCount BY etime |timechart span=1h max(eventCount) AS event_count max(indexCount) AS index_count However, when I compare hourly results,  I get more number of data points in indexed data than event data. Thus, can you please guide to resolve the problem. Thank you Taruchit
@ITWhisperer No, i mean like, i tried already as you suggested. From below said, under field PBC the field values comes under A00001, A00002 so for same other KCG and TTK as well, so these can show ... See more...
@ITWhisperer No, i mean like, i tried already as you suggested. From below said, under field PBC the field values comes under A00001, A00002 so for same other KCG and TTK as well, so these can show up in the table like above.  env="*A00001*" as "PBC" env="*A00002*" as "PBC" env="*A00005*" as "KCG env="*A00020*" as "TTK"
My forwarder refuses to connect to the manager over 8089.  firewall is allowing traffic set deploy-poll is working and yet I cannot see the connection even be attempted via netstat on the splunk un... See more...
My forwarder refuses to connect to the manager over 8089.  firewall is allowing traffic set deploy-poll is working and yet I cannot see the connection even be attempted via netstat on the splunk universal forwarder (nix) UF ---> HF   here is my deploymentclient.conf [deployment-client] [target-broker:deploymentServer] #this was part of default after command was run deploymentServer=x.x.x.x:8089 targetUri = 10.1.10.69:8089  #this was part of default after command was run
Hi @Jahnavi.Vangari, I just had a look at your ticket, and Support suggests this would need to be a feature request, which you can submit on the Idea Exchange. 
Hi @Vinit.Jain, Thanks for asking your question on the Community. I did some searching an found this AppD Docs page that may help. https://docs.appdynamics.com/appd/21.x/21.12/en/application-monito... See more...
Hi @Vinit.Jain, Thanks for asking your question on the Community. I did some searching an found this AppD Docs page that may help. https://docs.appdynamics.com/appd/21.x/21.12/en/application-monitoring/install-app-server-agents/php-agent/install-the-php-agent/install-the-php-agent-by-shell-script If that doesn't lead to anything, you may want to contact AppD Support for more help. How do I submit a Support ticket? An FAQ 
For clarification, I am currently using the SplunkForwarder to monitor a custom log file which auto-updates every 10 seconds. This custom log file is monitored in multiple hosts. After looking my pr... See more...
For clarification, I am currently using the SplunkForwarder to monitor a custom log file which auto-updates every 10 seconds. This custom log file is monitored in multiple hosts. After looking my previous example, I incorrectly stated the data format; this is the correct data structure displayed in Splunk: TimeStamp Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Component=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value There are 4 other sections, but for brevity, the only value I am getting from each component is the last section. If there is a better way of structuring the data so Splunk can auto detect the new fields, rather than using regex extraction, that would be wonderful. Splunk will be getting the latest entry for each hosts: Host Component Value Time Unique_Host_1 F_Type_1 F_Type_1_Section_5_Value 00:00:00 Unique_Host_1 F_Type_2 F_Type_2_Section_5_Value 00:00:00 Unique_Host_1 F_Type_3 F_Type_3_Section_5_Value 00:00:00 Unique_Host_2 F_Type_1 F_Type_1_Section_5_Value 00:00:00 Unique_Host_2 F_Type_2 F_Type_2_Section_5_Value 00:00:00 Unique_Host_2 F_Type_3 F_Type_3_Section_5_Value 00:00:00 .....         The Splunk table creation would be something like this: index="hosts" sourcetype="logname" | eval data=split("Field-Type=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Field-Type=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Field-Type=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value", " ") | stats latest(data) by host | mxexpand | rename Section_5 AS Value | extract  
That worked perfectly!
Do you mean something like this index=foo sourcetype=json_foo source="az-foo" |rename tags.envi as env |search env="*A00001*" OR env="*A00002*" OR env="*A00005*" OR env="*A00020*" |stats count by env
I need to list which data sources have datamodels, I tried a few ways but none of them were effective, can you help me please. Best regards Valderlúcio.
I am not quite sure I follow which string you want to extract, but try something like this | rex field=_raw "next\_best\_thing.+?description(?<NBT>.+?)}" Note, the .+ that you used is greedy, by ad... See more...
I am not quite sure I follow which string you want to extract, but try something like this | rex field=_raw "next\_best\_thing.+?description(?<NBT>.+?)}" Note, the .+ that you used is greedy, by adding a question mark, .+? it reduces the amount used for the anchor after _thing until it reach the next description (not the last description as the greedy match does).
I'm new to REX and trying to extract strings from _raw (which is actually a malformed JSON, so SPATH is not a good option either). I was able to create a REX to identify the pattern that I want (o... See more...
I'm new to REX and trying to extract strings from _raw (which is actually a malformed JSON, so SPATH is not a good option either). I was able to create a REX to identify the pattern that I want (or kind of). However, I'm having trouble establishing the correct boundaries. There is where my lack of experience with REX is showing. I cannot establish the end of my pattern correctly. I have pasted the expression that I'm using and a cleaned-up sample of the text I'm dealing with. | rex field=_raw "next\_best\_thing.+description(?<NBT>.+)topic" I thought this would identify the beginning of my pattern as next_best_thing (as it does) and the end after the first description and capture the Group (NBT) as \\\":\\\"Another quick brown fox jumps over the lazy dog.\\\"},{\\\" (just before the first topic). Naturally, a lot of clean-up would still be necessary but I would have something to work with. However, it seems that the search starts from the end of the _raw string, so the description that is being captured is in a different part and the Group becomes something completely different from what I intended to (\\\":\\\"A third quick brown fox jumps over the lazy dog\xAE Bla Bla BlaBla?\xA0 And a forth The quick brown fox jumps over the lazy dog.\\\"},{\\\"). Also, if the expression is just | rex field=_raw "next\_best\_thing.+description(?<NBT>.+)", omitting the end boundary (TOPIC), the whole pattern changes, with completely different description being used as the end boundary. And naturally the Group changes completely. The latter reinforces the impressions that the searches are being performed from the end of _raw. Is there a way to change the search direction? Or am I even more wrong / lost than I think on how to establish the boundaries for pattern and group? "BlaBla_BlaBla_condition\\\":\\\"\\\",\\\"OtherBla\\\":{\\\"description\\\":\\\"The quick brown fox jumps over the lazy dog\\\",\\\"next_best_thing\\\":[{\\\"topic\\\":\\\"Target Public\\\",\\\"description\\\":\\\"Another quick brown fox jumps over the lazy dog.\\\"},{\\\"topic\\\":\\\"Benefit to Someone\\\",\\\"description\\\":\\\"A third quick brown fox jumps over the lazy dog\xAE Bla Bla BlaBla?\xA0 And a forth The quick brown fox jumps over the lazy dog.\\\"},{\\\"topic\\\":\\\"Call to Something\\\",\\\"description\\\":\\\"The fith quick brown fox jumps over the lazy dog.\\\"}]}},\\\"componentTemplate\\\":{\\\"id\\\":\\\"tcm:999-111111-99\\\",\\\"title\\\":\\\"BlaBlaBla_Bla_Bla\\\"},\\\"ia_rendered\\\":\\\"data-slot-id=\\\\\\\"BlaBlaBla\\\\\\\" lang=\\\\\\\"en\\\\\\\" data-offer-id=\\\\\\\"BLABLABLABLABLABLA\\\\\\\" \\\"}\",\"Rank\":\"1\"},\"categoryName\":\"\",\"source\":\"BLA\",\"name\":\"OTHETHINGSHERE_\",\"type\":null,\"placementName\":\"tvprimary\",\"presentationOrderWitinSlot\":1,\"productDetails\":{\"computerApplicationCode\":null,\"productCode\":\"BLA\",\"productSubCode\":\"\"},\"locationProductCode\":null,\"locationProductSubCode\":null,\"priorityWithInProductAndSubCode\":null}],\"error\":null},\"custSessionAvailable\":false},\"ecprFailed\":false,\"svtException\":null}"
HIi @ITWhisperer  index=foo sourcetype=json_foo source="az-foo" |rename tags.envi as env |search env="*A00001*" OR env="*A00002*" OR env="*A00005*" OR env="*A00020*" |table env from the fields... See more...
HIi @ITWhisperer  index=foo sourcetype=json_foo source="az-foo" |rename tags.envi as env |search env="*A00001*" OR env="*A00002*" OR env="*A00005*" OR env="*A00020*" |table env from the fields i am using: env="*A00001*" as "PBC" env="*A00002*" as "PBC" env="*A00005*" as "KCG env="*A00020*" as "TTK" reference:   From this SPL, i am trying to create a table like ------------------------------------------------------ PBC           |            KCG           |           TTK ------------------------------------------------------- all values       all values                 all values count                count                       count  
Thank you for that suggestion.  Now I'm even more confused.  The events are coming in as sourcetype=cef, and there are a lot more differences than I would have expected, including what's in system/de... See more...
Thank you for that suggestion.  Now I'm even more confused.  The events are coming in as sourcetype=cef, and there are a lot more differences than I would have expected, including what's in system/default...  I've got some digging to do.
Thank @isoutamo , Yes, I update email password via gui, and reboot splunk, but this problem is still present
Quite probably your key for encrypt has changed? You could try to update old email password via gui and encrypt it again.