All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

There is no API that will provide every event Splunk receives.  Splunk does not want to make it easy to transition to a different product.  To use the API, you'll have to run a search (perhaps a real... See more...
There is no API that will provide every event Splunk receives.  Splunk does not want to make it easy to transition to a different product.  To use the API, you'll have to run a search (perhaps a real-time search) and collect the events from the search results. Depending on the other tool, you may be able to use Ingest Actions to fork the data to S3 where the other tool may be able to pick them up.
It appears no props.conf has been created, I'll talk more with the Admin. As for the Raw Data, It's Single Multi-line event: TimeStamp Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Valu... See more...
It appears no props.conf has been created, I'll talk more with the Admin. As for the Raw Data, It's Single Multi-line event: TimeStamp Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Component=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value But in the emulation is to ignore that TimeStamp: | makeresults | eval data=split("Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Component=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value", " ") | mvexpand data | rename data AS _raw ``` emulation assuming Splunk "forgets" to extract ```  
Hello, yuanliu. Thank you for reaching out. While I agree that the excerpt that I posted is indeed JSON, the full _raw has much more text, and a lot of cleanup would be necessary before spath could ... See more...
Hello, yuanliu. Thank you for reaching out. While I agree that the excerpt that I posted is indeed JSON, the full _raw has much more text, and a lot of cleanup would be necessary before spath could be useful. Considering my limited experience with SPLUNK at this point, it would be much more difficult to figure out what errors are caused by my shortcoming and what is caused by the need to prep _raw for spath to work its magic.
You do not need to remove timestamp per se.  Just let us know whether the mock data is a single, multi-line event (emulation 2) or multiple events (emulation 1)
Even so, your code will be more robust and much more maintainable if you don't treat JSON data as text.  The mock data looks too much like an excerpt from compliant JSON, but part of the object conta... See more...
Even so, your code will be more robust and much more maintainable if you don't treat JSON data as text.  The mock data looks too much like an excerpt from compliant JSON, but part of the object contains embedded escaped JSON string, hence you want some special handling. If you cab post complete mock data with the original structure, you will see that there is nothing that Splunk's QA tested spath command cannot handle.
I'll see if I can remove the time stamps in the raw data, since it is causing parsing issues.
I've been searching for the same answer, as Splunk ES is is limiting in the regards.  Most our other tools are found elswhere - to expedite the review or mitigation, it would be very helpful to add a... See more...
I've been searching for the same answer, as Splunk ES is is limiting in the regards.  Most our other tools are found elswhere - to expedite the review or mitigation, it would be very helpful to add a link in the next steps to say go to the EDR, the Proofpoint Server, O365 etc... vs. the SOC analyst needing to fumble through his/her bookmarks etc..   If this doesn't exist, I sure how it's on the roadmap. 
For the one single _raw event would be the following: eval _raw="TimeStamp Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Com... See more...
For the one single _raw event would be the following: eval _raw="TimeStamp Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Component=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value" My apologies, I didn't include the TimeStamp since it didn't appeared important when handling evaluating the data. Still trying to figure out the lingo for Splunk.
There is now a conflict between the corrected mock data and the emulation pseudo code.  The former seems to imply that Component contains what you want as Field-Type, but the latter directly uses Fie... See more...
There is now a conflict between the corrected mock data and the emulation pseudo code.  The former seems to imply that Component contains what you want as Field-Type, but the latter directly uses Field-Type as field name. Let's take baby steps.  First, can you confirm that your _raw events look like, or contain something like the following emulation? In other words, the mock data you give, are they emulating _raw?   | makeresults | eval data=split("Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Component=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value", " ") | mvexpand data | rename data AS _raw ``` emulation assuming Splunk "forgets" to extract ```   _raw _time Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value 2024-02-14 11:10:02 Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value 2024-02-14 11:10:02 Component=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value 2024-02-14 11:10:02 (See how similar this is from my previous emulation? You can simply adopt the formula with the field names.)  Whether you use forwarder or some other mechanism to ingest data is not a factor in Splunk extraction.  But if Splunk does NOT give Component and Section_5, you should dig deeper with admin.  Maybe post the props.conf that contains this source type.  You can always run | extract with _raw.  But it it would be so much better if you don't have to. TimeStamp Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Component=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value Or, do you mean all these 3 (and more) lines form one single _raw event? In other words, does this emulation better resembles your _raw events?   | makeresults | eval _raw="TimeStamp Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Component=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value"   _raw _time TimeStamp Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Component=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value 2024-02-14 11:20:05
Hi @Fernando.Moreira, I was informed that is correct. 
Hello All, I have the below SPL to compare hourly event data and indexed data to find if they follow similar pattern and if there big gap. |tstats count where index=xxx sourcetype=yyy BY _indextime... See more...
Hello All, I have the below SPL to compare hourly event data and indexed data to find if they follow similar pattern and if there big gap. |tstats count where index=xxx sourcetype=yyy BY _indextime _time span=1h |bin span=1h _indextime as itime |bin span=1h _time as etime |eventstats sum(count) AS indexCount BY itime |eventstats sum(count) AS eventCount BY etime |timechart span=1h max(eventCount) AS event_count max(indexCount) AS index_count However, when I compare hourly results,  I get more number of data points in indexed data than event data. Thus, can you please guide to resolve the problem. Thank you Taruchit
@ITWhisperer No, i mean like, i tried already as you suggested. From below said, under field PBC the field values comes under A00001, A00002 so for same other KCG and TTK as well, so these can show ... See more...
@ITWhisperer No, i mean like, i tried already as you suggested. From below said, under field PBC the field values comes under A00001, A00002 so for same other KCG and TTK as well, so these can show up in the table like above.  env="*A00001*" as "PBC" env="*A00002*" as "PBC" env="*A00005*" as "KCG env="*A00020*" as "TTK"
My forwarder refuses to connect to the manager over 8089.  firewall is allowing traffic set deploy-poll is working and yet I cannot see the connection even be attempted via netstat on the splunk un... See more...
My forwarder refuses to connect to the manager over 8089.  firewall is allowing traffic set deploy-poll is working and yet I cannot see the connection even be attempted via netstat on the splunk universal forwarder (nix) UF ---> HF   here is my deploymentclient.conf [deployment-client] [target-broker:deploymentServer] #this was part of default after command was run deploymentServer=x.x.x.x:8089 targetUri = 10.1.10.69:8089  #this was part of default after command was run
Hi @Jahnavi.Vangari, I just had a look at your ticket, and Support suggests this would need to be a feature request, which you can submit on the Idea Exchange. 
Hi @Vinit.Jain, Thanks for asking your question on the Community. I did some searching an found this AppD Docs page that may help. https://docs.appdynamics.com/appd/21.x/21.12/en/application-monito... See more...
Hi @Vinit.Jain, Thanks for asking your question on the Community. I did some searching an found this AppD Docs page that may help. https://docs.appdynamics.com/appd/21.x/21.12/en/application-monitoring/install-app-server-agents/php-agent/install-the-php-agent/install-the-php-agent-by-shell-script If that doesn't lead to anything, you may want to contact AppD Support for more help. How do I submit a Support ticket? An FAQ 
For clarification, I am currently using the SplunkForwarder to monitor a custom log file which auto-updates every 10 seconds. This custom log file is monitored in multiple hosts. After looking my pr... See more...
For clarification, I am currently using the SplunkForwarder to monitor a custom log file which auto-updates every 10 seconds. This custom log file is monitored in multiple hosts. After looking my previous example, I incorrectly stated the data format; this is the correct data structure displayed in Splunk: TimeStamp Component=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Component=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Component=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value There are 4 other sections, but for brevity, the only value I am getting from each component is the last section. If there is a better way of structuring the data so Splunk can auto detect the new fields, rather than using regex extraction, that would be wonderful. Splunk will be getting the latest entry for each hosts: Host Component Value Time Unique_Host_1 F_Type_1 F_Type_1_Section_5_Value 00:00:00 Unique_Host_1 F_Type_2 F_Type_2_Section_5_Value 00:00:00 Unique_Host_1 F_Type_3 F_Type_3_Section_5_Value 00:00:00 Unique_Host_2 F_Type_1 F_Type_1_Section_5_Value 00:00:00 Unique_Host_2 F_Type_2 F_Type_2_Section_5_Value 00:00:00 Unique_Host_2 F_Type_3 F_Type_3_Section_5_Value 00:00:00 .....         The Splunk table creation would be something like this: index="hosts" sourcetype="logname" | eval data=split("Field-Type=F_Type_1,.....,Section_5=F_Type_1_Section_5_Value Field-Type=F_Type_2,.....,Section_5=F_Type_2_Section_5_Value Field-Type=F_Type_3,.....,Section_5=F_Type_3_Section_5_Value", " ") | stats latest(data) by host | mxexpand | rename Section_5 AS Value | extract  
That worked perfectly!
Do you mean something like this index=foo sourcetype=json_foo source="az-foo" |rename tags.envi as env |search env="*A00001*" OR env="*A00002*" OR env="*A00005*" OR env="*A00020*" |stats count by env
I need to list which data sources have datamodels, I tried a few ways but none of them were effective, can you help me please. Best regards Valderlúcio.
I am not quite sure I follow which string you want to extract, but try something like this | rex field=_raw "next\_best\_thing.+?description(?<NBT>.+?)}" Note, the .+ that you used is greedy, by ad... See more...
I am not quite sure I follow which string you want to extract, but try something like this | rex field=_raw "next\_best\_thing.+?description(?<NBT>.+?)}" Note, the .+ that you used is greedy, by adding a question mark, .+? it reduces the amount used for the anchor after _thing until it reach the next description (not the last description as the greedy match does).