All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Could you please share your current otel config with us?
Hi @karthi2809, Can you please try below with eventstats? index="mulesoft" applicationName="s-concur-api" environment=PRD | eventstats values(tracePoint) as TracePoints values(message) as Messages... See more...
Hi @karthi2809, Can you please try below with eventstats? index="mulesoft" applicationName="s-concur-api" environment=PRD | eventstats values(tracePoint) as TracePoints values(message) as Messages by correlationId | search TracePoints="EXCEPTION" Messages!="*(SUCCESS)*" | fields - TracePoints - Messages | transaction correlationId | rename timestamp as Timestamp correlationId as CorrelationId tracePoint as TracePoint content.ErrorType as Error content.errorType as errorType content.errorMsg as ErrorMsg content.ErrorMsg as errorMsg | eval ErrorType=if(isnull(Error),"Unknown",Error) | dedup CorrelationId | eval errorType=coalesce(Error,errorType) | eval Errormsg=coalesce(ErrorMsg,errorMsg) | table CorrelationId,Timestamp, applicationName, locationInfo.fileName, locationInfo.lineInFile, errorType, message,Errormsg | sort -Timestamp  
Thanks! I have tried your setting, and unfortenatly it still doesn't work.  I have also discovered that xml data from sysmon have the same exact problem, won't pick up the time from the expected fi... See more...
Thanks! I have tried your setting, and unfortenatly it still doesn't work.  I have also discovered that xml data from sysmon have the same exact problem, won't pick up the time from the expected field from the xml data.  sysmon  looks like this, and it matches the text in the xml. [source::XmlWinEventLog:Microsoft-Windows-Sysmon/Operational] TIME_PREFIX = <Data Name='UtcTime'> TIME_FORMAT = %Y-%m-%d %H:%M:%S.%3N TZ = UTC I have used btool to look for any other stanzas that would cause this, for example for the common xmlwineventlog sourcetype but haven't found anyting.  Tips for debugging this welcome!  
Check out: Route and filter data - Splunk Documentation If you have more specific questions about your data just ask.
How to keep specific events and discard the rest in props.conf and transforms.conf We are Receiving large amount of data which is onboarded to splunk via tar files. We dont require monitoring a... See more...
How to keep specific events and discard the rest in props.conf and transforms.conf We are Receiving large amount of data which is onboarded to splunk via tar files. We dont require monitoring all the events.,we would need only some events with some data to be monitored and rest all files/sources needed to sent into nullqueue. Please give me some insights on it. Thanks in advance.
@tscroggins  thanks for the steer. I'm close ot getting this working but when I implemenet the transform it drops my event. The even tline looks as follows SOMEDATA NO_CLIENT_SITE: MYSYSTEM 10.15.37... See more...
@tscroggins  thanks for the steer. I'm close ot getting this working but when I implemenet the transform it drops my event. The even tline looks as follows SOMEDATA NO_CLIENT_SITE: MYSYSTEM 10.15.37.48 My props.conf is as follows: [netlogon] DATETIME_CONFIG = LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true SHOULD_LINEMERGE = false category = Custom pulldown_type = 1 TRANSFORMS-netlogon_send_to_nullqueue = netlogon_send_to_nullqueue My transforms.conf  [netlogon_send_to_nullqueue] REGEX = ^(?!NO_CLIENT_SITE). DEST_KEY = queue FORMAT = nullQueue Is it the regEx at fault here? I have been playing with it at regex101: build, test, and debug regex but I cannot see the issue.
Hi @uagraw01, You can also use Ingest Actions on UI. https://docs.splunk.com/Documentation/Splunk/9.2.1/Data/DataIngest#Mask_with_regular_expression  
rex has a mode option which can be set to sed to allow for edits to strings rex - Splunk Documentation props.conf has SEDCMD- stanzas which can do the editing before indexing props.conf - Splunk D... See more...
rex has a mode option which can be set to sed to allow for edits to strings rex - Splunk Documentation props.conf has SEDCMD- stanzas which can do the editing before indexing props.conf - Splunk Documentation
Hello I could not find a clear answer.  We have a setup where we run an IIS server on a windows virtual machine. On the IIS server we run a PHP webshop that makes calls to different databases and e... See more...
Hello I could not find a clear answer.  We have a setup where we run an IIS server on a windows virtual machine. On the IIS server we run a PHP webshop that makes calls to different databases and external calls.   Does your Observerability system work out of the box on the PHP webshop, or is this not supported.   The reason for the question is that some monitoring solutions such as AppDynamics, and New Relic does not support that setup. The question is mainly to know if we should start moving the setup to a different tech stack or if can wait a little.
Assuming your ingest has already parsed your timestamp into the _time field, then you can just format that to get the time | eval Time=strftime(_time, "%I:%M %p")
Hello Splunkers!! Below are the sample events I have in which I want to mask UserID field and Password field. There is no selected & interesting field is availble. I want to mask it from the raw eve... See more...
Hello Splunkers!! Below are the sample events I have in which I want to mask UserID field and Password field. There is no selected & interesting field is availble. I want to mask it from the raw event directly. Please suggest me solution from the UI by using rex mode command and second solution  by using the Props & transforms.conf from the backend .   Sample log:   <?xml version="1.0" encoding="UTF-8"?> <HostMessage><![CDATA[<?xml version="1.0" encoding="UTF-8" standalone="no"?><UserMasterRequest><MessageID>25255620</MessageID><MessageCreated>2024-04-05T07:00:55Z</MessageCreated><OpCode>CHANGEPWD</OpCode><UserId>pnkof123</UserId><Password>Summer123</Password><PasswordExpiry>2024-06-09</PasswordExpiry></UserMasterRequest>]]><original_header><IfcLogHostMessage xsi:schemaLocation="http://vanderlande.com/FM/Gtw/GtwLogging/V1/0/0 GtwLogging_V1.0.0.xsd"> <MessageId>25255620</MessageId> <MessageTimeStamp>2024-04-05T05:00:55Z</MessageTimeStamp> <SenderFmInstanceName>CMP_GTW</SenderFmInstanceName> <ReceiverFmInstanceName>FM_BPI</ReceiverFmInstanceName>   </IfcLogHostMessage></original_header></HostMessage>
It looks like you are excluding all the message=SUCCESS events, so you will never see them in the transaction data. If you want to exclude them, you will need to remove that message!="*(SUCCESS)*" co... See more...
It looks like you are excluding all the message=SUCCESS events, so you will never see them in the transaction data. If you want to exclude them, you will need to remove that message!="*(SUCCESS)*" constraint. Then your transaction will have the SUCCESS event included, so at that point, you can then filter out those events that have both succeeded then failed. However, you will need to take care of ordering - you know your data, but can the SUCCESS come AFTER the fail?  
timechart 명령에서 "fixedrange=f" 설정을 사용할 수도 있습니다.  
Hi!   I did uninstall the old version.. Clean up the folders and registry. Disable McAfee Antivirus too. Did the installation again.. and still hit the same issue.. Tried to install back the old ... See more...
Hi!   I did uninstall the old version.. Clean up the folders and registry. Disable McAfee Antivirus too. Did the installation again.. and still hit the same issue.. Tried to install back the old version.. It now encounter the same issue as the new version installation.. My OS is Windows Server 2016 Standard.  
Hello @anandhalagaras1  I believe that while changing the default value directly may not be possible, we can still achieve the desired outcome. Instead of adjusting the default setting, we can creat... See more...
Hello @anandhalagaras1  I believe that while changing the default value directly may not be possible, we can still achieve the desired outcome. Instead of adjusting the default setting, we can create a scheduled search with the preferred value of 20. This means that whenever the search is scheduled to run, it will automatically use the desired setting without needing to be manually adjusted each time. This ensures a consistent experience for users without worrying about the default value being reset. If this reply helps you, Karma would be appreciated.
Hi @whitecat001 , you can use the regex hinted  by @ITWhisperer , but when you ha a pais fieldnem=fieldvalue, as in your case, you could simply use the field for searching your data: index=your_ind... See more...
Hi @whitecat001 , you can use the regex hinted  by @ITWhisperer , but when you ha a pais fieldnem=fieldvalue, as in your case, you could simply use the field for searching your data: index=your_index permission=Permission12345 | ... Ciao. Giuseppe
안녕하세요 릴리, 두 차트는 사용된 시간 범위로 인해 다르게 나타납니다. 첫 번째 차트는 데이터가 있는 경우에만 차트에 선을 표시하는 Splunk 검색에서 가져온 것입니다. 두 번째 차트는 데이터가 없더라도 "전체 기간"에 대한 타임라인을 표시하는 Dashboard Studio의 차트입니다. 차트 2를 차트 1처럼 보이게 하려면 데이터와 일치하도록... See more...
안녕하세요 릴리, 두 차트는 사용된 시간 범위로 인해 다르게 나타납니다. 첫 번째 차트는 데이터가 있는 경우에만 차트에 선을 표시하는 Splunk 검색에서 가져온 것입니다. 두 번째 차트는 데이터가 없더라도 "전체 기간"에 대한 타임라인을 표시하는 Dashboard Studio의 차트입니다. 차트 2를 차트 1처럼 보이게 하려면 데이터와 일치하도록 시간 범위를 설정하면 됩니다.       데이터와 일치하도록 시간 범위를 설정하면:   (저는 한국어를 잘 못해서 번역기를 사용해야 했어요.)
Appreciate if you can share some example .
Hi @theprophet01, If you have a itoa_admin role,  you can export services, entities, glass tables, KPI searches, templates etc from the following menu: Configuration > Backup/Restore Click Create... See more...
Hi @theprophet01, If you have a itoa_admin role,  you can export services, entities, glass tables, KPI searches, templates etc from the following menu: Configuration > Backup/Restore Click Create Job > Create Backup Job   Select Partial Backup, give it a name and description, uncheck include conf files, then click next   Select the services you'd like to backup   Click Save and Backup   You will be taken to the Backup/Restore jobs page, where your job will be queued. When it's finished, usually after a few minutes, you can download the backup as a zip.  Go to the same page on a different Splunk instance to restore it - this time select Restore Job and upload the zip file.   See the capabilities here: https://docs.splunk.com/Documentation/ITSI/4.18.1/Configure/Capabilities You'll need the ones listed under "Backup/Restore" which by default is only given to itoa_admin.
Data Summary is not showing host at all even I already added UDP with ip address on port 514.