All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Situation: I have a panel. The panel creates a token for me from a field I extract from the search. In the same panel I create a URL that enables me to "drilldown". Open T... See more...
Situation: I have a panel. The panel creates a token for me from a field I extract from the search. In the same panel I create a URL that enables me to "drilldown". Open Tickets search... | table "ticket_num", "Creation Date", "other_field", "other_field" $field3.earliest$ $field3.latest$ $result.ticket_num$ 5 none row true <eval>replace($row.url$, "http://", ""</eval> <link target="_blank"> <![CDATA[ https://other.tool/$ticket_num$ ]]> </link> </drilldown> </table> </panel> My table appears to work nicely where each row represents unique open tickets. Problem: All rows, upon clicking, "drilldown" only to the first row's "ticket_num". Which I think makes sense. Question: Can you pleas share if its possible with ES simple XML and without any other add-on to modify the panel so that, upon clicking a row, I "drilldown" to the respective row's unique "ticket_num"?
We get FIPS compliance error when upgrading to Enterprise Security 6.1.0. FIPS is not enabled in our environment. From start using Enterprise 7.1.2 and ES 5.3.0. Upgrade to Enterprise 8.0.2.1... See more...
We get FIPS compliance error when upgrading to Enterprise Security 6.1.0. FIPS is not enabled in our environment. From start using Enterprise 7.1.2 and ES 5.3.0. Upgrade to Enterprise 8.0.2.1 first, and then upgrade to ES 6.1.0. (This path should be supported as we understand) -bash-4.2$ splunk show fips-mode -auth admin:passwd FIPS mode disabled. Splunk Enterprise Security Post-Install Configuration When step 4.4 is running, we get error: https://docs.splunk.com/Documentation/ES/6.1.0/Install/Upgradetonewerversion Error in 'essinstall' command: postinstall failed - Error updating FIPS compliance settings. See search.log for details. Extract from Search.log below 04-01-2020 16:56:21.052 ERROR ChunkedExternProcessor - stderr: msg="Error updating FIPS compliance settings." 04-01-2020 16:56:21.052 ERROR ChunkedExternProcessor - stderr: Traceback (most recent call last): 04-01-2020 16:56:21.052 ERROR ChunkedExternProcessor - stderr: File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/deploy_fips_compliant_settings.py", line 142, in deployFips 04-01-2020 16:56:21.052 ERROR ChunkedExternProcessor - stderr: incident_review_lookup_empty = isLookupEmpty(IR_LOOKUP, IR_APP, DEFAULT_OWNER, key) 04-01-2020 16:56:21.052 ERROR ChunkedExternProcessor - stderr: File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/deploy_fips_compliant_settings.py", line 74, in isLookupEmpty 04-01-2020 16:56:21.052 ERROR ChunkedExternProcessor - stderr: for lineno, unused_line in enumerate(open(path, 'r', newline=None)): 04-01-2020 16:56:21.052 ERROR ChunkedExternProcessor - stderr: File "/opt/splunk/lib/python3.7/codecs.py", line 322, in decode 04-01-2020 16:56:21.052 ERROR ChunkedExternProcessor - stderr: (result, consumed) = self._buffer_decode(data, self.errors, final) 04-01-2020 16:56:21.052 ERROR ChunkedExternProcessor - stderr: UnicodeDecodeError: 'utf-8' codec can't decode byte 0xa3 in position 37: invalid start byte 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: Traceback (most recent call last): 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/deploy_fips_compliant_settings.py", line 142, in deployFips 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: incident_review_lookup_empty = isLookupEmpty(IR_LOOKUP, IR_APP, DEFAULT_OWNER, key) 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/deploy_fips_compliant_settings.py", line 74, in isLookupEmpty 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: for lineno, unused_line in enumerate(open(path, 'r', newline=None)): 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: File "/opt/splunk/lib/python3.7/codecs.py", line 322, in decode 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: (result, consumed) = self._buffer_decode(data, self.errors, final) 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: UnicodeDecodeError: 'utf-8' codec can't decode byte 0xa3 in position 37: invalid start byte 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: During handling of the above exception, another exception occurred: 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: Traceback (most recent call last): 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/essinstaller2.py", line 331, in _postinstall 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: deployFips(session_key, logger=self.logger) 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/deploy_fips_compliant_settings.py", line 165, in deployFips 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: raise Exception('Error updating FIPS compliance settings.') 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: Exception: Error updating FIPS compliance settings. 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: During handling of the above exception, another exception occurred: 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: Traceback (most recent call last): 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/essinstall.py", line 243, in do_install 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: output = fn(session_key, True) 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/essinstaller2.py", line 81, in wrapper 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: r = f(self, *args, **kwargs) 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/essinstaller2.py", line 571, in stage_postinstall 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: self._postinstall(session_key) 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/essinstaller2.py", line 335, in _postinstall 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: raise InstallException(str(e)) 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - stderr: install.app_install_utils.InstallException: Error updating FIPS compliance settings. 04-01-2020 16:56:21.053 ERROR ChunkedExternProcessor - Error in 'essinstall' command: postinstall failed - Error updating FIPS compliance settings. We have restarted, selected to enable all Technology Add-on and also disable them all. Error message is always the same. Sample from essinstaller2.log which might give a hint: 2020-04-01 14:56:21,051+0000 ERROR pid=6614 tid=MainThread file=deploy_fips_compliant_settings.py:deployFips:164 | msg="Error updating FIPS compliance settings." Traceback (most recent call last): File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/deploy_fips_compliant_settings.py", line 142, in deployFips incident_review_lookup_empty = isLookupEmpty(IR_LOOKUP, IR_APP, DEFAULT_OWNER, key) File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/deploy_fips_compliant_settings.py", line 74, in isLookupEmpty for lineno, unused_line in enumerate(open(path, 'r', newline=None)): File "/opt/splunk/lib/python3.7/codecs.py", line 322, in decode (result, consumed) = self._buffer_decode(data, self.errors, final) UnicodeDecodeError: 'utf-8' codec can't decode byte 0xa3 in position 37: invalid start byte 2020-04-01 14:56:21,052+0000 ERROR pid=6614 tid=MainThread file=essinstall.py:do_install:261 | Traceback (most recent call last): File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/deploy_fips_compliant_settings.py", line 142, in deployFips incident_review_lookup_empty = isLookupEmpty(IR_LOOKUP, IR_APP, DEFAULT_OWNER, key) File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/deploy_fips_compliant_settings.py", line 74, in isLookupEmpty for lineno, unused_line in enumerate(open(path, 'r', newline=None)): File "/opt/splunk/lib/python3.7/codecs.py", line 322, in decode (result, consumed) = self._buffer_decode(data, self.errors, final) UnicodeDecodeError: 'utf-8' codec can't decode byte 0xa3 in position 37: invalid start byte During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/essinstaller2.py", line 331, in _postinstall deployFips(session_key, logger=self.logger) File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/deploy_fips_compliant_settings.py", line 165, in deployFips raise Exception('Error updating FIPS compliance settings.') Exception: Error updating FIPS compliance settings. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/essinstall.py", line 243, in do_install output = fn(session_key, True) File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/essinstaller2.py", line 81, in wrapper r = f(self, *args, **kwargs) File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/essinstaller2.py", line 571, in stage_postinstall self._postinstall(session_key) File "/opt/splunk/etc/apps/SplunkEnterpriseSecuritySuite/bin/install/essinstaller2.py", line 335, in _postinstall raise InstallException(str(e)) install.app_install_utils.InstallException: Error updating FIPS compliance settings.
I have 3 columns in my search output. For e.g. date, col1, col2, col3. Date will be X-Axis The column chart will show the bars for the value of Col1 by date and I want to do the chart overlay for c... See more...
I have 3 columns in my search output. For e.g. date, col1, col2, col3. Date will be X-Axis The column chart will show the bars for the value of Col1 by date and I want to do the chart overlay for col2 and col3 on this column chart. At present, I can overlay col2 series values on the chart, but not able to figure out how can I overlay third col3 on the same column chart ? Please help in this ? For eg. attached image has the details (query output and graph I want to show ). The bars represents the avg number of hours users worked for that date, also the yellow line represents the same thing ( used appendcols with the same search to overlay the value on bars), so that it can show the trend. I want to add the third overlay ( red line), that will show the number of total users (user_count) on that particular day ? user_count and "Avg User Session Time" should be the overlay on the bars. Also, I am not able to sort the date properly, April values are showing first and then march values ? Below is my query : index=paloalto sourcetype="syslog" userid!="tem*" zone="VPN" | eval date=date_mday+" - "+date_month |stats earliest(_time) as earliest latest(_time) as latest by userid,status,date | sort date| where status="login" | eval duration=latest-earliest | stats avg(duration) as duration dc(userid) as user_count by date| eval duration=round(duration/60/60,0) | rename date as Date duration as "Avg User Session Time (Hrs)" | appendcols [ search index=paloalto sourcetype="syslog" userid!="tem*" zone="VPN" | eval date=date_mday+" - "+date_month |stats earliest(_time) as earliest latest(_time) as latest by userid,status,date | sort date| where status="login" | eval duration=latest-earliest | stats avg(duration) as duration by date| eval duration=round(duration/60/60,0) | rename date as Date duration as "Avg User Session Time" ] ![alt text][2] [2]: /storage/temp/286728-overlay.jpg
Hi, We are deploying splunk enterprise in aws and we want to know how and which all components to be ssl secured. Few points about our cluster and we have to bind with these constraints There... See more...
Hi, We are deploying splunk enterprise in aws and we want to know how and which all components to be ssl secured. Few points about our cluster and we have to bind with these constraints There are no forwarders. ( I see splunk recommend to use forwarders but we choose other route) and so no deployment server HEC is enabled in indexers and our java based application sends data to hec indexers. Out company provides all required certs for ssl and we have to use these certs Our sample cluster would be something like 3 search heads in SHC, 1 cluster/license master, 7 indexers in indexer cluster and a deployer Here are my few questions about securing different components of our cluster Following https://docs.splunk.com/Documentation/Splunk/7.3.3/Security/SecureSplunkWebusingasignedcertificate to secure splunk web(search heads) with own certs. Do we need to still perform this step if we have our search head cluster fronted by a https load balancer.If yes, any detailed explanation would be helpful Do we need to have mutual TLS between Search heads in SHC and indexers in Indexer cluster? Since both are clusters, search heads communicates first with master and then with indexers. so how can we secure communication between shs and indexers with own certs? How to secure communication between our HEC indexers and the java based application? We are planning to have our HEC indexers fronted by a https load balancer. How to achieve secure communication in this regard with own certs? Is there any other channels that we need to secure with own certs apart from above? I know these are big list of questions, but any help here will really help us build a secure cluster. Any help is highly appreciated. Thanks in Advance.
Hi, I use pyagent to run....and I get the following error. 10:50:33,311 WARN [AD Thread Pool-Global1] AgentErrorProcessor - 3 instance(s) remaining before error log is silenced 10:50:33,311 E... See more...
Hi, I use pyagent to run....and I get the following error. 10:50:33,311 WARN [AD Thread Pool-Global1] AgentErrorProcessor - 3 instance(s) remaining before error log is silenced 10:50:33,311 ERROR [AD Thread Pool-Global1] ConfigurationChannel - Exception: earhart2020040113170721.saas.appdynamics.com:443 failed to respond org.apache.http.NoHttpResponseException: earhart2020040113170721.saas.appdynamics.com:443 failed to respond at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:141) ~[httpclient-4.5.4.jar:4.5.4] at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56) ~[httpclient-4.5.4.jar:4.5.4] at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259) ~[httpcore-4.4.9.jar:4.4.9] at org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163) ~[httpcore-4.4.9.jar:4.4.9] at org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:165) ~[httpclient-4.5.4.jar:4.5.4] at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273) ~[httpcore-4.4.9.jar:4.4.9] at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125) ~[httpcore-4.4.9.jar:4.4.9] at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272) ~[httpclient-4.5.4.jar:4.5.4] at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185) ~[httpclient-4.5.4.jar:4.5.4] at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89) ~[httpclient-4.5.4.jar:4.5.4] at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:111) ~[httpclient-4.5.4.jar:4.5.4] at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185) ~[httpclient-4.5.4.jar:4.5.4] at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:72) ~[httpclient-4.5.4.jar:4.5.4] at com.singularity.ee.util.httpclient.SimpleHttpClientWrapper.executeHttpOperation(SimpleHttpClientWrapper.java:289) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at com.singularity.ee.util.httpclient.SimpleHttpClientWrapper.executeHttpOperation(SimpleHttpClientWrapper.java:204) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at com.singularity.ee.rest.RESTRequest.sendRequestTracked(RESTRequest.java:384) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at com.singularity.ee.rest.RESTRequest.sendRequest(RESTRequest.java:337) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at com.singularity.ee.rest.controller.request.AControllerRequest.sendRequest(AControllerRequest.java:130) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at com.singularity.ee.rest.controller.request.ABinaryControllerRequest.sendRequest(ABinaryControllerRequest.java:36) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at com.singularity.ee.agent.appagent.kernel.config.xml.ConfigurationChannel.registerApplicationServer(ConfigurationChannel.java:1480) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at com.singularity.ee.agent.appagent.kernel.config.xml.ConfigurationChannel.access$100(ConfigurationChannel.java:122) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at com.singularity.ee.agent.appagent.kernel.config.xml.ConfigurationChannel$UnregisteredConfigurationState.nextTransition(ConfigurationChannel.java:742) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at com.singularity.ee.agent.appagent.kernel.config.xml.ConfigurationChannel.refreshConfiguration(ConfigurationChannel.java:521) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at com.singularity.ee.agent.appagent.kernel.config.xml.XMLConfigManager$AgentConfigurationRefreshTask.run(XMLConfigManager.java:649) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at com.singularity.ee.util.javaspecific.scheduler.AgentScheduledExecutorServiceImpl$SafeRunnable.run(AgentScheduledExecutorServiceImpl.java:122) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_212] at com.singularity.ee.util.javaspecific.scheduler.ADFutureTask$Sync.innerRunAndReset(ADFutureTask.java:335) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at com.singularity.ee.util.javaspecific.scheduler.ADFutureTask.runAndReset(ADFutureTask.java:152) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at com.singularity.ee.util.javaspecific.scheduler.ADScheduledThreadPoolExecutor$ADScheduledFutureTask.access$101(ADScheduledThreadPoolExecutor.java:119) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at com.singularity.ee.util.javaspecific.scheduler.ADScheduledThreadPoolExecutor$ADScheduledFutureTask.runPeriodic(ADScheduledThreadPoolExecutor.java:206) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at com.singularity.ee.util.javaspecific.scheduler.ADScheduledThreadPoolExecutor$ADScheduledFutureTask.run(ADScheduledThreadPoolExecutor.java:236) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at com.singularity.ee.util.javaspecific.scheduler.ADThreadPoolExecutor$Worker.runTask(ADThreadPoolExecutor.java:694) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at com.singularity.ee.util.javaspecific.scheduler.ADThreadPoolExecutor$Worker.run(ADThreadPoolExecutor.java:726) ~[proxy.jar:Proxy v${python-agent-version} GA SHA-1: DEV-BUILD Date ${TODAY}] at java.lang.Thread.run(Thread.java:748) [?:1.8.0_212]  but i can access appdynamics telnet earhart2020040113170721.saas.appdynamics.com 443 Trying 34.208.167.103... Connected to pdx-p-con-1501.saas.appdynamics.com. Escape character is '^]'
What i wanted to do is a simple search in our Proxy logs to find accesses to known bad Domain names. Currently we do not have the threatintelligence-app installed. I created a lookup table that onl... See more...
What i wanted to do is a simple search in our Proxy logs to find accesses to known bad Domain names. Currently we do not have the threatintelligence-app installed. I created a lookup table that only consists of one column called murl containing domain names hosting malicious sites. | inputlookup table.csv produces a simple list covidcyphers.com covid19sci.com suite401-covid19.com covid-taskforce.com titan-covid19.online if i use that as a lookup in a search i do not get Matches, also when i use Domains included in the log. index="proxy" | eval murl=url | lookup table.csv murl AS url OUTPUTNEW murl AS new| where dst like new (i tryed also "%new%" and Things alike) I then tryed to use inputlookup in a subsearch instead: index="proxy" url !="" [inputlookup table.csv where url in(murl) ] and it told me that the in function Needs a list of strings concatenated by commatas strin1,string2,string4 so i experimented with the Format/return (1000 $murl) commands index="proxy" where url IN([inputlookup table.csv| fields murl| format "" "" "," "" "" ""]) but did not reach my Goal … Is there a way to change the inputlookup result into a comma separated list to be used in the IN-function ? Or does anybody have a search command that can do a partial match by a list of values provided by a lookup table ? Thank you very much Kai
Hi so suppose in my results there are 2 logs that are being retrieved. There is a status message which is either true or false. I want the color to be set to green when both of them are true. What I ... See more...
Hi so suppose in my results there are 2 logs that are being retrieved. There is a status message which is either true or false. I want the color to be set to green when both of them are true. What I want to ask is that when both the statuses are true then only does splunk set the color to be green? What if one is true and one is false? If I want to give this type of a condition a seperate color is that possible? index=*** ---------- |eventstats count as message_count |eval number=message_count |eval color=case( (number)>2,"#ff9933",(status=="true" AND number==2),"#008000",(status=="false" AND number==2),"#FF0000") |table color
Hello, My data are like this, sender , receiver, _time userA, userB, _time1 userB, userC, _time2 userB, userD, _time3 userC, userD, _time4 I'd like to find the chain of users that se... See more...
Hello, My data are like this, sender , receiver, _time userA, userB, _time1 userB, userC, _time2 userB, userD, _time3 userC, userD, _time4 I'd like to find the chain of users that send and receive emails. For example userA->sent to userB-> sent to userC and userD , userC->send to userD my result would be like this userA, userB, userC, userD userA, userB, userD userB, userC, userD userC, userD Can you please help me Thanks in advance
Hi Team, I have written a small script(LinecountMonitor.sh) which counts the number of lines in the file. I can see the metrics in logs, But unable to see in controller UI. A restart of machine ag... See more...
Hi Team, I have written a small script(LinecountMonitor.sh) which counts the number of lines in the file. I can see the metrics in logs, But unable to see in controller UI. A restart of machine agent is not helping. Each time when I remove the script and replace it, It shows the metric in UI only once. After I update the lines in the script the updated value shows in logs but not in UI.  Can you please help me with this. Below are my logs:( line count shows 17 but same is not displayed in UI. UI shows path Custom Metrics|Linux Monitor|LineCount|TestFile but with no data) [Log files redacted] Thanks, Apurva ^ Post edited by @Ryan.Paredez to remove log files. Please do not share or attach log files to community posts for security and privacy reasons.
Dear all, I need your help as I need to parse a file generated by bluecoat, wich contain data relative to our web proxy policy. The format is like this: ;; CPL generated by Visual Policy Ma... See more...
Dear all, I need your help as I need to parse a file generated by bluecoat, wich contain data relative to our web proxy policy. The format is like this: ;; CPL generated by Visual Policy Manager: [Thu Mar 26 14:00:04 CET 2020] ;************************************************************* ; WARNING: ; THIS FILE IS AUTOMATICALLY GENERATED - DO NOT EDIT! ; ANY MANUAL CHANGES TO THIS FILE WILL BE LOST WHEN VPM ; POLICY IS REINSTALLED. ;************************************************************* define category "Blacklisted" isdsdsd.com *sdsds.com end category "Blacklisted" define condition __GROUP5 realm=admin group="admonui" end condition __GROUP5 define condition __GROUP7 realm=admin group="user1" end condition __GROUP7 define condition __GROUP25 realm=blablablabla" end condition __GROUP25 define condition __GROUP28 realm=bliblibli end condition __GROUP28 ;; Description: define condition __CondList1 url.domain="*ecurity.com" url.domain="sdsds*ecurity.com" end condition __CondList1 It seams that value are between words: define XXXXX and end XXXXX We cannot predict the XXXX However XXXXX are the same to start with define and end for example define MY_OWN_Policy value1="dsdsds" value2="fdfdfdfd" end MY_OWN_Policy In addition, comments are allowed using ;; before the define statement. Do you have idea on how to parse such format? Regards
Hello everyone, I'm very new to splunk and I find it very different than what I have worked so far. I am writing saved searches, where I am passing arguments to the search. I'm looking for a soluti... See more...
Hello everyone, I'm very new to splunk and I find it very different than what I have worked so far. I am writing saved searches, where I am passing arguments to the search. I'm looking for a solution how would I be able to pass arguments to a search, that default to a value if the parameter was not given. Something like this: [My Search] search = | savedsearch "Sample Search" \ argument1=$argument1$ \ argument2=$argument2$ \ argument3=if(isnull($argument3$), default_value, $argument3$) Any advice? Thanks,
Hello All, How we can get the stats of Internet Data Utilization by multiple Users for Different time periods.
Hello , So my question is about Visulaization of Dashboard , I want to visualize my dashboard as an list not a table . Can you help me to obtain that !! Thanks In advance and cooperation
Hi, I want to know if there is some mechanism by which i can stop indexing a particular kind of data like if segment_name="Enforced segment" From getting indexed. My inputs.conf has follo... See more...
Hi, I want to know if there is some mechanism by which i can stop indexing a particular kind of data like if segment_name="Enforced segment" From getting indexed. My inputs.conf has following entry [monitor:///opt/splunk/logs/check//.log] disabled = 0 host_segment = 5 sourcetype = check_logs index = check here i dont want those lines to get indexed if any of the log files has this pattern in it (segment_name="Enforced segment") Is it possible ? Thanks
I would like to return all messages that contains tag 6410. Currently the below will return all messages even if they do not contain tag 6410 index=gmrt_ett sourcetype=pubhub_emea TracingIncoming... See more...
I would like to return all messages that contains tag 6410. Currently the below will return all messages even if they do not contain tag 6410 index=gmrt_ett sourcetype=pubhub_emea TracingIncomingMessage "ET_OMS" | search "6401=POV" | extract pairdelim=";" kvdelim="\=" clean_keys=false | dedup _raw | searcg 6410=(?.*) | table 11,6410
I am looking at filtering Kafka messages in Splunk. For that I need to be able to filter which messages show up in my search. However, since I am new to Splunk, I have no idea how to filter the way I... See more...
I am looking at filtering Kafka messages in Splunk. For that I need to be able to filter which messages show up in my search. However, since I am new to Splunk, I have no idea how to filter the way I want and 2 days of tutorials, reading and messing about haven't brought me any closer to the solution. My even data looks like this: 2020-04-01 13:59:55:803 INFO [messageCoordinator] Sent Kafka message { "body":{ "id":"messageID", "name":"messageName", "channels":"affectedChannels", "order":messageOrder, ... }, "headers":{ "deltaFields":[ "status", "otherString" ], "level":"messageLevel", "operationType":"UPDATE", "messageType":"messageType", "timestamp":1585746000000, "trackingId":6651814029 } } Now I need to filter based on headers.deltaFields, which is always present, always a string array but can have multiple values (up to 8 strings) in the array. I only want the event to show up in my search if certain strings are within the deltaFields. Any help would be highly appreciated.
I have a query that looks for data from one source only if it is present in another source. It was working fine before. Suddenly it stopped working. Not sure why. Please find the query below. sour... See more...
I have a query that looks for data from one source only if it is present in another source. It was working fine before. Suddenly it stopped working. Not sure why. Please find the query below. sourcetype="ms:o365:reporting:messagetrace" NOT SenderAddress=company.com NOT Status= [search index=notable source="Threat - Detect Spam Email - Rule" | fields Subject] | stats dc(RecipientAddress) as recipientcount count by Subject SenderAddress | where recipientcount > 10 The query should give results only if the event with the same Subject is available in the subsearch. However, this is giving result even though the event is not present in the subsearch.
Hi at all, I have to ingest Zimperium Logs that are in json format and they are very complicated. In splunkbase there's the Zimperium App but there isn't any information about the logs ingestion a... See more...
Hi at all, I have to ingest Zimperium Logs that are in json format and they are very complicated. In splunkbase there's the Zimperium App but there isn't any information about the logs ingestion and no TA. Before I start with the logs parsing, had anyone already do it? Can you give me some hint? Thank you in advance. Ciao. Giuseppe
Hi Friends, could you pls advise, how to add the dashboard Last updated time on the dashboard page to understand when the page refreshed.
index=itsi_summary source="Indicator - Shared - DA-ITSI-OS_Performance_Storage - ITSI Search" How to determine the drive value from the above base search query for OS Performance for Storage Free ... See more...
index=itsi_summary source="Indicator - Shared - DA-ITSI-OS_Performance_Storage - ITSI Search" How to determine the drive value from the above base search query for OS Performance for Storage Free Space %. We need to add this value into description in SNOW.