All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Unfortunately Splunk cannot process JSON at that level.  Also, the value in your REGEX property doesn't make good regex.  How rigid are the format of these events?  If it is rigid enough, you can use... See more...
Unfortunately Splunk cannot process JSON at that level.  Also, the value in your REGEX property doesn't make good regex.  How rigid are the format of these events?  If it is rigid enough, you can use something as simple as REGEX = \"bindary\":\"/usr/bin/timeout\" If you have multiple subnodes with key name "binary" and value run-in, you can build better regex based on these possibilities, all provided that format is very rigid. Granted, JSON format can change without altering semantics.  So, this is not going to be very robust.
1. Search head is the component which spawns searches against indexers which hold the already indexed data. So I assume you meant that you're sending data in some format but it's getting improperly s... See more...
1. Search head is the component which spawns searches against indexers which hold the already indexed data. So I assume you meant that you're sending data in some format but it's getting improperly split into events. 2. Sending raw tcp or udp data stream directly to a Splunk component is not the preferred way to go (for several reasons which I will not dig into at this point). 3. What do these events look like on the wire? I'm not 100% sure but I think they might get split at datagram boundary regardless of other settings. 4. Your "split" set of events contains a second event which is _not_a part of the original event. A typo in preparation of the mockup data?
Let me see if I understand this correctly: Fields domain, min_score, max_score, attackerip, and hits are all available to you either from index search or some other SPL manipulation. The "threshol... See more...
Let me see if I understand this correctly: Fields domain, min_score, max_score, attackerip, and hits are all available to you either from index search or some other SPL manipulation. The "threshold" mentioned in the title and initial description, and subsequent comments, is to be calculated by a mathematical formula based on max_score, e.g., 3/4 * max_score. You want to set up an alert based on hits per attackerip per domain. The initial description and subsequent comments till now only mentioned one and half relevant field names (domain + attacker_score) and nothing else.  How are volunteers to know? If these are the conditions, I will assume that max_score is derived from attacker_score per attackerip, that hits is derived from event count, and that min_score is of no consequence in your formula.  In other words, index=ss group="Threat Intelligence" | stats max(attacker_score) as max_score count as hits by domain attackerip If this speculation is correct, setting up alert is a matter of applying your formula. | where 4 * hits > 3 * max_score  
Before we delve into SPL, I want to ask if you have any influence on developers of this application.  The put in so much energy into crafting a seemingly stiff log format.  With that energy, why don'... See more...
Before we delve into SPL, I want to ask if you have any influence on developers of this application.  The put in so much energy into crafting a seemingly stiff log format.  With that energy, why don't they just give you compliant JSON?  That will easily satisfy all your structural desire to have parent node and children nodes.  For example,   {"Changed Attributes": { "SAM Account Name": "-", "Display Name": "-", "User Principal Name": "-", "Home Directory": "-", "Home Drive": "-", "Script Path": "-", "Profile Path": "-", "User Workstations": "-", "Password Last Set": "9/12/2023 7:30:15 AM", "Account Expires": "-", "Primary Group ID": "-", "AllowedToDelegateTo": "-", "Old UAC Value": "-", "New UAC Value": "-", "User Account Control": "-", "User Parameters": "-", "SID History": "-", "Logon Hours": "-" } }   And this sample will give the following fields: field name field value Changed Attributes.Account Expires - Changed Attributes.AllowedToDelegateTo - Changed Attributes.Display Name - Changed Attributes.Home Directory - Changed Attributes.Home Drive - Changed Attributes.Logon Hours - Changed Attributes.New UAC Value - Changed Attributes.Old UAC Value - Changed Attributes.Password Last Set 9/12/2023 7:30:15 AM Changed Attributes.Primary Group ID - Changed Attributes.Profile Path - Changed Attributes.SAM Account Name - Changed Attributes.SID History - Changed Attributes.Script Path - Changed Attributes.User Account Control - Changed Attributes.User Parameters - Changed Attributes.User Principal Name - Changed Attributes.User Workstations - I believe this satisfies your structural requirement. If you absolutely have no influence, AND if the developers are so disciplined that they will never make tiny changes in log format, I want to ask how do you expect Splunk to identify "Changed Attributes:" as parent node?  Is it merely by leading space in the line?  Such criteria are extremely unrobust.  Additionally, how many different parent nodes can there be?  Is the illustration the entirety of the log or just a portion of the log?  Unless you can clearly describe data characteristics, there is no way to give a meaningful solution. Now, if the illustration is the entirety of the log, and your developers are extremely religious about spaces and swear on their souls never to make change, you can use these characteristics to derive information you need.  One such methods is to convert the free-hand string to compliant JSON, then use spath to extract and flatten the structure.   | rex max_match=0 mode=sed "s/^/{\"/ s/ / \"/g s/: /\"&\"/g s/ /\", /g s/:\",/\": {/ s/$/\" } }/" | spath   As to group the fields into changed and unchanged sets, that can also be achieved.  If your developers are flexible to make the log compliant JSON, they can just make unchanged fields JSON null.  Else you can try to handle them as string provided that the free-hand text is extremely rigid like mentioned above.   | foreach * [eval changed = mvappend(changed, if('<<FIELD>>' == "-", null(), "<<FIELD>>" . " => " . '<<FIELD>>')), unchanged = mvappend(unchanged, if('<<FIELD>>' == "-", "<<FIELD>>", null()))] | table changed unchanged   This way, you get changed unchanged Changed Attributes.Password Last Set => 9/12/2023 7:30:15 AM Changed Attributes.Account Expires Changed Attributes.AllowedToDelegateTo Changed Attributes.Display Name Changed Attributes.Home Directory Changed Attributes.Home Drive Changed Attributes.Logon Hours Changed Attributes.New UAC Value Changed Attributes.Old UAC Value Changed Attributes.Primary Group ID Changed Attributes.Profile Path Changed Attributes.SAM Account Name Changed Attributes.SID History Changed Attributes.Script Path Changed Attributes.User Account Control Changed Attributes.User Parameters Changed Attributes.User Principal Name Changed Attributes.User Workstations If your illustrated data is the entirety of the log, this is an emulation you can play with and compare with real data   | makeresults | fields - _time | eval data = "Changed Attributes: SAM Account Name: - Display Name: - User Principal Name: - Home Directory: - Home Drive: - Script Path: - Profile Path: - User Workstations: - Password Last Set: 9/12/2023 7:30:15 AM Account Expires: - Primary Group ID: - AllowedToDelegateTo: - Old UAC Value: - New UAC Value: - User Account Control: - User Parameters: - SID History: - Logon Hours: -" ``` data emulation above ```    
Hi everyone I need to grouping the below 3 events with correlation ID. I have tried transaction cmd below but it is not taking multiple ends with. And also I need to extract the event start timestam... See more...
Hi everyone I need to grouping the below 3 events with correlation ID. I have tried transaction cmd below but it is not taking multiple ends with. And also I need to extract the event start timestamp and event end timestamp. | transaction correlation_id startswith="processing_stage=Obtained data" endswith="processing_stage=Successfully obtained incontact response" endswith="processing_stage=Successfully obtained genesys response" {"message_type": "INFO", "processing_stage": "Obtained data", "message": "Successfully received data from API/SQS", "correlation_id": "c5be6c24-d0e6-4f27-a11d-86f7f194ae50", "error": "", "invoked_component": "prd-start-step-function-from-lambda-v1", 'startDate': datetime.datetime(2023, 11, 1, 5, 17, 50, 326000, tzinfo=tzlocal()), 'date': 'Wed, 01 Nov 2023 05:17:50 GMT', "invocation_timestamp": "2023-11-01T05:17:50Z", "response_timestamp": "2023-11-01T05:17:50Z", } {"message_type": "INFO", "processing_stage": "Successfully obtained genesys response", "message": "Successfully obtained genesys response", "correlation_id": "c5be6c24-d0e6-4f27-a11d-86f7f194ae50", "error": "", "invoked_component": "prd-ccm-genesys-ingestor-v1", "request_payload": "", "response_details": "", "invocation_timestamp": "2023-11-01T05:18:21Z", "response_timestamp": "2023-11-01T05:18:21Z"} {"message_type": "INFO", "processing_stage": "Successfully obtained incontact response", "message": "Successfully obtained incontact response", "correlation_id": "['330dba31-3d3d-4bf0-91a3-dfba81b56abf']", "error": "", "invoker_agent": "arn:aws:sqs:eu-central-1:981503094308:prd-ccm-incontact-ingestor-queue-v1", "invoked_component": "prd-ccm-incontact-ingestor-v1",  "invocation_timestamp": "2023-11-01T06:57:09Z", "response_timestamp": "2023-11-01T06:57:09Z"} Thanks in advance
Hi @inventsekar 1) In splunk search query we are using index name for search  2) Receiving logs via udp port 3) props conf LINE_BREAKER = (\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}) SHOULD_LINEMER... See more...
Hi @inventsekar 1) In splunk search query we are using index name for search  2) Receiving logs via udp port 3) props conf LINE_BREAKER = (\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}) SHOULD_LINEMERGE = false  
How to hide the dynamic filter tokens that we are passing in URL without using js.We need this because we don't want the user to view the tokens in URL.
Hi @Komal0113 Some more details needed: Can we have your Splunk Search Query pls (remove any hostname, ip address, etc from the search query) Are you using HF or not mostly the props/transforms c... See more...
Hi @Komal0113 Some more details needed: Can we have your Splunk Search Query pls (remove any hostname, ip address, etc from the search query) Are you using HF or not mostly the props/transforms causes this issue. can we have your props/transforms(only the portion responsible for this APP/add-on/TA is enough)
From splunk user we are receiving logs but when it comes to Splunk search head its splitting into different events  Expected log : Oct 26 09:37:51 +02:00 10.191.248.38 -: Operation%%31051 # Minor #... See more...
From splunk user we are receiving logs but when it comes to Splunk search head its splitting into different events  Expected log : Oct 26 09:37:51 +02:00 10.191.248.38 -: Operation%%31051 # Minor # qaz# XYZ # 10.135.114.70 # Succeeded # Function:[Configuration Management][MML Command] PQR ME:; # 2023-10-26 09:37:51# splunk dividing into two separate events Oct 26 09:37:51 +02:00 10.191.248.38 -: Operation%%31051 # Minor # qaz# XYZ # 10.135.114.70  # Succeeded # Function:[Configuration Management][MML Command] & LST ME:; # 2023-10-26 09:37:51# How can i resolve this cannot combine this two because getting seperate event not one after another 
Hi as @inventsekar said it should work without issues. Could you try this | makeresults | eval c=avg(1, 2, 3) | table c It should give to 2 to you. When this is working and your current query di... See more...
Hi as @inventsekar said it should work without issues. Could you try this | makeresults | eval c=avg(1, 2, 3) | table c It should give to 2 to you. When this is working and your current query didn't work, then you have some other issue in it. You could check which Splunk version you have by click Help -> About on top bar. r. Ismo
Hi just define that base search as now, but don' use it as your query which create table on your dashboard. Just create another post-process search where your query is just "| table company, AvgScore... See more...
Hi just define that base search as now, but don' use it as your query which create table on your dashboard. Just create another post-process search where your query is just "| table company, AvgScore". r. Ismo
CI_5 field extraction is not proper. As of now all last values (C,srv048 & server) are going into CI_5 which is not correct. "CI": "V2;Y;Windows;srv048;LogicalDisk;C:", "CI": "V2;Y;Linx;srv048", "... See more...
CI_5 field extraction is not proper. As of now all last values (C,srv048 & server) are going into CI_5 which is not correct. "CI": "V2;Y;Windows;srv048;LogicalDisk;C:", "CI": "V2;Y;Linx;srv048", "CI": "V2;LX;apple;rose;server",
Hello,    I have a table with a column recording the ID, I want to make each ID in the table a Hyperlink and click on it to take me to a different page. I want each provided ID in the table to be... See more...
Hello,    I have a table with a column recording the ID, I want to make each ID in the table a Hyperlink and click on it to take me to a different page. I want each provided ID in the table to be concatenated with the begining of the hyperlink that I want to provide as static.  I tried to use drilldown option in my table and the below code is what I used but it is not working.    "eventHandlers": [ { "type": "drilldown.setToken", "options": { "tokens": [ { "token": "selection_tok", "key": "click.value" } ] } }, { "type": "drilldown.customUrl", "options": { "url": "https://(the main part of my url)/$selection_tok$", "newTab": true } }     I already tried to replace $selection_tok$ with direclty calling $click.value$ and also tried $click.value2$ I tried multiple approaches but it is not pickining out the ID from the table.       
You are right @PickleRick .   I'm guessing I'm left with 2nd option of building with Python script inside a custom command. And I need to spend some time on building an algorithm that best suits pe... See more...
You are right @PickleRick .   I'm guessing I'm left with 2nd option of building with Python script inside a custom command. And I need to spend some time on building an algorithm that best suits performance. I'll experiment.
I'm creating a dashboard that have 3 pie charts where the values for each pie chart are color-coded: green for Comply, red for Not Comply and yellow for Not Supported.  I also created 2 text inputs ... See more...
I'm creating a dashboard that have 3 pie charts where the values for each pie chart are color-coded: green for Comply, red for Not Comply and yellow for Not Supported.  I also created 2 text inputs and 3 dropdown inputs. The text input will allow user to input hostname or IP address and the pie chart will return the results of the compliance tied to the hostname or IP address with the right color code. And the dropdown inputs will allow user to select "Comply", "Not Comply" or "Not Supported". When the user selects "Not comply", the pie chart should be red, but instead it is still green because the colors apply to the series from left to right so I get the first color. Is there a way to make the colors apply accordingly after text/dropdown input change? The XML looks like this:    <form version="1.1"> <label>Overview</label> <search id="base"> <query> | savedsearch Overview | search Hostname="$field1$" "IP Address"="$field2$" OS="$field3$" "ABC Compliance"="$field5$" "DEF Compliance"="$field6$" "GHI Compliance"="$field8$" </query> <earliest>-3d@d</earliest> <latest>now</latest> <sampleRatio>1</sampleRatio> </search> <fieldset submitButton="true"> <input type="text" token="field1" searchWhenChanged="true"> <label>Hostname</label> <default>*</default> <initialValue>*</initialValue> </input> <input type="text" token="field2" searchWhenChanged="true"> <label>IP Address</label> <default>*</default> <initialValue>*</initialValue> </input> <input type="dropdown" token="field3"> <label>OS</label> <choice value="*">All</choice> <default>*</default> <initialValue>*</initialValue> <fieldForLabel>OS</fieldForLabel> <fieldForValue>OS</fieldForValue> <search> <query>| savedsearch Overview | stats count by OS</query> <earliest>-7d@h</earliest> <latest>now</latest> </search> </input> <input type="dropdown" token="field5"> <label>ABC Compliance</label> <choice value="*">All</choice> <fieldForLabel>ABC Compliance</fieldForLabel> <fieldForValue>ABC Compliance</fieldForValue> <search> <query>| savedsearch Overview | stats count by "ABC Compliance"</query> <earliest>-7d@h</earliest> <latest>now</latest> </search> <default>*</default> <initialValue>*</initialValue> </input> <input type="dropdown" token="field6"> <label>DEF Compliance</label> <choice value="*">All</choice> <default>*</default> <initialValue>*</initialValue> <fieldForLabel>DEF Compliance</fieldForLabel> <fieldForValue>DEF Compliance</fieldForValue> <search> <query>| savedsearch Overview | stats count by "DEF Compliance"</query> <earliest>-7d@h</earliest> <latest>now</latest> </search> </input> <input type="dropdown" token="field8"> <label>GHI Compliance</label> <choice value="*">All</choice> <default>*</default> <initialValue>*</initialValue> <fieldForLabel>GHI Compliance</fieldForLabel> <fieldForValue>GHI Compliance</fieldForValue> <search> <query>| savedsearch Overview | stats count by "GHI Compliance"</query> <earliest>-7d@h</earliest> <latest>now</latest> </search> </input> </fieldset> <row> <panel> <chart> <title>ABC Compliance</title> <search> <query>| savedsearch Overview | search Hostname="$field1$" "IP Address"="$field2$" OS="$field3$" "ABC Compliance"="$field5$" "DEF Compliance"="$field6$" "GHI Compliance"="$field8$" | rename "ABC Compliance" as Compliance | stats count by Compliance | eval Compliance=Compliance." (".count.")"</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="charting.chart">pie</option> <option name="charting.drilldown">none</option> <option name="charting.legend.labels">["Comply","Not Comply","Not Supported"]</option> <option name="charting.seriesColors">[0x70db70,0xff4d4d,0xffff66]</option> <option name="refresh.display">progressbar</option> </chart> </panel> <panel> <chart> <title>DEF Compliance</title> <search> <query>| savedsearch Overview | search Hostname="$field1$" "IP Address"="$field2$" OS="$field3$" "ABC Compliance"="$field5$" "DEF Compliance"="$field6$" "GHI Compliance"="$field8$" | rename "DEF Compliance" as Compliance | stats count by Compliance | eval Compliance=Compliance." (".count.")"</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="charting.chart">pie</option> <option name="charting.drilldown">none</option> <option name="refresh.display">progressbar</option> <option name="charting.legend.labels">["Comply","Not Comply","Not Supported"]</option> <option name="charting.seriesColors">[0x70db70,0xff4d4d,0xffff66]</option> </chart> </panel> <panel> <chart> <title>GHI Compliance</title> <search> <query>| savedsearch Overview | search Hostname="$field1$" "IP Address"="$field2$" OS="$field3$" "ABC Compliance"="$field5$" "DEF Compliance"="$field6$" "GHI Compliance"="$field8$" | rename "GHI Compliance" as Compliance | stats count by Compliance | eval Compliance=Compliance." (".count.")"</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="charting.chart">pie</option> <option name="charting.drilldown">none</option> <option name="refresh.display">progressbar</option> <option name="charting.legend.labels">["Comply","Not Comply","Not Supported"]</option> <option name="charting.seriesColors">[0x70db70,0xff4d4d,0xffff66]</option> </chart> </panel> </row> <row> <panel> <title>Host Summary</title> <table> <search base="base"> <query>| table Hostname, "IP Address", OS, "ABC Compliance", "DEF Compliance", "GHI Compliance"</query> </search> <option name="count">20</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="refresh.display">progressbar</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> <format type="color" field="ABC Compliance"> <colorPalette type="map">{"Comply":#70db70,"Not Comply":#ff4d4d, "Not Supported":#ffff66}</colorPalette> </format> <format type="color" field="DEF Compliance"> <colorPalette type="map">{"Comply":#70db70,"Not Comply":#ff4d4d, "Not Supported":#ffff66}</colorPalette> </format> <format type="color" field="GHI Compliance"> <colorPalette type="map">{"Comply":#70db70,"Not Comply":#ff4d4d, "Not Supported":#ffff66}</colorPalette> </format> </table> </panel> </row> </form>        
Summary Gradle build configuration cache cannot be enabled in the project because of AppDynamics plugin   Context   Gradle has a build configuration cache capability, which is still not enabled ... See more...
Summary Gradle build configuration cache cannot be enabled in the project because of AppDynamics plugin   Context   Gradle has a build configuration cache capability, which is still not enabled by default. Information can be found at https://docs.gradle.org/current/userguide/configuration_cache.html The build configuration cache can improve build times drastically for large projects.    However, it has some requirements for Gradle tasks for it to be able to be used (https://docs.gradle.org/current/userguide/configuration_cache.html#config_cache:requirements).   Current `debugAppDynamicsAddBuildInfo` AppDynamics Gradle task is not compatible with Gradle build configuration cache. This makes it not possible to enable it in any project using AppDynamics Gradle plugin.   How to replicate the issue   In the current project, run the following Gradle command:   ./gradlew clean assembleDebug   Key details   - AppDynamics version in use: 23.7.1   - Key configuration to enable build configuration cache: In `gradle.properties`:   org.gradle.configuration-cache=true   Error message   * What went wrong: Configuration cache problems found in this build. 1 problem was found storing the configuration cache. - Task `:app:debugAppDynamicsAddBuildInfo` of type `com.appdynamics.android.gradle.AddToClassesTask`: invocation of 'Task.project' at execution time is unsupported.   See https://docs.gradle.org/8.4/userguide/configuration_cache.html#config_cache:requirements:use_project_during_execution See the complete report at file:///<REDACTED> > Invocation of 'Task.project' by task ':app:debugAppDynamicsAddBuildInfo' at execution time is unsupported.     Stack trace   org.gradle.api.InvalidUserCodeException: Invocation of 'Task.project' by task ':app:debugAppDynamicsAddBuildInfo' at execution time is unsupported. at org.gradle.configurationcache.problems.DefaultProblemFactory$problem$1$build$diagnostics$1.get(DefaultProblemFactory.kt:86) at org.gradle.configurationcache.problems.DefaultProblemFactory$problem$1$build$diagnostics$1.get(DefaultProblemFactory.kt:86) at org.gradle.internal.problems.DefaultProblemDiagnosticsFactory$DefaultProblemStream.getImplicitThrowable(DefaultProblemDiagnosticsFactory.java:111) at org.gradle.internal.problems.DefaultProblemDiagnosticsFactory$DefaultProblemStream.forCurrentCaller(DefaultProblemDiagnosticsFactory.java:100) at org.gradle.configurationcache.problems.DefaultProblemFactory$problem$1.build(DefaultProblemFactory.kt:86) at org.gradle.configurationcache.initialization.DefaultConfigurationCacheProblemsListener.onTaskExecutionAccessProblem(ConfigurationCacheProblemsListener.kt:134) at org.gradle.configurationcache.initialization.DefaultConfigurationCacheProblemsListener.onProjectAccess(ConfigurationCacheProblemsListener.kt:74) at jdk.internal.reflect.GeneratedMethodAccessor1182.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.base/java.lang.reflect.Method.invoke(Unknown Source) at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36) at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24) at org.gradle.internal.event.DefaultListenerManager$ListenerDetails.dispatch(DefaultListenerManager.java:472) at org.gradle.internal.event.DefaultListenerManager$ListenerDetails.dispatch(DefaultListenerManager.java:454) at org.gradle.internal.event.AbstractBroadcastDispatch.dispatch(AbstractBroadcastDispatch.java:83) at org.gradle.internal.event.AbstractBroadcastDispatch.dispatch(AbstractBroadcastDispatch.java:69) at org.gradle.internal.event.DefaultListenerManager$EventBroadcast$ListenerDispatch.dispatch(DefaultListenerManager.java:443) at org.gradle.internal.event.DefaultListenerManager$EventBroadcast$ListenerDispatch.dispatch(DefaultListenerManager.java:431) at org.gradle.internal.event.AbstractBroadcastDispatch.dispatch(AbstractBroadcastDispatch.java:43) at org.gradle.internal.event.AbstractBroadcastDispatch.dispatch(AbstractBroadcastDispatch.java:66) at org.gradle.internal.event.DefaultListenerManager$EventBroadcast$ListenerDispatch.dispatch(DefaultListenerManager.java:443) at org.gradle.internal.event.DefaultListenerManager$EventBroadcast.dispatch(DefaultListenerManager.java:232) at org.gradle.internal.event.DefaultListenerManager$EventBroadcast.dispatch(DefaultListenerManager.java:203) at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94) at jdk.proxy1/jdk.proxy1.$Proxy77.onProjectAccess(Unknown Source) at org.gradle.configurationcache.AbstractTaskProjectAccessChecker.notifyProjectAccess(TaskExecutionAccessCheckers.kt:33) at org.gradle.api.internal.AbstractTask.getProject(AbstractTask.java:238) at org.gradle.api.DefaultTask.getProject(DefaultTask.java:59) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.base/java.lang.reflect.Method.invoke(Unknown Source) at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:107) at groovy.lang.MetaBeanProperty.getProperty(MetaBeanProperty.java:59) at org.gradle.internal.metaobject.BeanDynamicObject$MetaClassAdapter.getProperty(BeanDynamicObject.java:256) at org.gradle.internal.metaobject.BeanDynamicObject.tryGetProperty(BeanDynamicObject.java:198) at org.gradle.internal.metaobject.CompositeDynamicObject.tryGetProperty(CompositeDynamicObject.java:55) at org.gradle.internal.metaobject.AbstractDynamicObject.getProperty(AbstractDynamicObject.java:60) at com.appdynamics.android.gradle.AddToClassesTask_Decorated.getProperty(Unknown Source) at org.codehaus.groovy.runtime.callsite.PogoGetPropertySite.getProperty(PogoGetPropertySite.java:49) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callGroovyObjectGetProperty(AbstractCallSite.java:341) at com.appdynamics.android.gradle.AddToClassesTask.instrumentationConfig(AddToClassesTask.groovy:41) at com.appdynamics.android.gradle.AddToClassesTask.taskAction(AddToClassesTask.groovy:32) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.base/java.lang.reflect.Method.invoke(Unknown Source) at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:125) at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:58) at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:51) at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:29) at org.gradle.api.internal.tasks.execution.TaskExecution$3.run(TaskExecution.java:248) at org.gradle.internal.operations.DefaultBuildOperationRunner$1.execute(DefaultBuildOperationRunner.java:29) at org.gradle.internal.operations.DefaultBuildOperationRunner$1.execute(DefaultBuildOperationRunner.java:26) at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:66) at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:59) at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:157) at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:59) at org.gradle.internal.operations.DefaultBuildOperationRunner.run(DefaultBuildOperationRunner.java:47) at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:73) at org.gradle.api.internal.tasks.execution.TaskExecution.executeAction(TaskExecution.java:233) at org.gradle.api.internal.tasks.execution.TaskExecution.executeActions(TaskExecution.java:216) at org.gradle.api.internal.tasks.execution.TaskExecution.executeWithPreviousOutputFiles(TaskExecution.java:199) at org.gradle.api.internal.tasks.execution.TaskExecution.execute(TaskExecution.java:166) at org.gradle.internal.execution.steps.ExecuteStep.executeInternal(ExecuteStep.java:105) at org.gradle.internal.execution.steps.ExecuteStep.access$000(ExecuteStep.java:44) at org.gradle.internal.execution.steps.ExecuteStep$1.call(ExecuteStep.java:59) at org.gradle.internal.execution.steps.ExecuteStep$1.call(ExecuteStep.java:56) at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:204) at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:199) at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:66) at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:59) at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:157) at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:59) at org.gradle.internal.operations.DefaultBuildOperationRunner.call(DefaultBuildOperationRunner.java:53) at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:78) at org.gradle.internal.execution.steps.ExecuteStep.execute(ExecuteStep.java:56) at org.gradle.internal.execution.steps.ExecuteStep.execute(ExecuteStep.java:44) at org.gradle.internal.execution.steps.RemovePreviousOutputsStep.execute(RemovePreviousOutputsStep.java:67) at org.gradle.internal.execution.steps.RemovePreviousOutputsStep.execute(RemovePreviousOutputsStep.java:37) at org.gradle.internal.execution.steps.CancelExecutionStep.execute(CancelExecutionStep.java:41) at org.gradle.internal.execution.steps.TimeoutStep.executeWithoutTimeout(TimeoutStep.java:74) at org.gradle.internal.execution.steps.TimeoutStep.execute(TimeoutStep.java:55) at org.gradle.internal.execution.steps.CreateOutputsStep.execute(CreateOutputsStep.java:50) at org.gradle.internal.execution.steps.CreateOutputsStep.execute(CreateOutputsStep.java:28) at org.gradle.internal.execution.steps.CaptureStateAfterExecutionStep.executeDelegateBroadcastingChanges(CaptureStateAfterExecutionStep.java:100) at org.gradle.internal.execution.steps.CaptureStateAfterExecutionStep.execute(CaptureStateAfterExecutionStep.java:72) at org.gradle.internal.execution.steps.CaptureStateAfterExecutionStep.execute(CaptureStateAfterExecutionStep.java:50) at org.gradle.internal.execution.steps.ResolveInputChangesStep.execute(ResolveInputChangesStep.java:40) at org.gradle.internal.execution.steps.ResolveInputChangesStep.execute(ResolveInputChangesStep.java:29) at org.gradle.internal.execution.steps.BuildCacheStep.executeWithoutCache(BuildCacheStep.java:179) at org.gradle.internal.execution.steps.BuildCacheStep.lambda$execute$1(BuildCacheStep.java:70) at org.gradle.internal.Either$Right.fold(Either.java:175) at org.gradle.internal.execution.caching.CachingState.fold(CachingState.java:59) at org.gradle.internal.execution.steps.BuildCacheStep.execute(BuildCacheStep.java:68) at org.gradle.internal.execution.steps.BuildCacheStep.execute(BuildCacheStep.java:46) at org.gradle.internal.execution.steps.StoreExecutionStateStep.execute(StoreExecutionStateStep.java:36) at org.gradle.internal.execution.steps.StoreExecutionStateStep.execute(StoreExecutionStateStep.java:25) at org.gradle.internal.execution.steps.RecordOutputsStep.execute(RecordOutputsStep.java:36) at org.gradle.internal.execution.steps.RecordOutputsStep.execute(RecordOutputsStep.java:22) at org.gradle.internal.execution.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:91) at org.gradle.internal.execution.steps.SkipUpToDateStep.lambda$execute$2(SkipUpToDateStep.java:55) at java.base/java.util.Optional.orElseGet(Unknown Source) at org.gradle.internal.execution.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:55) at org.gradle.internal.execution.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:37) at org.gradle.internal.execution.steps.ResolveChangesStep.execute(ResolveChangesStep.java:65) at org.gradle.internal.execution.steps.ResolveChangesStep.execute(ResolveChangesStep.java:36) at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsFinishedStep.execute(MarkSnapshottingInputsFinishedStep.java:37) at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsFinishedStep.execute(MarkSnapshottingInputsFinishedStep.java:27) at org.gradle.internal.execution.steps.ResolveCachingStateStep.execute(ResolveCachingStateStep.java:77) at org.gradle.internal.execution.steps.ResolveCachingStateStep.execute(ResolveCachingStateStep.java:38) at org.gradle.internal.execution.steps.ValidateStep.execute(ValidateStep.java:108) at org.gradle.internal.execution.steps.ValidateStep.execute(ValidateStep.java:55) at org.gradle.internal.execution.steps.CaptureStateBeforeExecutionStep.execute(CaptureStateBeforeExecutionStep.java:71) at org.gradle.internal.execution.steps.CaptureStateBeforeExecutionStep.execute(CaptureStateBeforeExecutionStep.java:45) at org.gradle.internal.execution.steps.SkipEmptyWorkStep.executeWithNonEmptySources(SkipEmptyWorkStep.java:177) at org.gradle.internal.execution.steps.SkipEmptyWorkStep.execute(SkipEmptyWorkStep.java:81) at org.gradle.internal.execution.steps.SkipEmptyWorkStep.execute(SkipEmptyWorkStep.java:53) at org.gradle.internal.execution.steps.RemoveUntrackedExecutionStateStep.execute(RemoveUntrackedExecutionStateStep.java:32) at org.gradle.internal.execution.steps.RemoveUntrackedExecutionStateStep.execute(RemoveUntrackedExecutionStateStep.java:21) at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsStartedStep.execute(MarkSnapshottingInputsStartedStep.java:38) at org.gradle.internal.execution.steps.LoadPreviousExecutionStateStep.execute(LoadPreviousExecutionStateStep.java:36) at org.gradle.internal.execution.steps.LoadPreviousExecutionStateStep.execute(LoadPreviousExecutionStateStep.java:23) at org.gradle.internal.execution.steps.CleanupStaleOutputsStep.execute(CleanupStaleOutputsStep.java:75) at org.gradle.internal.execution.steps.CleanupStaleOutputsStep.execute(CleanupStaleOutputsStep.java:41) at org.gradle.internal.execution.steps.ExecuteWorkBuildOperationFiringStep.lambda$execute$2(ExecuteWorkBuildOperationFiringStep.java:66) at java.base/java.util.Optional.orElseGet(Unknown Source) at org.gradle.internal.execution.steps.ExecuteWorkBuildOperationFiringStep.execute(ExecuteWorkBuildOperationFiringStep.java:66) at org.gradle.internal.execution.steps.ExecuteWorkBuildOperationFiringStep.execute(ExecuteWorkBuildOperationFiringStep.java:38) at org.gradle.internal.execution.steps.AssignWorkspaceStep.lambda$execute$0(AssignWorkspaceStep.java:32) at org.gradle.api.internal.tasks.execution.TaskExecution$4.withWorkspace(TaskExecution.java:293) at org.gradle.internal.execution.steps.AssignWorkspaceStep.execute(AssignWorkspaceStep.java:30) at org.gradle.internal.execution.steps.AssignWorkspaceStep.execute(AssignWorkspaceStep.java:21) at org.gradle.internal.execution.steps.IdentityCacheStep.execute(IdentityCacheStep.java:37) at org.gradle.internal.execution.steps.IdentityCacheStep.execute(IdentityCacheStep.java:27) at org.gradle.internal.execution.steps.IdentifyStep.execute(IdentifyStep.java:47) at org.gradle.internal.execution.steps.IdentifyStep.execute(IdentifyStep.java:34) at org.gradle.internal.execution.impl.DefaultExecutionEngine$1.execute(DefaultExecutionEngine.java:64) at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeIfValid(ExecuteActionsTaskExecuter.java:145) at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:134) at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:46) at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:51) at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57) at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:74) at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36) at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:77) at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:55) at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:52) at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:204) at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:199) at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:66) at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:59) at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:157) at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:59) at org.gradle.internal.operations.DefaultBuildOperationRunner.call(DefaultBuildOperationRunner.java:53) at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:78) at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:52) at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:42) at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:331) at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:318) at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.lambda$execute$0(DefaultTaskExecutionGraph.java:314) at org.gradle.internal.operations.CurrentBuildOperationRef.with(CurrentBuildOperationRef.java:80) at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:314) at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:303) at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:463) at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:380) at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64) at org.gradle.internal.concurrent.AbstractManagedExecutor$1.run(AbstractManagedExecutor.java:47) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.base/java.lang.Thread.run(Unknown Source)
Hi @yuanliu , Field name for ip is attackerip, exactly  attacker_score  Is that the number of hits from a single IP, I'm trying to find the threshold values based on the attacker_score.If suppose th... See more...
Hi @yuanliu , Field name for ip is attackerip, exactly  attacker_score  Is that the number of hits from a single IP, I'm trying to find the threshold values based on the attacker_score.If suppose the attackerip max score is 2000, then threshold should be like 1500 to raise an alert . domain min_score max_score attackerip hits xyz.com 110 1985 191.168.1.1 2135 abc.com 520 1760 192.153.1.1 2165   Thanks 
How to hide a field of a table but keep it for separate search?   Thank you for your help For example:  field "id" exists on the index. I don't want to display field "id" on the first table (Base ... See more...
How to hide a field of a table but keep it for separate search?   Thank you for your help For example:  field "id" exists on the index. I don't want to display field "id" on the first table (Base search),  but display it on the second table (uses  the first search as Base search) <search id="base">     <query> index=testindex           | table company, ip, AvgScore      </query> </search> company ip AvgScore CompanyA ip1 1 CompanyA ip2 3 CompanyA ip3 4 <search base="base">       <query> | lookup  example.csv id as id OUTPUTNEW  id, location                         | table company, id, ip, AvgScore, location      </query> </search> company id ip AvgScore location CompanyA idA ip1 1 loc1 CompanyA idA ip2 3 loc1 CompanyA idA ip3 4 loc1
Still unclear.  Your original sample code mentions no IP address.  What is the field name for IP?  How are volunteers going to know what data you have?  And what is attacker_score?  Is that the numbe... See more...
Still unclear.  Your original sample code mentions no IP address.  What is the field name for IP?  How are volunteers going to know what data you have?  And what is attacker_score?  Is that the number of hits from a single IP?  Are you simply looking for maximum of attacker_score? To ask a data analytics question other people can help, illustrate relevant data (anonymize as needed), explain any characteristics others need to know, illustrate desired results, then explain the logical relationship between illustrated data and results.
As @ITWhisperer said, illustrate structured data in raw format, not with Splunk's condensation. If you already have a top level key "tag", I suspect that you actually want the key-value pairs in tha... See more...
As @ITWhisperer said, illustrate structured data in raw format, not with Splunk's condensation. If you already have a top level key "tag", I suspect that you actually want the key-value pairs in that value ("service=z2-qa1-local-z2-api-endpoint APPID=1234 cluster=z2-qa1-local application=z2 full-imagename=0123456789.dkr.10cal/10.20/xyz container-id=asdfgh503 full-container-id=1234567890") extracted, not to extract that line again.  Maybe the key "tag" is not top level.  In that case, you will need to tell us what is the path leading to tag.  In all cases, raw format will help volunteers diagnose. If "tag" is top level, you can use kv (aka extract) to extract fields like service, APPID, etc., like   | rename _raw AS temp, tag AS _raw | kv | rename _raw AS tag, temp as _raw   Your sample should give APPID application cluster container_id full_container_id full_imagename service 1234 z2 z2-qa1-local asdfgh503 1234567890 0123456789.dkr.10cal/10.20/xyz z2-qa1-local-z2-api-endpoint Is this something you are looking for?