All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

It would help to see the props.conf settings for that sourcetype, but these should get you started: [mysourcetype] LINE_BREAKER = ([\r\n]+)\d\d\d\d-\d\d\d\dT EVENT_BREAKER = ([\r\n]+)\d\d\d\d-\d\d\d... See more...
It would help to see the props.conf settings for that sourcetype, but these should get you started: [mysourcetype] LINE_BREAKER = ([\r\n]+)\d\d\d\d-\d\d\d\dT EVENT_BREAKER = ([\r\n]+)\d\d\d\d-\d\d\d\dT EVENT_BREAKER_ENABLE = true SHOULD_LINEMERGE = false TIME_PREFIX = ^ TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%6N%:z MAX_TIMESTAMP_LOOKAHEAD = 30 TRUNCATE = 10000
The RHS of arguments in the savedsearch command is expected to be a string rather than a field name.  You might try putting the value in single quotes to see if the forces it to be treated as a field... See more...
The RHS of arguments in the savedsearch command is expected to be a string rather than a field name.  You might try putting the value in single quotes to see if the forces it to be treated as a field. | appendcols [| savedsearch "Events_list" perc='Critical'] There's a wrinkle, however.  The Critical field is multi-value (because of the values function) and most commands don't work well with multi-value fields.
Thanks for your reply.  Honestly, I wish I knew what I was talking about so that my question could be more clear.  Our company has different 6 digit asset numbers for each site, which there are multi... See more...
Thanks for your reply.  Honestly, I wish I knew what I was talking about so that my question could be more clear.  Our company has different 6 digit asset numbers for each site, which there are multiple of.  So in the search, I am trying to figure out how to have Splunk only search our asset for overnight type 2 logins, as we do not need the data from all of the other assets.  Hopefully that painted a better picture.  Thanks again!
what you said doesnt make sense, in the end I can only choose one variable to return so all of the custom blocks must share the same variable, change it on -the-fly depends on the passed result. sure... See more...
what you said doesnt make sense, in the end I can only choose one variable to return so all of the custom blocks must share the same variable, change it on -the-fly depends on the passed result. sure you can use a custom function e.g passthrough for that but that seems so unnecessary for such a simple task,  thanks anyways I understand the only solutions arent quite built-in 
Again, there are many ways to do things in SOAR and we're just trying to give you ideas.  Sometimes the work needed to build re-usable playbooks, if it's just 1 action and 1 filter/decision, isn't w... See more...
Again, there are many ways to do things in SOAR and we're just trying to give you ideas.  Sometimes the work needed to build re-usable playbooks, if it's just 1 action and 1 filter/decision, isn't worth not adding it into the main playbook each time as it takes just as much time to input the inputs to the action/decision than an input playbook.  It depends really on what you are wanting to do with the True/False items? If true ones are sent to another action then yes, just use the input playbook to work out True and then output a list of them for use later. If you want to get more complicated you can as playbooks allow that. 
I think what you may be looking for is something like the following: Action block that queries AD for the OU the object is in.  The object (ip_hostname in your example) is configured as the input ... See more...
I think what you may be looking for is something like the following: Action block that queries AD for the OU the object is in.  The object (ip_hostname in your example) is configured as the input at the start of the playbook Decision Block checking if the OU matches your condition My org had some trouble with these initially, you should configure these like below (rather than the opposite direction,  which is what we were trying initially)   if <artifact_datapath> == <the OU you're trying to match>​ 3. Each path on your decision block then has a custom code block. You'll need to configure an output variable for the custom block themselves and set it to either True/False  4. The output of your entire playbook will be set to the variable you configured in step 3.  
Hi @Nour.Alghamdi, Can you share what documentation you were looking at? I'll share it with the Docs team to see if we can get changes made to it.  In the meantime, I'm also looking around to see w... See more...
Hi @Nour.Alghamdi, Can you share what documentation you were looking at? I'll share it with the Docs team to see if we can get changes made to it.  In the meantime, I'm also looking around to see what I can find. 
Hi, I am having issues passing value into savedsearch Below is the simplified version of my query: | inputlookup alert_thresholds.csv | search Alert="HTTP 500" | stats values(Critical) as Cr... See more...
Hi, I am having issues passing value into savedsearch Below is the simplified version of my query: | inputlookup alert_thresholds.csv | search Alert="HTTP 500" | stats values(Critical) as Critical | appendcols [| savedsearch "Events_list" perc=Critical] basically what I want to do is to use Critical value as the value of perc in subsearch but it seems to not work correctly. I get no results. When I replace Critical with 10 in the subsearch it works just fine.
Hi @Yogesh.Joshi, Did you ever hear from anyone from AppD when you had those questions a few months ago? 
we have a log ingestion from aws cloud env via HTTP event collector to splunk , one of the user reporting some of the logs which is missing in splunk is there any log file to validate this or if ther... See more...
we have a log ingestion from aws cloud env via HTTP event collector to splunk , one of the user reporting some of the logs which is missing in splunk is there any log file to validate this or if there is any connectivity drop in http to cloud apps how to validate this 
Hi @Harikiran.Kanuru, You can find info here on email template and actions: https://docs.appdynamics.com/appd/onprem/latest/en/appdynamics-essentials/alert-and-respond/actions/notification-actions
Hello @Ankur.Sharma @Evgeniy.Ziangirov, The latest version of the iOS agent, which came out in early December now supports Alamofire. Sorry for getting back to everyone so late.  https://docs.... See more...
Hello @Ankur.Sharma @Evgeniy.Ziangirov, The latest version of the iOS agent, which came out in early December now supports Alamofire. Sorry for getting back to everyone so late.  https://docs.appdynamics.com/appd/23.x/23.12/en/product-and-release-announcements/release-notes#id-.ReleaseNotesv23.12-agent-enhancements-23-12AgentEnhancements
Events are merging like this: 2022-02-02T15:26:46.593150-05:00 mycompany: syslog initialised2022-02-02T15:26:48.970328-05:00 mycompany: [Portal|SYSTEM|20001|*system] Portal is starting2022-02-02T1... See more...
Events are merging like this: 2022-02-02T15:26:46.593150-05:00 mycompany: syslog initialised2022-02-02T15:26:48.970328-05:00 mycompany: [Portal|SYSTEM|20001|*system] Portal is starting2022-02-02T15:26:50.032387-05:00 mycompany: [Portal|SYSTEM|20002|*system] Portal is up and running2022-02-02T15:26:50.488943-05:00 mycompany: [Portal|CONTENTMANAGER|20942|-] Created fields (category), uid=5fdc6ec-01f0-41d5-8a33-d58b5efre2022-02-02T15:26:50.496126-05:00 mycompany: [Portal|CONTENTMANAGER|20942|-] Created fields (category), uid=6fe48c-20ee-4f7b-bf88-22ed5dfdd2022-02-02T15:26:50.502563-05:00 mycompany: [Portal|CONTENTMANAGER|20942|-] Created fields (category), uid=bcd5c461-9d23-4c79-8509-4af76c03ff5a2022-02-02T15:26:50.505764-05:00 mycompany: [Portal|CONTENTMANAGER|20942|-] Created fields (category), uid=bbb9449e-2893-4d06-bc51-edfdd42022-02-02T15:26:50.512171-05:00 mycompany: [Portal|CONTENTMANAGER|20942|-] Created fields (category), uid=155c7a37-69bc-44d2-98ac-cb75831a7c472022-02-02T15:26:50.517049-05:00 mycompany: [Portal|CONTENTMANAGER|20942|-] Created fields (category), uid=a575dfde3eb-4ca6-be2d-4491a4b59fe02022-02-02T15:33:33.669982-05:00 mycompany: syslog initialised2022-02-02T15:33:40.935228-05:00 mycompany: [Portal|SYSTEM|20001|*system] Portal is starting2022-02-02T15:33:41.990171-05:00 mycompany: [Portal|SYSTEM|20002|*system] Portal is up and running2022-02-02T15:35:34.533063-05:00 mycompany: syslog initialised2022-02-02T15:35:42.168799-05:00 mycompany: [Portal|SYSTEM|20001 I am expecting logs should break on timestamps like this: 2022-02-02T15:26:46.593150-05:00 mycompany: syslog initialised 2022-02-02T15:26:48.970328-05:00 mycompany: [Portal|SYSTEM|20001|*system] Portal is starting 2022-02-02T15:26:50.032387-05:00 mycompany: [Portal|SYSTEM|20002|*system] Portal is up and running
I didnt quite understand what you mean, you are saying for example the input will return the OU of the domain computer and then in automation playbooks I filter it based on the value? if yes, that ki... See more...
I didnt quite understand what you mean, you are saying for example the input will return the OU of the domain computer and then in automation playbooks I filter it based on the value? if yes, that kinda defeats the purpose of input playbooks?
Why not have the input playbook act as a filter and anything that matches your requirement come as an one output and if you want, the others under another output? Then you can work out which was True... See more...
Why not have the input playbook act as a filter and anything that matches your requirement come as an one output and if you want, the others under another output? Then you can work out which was True and which was False? OR tag/update the artifact that contains the value with something to indicate the result of the check? There are many ways to do things in SOAR just depends how janky you want to get!
Ok, two things. Three actually. 1. Check out the Splunk edu site for entry level courses on splunk searching. 2. If you want to just search for logon type 2, add the condition to the initial search... See more...
Ok, two things. Three actually. 1. Check out the Splunk edu site for entry level courses on splunk searching. 2. If you want to just search for logon type 2, add the condition to the initial search (searching for particular value is much more effective than excluding a value from your search so if "not logon type 3" can be simplified to logon type 2, it's great) 3. I have honestly no idea what you mean by "add our asset to the search" (in Splunk terminology those are called searches, not queries).
For the time being I have solved the issue saving the code one piece at a time. Saving the 200 lines of code in one shot was generating the problem...   Restarting Splunk in DEBUG mode can point i... See more...
For the time being I have solved the issue saving the code one piece at a time. Saving the 200 lines of code in one shot was generating the problem...   Restarting Splunk in DEBUG mode can point in the right direction to understand the root cause, but the amount of messages is really huge.
Hello to all, really hoping I can make sense while asking this....    I'm an entry level  IT Security Specialist and I have been tasked with re-writing our current query for overnight logins as our e... See more...
Hello to all, really hoping I can make sense while asking this....    I'm an entry level  IT Security Specialist and I have been tasked with re-writing our current query for overnight logins as our existing query does not put out the correct information we need.  Here is the current query: source=WinEventLog:Security EventCode=4624 OR (EventCode=4776 Keywords="Audit Success") | eval Account = mvindex(Account_Name, 1) | eval TimeHour = Strftime(_time, "%H") | eval Source = coalesce(Source_Network_Address, Sorce_Workstation) | eval Source=if(Source="127.0.0.1" or Source="::1" OR Source="-" OR Source="", hos, Source) | where (Time_Hour > 20 AND Time_Hour <24) OR (Time_Hour > 0 AND Time_Hour < 5) | bin _time span=12h aligntime=@d+20h | eval NightOf = strftime(_time "%m/%d/%Y) | lookup dnslookup clienttip as Source OUTPUT clienthost as SourceDevice | search NOT Account="*$" NOT Account=HealthMail*" NOT Account="System" | stats count as LoginEvents values(sourceDevice) as SourceDevices by Account NightOf | sort NightOfAccount SourceDevices | table NightOf Account Source Devices LoginEvents I need to somehow add an exclusion to the query for logon type 3, (meaning for splunk to omit them from its search), as well as add our asset to the query, that way splunk will only target searches from that particular asset.   I know nothing about coding, or scripts, and my boss just thought it would be super fun if the guy with the least experience try to figure it all out since the current query does not give us the data that we need for our audits.  In a nutshell, we need splunk to tell us who was logged in between 8pm-5am, that it was a logon type 2 , and what computer system they were on.  If anyone could help out an absolute noob here I would greatly appreciate it!  
I think I found the answer on Reddit. It's in spanish tough https://www.reddit.com/user/Splunker1123/comments/198992x/splunk_y_el_esquema_nacional_de_seguridad_ens/?utm_source=share&utm_medium=web2x... See more...
I think I found the answer on Reddit. It's in spanish tough https://www.reddit.com/user/Splunker1123/comments/198992x/splunk_y_el_esquema_nacional_de_seguridad_ens/?utm_source=share&utm_medium=web2x&context=3    
The CIM manual should help.  It describes each DM field so you can determine which of the fields in your data map best.  See https://docs.splunk.com/Documentation/CIM/5.3.1/User/Endpoint#Processes