All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I am searching far and wide for recommendations, best practices, even just conversations on this topic - all for naught. Here's my big dilemma that I would love to get something other than "It Depend... See more...
I am searching far and wide for recommendations, best practices, even just conversations on this topic - all for naught. Here's my big dilemma that I would love to get something other than "It Depends" answer to:  Which platform is more appropriate for Metrics: Splunk Cloud or Splunk O11y ? I am not an expert in Splunk, but know that there are certain challenges with metrics indexes in Splunk Core/Cloud . And, I can also see there are Metrics Rules in O11y  that help with filtering, and some pre-built dashboards. But I still am lost for direction.  Short of actually ingesting the same amount of metrics into both Platforms and seeing what comes out...do I want to do that ? Well, "that depends"!  Any guidance deeply appreciated.
I recently upgraded my search head cluster to 9.x and since then my skipped/deferred searches have sky rocketed.     index=_internal source=*scheduler.log status=* | timechart span=60s count... See more...
I recently upgraded my search head cluster to 9.x and since then my skipped/deferred searches have sky rocketed.     index=_internal source=*scheduler.log status=* | timechart span=60s count by status      
Hello! I'm using a text input box to input a username. If I were to simply put that username into my base search, it works great and is very quick. I have other search input parameters, so my probl... See more...
Hello! I'm using a text input box to input a username. If I were to simply put that username into my base search, it works great and is very quick. I have other search input parameters, so my problem is that if I DON'T specify a username, I want it to include all values. This includes null values. I started by using an asterisk as the default input value, but that doesn't include null values. The only way I've been able to make this partially work is by removing the username from the base search, then using an eval command to give the null entries a value, and then search the base results for either "*" to include everything, or the username I typed in. This is horribly inefficient because I have to search my entire database for every entry before I can filter it. I also think this doesn't work properly because it has a limit on the number of results in the base search.  I've done a lot of searching for doing an eval command BEFORE the base search, but that doesn't seem to be possible. This can't be a unique scenario. How do I search for both "null" and "NOT null" values in the base search without removing my username input box?
Hello, I wonder if somebody can please help me to sort the following data: Into this table: Any ideas are welcome I was trying to run this query but it is not separating the values of... See more...
Hello, I wonder if somebody can please help me to sort the following data: Into this table: Any ideas are welcome I was trying to run this query but it is not separating the values of the fields properly: index=query_mcc | eval data = split(_raw, ",") | eval Date = strftime(_time, "%Y-%m-%d-%H:%M:%S") | eval Category = mvindex(data, 1) | eval Status = mvindex(data, -1) | eval Command = mvindex(data, 0) | table host, Date, Category, Status, Command   but is giving me this , where it only shows the first line..             
Thank you, I will have our team look into these and see if there is anything we can salvage of our current system.  I feel like Goofy from Disney's Jack & the Beanstalk, slicing bread so thin it is t... See more...
Thank you, I will have our team look into these and see if there is anything we can salvage of our current system.  I feel like Goofy from Disney's Jack & the Beanstalk, slicing bread so thin it is transparent.
Hello,  There must be something `rex` specific with my query below since it is not extracting the fields, while the regex works as expected when I test on regex101 (see https://regex101.com/r/g0TMS4... See more...
Hello,  There must be something `rex` specific with my query below since it is not extracting the fields, while the regex works as expected when I test on regex101 (see https://regex101.com/r/g0TMS4/1)     eventtype="my_event_type" | rex field=responseElements.assumedRoleUser.arn /arn:aws:sts::(?<accountId>\d{12}):assumed_role\/(?<assumedRoled>.*)\/vault-oidc-(?<userId>\w+)-*./ | fields accountId, assumedRole, userId Sample data that fails to match: arn:aws:sts::984086324016:assumed-role/foo-admin-app/vault-oidc-foo-admin-app-1687793763-Qen4JHeRXYlB8Eoplkjs      Thanks Alex.
I am still facing the same issue after disabling the data input from Splunk Assist Supervisor app. Is there any other step you followed to bypass this error ?
I made suggested change to disable the data input but I am still getting the same error.  I do not see permission to disable the whole app though. Can you please help as I am stuck !
Hello, I have the following search     index=wineventlog EventCode=4728 OR EventCode = 4731 OR EventCode=4729 OR EventCode=4732 OR EventCode=4756 OR EventCode=4756 NOT src_user=*$ | rename src... See more...
Hello, I have the following search     index=wineventlog EventCode=4728 OR EventCode = 4731 OR EventCode=4729 OR EventCode=4732 OR EventCode=4756 OR EventCode=4756 NOT src_user=*$ | rename src_user as admin, name as action | table admin, Group_Name, user_name     This spits out output like this:   admin Group_Name user_name adminx GroupA UserA adminx GroupB UserA adminx GroupC UserA adminy GroupD UserB adminy GroupE UserB adminy GroupF UserC adminy GroupF UserD     I'm trying to combine them into a single message that looks like this:   admin Group_Name user_name adminx GroupA,GroupB,GroupC UserA adminy GroupD,GroupE UserB adminy GroupF UserC,UserD     What would be the best way to achieve that?
Hi - I would like to join and sum the results and output The searches: index=test_index sourcetype="test_source"  className=export | table message.totalExportedProfileCounter index=test_inde... See more...
Hi - I would like to join and sum the results and output The searches: index=test_index sourcetype="test_source"  className=export | table message.totalExportedProfileCounter index=test_index sourcetype="test_source"  className=export | table message.exportedRecords From above both searches I am looking to add message.totalExportedProfileCounter, message.exportedRecords. For a given call only one of the above search shows up. I am looking for message.totalExportedProfileCounter + message.exportedRecords   Thanks in advance!   Thanks.
Hi, I'm saying for these issues you've answered some of it. What I was reaching out to the community for was search queries for each of these issues. I'm trying to use different types of search quer... See more...
Hi, I'm saying for these issues you've answered some of it. What I was reaching out to the community for was search queries for each of these issues. I'm trying to use different types of search queries and can't seem to get something to stick for each of those issues. I'm trying to make a table for each one of those issues but if columns for those issues you think would be better then I'll experiment with that idea as well. I just can't seem to get any to show up. I'm using them for note purposes. I'm just needing assistance from someone being able to show me how to get search queries for each. 
in my search I have no lookup command. Anyone knows why I am getting this error.
Unfortunately, there are two different rules at play. One needs everything after the URL and the other only needs the URI which is a service name. That's what I've been struggling with.
I had the same error using a local instance of Cisco Cyber Vision and local Splunk Enterprise on Windows.  I modified the file located at C:\Program Files\Splunk\etc\apps\TA-cisco_cybervision\bin\TA... See more...
I had the same error using a local instance of Cisco Cyber Vision and local Splunk Enterprise on Windows.  I modified the file located at C:\Program Files\Splunk\etc\apps\TA-cisco_cybervision\bin\TA_cisco_cybervision_utils.py and changed VERIFY_SSL=True to VERIFY_SSL=False at the top of the document. The location of this file may be different depending on your installation. I saved and restarted Splunk Enterprise (Settings>Server Controls>Restart Splunk) Now when I try it again it works
Hello @shriram.m, Please fill out this Contact Sales form. https://www.appdynamics.com/company/contact-us
The desired extractions are inconsistent.  The first two want everything after the domain, but the third wants only the first segment after the domain.  Please specify the rules for extracting URI_ABR.
Hello I am collecting data via AWS add on and what I have found is that my timestamp recognition isn't working properly. I have a single AWS input using the [aws:s3:csv] sourcetype. this then use... See more...
Hello I am collecting data via AWS add on and what I have found is that my timestamp recognition isn't working properly. I have a single AWS input using the [aws:s3:csv] sourcetype. this then uses transforms to update the sourcetype based on the file name the data comes from. Config snips: props.conf   [aws:s3:csv] LINE_BREAKER = ([\r\n]+) SHOULD_LINEMERGE = true BREAK_ONLY_BEFORE_DATE = true FIELD_DELIMITER = , HEADER_FIELD_DELIMITER = , TRUNCATE = 20000 TRANSFORMS-awss3 =sourcetypechange:awss3-object_rolemap_audit,sourcetypechange:awss3-authz-audit-logs [awss3:object_rolemap_audit] TIME_FORMAT=%d %b %Y %H:%M:%S LINE_BREAKER = ([\r\n]+) SHOULD_LINEMERGE = false BREAK_ONLY_BEFORE_DATE = true FIELD_DELIMITER = , HEADER_FIELD_DELIMITER = , FIELD_QUOTE = " INDEXED_EXTRACTIONS = CSV HEADER_FIELD_LINE_NUMBER = 1 [awss3:authz_audit] TIME_FORMAT=%Y-%m-%d %H:%M:%S,%3Q #TZ=GMT FIELD_DELIMITER = , HEADER_FIELD_DELIMITER = , FIELD_QUOTE = " INDEXED_EXTRACTIONS = CSV HEADER_FIELD_LINE_NUMBER = 1   transforms.conf   [sourcetypechange:awss3-object_rolemap_audit] SOURCE_KEY = MetaData:Source REGEX = .*?object_rolemap_audit.csv DEST_KEY = MetaData:Sourcetype FORMAT = sourcetype::awss3:object_rolemap_audit [sourcetypechange:awss3-authz-audit-logs] SOURCE_KEY = MetaData:Source REGEX = .*?authz-audit.csv DEST_KEY = MetaData:Sourcetype FORMAT = sourcetype::awss3:authz_audit     It seems that the data comes in at indextime from what I can see, even though I set recognition for each sourcetype. I believe that timestamping is happening at the initial pass into Splunk before it gets the transforms applied.   How can i set timestamping via the initial sourcetype if there are multiple formats for the sourcetype depending on the file? Since its not honoring the timestamp recognition setting post-transforms. Thanks for the help.
Hi Practically you must have admin role to share KOs to global/all apps. r. Ismo
I try change permission to all app option but I don't see the option. I s anyother way make my macro available for all apps.
My mistake, it should be max(_time). I've fixed it in the other reply.