All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi Everyone, I am using Splunk enterprise MSI file in my own windows laptop. So for practice i would like to install Splunk in Linux environment in same machine. As we know we can use putty or th... See more...
Hi Everyone, I am using Splunk enterprise MSI file in my own windows laptop. So for practice i would like to install Splunk in Linux environment in same machine. As we know we can use putty or third party tools to run Linux in windows at that point of time we need to install Splunk again. Is it possible? Or Do I need to uninstall Splunk in windows(.exe) file in order to install through Linux?
Out of the dataModels provided with Enterprise Security, one of the accelerated datamodel suddenly has a very high run time than usual . Any suggestion for what could be the issue Or where in the int... See more...
Out of the dataModels provided with Enterprise Security, one of the accelerated datamodel suddenly has a very high run time than usual . Any suggestion for what could be the issue Or where in the internal logs we should see to identify potential root cause for it ?
Hi, I want to see my data in the ES dashboard Security Domains -> Endpoint -> Endpoint Changes. I created the following things: props.conf with CIM compliant field aliases. eventtypes.conf [MyEv... See more...
Hi, I want to see my data in the ES dashboard Security Domains -> Endpoint -> Endpoint Changes. I created the following things: props.conf with CIM compliant field aliases. eventtypes.conf [MyEventType] search = index=MyIndex sourcetype=MySourcetype   tags.conf [eventtype=MyEventType] change=enabled endpoint=enabled I can successfully search the events with tag=change and tag=endpoint. I can also successfully search the data with the data model constraint (`cim_Change_indexes`) tag=change NOT (object_category=file OR object_category=directory OR object_category=registry) tag=endpoint. However, the dashboard stays empty. When I manually execute one of the dashboard searches | `tstats` append=T count from datamodel=Change.All_Changes where nodename="All_Changes.Endpoint_Changes" I get not results. When I change nodename="All_Changes.Endpoint_Changes"  to nodename="All_Changes" I see my events. So the question is, what do I need to do to get my events in the node All_Changes.Endpoint_Changes?  
Hey there Splunk hero's, Story/Background: So, there is this variable called "src_ip" in my correlation search. The "src_ip" is a more than 5000+ ip address. What i am doing is matching these i... See more...
Hey there Splunk hero's, Story/Background: So, there is this variable called "src_ip" in my correlation search. The "src_ip" is a more than 5000+ ip address. What i am doing is matching these ip address which should not be in a particular CIDR range using cidrmatch function which works prefectly. Which looks something like this : | where (NOT cidrmatch("34.20.223.128/25",src_ip) AND NOT cidrmatch("13.9.22.0/25",src_ip) AND NOT cidrmatch("13.56.21.18/25",src_ip) AND NOT cidrmatch("35.17.29.0/26",src_ip) AND NOT(many-more,src_ip))   SOLUTION REQUIRED: Now, coming to the part where i need your help is . I want to simply this. SOLUTION Tried: PART 1: Solutions which i have searched over the forum tell me to create a lookup table and look through it. So, I have created a lookup table named "match_cidr.csv". This csv/lookup file consist of more that 100+ CIDR blocks under a variable called cidr_match_src_ip. What i have tried looking into this via this command. there is a tstat command as well so, Query looks like this  [ | inputlookup match_cidr.csv | where src_ip != cidr_match_src_ip] ===> this won't work since i am comparing a CIDR to IP address directly. where NOT cidrmatch([| inputlookup match_cidr.csv], src) ==> tried this as well   What can i use here or what other things can you recommend me to do. Feel free to ask any more question to me if my message isn't clear      
Hi, I want to search    "xyzetc\";0,                   ---this is my string . Unable to search this exact pattern, Unbalanced Quotes error as it already has quotes  
I want to use the subsearch to get start and endtime of the newest transaction (here a botsession). The subsearch alone gives me: starttime=  09/01/2021:17:28:49 endtime= 09/01/2021:19:42:50 At f... See more...
I want to use the subsearch to get start and endtime of the newest transaction (here a botsession). The subsearch alone gives me: starttime=  09/01/2021:17:28:49 endtime= 09/01/2021:19:42:50 At first i used the subsearch without strftime() but Splunk said earliest/latest cant parse epochtime and that it wants format %m/%d/%Y:%H:%M:%S that brings me to my current search where splunk says "Invalid value "starttime" for time term 'earliest'" When i use the results of the subsearch when running alone it works. How can i make use of the start-/endtime? Or is there a better method to limit my mainsearch for the newest botsession? My Search (not the final search, but i want to work with the events from a specific session): index="fishingbot"   [search index=fishingbot   | transaction startswith="Anmeldung erfolgreich!" endswith="deaktiviert!"   | eval endtime=strftime((_time+duration), "%m/%d/%Y:%H:%M:%S")   | eval starttime=strftime(_time, "%m/%d/%Y:%H:%M:%S")   | top starttime endtime limit=1   | table starttime endtime] earliest=starttime latest=endtime
So, I have multiple ip addresses i want to combine them using regex or normal by supplying dashes and compare them to the variable. For eg: This is my existing query: | search NOT src IN (10.161.5... See more...
So, I have multiple ip addresses i want to combine them using regex or normal by supplying dashes and compare them to the variable. For eg: This is my existing query: | search NOT src IN (10.161.5.50 , 10.161.5.51,10.161.5.52, 10.161.5.53,10.161.10.20,192.168.1.120,192.168.1.130 ) I had an output of 15 matched output. What i have tried doing to get result is: | search NOT src IN ("10.161.5.5[0-3]", 10.161.10.20,192.168.1.120,192.168.1.130) Doing this lead to an increase in the matched query up to 30 results. Why was this happening and what can i do to prevent it. Any solutions?
Hi, Current piechart In the above piechart highlighted cities details are not displaying.have to use mouse over to check the details. Please let me know how to get hided values also in piechar... See more...
Hi, Current piechart In the above piechart highlighted cities details are not displaying.have to use mouse over to check the details. Please let me know how to get hided values also in piechart.   Regards, Madhusri R  
I'm trying to calculate percentages based on the number of events per vary group. There are actually a lot of events, so can't use method like count(eval(...)). The summary of events is as follows: ... See more...
I'm trying to calculate percentages based on the number of events per vary group. There are actually a lot of events, so can't use method like count(eval(...)). The summary of events is as follows:       color ------ green red greed greed red        Here's my search so far:       index="test" sourcetype="csv" | stats count as numColor by color | eval total=5 | eval percent=printf("%.2f", (numColor/total)*100) | sort num(percent) | table color numColor percent       How do I replace the hardcore variable value "total" with count() function or other methods? Any help would be appreciated.
Hello I have a table with 3 columns 1 is strings and 2 columns with numbers is there a way to sort the table from the highest number to lowest from all the values in the table ? for example: ... See more...
Hello I have a table with 3 columns 1 is strings and 2 columns with numbers is there a way to sort the table from the highest number to lowest from all the values in the table ? for example: this is part of my table and i want to sort the numbers in "priority" and "silverpop" regardless if its one of them, just to see the raw with the highest value first
Hi Guys, I would like to check if it's possible to prevent some data from showing up in the search.  Below is what I want to prevent from showing up. ============= Aug 31 23:59:43 a.b.c.d hey_audi... See more...
Hi Guys, I would like to check if it's possible to prevent some data from showing up in the search.  Below is what I want to prevent from showing up. ============= Aug 31 23:59:43 a.b.c.d hey_audit: INFO 2021-08-31 23:59:43 12,345 HelloType=External |userName=zzz| |xxxapiversion=1.0| |httpMethod=POST| | restEndPoint=/v1/insights_transport/transfer_data_to_multicluster| |entityUuid=| |queryParams=| |payload= I understand can use regex or eval. Can someone show me how it's done?  
Hi All, Kindly let me know if there is any document or link which can provide me steps to  on-board the open telemetry data in to splunk Enterprise.   1) Do we need to buy the Splunk Observability c... See more...
Hi All, Kindly let me know if there is any document or link which can provide me steps to  on-board the open telemetry data in to splunk Enterprise.   1) Do we need to buy the Splunk Observability cloud in-order to monitor/analysis the open telemetry data ? 2)  What are the steps or Procedure which we need to follow to on-board the open telemetry data in to Splunk? Kindly provide the link to access the document. 3) I had gone through this link but getting confused on what on the component which are need to perform this task.  https://docs.splunk.com/Observability/get-started/welcome.html#nav-Welcome-to-Splunk-Observability-Cloud  thanks in advance. 
Hello ,   I need to onboards linux and window to itsi. 1) I have installed UF on linux and addon Unix and splunk Infr. and configured to connect splunk. 2) i am geeting data on splunk but not sho... See more...
Hello ,   I need to onboards linux and window to itsi. 1) I have installed UF on linux and addon Unix and splunk Infr. and configured to connect splunk. 2) i am geeting data on splunk but not showing anything on ITST.   Thanks Lalit
We have Splunk DB Connect Add On connected into a SQL Server, after all connections are successful. We monitor the database activity and we saw queries in sleeping mode, DBAs mentioned that connecti... See more...
We have Splunk DB Connect Add On connected into a SQL Server, after all connections are successful. We monitor the database activity and we saw queries in sleeping mode, DBAs mentioned that connection should be turned off after transacting a query. Is this possible?
Hi all, We have an excisting index cluster which was installed with version 6.x and gradually upgraded to version 8.1.3. In the proces of adding two new Heavy forwarders we can not get the HF to pr... See more...
Hi all, We have an excisting index cluster which was installed with version 6.x and gradually upgraded to version 8.1.3. In the proces of adding two new Heavy forwarders we can not get the HF to properly communicate with the index cluster. The HF are fresh installations using the lates t8.1.3 package. We get the error as shown in the subject. Since we do not use SSL where a bit lost with regards to this message.   
Hello I have to search events on many sourcetype with name begin by "ezop:web" So I use a wildcard after "ezop:web*"   index="tutu" sourcetype=ezop:web*     Is it the good practice or is it bet... See more...
Hello I have to search events on many sourcetype with name begin by "ezop:web" So I use a wildcard after "ezop:web*"   index="tutu" sourcetype=ezop:web*     Is it the good practice or is it better to do somethin like this :   (sourcetype=ezop:web1 OR sourcetype=ezop:web2 OR sourcetype=ezop:web3)   Or pearhaps something else? Thanks
Hi All, I need to integrate trend micro portable security   ( which is an antivirus security program in a portable USB ) with splunk. However the addons available in splunk are for Trend micro deep... See more...
Hi All, I need to integrate trend micro portable security   ( which is an antivirus security program in a portable USB ) with splunk. However the addons available in splunk are for Trend micro deepsecurity or trend micro deep discovery. will trend micro deepsecurity addon work for trendmicro portable security logs ?? or is there any other way to integrate trend micro portable security logs with splunk? Detailed answer and  suitable links will be appreciated .. Thanks   
My query is : index="stage*" source="*record service*" | eval type=case(like(message, "%successful generated account%"),"Success Accounts", like(message, "%Granting failed Accounts%"),"Granting fai... See more...
My query is : index="stage*" source="*record service*" | eval type=case(like(message, "%successful generated account%"),"Success Accounts", like(message, "%Granting failed Accounts%"),"Granting failed Accounts", like(message, "%Inbound setup failed accounts%"),"Inbound  failed Accounts")| stats count as Results by type I am getting the result as: type                                               Results Success Accounts                   10 Granting failed Accounts       20   I am unable to get the results for the string  Inbound failed Accounts as the results are zero. I need the output as  type                                               Results Success Accounts                   10 Granting failed Accounts       20 Inbound  failed Accounts         0   Please help me with the query for displaying the strings with zero count as well  
Hello,  We get this error and I'm not entirely sure on how we can resolve this? Looks like a timeout issue 2021-09-01 14:07:44,983 level=ERROR pid=12315 tid=Thread-20 logger=splunk_ta_o365.modi... See more...
Hello,  We get this error and I'm not entirely sure on how we can resolve this? Looks like a timeout issue 2021-09-01 14:07:44,983 level=ERROR pid=12315 tid=Thread-20 logger=splunk_ta_o365.modinputs.management_activity pos=management_activity.py:do:159 | datainput=b'at_rbi_management_activity_sharepoint' start_time=1630505194 | message="Failed to retrieve content blob." content_id=b'20210901140215154005183$20210901140215154005183$audit_sharepoint$Audit_SharePoint$emea0023' Traceback (most recent call last): File "/opt/splunk/etc/apps/splunk_ta_o365/bin/3rdparty/urllib3/connectionpool.py", line 383, in _make_request six.raise_from(e, None) File "<string>", line 2, in raise_from File "/opt/splunk/etc/apps/splunk_ta_o365/bin/3rdparty/urllib3/connectionpool.py", line 379, in _make_request httplib_response = conn.getresponse() File "/opt/splunk/lib/python3.7/http/client.py", line 1369, in getresponse response.begin() File "/opt/splunk/lib/python3.7/http/client.py", line 310, in begin version, status, reason = self._read_status() File "/opt/splunk/lib/python3.7/http/client.py", line 271, in _read_status line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") File "/opt/splunk/lib/python3.7/socket.py", line 589, in readinto return self._sock.recv_into(b) File "/opt/splunk/lib/python3.7/ssl.py", line 1071, in recv_into return self.read(nbytes, buffer) File "/opt/splunk/lib/python3.7/ssl.py", line 929, in read return self._sslobj.read(len, buffer) socket.timeout: The read operation timed out During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/opt/splunk/etc/apps/splunk_ta_o365/bin/3rdparty/requests/adapters.py", line 449, in send timeout=timeout File "/opt/splunk/etc/apps/splunk_ta_o365/bin/3rdparty/urllib3/connectionpool.py", line 637, in urlopen _stacktrace=sys.exc_info()[2]) File "/opt/splunk/etc/apps/splunk_ta_o365/bin/3rdparty/urllib3/util/retry.py", line 368, in increment raise six.reraise(type(error), error, _stacktrace) File "/opt/splunk/etc/apps/splunk_ta_o365/bin/3rdparty/urllib3/packages/six.py", line 686, in reraise raise value File "/opt/splunk/etc/apps/splunk_ta_o365/bin/3rdparty/urllib3/connectionpool.py", line 599, in urlopen chunked=chunked) File "/opt/splunk/etc/apps/splunk_ta_o365/bin/3rdparty/urllib3/connectionpool.py", line 385, in _make_request self._raise_timeout(err=e, url=url, timeout_value=read_timeout) File "/opt/splunk/etc/apps/splunk_ta_o365/bin/3rdparty/urllib3/connectionpool.py", line 305, in _raise_timeout raise ReadTimeoutError(self, url, "Read timed out. (read timeout=%s)" % timeout_value) urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='manage.office.com', port=443): Read timed out. (read timeout=60) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/opt/splunk/etc/apps/splunk_ta_o365/bin/splunk_ta_o365/modinputs/management_activity.py", line 153, in do response = self._subscription.retrieve_content_blob(session, content.uri) File "/opt/splunk/etc/apps/splunk_ta_o365/bin/splunk_ta_o365/common/portal.py", line 166, in retrieve_content_blob return self._request(session, 'GET', url) File "/opt/splunk/etc/apps/splunk_ta_o365/bin/splunk_ta_o365/common/portal.py", line 179, in _request response = session.request(method, url, params=params, timeout=self._request_timeout) File "/opt/splunk/etc/apps/splunk_ta_o365/bin/3rdparty/requests/sessions.py", line 533, in request resp = self.send(prep, **send_kwargs) File "/opt/splunk/etc/apps/splunk_ta_o365/bin/3rdparty/requests/sessions.py", line 646, in send r = adapter.send(request, **kwargs) File "/opt/splunk/etc/apps/splunk_ta_o365/bin/3rdparty/requests/adapters.py", line 529, in send raise ReadTimeout(e, request=request) requests.exceptions.ReadTimeout: HTTPSConnectionPool(host='manage.office.com', port=443): Read timed out. (read timeout=60)  
Before I go and re-invent the wheel, has anyone looked at indexing the results from the running an inspect using the CLI version of splunk-appinspect? The --output-file is, by default, JSON and has ... See more...
Before I go and re-invent the wheel, has anyone looked at indexing the results from the running an inspect using the CLI version of splunk-appinspect? The --output-file is, by default, JSON and has a start_time field in it which could be used for the event's _time. And, if you run it with --generate-feedback, then you get a YAML file which can be converted to JSON using the yq command.  The result JSON file also has a start_time field in it which could be used for the event's _time. As for a use-case... I don't know (yet).  At this stage, it's really just a wouldn't it be cool to ...