All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Attempting to exclude based on UserId/City/Country from inputlookup csv file but City/Country are not matching because iplocation is run after. How can I fix this? index="o365data" eventtype="user_l... See more...
Attempting to exclude based on UserId/City/Country from inputlookup csv file but City/Country are not matching because iplocation is run after. How can I fix this? index="o365data" eventtype="user_logins" NOT [| inputlookup exemptusers |fields UserId,City,Country | format] | spath input=AuditData | eval User = coalesce(UserId, UserIds) | search Workload=AzureActiveDirectory User!=Unknown User!="*@emailaddress.org" | rename ExtendedProperties{}.* as * | eval ExtendedProperties=mvzip(Name,Value, "=") | mvexpand ExtendedProperties | eval Key=mvindex(split(ExtendedProperties,"="),0), Value=mvindex(split(ExtendedProperties,"="),1) | rename ExtendedProperties{}.* as * | eval ExtendedProperties=mvzip(Name,Value, "=") | mvexpand ExtendedProperties | eval Key=mvindex(split(ExtendedProperties,"="),0), Value=mvindex(split(ExtendedProperties,"="),1) | iplocation ClientIP | eval checkfail1=if(like(LogonError, "FaultDomainRedirect"), "true", "false") | eval checkfail2=if(like(LogonError, "UserAccountNoFound"), "true", "false") | where Country != "Canada" AND checkfail1 != "true" AND checkfail2 != "true" | stats values(ClientIP) as "ClientIP", values(City) as "City", values(Country) as "Country", latest(_time) as "Latest Timestamp" by User | convert ctime("Latest Timestamp")
In the timeline summary, only Activities tracking is being shown. As declared in the spec the library should also track Fragments but it doesn't happen. Library version and Gradle plugin:  20.12... See more...
In the timeline summary, only Activities tracking is being shown. As declared in the spec the library should also track Fragments but it doesn't happen. Library version and Gradle plugin:  20.12.1 Is it issue in library implementation or wrong configuration? Any tips how to track fragments properly? ^ Edited by @Ryan.Paredez for clarity and readability. 
Hi, I have an issue in Forwarder Management, it says there errors in installing a specific app on some clients however for others the app has been installed without errors. I uninstalled the app but... See more...
Hi, I have an issue in Forwarder Management, it says there errors in installing a specific app on some clients however for others the app has been installed without errors. I uninstalled the app but errors still exist. What should I do? Thanks
Hello, I'm currently have 2 queries that produce 2 alert emails that send 2 separate csv files. The 2 have the same fields/columns. I want to merge them into 1 scv file, and in Excel form, would ide... See more...
Hello, I'm currently have 2 queries that produce 2 alert emails that send 2 separate csv files. The 2 have the same fields/columns. I want to merge them into 1 scv file, and in Excel form, would ideally in the same sheet. Exp: Table 1   MERCHANT_ID|Sales|TYPE Merchant A |14 |Domestic Merchant B |5 |Domestic   Table 2   MERCHANT_ID|Sales|TYPE Merchant C |2 |Foreign Merchant D |52 |Foreign   The result would be   MERCHANT_ID|Sales|TYPE Merchant A |14 |Domestic Merchant B |5 |Domestic Merchant C |2 |Foreign Merchant D |52 |Foreign    
I could see there is a slight difference ( in seconds - from 1 to 10) between the _time and the timestamp field in the row data. Is this expected or this should be exacty matching ? Please note that ... See more...
I could see there is a slight difference ( in seconds - from 1 to 10) between the _time and the timestamp field in the row data. Is this expected or this should be exacty matching ? Please note that the difference is only interms of seconds .  Is there someway to fix this issue ? What could be the reason the _time is not showing exact time as in the timestamp ?   
Issue importing pandas module into Splunk Add-on Builder   import re import sys import os ta_name = 'TA-urlparse' ta_lib_name = '/usr/local/lib/python3.7/site-packages' pattern = re.compile(r"[\\/... See more...
Issue importing pandas module into Splunk Add-on Builder   import re import sys import os ta_name = 'TA-urlparse' ta_lib_name = '/usr/local/lib/python3.7/site-packages' pattern = re.compile(r"[\\/]etc[\\/]apps[\\/][^\\/]+[\\/]bin[\\/]?$") new_paths = [path for path in sys.path if not pattern.search(path) or ta_name in path] new_paths.insert(0, os.path.sep.join([ta_lib_name])) sys.path = new_paths import numpy as np import pandas import requests import ipaddress import time from datetime import datetime    The Error:   Traceback (most recent call last): File "/apps/splunk/etc/apps/TA-urlparse/bin/urltest_1615256627_866.py", line 14, in <module> import input_module_urltest_1615256627_866 as input_module File "/apps/splunk/etc/apps/TA-urlparse/bin/input_module_urltest_1615256627_866.py", line 16, in <module> import pandas File "/usr/local/lib/python3.7/site-packages/pandas/__init__.py", line 52, in <module> from pandas.core.api import ( File "/usr/local/lib/python3.7/site-packages/pandas/core/api.py", line 29, in <module> from pandas.core.groupby import Grouper, NamedAgg File "/usr/local/lib/python3.7/site-packages/pandas/core/groupby/__init__.py", line 1, in <module> from pandas.core.groupby.generic import DataFrameGroupBy, NamedAgg, SeriesGroupBy File "/usr/local/lib/python3.7/site-packages/pandas/core/groupby/generic.py", line 57, in <module> from pandas.core.aggregation import ( File "/usr/local/lib/python3.7/site-packages/pandas/core/aggregation.py", line 27, in <module> from pandas.core.series import FrameOrSeriesUnion, Series File "/usr/local/lib/python3.7/site-packages/pandas/core/series.py", line 68, in <module> from pandas.core import algorithms, base, generic, nanops, ops File "/usr/local/lib/python3.7/site-packages/pandas/core/generic.py", line 102, in <module> from pandas.io.formats import format as fmt File "/usr/local/lib/python3.7/site-packages/pandas/io/formats/format.py", line 71, in <module> from pandas.io.common import stringify_path File "/usr/local/lib/python3.7/site-packages/pandas/io/common.py", line 7, in <module> import mmap ModuleNotFoundError: No module named 'mmap'   "Extensive" Troubleshooting:   /usr/bin/python3 -m pip install mmap ERROR: Could not find a version that satisfies the requirement mmap ERROR: No matching distribution found for mmap   Per: https://community.splunk.com/t5/All-Apps-and-Add-ons/how-can-I-enable-MMAP-caching-in-Google-GeoIP-of-MaxMind/m-p/33754 copied mmap.so and am now getting the following: Traceback (most recent call last): File "/apps/splunk/etc/apps/TA-urlparse/bin/urltest_1615264087_827.py", line 14, in <module> import input_module_urltest_1615264087_827 as input_module File "/apps/splunk/etc/apps/TA-urlparse/bin/input_module_urltest_1615264087_827.py", line 16, in <module> import pandas File "/usr/local/lib/python3.7/site-packages/pandas/__init__.py", line 52, in <module> from pandas.core.api import ( File "/usr/local/lib/python3.7/site-packages/pandas/core/api.py", line 29, in <module> from pandas.core.groupby import Grouper, NamedAgg File "/usr/local/lib/python3.7/site-packages/pandas/core/groupby/__init__.py", line 1, in <module> from pandas.core.groupby.generic import DataFrameGroupBy, NamedAgg, SeriesGroupBy File "/usr/local/lib/python3.7/site-packages/pandas/core/groupby/generic.py", line 57, in <module> from pandas.core.aggregation import ( File "/usr/local/lib/python3.7/site-packages/pandas/core/aggregation.py", line 27, in <module> from pandas.core.series import FrameOrSeriesUnion, Series File "/usr/local/lib/python3.7/site-packages/pandas/core/series.py", line 68, in <module> from pandas.core import algorithms, base, generic, nanops, ops File "/usr/local/lib/python3.7/site-packages/pandas/core/generic.py", line 102, in <module> from pandas.io.formats import format as fmt File "/usr/local/lib/python3.7/site-packages/pandas/io/formats/format.py", line 71, in <module> from pandas.io.common import stringify_path File "/usr/local/lib/python3.7/site-packages/pandas/io/common.py", line 7, in <module> import mmap ImportError: /apps/splunk/lib/python3.7/mmap.so: undefined symbol: PL_mmap_page_size Any Ideas? 
Hi, I have a weird requirement where I have to count the distinct values of a multi value field. So I have a xml where a particular node can appear one time or multiple times and there are many nodes... See more...
Hi, I have a weird requirement where I have to count the distinct values of a multi value field. So I have a xml where a particular node can appear one time or multiple times and there are many nodes like this. How do i count the distinct number of nodes using a request ID? Basically I am looking something like this - request ID nodes Count 12345 networkpremise networkdetails mysubscription 2 3 2 3456778 networkpremise networkdetails mysubscription 6 2 4 And so on..       Not exactly like above but if there are some other interpretations which can give a better view please let me know. I've looked into some of the posts like this but the solution has not worked for me https://community.splunk.com/t5/Splunk-Search/Can-I-get-a-count-of-distinct-values-in-multivalue-field/m-p/59488 Let me know if someone can help on this. This is my query which I was trying to do from the above referebnc     index=test_prod MyServiceGateway "SoapMessage Incoming" | rex field=_raw "\<(?<nodes>[^\>]+)\>\s+?\<action\>" max_match=0 | rex field=_raw "\>(?<requestID>[^\<]+)\<\/ns:requestID>" max_match=0 | table requestID nodes | untable requestID field value | makemv delim="," value | mvexpand value | stats count by requestID field value | eval pair=value." (".count.")" | stats list(pair) as values by requestID field      
Hi Splunkers. I'm wondering how people out there are deploying the UF to Mac fleets We are trying to deploy the Splunk UF to MacOS machines in an automated fashion. i.e. Scripted so that it requir... See more...
Hi Splunkers. I'm wondering how people out there are deploying the UF to Mac fleets We are trying to deploy the Splunk UF to MacOS machines in an automated fashion. i.e. Scripted so that it requires no input from the user. If we try the install by extracting the .tgz, Big Sur complains that it cannot confirm the Splunk executables are legitimate and blocks the install. Accordingly, when we deploy via the .dmg (scripted), partway through the install Splunk's Little Helper pops up.  Although we can click through this, this breaks the unattended workflow we are trying to use for the installation  Note that we are performing any configuration items needed from the command line such as: - Setting admin credentials using the user-seed file. - Starting Splunk up for the first ime with the --accept-license option. - Setting Splunk to start on boot up. This means we don't need (or want) Splunk's Little Helper to run. We just need the install to finish without any interaction with the user. Is there any way of running the install without the helper popping up? Alternatively, is there an alternative way people doing scripted UF installs on Macs that requires no user interaction/click-throughs or Splunks Little helper popping up during install? I've not found any special options to prevent this from popping up. Version of Splunk UF being installed is 7.2.5.1.  Macos is BigSur. Thanks in advance.
Hello, OKTA Add-on User, I was wondering if you have event size per user per day on the OKTA log, user, group, app.  I am trying to get a baseline for sizing. We're currently pilot.  The pilot data... See more...
Hello, OKTA Add-on User, I was wondering if you have event size per user per day on the OKTA log, user, group, app.  I am trying to get a baseline for sizing. We're currently pilot.  The pilot data seem small.  I am seeking a simple answer like 5 MB per user per day.
I need an example script for sending buckets to Google Cloud Storage when it reaches the freeze state. However due to the high volume of freezes, making a mount in the operating system is not an opt... See more...
I need an example script for sending buckets to Google Cloud Storage when it reaches the freeze state. However due to the high volume of freezes, making a mount in the operating system is not an option, I need to upload it on demand to Google Cloud Storage.
Hello I am trying to using regex to search a hostname that begins with WB has a 13 characters, character number 10 is the "I" character, the last three characters are numbers. 001 - 099. Thanks  
I create a TA which is working fine in our Splunk test environment. However, seems that any configuration from props.conf and transforms.conf is not applied in the production environment.  There are ... See more...
I create a TA which is working fine in our Splunk test environment. However, seems that any configuration from props.conf and transforms.conf is not applied in the production environment.  There are events in XML format that KV_MODE = xml setting in props.conf makes field extraction just fine in test however the same setup doesn't work for any field extraction in a production environment.  Running search in verbose mode. All TA abjects are set to global and everyone read permissions.  i.e. default.meta file  [] access = read : [ * ], write : [ admin ] export = system btool props check shows that all the objects are applied as expected. How to troubleshoot this issue? Where to start from?
I am working on Splunk fundamentals lab 4 module, I have added data as administrator and when I log in as power user, am unable to view the data summary. Can anyone help me here why I am not able to ... See more...
I am working on Splunk fundamentals lab 4 module, I have added data as administrator and when I log in as power user, am unable to view the data summary. Can anyone help me here why I am not able to see data summary?
Hello, Good day to you. We are experiencing an issue wherein, our Splunk instance when accessed outside its host (windows server) it logs in but immediately kicks us out regardless of the user accoun... See more...
Hello, Good day to you. We are experiencing an issue wherein, our Splunk instance when accessed outside its host (windows server) it logs in but immediately kicks us out regardless of the user account and receive this error message on the login screen "Your session has expired. Log in to return to the system". Additional info we are using a proxy server to pass traffic destined for Splunk server IP ADDRESS2 to Splunk server IP ADDRESS1. Please advise. Thank you.
Hi, i need help with some datamodel acceleration issues in CIM. The problem is that i accelerated a datamodel with 1y summary range, but after rebuilding it, i cant get results older than last mon... See more...
Hi, i need help with some datamodel acceleration issues in CIM. The problem is that i accelerated a datamodel with 1y summary range, but after rebuilding it, i cant get results older than last month, neither in pivot or tstats from datamodel. The command datamodel works fine with older events. Tried to tweak the advanced configuration in the datamodel acceleration settings, but nothing changed. The query using tstats:   | tstats prestats=true local=false summariesonly=true allow_old_summaries=true count from datamodel=Network_Traffic.All_Traffic by _time,All_Traffic.transport span=10m | timechart minspan=10m count by All_Traffic.transport   Am i missing some configuration that forces the acceleration date range to the last month range? Appreciate any insight about this issue. EDIT: this are the configuration settings allow_old_summaries = true allow_skew = 0 backfill_time = -1y cron_schedule = 1-56/5 * * * * earliest_time = -1y hunk.compression_codec = - hunk.dfs_block_size = 0 hunk.file_format = - manual_rebuilds = false max_concurrent = 9 max_time = 3600 poll_buckets_until_maxtime = true schedule_priority = highest workload_pool = - Update1: We noticed that several acceleration mainenance searches were skipped, so we optimized the datamodel acceleration specifying the used index, made the scheduled maintenance searches run at diferent times so they dont overlap, and reduced the time range of the datamodels.
How do I create a custom License Report used by selected Indexes including graphical view. Is the License master a good server to do it on? Thank u
Hello, I have two indexes to which I need to compare the source ip and if it is the same, show me a message like true or false. This is what I did, but I know that the lookup is needed and I don't k... See more...
Hello, I have two indexes to which I need to compare the source ip and if it is the same, show me a message like true or false. This is what I did, but I know that the lookup is needed and I don't know how to use it well, could you give me a hand? index=firewall |eval Type_Cnx = if(dst_port="1446" OR dst_port="1444", "B2B", "B2C") |stats count by Type_Cnx src | fields Type_Cnx src | appendcols [ search index=linux UserIp description="my message*"] |eval a= if(src==UserIp, "true", "false")
I'm trying to use a lookup and some search results to display a table that includes both where something matched in the lookup, and where no matches were found in the lookup. Sample search syntax ... See more...
I'm trying to use a lookup and some search results to display a table that includes both where something matched in the lookup, and where no matches were found in the lookup. Sample search syntax would be: [very long search, ends up in table with fieldA, fieldC] | lookup MyLookup fieldA output fieldB | chart values(fieldA) by fieldB, fieldC This shows the values of field A, split by rows for the matching fieldB (pulled from lookup) and split by columns for fieldC. I want to see all of the fieldB rows in the lookup added, even if there was no matching fieldA. Either blank or with a default value of 'None Found' or something in the chart results, since knowing something is 'missing' is important. How can I do this? 
Hi I wanted to know the best way to upgrade from Splunk 7342 running on windows server 2012r2   to windows server 2016 or 2019 running the latest version of splunk for windows?
please can anyone help I need to create a dashboard based on a lookup file which will be updated every 30 minutes and the dashboard should have filters such as customer name etc.  can anyone help pl... See more...
please can anyone help I need to create a dashboard based on a lookup file which will be updated every 30 minutes and the dashboard should have filters such as customer name etc.  can anyone help please thanks.