All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi Splunkers, we have a SH with Splunk Enterprise Security installed on it. It is a standalone instance that query some indexers clusters. We are going on about configure it and we loaded some .csv f... See more...
Hi Splunkers, we have a SH with Splunk Enterprise Security installed on it. It is a standalone instance that query some indexers clusters. We are going on about configure it and we loaded some .csv file for Asset and identity management. Once ewe uploaded those files, when we ran a search we got this situation: the search is executed, but erros about inability to load lookups that store merged asset and identity data in Splunk Enterprise Security are collected. Error syntax is the following:   [<indexers listed here>] Could not load lookup=LOOKUP-zu-asset_lookup_by_str-_risk_system [<indexers listed here>] Could not load lookup=LOOKUP-zu-asset_lookup_by_str-dest [<indexers listed here>] Could not load lookup=LOOKUP-zu-asset_lookup_by_str-dvc [<indexers listed here>] Could not load lookup=LOOKUP-zu-asset_lookup_by_str-src [<indexers listed here>] Could not load lookup=LOOKUP-zv-asset_lookup_by_cidr-_risk_system [<indexers listed here>] Could not load lookup=LOOKUP-zv-asset_lookup_by_cidr-dest [<indexers listed here>] Could not load lookup=LOOKUP-zv-asset_lookup_by_cidr-dvc [<indexers listed here>] Could not load lookup=LOOKUP-zv-asset_lookup_by_cidr-src   First think I thought: ok, this is probably a permission issue. BTW, even when I execute the search with admin user that loaded .csv in assent and identity inventory, I got the same error.  I can add that we modified some OOT DM, to add some fields needed by our SOC. What could be the root cause?
Try something like this | timechart span=1d sum(abc) by xyz | where strftime(_time,"%w") = 1
The session is only present 3 times in the hour, the fourth one at 13:00 is in the next hour Anyway, assuming you still want to count different sessions for the same user separately, you can do the ... See more...
The session is only present 3 times in the hour, the fourth one at 13:00 is in the next hour Anyway, assuming you still want to count different sessions for the same user separately, you can do the stats twice | bin _time span=1h | stats count by _time, userName, sessionKey | stats count by _time, userName Depending on what count you actually want, you could also do this | bin _time span=1h | stats count by _time, userName, sessionKey | stats count by _time
Hi @triva79, you could use timechart or dedup: <your_search> | timechart span=1h count BY userName Ciao. Giuseppe
Hi @Yousef.Raafat  Still the issue is there, I've raised support ticket and checking with AppD support team. I'll update here once i get solution. Thanks, Shubham
we have data in Splunk for user sessions in an app and I am trying to produce a line graph to show usage every hour. the session information is added 4 times an hour so trying to remove the extra res... See more...
we have data in Splunk for user sessions in an app and I am trying to produce a line graph to show usage every hour. the session information is added 4 times an hour so trying to remove the extra results per hour below is an example for one user but there will be other user data as well  userName: fred sessionKey: a0b360d9-a471-45a1-9dcc-0dee39ed6ba8 timestamp: 2024-05-20T12:00:00Z userName: fred sessionKey: a0b360d9-a471-45a1-9dcc-0dee39ed6ba8 timestamp: 2024-05-20T12:30:00Z userName: fred sessionKey: a0b360d9-a471-45a1-9dcc-0dee39ed6ba8 timestamp: 2024-05-20T12:45:00Z userName: fred sessionKey: a0b360d9-a471-45a1-9dcc-0dee39ed6ba8 timestamp: 2024-05-20T13:00:00Z
Hey @tejasode , Not sure why you mention the app to be archived. I see that the app is available for use and download as well. Additionally, the similar inputs can also be found in Microsoft Azure A... See more...
Hey @tejasode , Not sure why you mention the app to be archived. I see that the app is available for use and download as well. Additionally, the similar inputs can also be found in Microsoft Azure Add-on for Splunk (https://splunkbase.splunk.com/app/3757). There are a lot of add-ons available on Splunkbase to fetch data from Azure. Can you elaborate more on what particular functionality are you looking for? Thanks, Tejas.
Hi @karthi2809, you could put the list of links and names in a lookup (called e.g. links.csv) and containing at least two columns (Name, Link). Then you could run something ike this: <row> <p... See more...
Hi @karthi2809, you could put the list of links and names in a lookup (called e.g. links.csv) and containing at least two columns (Name, Link). Then you could run something ike this: <row> <panel> <table> <title>Use Cases</title> <search> <query> | inputlookup links.csv | sort Name Link | table Name Link </query> <sampleRatio>1</sampleRatio> </search> <option name="count">10</option> <option name="dataOverlayMode">none</option> <option name="drilldown">row</option> <option name="percentagesRow">false</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> <drilldown> <condition match="isnotnull($row.Link$)"> <link target="_blank">/app/your_app/$row.Link$</link> </condition> <condition match="isnull($row.Link$)"/> </drilldown> </table> </panel> </row> In this way with a click you run your dashboard. If the dashboards are in different apps then the present one, you have to add the full path.  Ciao. Giuseppe
Hello @splunky_diamond , I am unsure if there are any apps/TAs available for Fudo PAM data. The best would be to write magic 8 props for parsing the data. You can find the relevant documentation lin... See more...
Hello @splunky_diamond , I am unsure if there are any apps/TAs available for Fudo PAM data. The best would be to write magic 8 props for parsing the data. You can find the relevant documentation links below: - https://docs.splunk.com/Documentation/Splunk/latest/Data/WhatSplunkdoeswithyourdata - https://docs.splunk.com/Documentation/Splunk/latest/Data/Overviewofeventprocessing - https://docs.splunk.com/Documentation/Splunk/latest/Data/Createsourcetypes - https://lantern.splunk.com/Splunk_Platform/Product_Tips/Data_Management/Configuring_new_source_types   Thanks, Tejas.   --- If the above solution helps, an upvote is appreciated.
The message is a warning, not an error.  I says the free space on C: is getting low, not that you don't have any. You can free up some space to get back below the yellow threshold or change the thre... See more...
The message is a warning, not an error.  I says the free space on C: is getting low, not that you don't have any. You can free up some space to get back below the yellow threshold or change the threshold to alert at different level.
Hi @ViniciusMariano , if the structure of the right dataset name is always the same, e.g. : left_dataset_name + _ + other, you could extract the common part using a regex: <your_search> | rex fie... See more...
Hi @ViniciusMariano , if the structure of the right dataset name is always the same, e.g. : left_dataset_name + _ + other, you could extract the common part using a regex: <your_search> | rex field=right_dataset_name "^(?<left_dataset_name>[^_]+)" | stats values(right_dataset_name) AS right_dataset_name count BY right_dataset_name | table left_dataset_name right_dataset_name I would suppose that you used the join concept and not the use of the join command, that should be avoided because it's very slow and expensive for the system resources. Ciao. Giuseppe
Hi All, How to map splunk dashboard link based on the values  on the field. And i have existing dashboard so i need to map based on the values onclick the link it will open the existing dashboard E... See more...
Hi All, How to map splunk dashboard link based on the values  on the field. And i have existing dashboard so i need to map based on the values onclick the link it will open the existing dashboard Ex: Name link abc click here bbc click here ccd clik here  
Hello @Shubham.Kadam, Kindly update us here if you have any good news since I've encountered the same issue while applying the extension to MA with SIM Enabled (Without APM agent) R01-SMAX-NFS==>... See more...
Hello @Shubham.Kadam, Kindly update us here if you have any good news since I've encountered the same issue while applying the extension to MA with SIM Enabled (Without APM agent) R01-SMAX-NFS==> [Monitor-Task-Thread3] 20 May 2024 08:43:12,727 WARN MetricWriteHelper-Linux Monitor - The metric is not valid. Not reporting the metric to the machine agent. Name - inuse, Value - 0, Path - Server|Component:<TIER_ID>|Custom Metrics|Linux Monitor|sockUsedStats|ifrag|inuse, Qualifiers - AVERAGE, CURRENT, INDIVIDUAL R01-SMAX-NFS==> [Monitor-Task-Thread3] 20 May 2024 08:43:12,727 WARN MetricWriteHelper-Linux Monitor - The metric is not valid. Not reporting the metric to the machine agent. Name - inuse, Value - 23, Path - Server|Component:<TIER_ID>|Custom Metrics|Linux Monitor|sockUsedStats|tcp|inuse, Qualifiers - AVERAGE, CURRENT, INDIVIDUAL R01-SMAX-NFS==> [Monitor-Task-Thread3] 20 May 2024 08:43:12,727 WARN MetricWriteHelper-Linux Monitor - The metric is not valid. Not reporting the metric to the machine agent. Name - inuse, Value - 8, Path - Server|Component:<TIER_ID>|Custom Metrics|Linux Monitor|sockUsedStats|udp|inuse, Qualifiers - AVERAGE, CURRENT, INDIVIDUAL R01-SMAX-NFS==> [Monitor-Task-Thread3] 20 May 2024 08:43:12,727 WARN MetricWriteHelper-Linux Monitor - The metric is not valid. Not reporting the metric to the machine agent. Name - HeartBeat, Value - 1, Path - Server|Component:<TIER_ID>|Custom Metrics|Linux Monitor|HeartBeat, Qualifiers - AVERAGE, AVERAGE, INDIVIDUAL R01-SMAX-NFS==> [Monitor-Task-Thread1] 20 May 2024 08:43:12,822 ERROR NFSMountMetricsTask-Linux Monitor - Exception occurred collecting NFS I/O metrics java.lang.NullPointerException: null at com.appdynamics.extensions.linux.NFSMountMetricsTask.getMountIOStats(NFSMountMetricsTask.java:173) [?:?] at com.appdynamics.extensions.linux.NFSMountMetricsTask.run(NFSMountMetricsTask.java:66) [?:?] at com.appdynamics.extensions.executorservice.MonitorThreadPoolExecutor$TaskRunnable.run(MonitorThreadPoolExecutor.java:113) [?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [?:?] at java.lang.Thread.run(Unknown Source) [?:?] R01-SMAX-NFS==> [Monitor-Task-Thread1] 20 May 2024 08:43:12,822 WARN MetricWriteHelper-Linux Monitor - The metric is not valid. Not reporting the metric to the machine agent. Name - Availability, Value - 0, Path - Server|Component:<TIER_ID>|Custom Metrics|Linux Monitor|mountedNFSStatus|NFS1|Availability, Qualifiers - AVERAGE, AVERAGE, INDIVIDUAL R01-SMAX-NFS==> [Monitor-Task-Thread2] 20 May 2024 08:43:12,822 INFO LinuxMonitorTask-Linux Monitor - Completed the Linux Monitoring task R01-SMAX-NFS==> [Monitor-Task-Thread2] 20 May 2024 08:43:12,822 INFO LinuxMonitorTask-Linux Monitor - All tasks for Linux Monitor finished R01-SMAX-NFS==> [Monitor-Task-Thread2] 20 May 2024 08:43:12,822 WARN MetricWriteHelper-Linux Monitor - The metric is not valid. Not reporting the metric to the machine agent. Name - Metrics Uploaded, Value - 1, Path - Server|Component:<TIER_ID>|Custom Metrics|Linux Monitor|Metrics Uploaded, Qualifiers - AVERAGE, AVERAGE, COLLECTIVE R01-SMAX-NFS==> [Monitor-Task-Thread2] 20 May 2024 08:43:12,822 INFO MetricWriteHelper-Linux Monitor - Finished executing Linux Monitor at 2024-05-20 08:43:12 EDT R01-SMAX-NFS==> [Monitor-Task-Thread2] 20 May 2024 08:43:12,822 INFO MetricWriteHelper-Linux Monitor - Total time taken to execute Linux Monitor : 165 ms R01-SMAX-NFS==> [Monitor-Task-Thread2] 20 May 2024 08:43:12,822 INFO ABaseMonitor - Finished processing all tasks in the job for Linux Monitor Thanks, Yousef
Hello @Santosh2 , You can create an additional input for country and use the country value as token for site input. Your xml code should look something like below: <input type="dropdown" token="cou... See more...
Hello @Santosh2 , You can create an additional input for country and use the country value as token for site input. Your xml code should look something like below: <input type="dropdown" token="country"> <label>Country</label> <choice value="*">All</choice> <prefix></prefix> <suffix></suffix> <default>*</default> <fieldForLabel>country</fieldForLabel> <fieldForValue>country</fieldForValue> <search> <query> | makeresults | eval country="India" | fields country | append [ | makeresults | eval country="China" | fields country] | sort country | table country </query> </search> </input> Now use the country token as below in Site input <input type="dropdown" token="site"> <label>SITE</label> <choice value="*">All</choice> <prefix>site="</prefix> <suffix>"</suffix> <default>*</default> <fieldForLabel>site</fieldForLabel> <fieldForValue>site</fieldForValue> <search> <query> | makeresults | eval site=case($country$="India","BDC",$country$="SOC",true(),"BDC") | fields site </query> </search> </input>   Thanks, Tejas.   --- If the above solution helps, an upvote is appreciated.
@Everyone, Can any help on this
Hey guys, I'm having trouble joining two datasets with similar values I'm trying to join two datasets, both have a common "name" field, but the one on the left has the correct value and the one on t... See more...
Hey guys, I'm having trouble joining two datasets with similar values I'm trying to join two datasets, both have a common "name" field, but the one on the left has the correct value and the one on the right has this pattern: left dataset name field + some characters e.g.: left dataset name right dataset name RU3NDS RU3NDS_sdsavdg_SoKdsVI3   Is there any way to use a wildcard when joining?
Hello @dc18 , There are plenty of apps on Splunkbase that can be used for visalizing AWS data. One of them is following: https://splunkbase.splunk.com/app/6311 Additionally, you can also check for ... See more...
Hello @dc18 , There are plenty of apps on Splunkbase that can be used for visalizing AWS data. One of them is following: https://splunkbase.splunk.com/app/6311 Additionally, you can also check for AWS Content Pack that can also assist with similar purpose.   Thanks, Tejas.   --- If the above solution helps, an upvote is appreciated.
I want to show data for every monday on weekly basis  
Hello community, I aim to compare the 'src_ip' referenced below with the CIDR IP ranges in the lookup file 'zscalerip.csv' using the query provided. If there is a match, the result should be recor... See more...
Hello community, I aim to compare the 'src_ip' referenced below with the CIDR IP ranges in the lookup file 'zscalerip.csv' using the query provided. If there is a match, the result should be recorded as true in the 'Is_managed_device' field; otherwise, it should be marked as false. However, upon executing this query, I'm obtaining identical results for all IPs, irrespective of whether they match the CIDR range.  I have created a new lookup definition for the lookup and implemented the following changes:- Type = file-based min_matches = 0 default_match = NONE filename = zscalerip.csv match_type = CIDR(CIDR) CIDR IP range in lookup file :-  CIDR 168.246.*.* 8.25.203.0/24 64.74.126.64/26 70.39.159.0/24 136.226.158.0/23 Splunk Query :- | makeresults | eval src_ip="10.0.0.0 166.226.118.0 136.226.158.0 185.46.212.0 2a03:eec0:1411::" | makemv delim=" " src_ip | mvexpand src_ip | lookup zscalerip.csv CIDR AS src_ip OUTPUT CIDR as CIDR_match | eval Is_managed_device=if(cidrmatch(CIDR_match,src_ip), "true", "false") | table src_ip Is_managed_device getting result in below format:- src_ip Is_managed_device 10.0.0.0 FALSE 166.226.118.0 FALSE 136.226.158.0 FALSE 185.46.212.0 FALSE 2a03:eec0:1411:: FALSE  
I'm trying to change the font size of a table in a dashboard studio visualization. How is this done in the code? I've tried a few ways but having no luck.   If yes, in which version we can increase... See more...
I'm trying to change the font size of a table in a dashboard studio visualization. How is this done in the code? I've tried a few ways but having no luck.   If yes, in which version we can increase the font size of a table. Thanks in advance and I appreciate the help.