All Topics

Top

All Topics

Hi, I aimed to merge the "dropped" and "blocked" values under the "IDS_Attacks.action" field in the output of the datamodel search and include their respective counts within the newly created "block... See more...
Hi, I aimed to merge the "dropped" and "blocked" values under the "IDS_Attacks.action" field in the output of the datamodel search and include their respective counts within the newly created "blocked" field. so that I can add it to the dashboard. output:   IDS_Attacks.action count allowed 130016 blocked 595 dropped 1123
Hi, Not sure how to fix continius bar between login and logout. As you can see on picture it's marked as login, lot of spaces and then logout. The best would be everything is color marked from login... See more...
Hi, Not sure how to fix continius bar between login and logout. As you can see on picture it's marked as login, lot of spaces and then logout. The best would be everything is color marked from login until logout. Though it could be done throug format, but not this time.  Hope someone can help me with it Rgds
I have a response from one of the client application like this: { "employees": { "2023-03-16": { "1": { "id": 1, "name": "Michael Scott", "email": "demo@desktime.com", "groupId": 1, "group": "Accoun... See more...
I have a response from one of the client application like this: { "employees": { "2023-03-16": { "1": { "id": 1, "name": "Michael Scott", "email": "demo@desktime.com", "groupId": 1, "group": "Accounting", "profileUrl": "url.com", "isOnline": false, "arrived": false, "left": false, "late": false, "onlineTime": 0, "offlineTime": 0, "desktimeTime": 0, "atWorkTime": 0, "afterWorkTime": 0, "beforeWorkTime": 0, "productiveTime": 0, "productivity": 0, "efficiency": 0, "work_starts": "23:59:59", "work_ends": "00:00:00", "notes": { "Skype": "Find.me", "Slack": "MichielS" }, "activeProject": [] }, "2": { "id": 2, "name": "Andy Bernard", "email": "demo3@desktime.com", "groupId": 106345, "group": "Marketing", "profileUrl": "url.com", "isOnline": true, "arrived": "2023-03-16 09:17:00", "left": "2023-03-16 10:58:00", "late": true, "onlineTime": 6027, "offlineTime": 0, "desktimeTime": 6027, "atWorkTime": 6060, "afterWorkTime": 0, "beforeWorkTime": 0, "productiveTime": 4213, "productivity": 69.9, "efficiency": 14.75, "work_starts": "09:00:00", "work_ends": "18:00:00", "notes": { "Background": "Law and accounting" }, "activeProject": { "project_id": 67973, "project_title": "Blue Book", "task_id": 42282, "task_title": "Blue Book task", "duration": 6027 } }..... } "__request_time": "1678957028" }  I am facing problem with the date field "2023-03-16" as this field changes everyday. I wanted to create statistics based on all Employee IDs, Late employees, Email etc for last 7 days. I have used Spath  but cannot use wildcard search on all Late employees on all days. Thanks
I am getting the error: (502) Insufficient Privileges: You do not have View privilege on Course I am enrolled for the splunk Power user training and i cannot access my learning path because of the e... See more...
I am getting the error: (502) Insufficient Privileges: You do not have View privilege on Course I am enrolled for the splunk Power user training and i cannot access my learning path because of the error.
Can you suggest on this if we remove the 2022 files so will be any impact on splunk </opt/app/splunk/var/lib/splunk/os/db>ls -lrt total 644 -rw------- 1 splunk splunk  10 Jan 18 2022 CreationT... See more...
Can you suggest on this if we remove the 2022 files so will be any impact on splunk </opt/app/splunk/var/lib/splunk/os/db>ls -lrt total 644 -rw------- 1 splunk splunk  10 Jan 18 2022 CreationTime drwx--x--- 2 splunk splunk 4096 Jan 18 2022 GlobalMetaData drwx--x--- 3 splunk splunk 4096 Jan 18 2022 db_1642559010_1641112260_0 drwx--x--- 3 splunk splunk 4096 Feb 26 2022 db_1645905109_1644968889_4 drwx--x--- 3 splunk splunk 4096 Feb 26 2022 db_1625407961_1565097054_1 drwx--x--- 3 splunk splunk 4096 Feb 26 2022 db_1564424430_1323199008_2 drwx--x--- 3 splunk splunk 4096 Feb 26 2022 db_1645912526_1645346582_5 drwx--x--- 3 splunk splunk 4096 Feb 26 2022 db_1644968878_1642559018_3 drwx--x--- 3 splunk splunk 4096 Feb 26 2022 db_1645931413_1641472459_8 drwx--x--- 3 splunk splunk 4096 Feb 27 2022 db_1646022282_1645905131_11 drwx--x--- 3 splunk splunk 4096 Feb 28 2022 db_1646061049_1646022278_12 drwx--x--- 3 splunk splunk 4096 Mar 31 2022 db_1648760328_1646061038_13 drwx--x--- 3 splunk splunk 4096 May 1 2022 db_1651428760_1648760301_14 drwx--x--- 3 splunk splunk 4096 Jun 1 2022 db_1654064390_1651428766_16 drwx--x--- 3 splunk splunk 4096 Jul 1 2022 db_1656658688_1654064392_17 drwx--x--- 3 splunk splunk 4096 Jul 30 2022 db_1659238089_1656658690_18 drwx--x--- 3 splunk splunk 4096 Aug 6 2022 db_1625407961_1569499319_9 drwx--x--- 3 splunk splunk 4096 Aug 6 2022 db_1625407908_1587017816_6 drwx--x--- 3 splunk splunk 4096 Aug 6 2022 db_1568123891_1361996942_7 drwx--x--- 3 splunk splunk 4096 Aug 6 2022 db_1566397752_1323199008_10 drwx--x--- 3 splunk splunk 4096 Aug 6 2022 db_1659536784_1659238115_19 drwx--x--- 3 splunk splunk 4096 Aug 6 2022 db_1590756532_1590756532_15 drwx--x--- 3 splunk splunk 4096 Sep 12 2022 db_1662507027_1659807171_20 drwx--x--- 3 splunk splunk 4096 Sep 19 2022 db_1663592993_1662507051_21 drwx--x--- 3 splunk splunk 4096 Sep 19 2022 db_1663597969_1663592971_24 drwx--x--- 3 splunk splunk 4096 Sep 19 2022 db_1663600052_1663597937_25 drwx--x--- 3 splunk splunk 4096 Oct 20 2022 db_1666239485_1663600060_26 drwx--x--- 3 splunk splunk 4096 Nov 15 2022 db_1668525038_1666239467_27 drwx--x--- 3 splunk splunk 4096 Nov 15 2022 db_1668525264_1668525013_29 drwx--x--- 3 splunk splunk 4096 Dec 13 2022 db_1660748402_1645073785_31 drwx--x--- 3 splunk splunk 4096 Dec 15 2022 db_1671120985_1668526212_32
Does anyone know where I can find information on installing and configuring ESET TA and the app Linux Splunk enterprise (Debian) and Windows Eset Administrator ? I don't have any information on in... See more...
Does anyone know where I can find information on installing and configuring ESET TA and the app Linux Splunk enterprise (Debian) and Windows Eset Administrator ? I don't have any information on installing on newer versions compatible with 9.0.5 Splunk Enterprise. Despite having configured according to the logs and syslog eset, I do not see any logs arriving on my search head. https://help.eset.com/protect_admin/90/en-US/admin_server_settings_syslog.html https://splunkbase.splunk.com/app/3931/ https://splunkbase.splunk.com/app/3867/#/details Or https://splunkbase.splunk.com/app/6808
Each time I run a search query and click visualisation, the default is "column chart". How do I set this to default to "line chart" for myself, and how do I set this for other users? Thanks in adva... See more...
Each time I run a search query and click visualisation, the default is "column chart". How do I set this to default to "line chart" for myself, and how do I set this for other users? Thanks in advance
CAT to Splunk Logs Failing: host = 161.209.202.108 user = sv_cat port = 22 Start time: 10/24/2023 at 4:21am 
Hello As far I understand, the Splunk datamodel has two main goals 1)  Data models enable users of Pivot to create compelling reports and dashboards without designing the searches that generate the... See more...
Hello As far I understand, the Splunk datamodel has two main goals 1)  Data models enable users of Pivot to create compelling reports and dashboards without designing the searches that generate them.  So, the Pivot tool lets to report on a specific data set without the Splunk Search Processing Language  2) It's possible to refer to the CIM data models to normalize different name of data having the same function In this case, we need to normalize data by using tags, alias, eventtypes, etc... Alerts Application State Authentication Certificates Databases Data Loss Prevention Email Interprocess Messaging Intrusion Detection Inventory Java Virtual Machines Malware Network Resolution (DNS) Network Sessions Network Traffic Performance Ticket Management Updates Vulnerabilities Web Is it correct? Thanks
  The Splunk SOAR team shares more on the latest and greatest updates in version 6.2. During this session, the team will provide a deep dive into new features like Logic Loops, our new integra... See more...
  The Splunk SOAR team shares more on the latest and greatest updates in version 6.2. During this session, the team will provide a deep dive into new features like Logic Loops, our new integration with CyberArk, and the latest connectors featured in the Firewall Manager Connector Pack. Splunk SOAR provides security orchestration, automation, and response capabilities that empower your SOC. Splunk SOAR allows security analysts to work smarter, not harder, by automating repetitive tasks; triage security incidents faster with automated investigation and response; increase productivity, efficiency, and accuracy; and strengthen defenses by connecting and coordinating complex workflows across their team and tools.
This isn't a question, rather just a place to drop a PDF I put together that I titled "Bare Bones Splunk"   I've seen a lot of people try and get started with Splunk, but then get stuck right after... See more...
This isn't a question, rather just a place to drop a PDF I put together that I titled "Bare Bones Splunk"   I've seen a lot of people try and get started with Splunk, but then get stuck right after getting Splunk Enterprise installed on their local machine. It can be daunting to log into Splunk for the first time and know what the heck you should do.  A person can get through the install to the What Happens Next page, and be pretty overwhelmed with what to do next: Learn SPL and search?  What should they search?  How should they start getting their data in?  What sort of data should I start getting in?  What dashboard should I build? They've started...but need that ah-ha example to see how this tool will fit into their existing environment and workflow. The attached Bare_Bones_Splunk.pdf file guides the reader from the point of install to using the data already being indexed in index=_internal to replicate a few common use cases of Splunk: Monitor a web server Monitor an application server Monitor security incidents The examples are really simple, and the resulting dashboard created in the tutorial is a poor example of something your boss might want (or not...how observant is your boss - do they just want a few graphs with nice colors?).  But, this will give someone a really quick intro to Splunk without having to do anything other than install (and then maybe they will be ready to tackle a broader introduction, like the Search Tutorial)
I have a user that requested me to look into some of his reports. He wanted the permission of report 2 to match with report 1. Both are owned by two different people, but two people with similar role... See more...
I have a user that requested me to look into some of his reports. He wanted the permission of report 2 to match with report 1. Both are owned by two different people, but two people with similar roles and access.   After we tweaked the settings for the report, being shared in the app, having read access by all, and write permissions to those with the appropriate roles, they are still having issues viewing and editing.    The owner of report 1 is the owner/creator of the report. The report runs as owner, and is shared globally. He doesn't have permissions to edit the actual alert.  He created the report initially, how come he cant edit it. I even cloned it and reassigned ownership, to no avail.  Report 1  runs as owner, while report 2 has the option to run as owner or as the user. How come one report has that option while the other one is locked to running as owner? As far as user two goes, his roles include permissions to the used indexes, as well as access to the app, default search app, and he has even more roles and permissions than user 1. Yet, he receives an error when trying to view the link that splunk sends out that has the attached report.  My question is, is there anywhere else I should be looking at in order to find permission discrepancies. From everything ive seen, both users have access to the required indexes, have pretty much soft-admin on splunk, and i assume they have viewed these in the past. From roles to users to capabilities, they have everything in order, or at least it seems. Is there something I should check in the configs?    Thanks for any guidance. 
I often run into a case where I find I need to take the same dataset and compute aggregate statistics on different group-by sets, for instance if you want the output of this:     index=example | st... See more...
I often run into a case where I find I need to take the same dataset and compute aggregate statistics on different group-by sets, for instance if you want the output of this:     index=example | stats avg(field1) by x,y,z | append [ index=example | stats perc95(field2) by a,b,c ]   I am using the case n=2 groupbys for convenience. In the general case there are N groupbys, and arbitrary stats functions... what is the best way to optimize this kind of query, without using append (which runs into subsearch limits)? Some of the patterns I can think of are below. One way is to use appendpipe.    index=example | appendpipe [ | stats avg(field1) by x,y,z ] | appendpipe [ | stats perc95(field2) by a,b,c ]   Unfortunately this seems kind of slow, especially once you start having to add more subsearches and preserving and passing  a large number of non-transformed events throughout the search. Another way is to use eventstats to preserve the events data, finishing it off with a final stats.   index=example | eventstats avg(field1) as avg_field1 by x,y,z | stats first(avg_field1) as avg_field1, perc95(field2) by a,b,c   Unfortunately this is not much faster. I think there is another way using streamstats in place of eventstats, but I still haven't figured out how to retrieve the last event without just invoking eventstats last() or relying on an expensive sort.   Another way I've tried is intentionally duplicating your data using mvexpand which has the best performance by far.    index=example ```Duplicate all the data``` | eval key="1,2" | makemv delim="," key | mvexpand key ```Set groupby = concatenation of groupby field values``` | eval groupby=case(key=1,x.",".y.",".z, key2=a.",".b.",".c, true(), null()) | stats avg(field1), perc95(field2) by groupby   Are there any other patterns that are easier/faster? I'm curious as to how Splunk processes things under the hood, I know something called "map-reduce" is part of it but would be curious to know if anyone knows how to optimize this computation and why it's optimal in a theoretical sense. 
Hi, I am trying to create a custom app using add-on builder.  In request I am looking to use global account details. but its throwing an error.  Not sure what I am missing here. Anyone know abo... See more...
Hi, I am trying to create a custom app using add-on builder.  In request I am looking to use global account details. but its throwing an error.  Not sure what I am missing here. Anyone know about this issue ? I am using latest version of Add-on builder. Reference  https://docs.splunk.com/Documentation/AddonBuilder/4.1.3/UserGuide/ConfigureDataCollectionAdvanced Thanks
I was asked to create a query that will allow the user to see only the open ports. An example log looks something like this:     10/24/2023 06:00:04,source=SXXXX-88880000,destination=10.10.100.130... See more...
I was asked to create a query that will allow the user to see only the open ports. An example log looks something like this:     10/24/2023 06:00:04,source=SXXXX-88880000,destination=10.10.100.130,DuBlIn_,11.11.119.111,port_80=True,port_443=True,port_21=False,port_22=True,port_25=False,port_53=False,port_554=False,port_139=False,port_445=False,port_123=False,port_3389=False     it looks easy enough, I want to table port_*=True.   I want destination, src_ip, and the open ports.   I asked our equivalent of Chat GPT about it, and I got this.      index=gpss sourcetype=acl "SXXXXXXX" destination="11.11.111.11" | eval open_ports = case( port_123=="True", "123", port_139=="True", "139", port_21=="True", "21", port_22=="True", "22", port_25=="True", "25", port_3389=="True", "3389", port_443=="True", "443", port_445=="True", "445", port_53=="True", "53", port_554=="True", "554", port_80=="True", "80", true(), null() ) | where open_ports!=null() | mvexpand open_ports | table _time, destination, gpss_src_ip, open_ports     But the open_ports!=null() wasnt allowed.  I get a  Error in 'where' command: Type checking failed. The '!=' operator received different types.   During testing, I have a baseline event, an event with three open Ports, but that search I ran only outputs the first one in the list. It hits port 22 first, since thats the first on in the case statement that is true.  My main question is, How do I successfully tell splunk to only grab the open ports that are True? Can i even do a wildcard somewhere, and request to pull port_* WHERE True   Thank you for any help  
I have a multiselect that does not interact with my Trellis chart. I would say; it's not defined in my base search but not sure how to identify the issue and how to fix? BASE Search: | eval Pat=sp... See more...
I have a multiselect that does not interact with my Trellis chart. I would say; it's not defined in my base search but not sure how to identify the issue and how to fix? BASE Search: | eval Pat=spath(json, "Info.Pat.Time") | eval Con=spath(json, "Info.Con.Time") | eval Cov=spath(json, "Info.Cov.Time") | eval Category = RED | table _time, Pat, Con, Cov, Category  Mulit-Select: | eval SysTime = Category + ":" + _time | fields - Category | untable SysTime Reason CurationValue | eval Category = mvindex(split(SysTime, ":"), 0) | eval _time = mvindex(split(SysTime, ":"), 1) | fields - SysTime | table Reason | dedup Reason Chart: | search Category $t_category$ Reason $t_reason$ | timechart span=1h avg(Pat) as Pat, avg(Con) as Con, avg(Cov) as Cov  
I have a query to retrieve user experience metrics from Dynatrace index. Wanted to compare the response times for 2 different time frames. My query is having sub query as well. In the dashboard, i am... See more...
I have a query to retrieve user experience metrics from Dynatrace index. Wanted to compare the response times for 2 different time frames. My query is having sub query as well. In the dashboard, i am having 2 time range pickers. Main query is picking the time range from time range picker1 and in the sub query using the token from time range picker2.  <<main search>> | appendcols [ search index="dynatrace"  $tr_14AGuxUA.earliest$ - $tr_14AGuxUA.latest$ | spath |output=user_actions path="userActions{}"| stats count by user_actions this is not retrieving any data from the sub query. how to fix this? If i am passing the hard coded values - earliest=10/23/2023:10:00:00 latest=10/23/2023:11:00:00, then its working fine. 
I can't access the support portal, with URL https://www.splunk.com/404?ErrorCode=23&ErrorDescription=Invalid+contact   Does anyone have the same issue?  
Can anyone shed any light on an issue I am having with a Splunk Cloud deployment, I have a Splunk heavy forwarder setup on Red Hat Linux 8 ingesting Cisco Switches via syslog,  This appears to be wor... See more...
Can anyone shed any light on an issue I am having with a Splunk Cloud deployment, I have a Splunk heavy forwarder setup on Red Hat Linux 8 ingesting Cisco Switches via syslog,  This appears to be working fine for the vast majority of devices, I can see the individual directories and logs dropping into /opt/splunklogs/Cisco/, There is just one Cisco device that isn't being ingested ? I have compared the config on the switch to the others and it is setup correctly logging host/trap etc, I can telnet from the switch to the interface on the Linux server and see the syslog hitting the interface via tcpdump, I have never had to populate an allow list for the switch IP's it looks to do them automatically on the forwarder, I can see the Cisco directories in the forwarder are generated by SPLUNK. For some reason this one switch just isn't being ingested. Does anyone have any guidance on some troubleshooting steps to try and establish what the issue is ? Thanks
hi My Splunk server is reachable from : http://127.0.0.1:8000/fr-FR/app/launcher/home I try to send data in my splunk server with the curl command below curl -H "Authorization: Splunk 1f5de11f-ee... See more...
hi My Splunk server is reachable from : http://127.0.0.1:8000/fr-FR/app/launcher/home I try to send data in my splunk server with the curl command below curl -H "Authorization: Splunk 1f5de11f-ee8e-48df-b4f1-eb1bbb6f3db0" https://localhost:8088/services/collector/event -d '{"event":"hello world"}'  But I have the message : curl: (7) Failed to connect to localhost port 8088 after 2629 ms: Couldn't connect to server  Could you help please?