All Topics

Top

All Topics

Watch our on-demand webinar to learn strategies for achieving full-stack observability across containerized, virtual, hybrid cloud native and enterprise applications with an OpenTelemetry-based sol... See more...
Watch our on-demand webinar to learn strategies for achieving full-stack observability across containerized, virtual, hybrid cloud native and enterprise applications with an OpenTelemetry-based solution.  Embark on your OpenTelemetry-based full-stack observability journey You’ll learn:  Why OpenTelemetry is the gold standard for observability — and how to simplify its adoption.  How observability increases your influence over decision-making in IT and lines of business.   Ways to use CloudFabrix observability data modernization service with the Cisco FSO Platform.  It's on demand! Watch now! ... And don't forget to stick around for the Q&A portion! You can even submit your own questions! "What challenges do organizations typically face when trying to achieve full-stack observability across their applications and infrastructure" "...what specific usecases users can solve using this FSO platform?" "Which other connectors are you building next ? Any connectors for other infra products such as Firewalls / Databases / Load Balancers "
Hello, I was wondering if I learn splunk can I get a job. Or do I have to have a specific career title , like a cyber security career title, and have splunk as an additional thing.   Can I learn sp... See more...
Hello, I was wondering if I learn splunk can I get a job. Or do I have to have a specific career title , like a cyber security career title, and have splunk as an additional thing.   Can I learn splunk whilst not working at a company that uses splunk and get a job. If so, which splunk job role do you think would help me get a remote job. I live on a small island . Also, are there any websites online where people practice and help by doing splunk projects together. To learn and to also have experience to put on a resume.  
Hi,  Can anyone please help advise there is any quick way to find list of servers not managed by deployment server but transmitting data to Splunk on deployment server or SH itself.  Thanks
The fourth leaderboard update for The Great Resilience Quest is out >>  Check out the Leaderboard  Cheers to our leading participants and a hearty welcome to our new players! Plenty of r... See more...
The fourth leaderboard update for The Great Resilience Quest is out >>  Check out the Leaderboard  Cheers to our leading participants and a hearty welcome to our new players! Plenty of rewards are yet to be claimed! We are working to unveil levels 3 and 4 soon, so if you have just started your quest, you are right on time. Jump into the game and deepen your understanding of digital resilience with Splunk. Push yourself and aim for that leaderboard spot! Best regards, Splunk Customer Success
We use an asset file correctly configured on ES but we noticed that the enrichment based on "asset_lookup_by_cidr" is not working correctly because the lookup is not sorted by CIDR class. For example... See more...
We use an asset file correctly configured on ES but we noticed that the enrichment based on "asset_lookup_by_cidr" is not working correctly because the lookup is not sorted by CIDR class. For example in the following sample the sorting is base on "lexicographic" order instead of the real CIDR classes logic: 1.2.30.0/26 1.2.30.128/25 1.2.31.0/24 1.2.32.0/24 1.2.33.0/25 1.2.33.128/25 We tried to solve the problem creating a saved search that automatically performs the right sort but soon after the execution the lookup "asset_lookup_by_cidr" is replaced with "lexicographic" order. My saved search: | inputlookup asset_lookup_by_cidr | eval ip=replace(ip,"\s+","") | eval sorted=case(match(ip,"\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\/\d{2}"),substr(ip,-2),match(ip,"\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\/\d{1}"),substr(ip,-1),1=1,"0") | sort limit=0 - sorted | fields - sorted | outputlookup asset_lookup_by_cidr Is there a quick solution to this problem? Because it is a big trouble for notable based on IP addresses.
How to create top 10 DB queries in AppD dashboards and reports.
Hi all, there is a way to disable the audience restriction verification on SAML response? because in our case, base on Siteminder configuration, is the only way to resolve. Thank you!
Hi everyone I got a question regarding the configuration of the app Microsoft Teams Add-on for Splunk. When I configure a Webhook, a TeamsSubscription, and a CallRecord according to this guide, M... See more...
Hi everyone I got a question regarding the configuration of the app Microsoft Teams Add-on for Splunk. When I configure a Webhook, a TeamsSubscription, and a CallRecord according to this guide, MS Teams data flow into my Splunk instance. Just like the guide suggests, I use ngrok since the server my Splunk instance is running on is not accessible via HTTPS. Ngrok is fine for testing, but I want to switch it out for my actual proxy server. I tried several different settings, but there is no more data coming in. Given that data came in for as long as I used ngrok, all settings related to Azure (Tenant ID, Client ID, Client Secret) must be correct. The issue lies somewhere in the proxy server settings. Can anyone share some insights on how to configure the MS Teams Add-on as well as proxy server settings? Here is my current setup. Webhook - Name: Webhook - Interval: 30 - Index: ms_teams - Port: 4444 Subscription - Name: Subscription - Interval: 86400 - Index: ms_teams - Global Account: MSAzure - Tenant ID: mytenantidfromazure - Environment: Public - Webhook URL: myproxy.server.com <------- splunkinstanceserver.com:4444 or myproxy.server.com? - Endpoint: v1.0 CallRecord - Name: CallRecord - Interval: 30 - Index: ms_teams - Global Account: MSAzure - Tenant ID: mytenantidfromazure - Environment: Public - Endpoint: v1.0 - Max Batch Site: 5000 Proxy - Enable: checked - Host: myproxyserver.com - Port: 4444  <--------- Is this meant to be the port of my webhook or where my proxy takes https requests? - Username: userformyproxyserver - PW: userpwformyproxyserver splunkd.log ***Paths are shortened for readability. .../TA_MS_Teams/bin/TA_MS_Teams_rh_settings.py persistent}: WARNING:root:Run function: get_password failed: Traceback (most recent call last): .../TA_MS_Teams/bin/TA_MS_Teams_rh_settings.py persistent}: File ".../TA_MS_Teams/bin/ta_ms_teams/aob_py3/solnlib/utils.py", line 148, in wrapper .../TA_MS_Teams/bin/TA_MS_Teams_rh_settings.py persistent}: return func(*args, **kwargs) .../TA_MS_Teams/bin/TA_MS_Teams_rh_settings.py persistent}: File ".../TA_MS_Teams/bin/ta_ms_teams/aob_py3/solnlib/credentials.py", line 128, in get_password .../TA_MS_Teams/bin/TA_MS_Teams_rh_settings.py persistent}: "Failed to get password of realm=%s, user=%s." % (self._realm, user) .../TA_MS_Teams/bin/TA_MS_Teams_rh_settings.py persistent}: solnlib.credentials.CredentialNotExistException: Failed to get password of realm=__REST_CREDENTIAL__#TA_MS_Teams#configs/conf-ta_ms_teams_settings, user=proxy.
Hi, I'm trying to set a specific color to each one of 4 my dynamic labels of my 3 trellis pie charts. I already added series color option :  <option name="charting.seriesColors">[#CFD6EA,#C45AB3,#7... See more...
Hi, I'm trying to set a specific color to each one of 4 my dynamic labels of my 3 trellis pie charts. I already added series color option :  <option name="charting.seriesColors">[#CFD6EA,#C45AB3,#735CDD,#8fba38]</option> My issue is that my labels are "dynamic" and also  I don't  have a constant  number of categories ( it changes in each chart between 0-4 according to the data i receive). So my color plate sequence not aligned with the number of categories. For example, I want to set the flowing: type_A - Red type_B- Blue type_C- Green The problem is that  sometimes category "type_A" is missing from one or more of my charts and the category type_B is getting its color (Red) instead of Blue. here is my query: <---Search--> | stats dc(sessions) as Number_of_Sessions by type | sort type | eval type = type." - ".Number_of_Sessions I will very a appreciate any help i get get :-0) 10Q
I use the Splunk Machine Learning command: | fit LinearRegression blah, blah into ModelName I can generate a ModelName file. Using the command  | summary ModelName I can generate a result s... See more...
I use the Splunk Machine Learning command: | fit LinearRegression blah, blah into ModelName I can generate a ModelName file. Using the command  | summary ModelName I can generate a result set that has feature and coefficient fields. How can I "extract" the numerical coefficients so that I can create a regression equation for future use? Example: I'm trying to create the equation y = c0 + c1 * Term1 + c2 * Term2 for a future modeling activity?  
Hi When I run the command below, it works fine   index=toto event_id=4688 | eval file_name=if(event_id==4688, replace(NewProcessName, "^*\\\\([^\\\\]+)$","\\1"),null)   Now I need to combine th... See more...
Hi When I run the command below, it works fine   index=toto event_id=4688 | eval file_name=if(event_id==4688, replace(NewProcessName, "^*\\\\([^\\\\]+)$","\\1"),null)   Now I need to combine this search with a subearch   index=toto event_id=4688 | eval file_name=if(event_id==4688, replace(NewProcessName, "^*\\\\([^\\\\]+)$","\\1"),null) [| inputlookup test where software=pm | table pm |rename pm as file_name | format] | stats values(file_name) as file_name.....   But i have the message "Error in "EvalCommand": The expression is malformed What is wrong please?
Hello Team, I need to use the predict command but currently i have only 110 data events therefore to have more data points i am trying to add mock data with only time field which is different. Also... See more...
Hello Team, I need to use the predict command but currently i have only 110 data events therefore to have more data points i am trying to add mock data with only time field which is different. Also in my dataset i have only MonthYear field and data collected from March month of this year.  I read about repeat function and dataset literal can we use it in this scenario Quarter Subscription ID Subscription name Azure service Azure region Usage MonthYear Qtr 1 020b3b0c-5b0a-41a1-8cd7-90cbd63e06 SUB-PRD-EDL Azure Data Factory West 9,10E-12 March 2023 Qtr 1 020b3b0c-5b0a-41a1-8cd7-90cbd63e06 SUB-PRD-EDL Azure Data Factory West 0 March 2023 Qtr 1 020b3b0c-5b0a-41a1-8cd7-90cbd63e06 SUB-PRD-EDL Azure Data Factory West 4,40303E-09 March 2023
Hi Team, How do I download and install splunk forwarder on ubuntu 20.04 by downloading a file
Hello,  Can anyone help me to extract the below file name which is OU_..... from the below raw data.  12:04:19.85 14/09/2023 directory="E:\data\Test" ECHO is off. Volume in drive E is Data Vol... See more...
Hello,  Can anyone help me to extract the below file name which is OU_..... from the below raw data.  12:04:19.85 14/09/2023 directory="E:\data\Test" ECHO is off. Volume in drive E is Data Volume Serial Number is 7808-CA1B Directory of E:\data\Test 13/09/2023 13:22 <DIR> XXX\xxxx . 13/09/2023 13:22 <DIR> xxx\xxx .. 12/09/2023 09:31 95 xxx\xxx  dir_details.bat 13/09/2023 13:41 171 xxx\xxx  dir_details_copy.bat 07/09/2023 13:26 0 xxx\xxx  edsadsad.txt 07/09/2023 13:26 22 xxx\xxx  OU_kljdajdklsajkdl.zip 07/09/2023 13:26 22 xxx\xxx  OU_kljdajdklsajkewew.zip 07/09/2023 13:26 22 xxx\xxx  OU_kljdajdklsajkewewdsads.zip 6 File(s) 332 bytes 2 Dir(s) 20718067712 bytes free  
Hello, guys I want change my universal forward for new deployment_server,how to use Current deployment server。 I am currently pushing the app for universal forwarder, but can‘t change deployment_server
Hi all,   We have a new built set up for Splunk Enterprise situated in the Temporary location, we are looking to perform the Data center migration from the Temporary location to permanent location.... See more...
Hi all,   We have a new built set up for Splunk Enterprise situated in the Temporary location, we are looking to perform the Data center migration from the Temporary location to permanent location.  We want to know the behavior of Splunk Enterprise Installed on the component servers has any impact with the Change of IP address with the DC migration?  
Hello, I have installed sysmon and I try to send it with a UniversalForwarder on that machine to my Splunk-Indexer and Search-Head... I have tryed to add      [WinEventLog://Microsoft-Windows-Sy... See more...
Hello, I have installed sysmon and I try to send it with a UniversalForwarder on that machine to my Splunk-Indexer and Search-Head... I have tryed to add      [WinEventLog://Microsoft-Windows-Sysmon/Operational] disabled = 0 [WinEventLog://"Applications and Services Logs/Microsoft/Windows/Sysmon/Operational"] disabled = 0 [WinEventLog://Applications and Services Logs/Microsoft/Windows/Sysmon/Operational] disabled = 0     to the inputs.conf, but non of that versions worked... I have also restarted the UniversalForwarder and the Indexer / Search-Head has the Sysmom app installed. What am I doing wong?!   PS.: Sysmon is running and I see the logged data in the Eventviewer of that machine...
Hi Team, I have below query: index="abc" sourcetype =$Regions$ source="/amex/app/gfp-settlement-raw/logs/gfp-settlement-raw.log" "ReadFileImpl - ebnc event balanced successfully" | eval True=if(se... See more...
Hi Team, I have below query: index="abc" sourcetype =$Regions$ source="/amex/app/gfp-settlement-raw/logs/gfp-settlement-raw.log" "ReadFileImpl - ebnc event balanced successfully" | eval True=if(searchmatch("ebnc event balanced successfully"),"✔","") | eval EBNCStatus="ebnc event balanced successfully"|dedup EBNCStatus | table EBNCStatus True I am deduping my EBNC status so when I am selecting date Filter as yesterday its showing one count but when I am selecting 7 days from date filter still showing one count. I want when I select 7 its should show 7 count .  Can someone help me with this,
Hello I have this simple imput that stopped working after renaming the sourcetype  from linux server -> indexers [monitor:///opt/splunk_connect_for_kafka/kafka_2.13-3.5.1/logs/connect.log] disable... See more...
Hello I have this simple imput that stopped working after renaming the sourcetype  from linux server -> indexers [monitor:///opt/splunk_connect_for_kafka/kafka_2.13-3.5.1/logs/connect.log] disabled = false index = _internal sourcetype = kafka_connect_log   I restarted the universal forwarder many times, but it is not helping. Any other troubleshooting steps?    
Hi, are there any plans to make this add-on compatible with Splunk Cloud?