All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Our Dev Splunk instance was recently upgraded from Splunk Enterprise 8.2.2.1 to 9.0.2. I am getting the following error on our primary Search Head from python.log on splunkd restart: ERROR config... See more...
Our Dev Splunk instance was recently upgraded from Splunk Enterprise 8.2.2.1 to 9.0.2. I am getting the following error on our primary Search Head from python.log on splunkd restart: ERROR config:149 - [HTTP 401] Client is not authenticated Traceback (most recent call last): File "/opt/splunk/lib/python3.7/site-packages/splunk/appserver/mrsparkle/lib/config.py", line 147, in getServerZoneInfoNoMem return times.getServerZoneinfo() File "/opt/splunk/lib/python3.7/site-packages/splunk/appserver/mrsparkle/lib/times.py", line 163, in getServerZoneinfo serverStatus, serverResp = splunk.rest.simpleRequest('/search/timeparser/tz', sessionKey=sessionKey) File "/opt/splunk/lib/python3.7/site-packages/splunk/rest/__init__.py", line 625, in simpleRequest raise splunk.AuthenticationFailed splunk.AuthenticationFailed: [HTTP 401] Client is not authenticated I did a re-scan of everything using the newest version of the Upgrade Readiness App (off Splunkbase). Some apps did have a Python warning. I verified currently installed versions of each app (with the exception of Splunk Enterprise package-included apps like Splunk Secure Gateway and Splunk RapidDiag) and the documentation states our installed versions are compatible with Enterprise 9.0. It does not appear that any installed apps are using a deprecated version of python. I also ran the following command and verified our python version as Python 3.7.11: splunk cmd python -V After combing over known issues for the 9.0 release and other Answers threads I’ve had no luck. I don’t know if this errors is meaningful so direction would be advised. Thank you!
index=data severity IN ("critical","high","medium","low") | eval TopHost = [ search index=tenable severity IN ("critical","high","medium","low") | where len(dnsName)>0 | dedup dnsName,solution |... See more...
index=data severity IN ("critical","high","medium","low") | eval TopHost = [ search index=tenable severity IN ("critical","high","medium","low") | where len(dnsName)>0 | dedup dnsName,solution | dedup dnsName,pluginText | rex field=pluginName "^(?<VulnName>(?:\w+\s+){2})" | dedup dnsName,VulnName | top limit=1 dnsName | rename dnsName as query | fields query | head 1] | where dnsName=TopHost | table dnsName, ip   My query above works, but missing one thing. Right now it is getting the first result ( using head command ). I am trying to do first 5 results and store that to my eval variable. I tried to change head 5 but got errors. Any help is appreciated. Thanks Attached error
Hello, I have a .csv file with 2 columns: IoC and added_timestamp I did compare the data and I get a few matches, but what I want is to use just a portion of the .csv. Based on added_timestamp co... See more...
Hello, I have a .csv file with 2 columns: IoC and added_timestamp I did compare the data and I get a few matches, but what I want is to use just a portion of the .csv. Based on added_timestamp column I want to compare the IoC added in .csv in the last 7 days. Can someone help me to accomplish this ? Thank you in advance.
Hi Team,  Seeing this error when I go into the Mission Control App.  Any ideas? Thanks, Mike
Hi team, I heard that MC went GA.  Congratulations!  So, now that this is GA, is this our perm SOAR instance and to confirm that we have 100 actions per day as part of using MC?
I am attempting to calculate the following: -  Total Number "Requests Per Day" -  Average/Mean "Requests Per Day" -  Standard Deviation "Requests Per Day" I am using the following search: index=... See more...
I am attempting to calculate the following: -  Total Number "Requests Per Day" -  Average/Mean "Requests Per Day" -  Standard Deviation "Requests Per Day" I am using the following search: index=myCoolIndex cluster_name="myCoolCluster" sourcetype=myCoolSourceType label_app=myCoolApp ("\"statusCode\"") | rex .*\"traceId\"\s:\s\"?(?<traceId>.*?)\".* | dedup traceId | rex "(?s)\"statusCode\"\s:\s\"?(?<statusCode>[245]\d{2})\"?" | timechart span=1d count(statusCode) as "Number_Of_Requests" | where Number_Of_Requests > 0 | eventstats mean(Number_Of_Requests) as "Average Requests Per Day" stdev(Number_Of_Requests) as "Standard Deviation" I am getting results back, but am unsure if the results I am getting back are correct per what I am trying to look for.  For instance, I would have thought "stdev()" would need some eval statement to know what the "Total Requests Per Day" and "Average/Mean Requests Per Day" is?   Does the "where Number_Of_Requests > 0" skew the results since those are not getting added to the result set?  Was hoping someone would be able to take a look at my query and provide a little insight as to what I may still need to do so I can get an accurate Standard Deviation.  Also, below is the output I am getting from the current query: Number_Of_Requests Average Requests Per Day Standard Deviation 25687 64395 54741.378572337766 103103 64395 54741.378572337766   Any help is appreciated!  
Hi folks,   Im looking for config of splunk in palo alto Xsoar. im running Splunk ES in Windows server 2012. and i have installed universal forwarder in Windows for log collection (Active Directo... See more...
Hi folks,   Im looking for config of splunk in palo alto Xsoar. im running Splunk ES in Windows server 2012. and i have installed universal forwarder in Windows for log collection (Active Directory).  I would like to configure the splunk instance in palo  alto Xsoar. when i installed the API of Splunk Pycharm, im not able to connect it to splunk server( Entered correct ip address while configuring the instance, allowed inbound rule on windows splunk ES). but i cant connect. can anyone help me?   Thanks in advance.      
Can you please explain how to do partial fit on DensityFunction  I have created a lookup file for a day, say today.. it has fields src_ip, dest_ip, bytes, hod. First created a search as below... See more...
Can you please explain how to do partial fit on DensityFunction  I have created a lookup file for a day, say today.. it has fields src_ip, dest_ip, bytes, hod. First created a search as below: |inputlookup log_21.csv | fit DensityFunction bytes by "hod" into model1 partial_fit=true After that , i have to do partial fit for another lookup with same fields. How to do it? |inputlookup log_22.csv
I think I have a conceptual problem understanding these two commands but in my mind you'd build a model with fit and somehow use that model to forecast (predict) future events right?  But for the lif... See more...
I think I have a conceptual problem understanding these two commands but in my mind you'd build a model with fit and somehow use that model to forecast (predict) future events right?  But for the life of me I can't find any examples of this in practice.  To see it in pseudo-SPL might be something like this:     | mstats avg(some_metric) avg(some_other_metric) avg(yet_another_metric) WHERE index = my_index span=1d | table some_metric some_other_metric yet_another_metric _time | fit Ridge some_metric from some_other_metric yet_another_metric into my_model | predict some_metric as prediction     I guess everything I read and examples I see treat fit and predict as exclusive commands.  So I guess I have a couple questions.   What's the point of a model created with fit?   Do fit and predict work together to forecast?   Thank you to anyone who can clear up my thinking on this.   
Hello, after having redeployed my UF (with a props because my logs are in csv). I end up with my new parsing mixed with the old parsing. Does it speak to you? In my case, I have my logs which end... See more...
Hello, after having redeployed my UF (with a props because my logs are in csv). I end up with my new parsing mixed with the old parsing. Does it speak to you? In my case, I have my logs which end up with a non-existent header (it takes the 1st line of the csv which is a log). And at the same time it also parses correctly (because I modified the problem). THANKS
I recently installed the Splunk Add-on builder on my local environment (not in the cloud) and developed an Add-on using python and Rest API. Things worked well, I created 2 data collections and data... See more...
I recently installed the Splunk Add-on builder on my local environment (not in the cloud) and developed an Add-on using python and Rest API. Things worked well, I created 2 data collections and data comes in. I decided to remove the Rest API data collection and right after the Inputs page failed to load. I used to get the generic status code 500 but now it is "Request failed with status code 404". I went through all of the discussions in the community and none of the answers helped me. Internal Logs Search : index=_internal "error" Result:          message from "/Applications/Splunk/bin/python3.7 /Applications/Splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py" RuntimeError: assist binary not found, path=/Applications/Splunk/etc/apps/splunk_assist/bin/darwin_x86_64/assistsup message from "/Applications/Splunk/bin/python3.7 /Applications/Splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py" raise RuntimeError(f'assist binary not found, path={full_path}')         Thanks in advance, Aviv.
Hello,    I'm not a javascript expert, but I have found useful examples to build setup pages for app and add-on:     https://github.com/splunk/splunk-app-examples These examples show forms with... See more...
Hello,    I'm not a javascript expert, but I have found useful examples to build setup pages for app and add-on:     https://github.com/splunk/splunk-app-examples These examples show forms with input "text". I see that "textarea" works too. But I cant' build other input type, such as "select". How can I add "select" here? For instance,   e("label", null, [ "My label ", e("input", { type: "select", name: "mytype", options: [ { label: "File", value: "file" }, { label: "Folder", value: "folder" }, ], onChange: this.handleChange }), ]),   doesn't work. Thank you very much Kind Regards Marco
Hi Team, I am a newbie to the Splunk. I have install the Splunk Enterprise in a server and I have installed some of the forrwarders in other machines and I'm taking up the application, security, sy... See more...
Hi Team, I am a newbie to the Splunk. I have install the Splunk Enterprise in a server and I have installed some of the forrwarders in other machines and I'm taking up the application, security, system and setup logs. But now, I want to monitor the websites he visited and the outlook application either it is separate application or running in browser. I want to monitor them in order to detect the phishing attacks. There is no web proxy servers and IIS servers. Can anyone suggest some ideas to complete this use case (On detecting the phishing attacks)
So I currently have a stats sum donuts for the last 90 days and i am getting the following results like below sum(donuts) 54000   But i need a line chart to show the total num... See more...
So I currently have a stats sum donuts for the last 90 days and i am getting the following results like below sum(donuts) 54000   But i need a line chart to show the total number of donuts (sum (donuts) field)  for the last 90 days but just on  a 4 weeks interval. So i should have something like below I have a field for the lastEaten date but that shows how many were eaten on that specific date. 
I am building search head cluster.  But when i am initiating clustering i am getting below error. I am using below syntax for cluster initiation.  Please note I have checked my server.conf file in lo... See more...
I am building search head cluster.  But when i am initiating clustering i am getting below error. I am using below syntax for cluster initiation.  Please note I have checked my server.conf file in local directory and added entry as false for cliVerifyServerName and sslVerifyServerName.  Also in web.conf enableSplunkWebSSL = false is already there .   ./splunk init shcluster-config -auth <username>:<password> -mgmt_uri <uri>:<managementport> -replication_port <replication port> -replication_factor <n> -conf_deploy_fetch_url <url>:<managementport> -secret <security_key> -shcluster_label <label> WARNING: Server Certificate Hostname Validation is disabled. Please see server.conf/[sslConfig]/cliVerifyServerName for details. Login failed I am not looking to setup any third party or self singed certificate as this is just for self learning.  I would request your support. 
I am running a search like  index="main" app="student-api" "path"="/v1/enroll" And in the events (when I select Raw) I see this type of output { "application":"student-api", "environment":"prod",... See more...
I am running a search like  index="main" app="student-api" "path"="/v1/enroll" And in the events (when I select Raw) I see this type of output { "application":"student-api", "environment":"prod", "timestamp":"2023-02-23T08:24:23.163Z", "traceId":"2a2e3980-e61b-4927-b270-1785569d5af8", "response":{ "statusCode":"200", }, "request":{ "protocol":"HTTP/1.1", "method":"POST", "path":"/v1/enroll", "headers":{ "Accept-Encoding":[ "gzip" ], "Accept-Language":[ "en_US" ], "Content-Type":[ "application/json; charset=UTF-8" ], "Experiments":[ "{\"n\":\"first_enroll\",\"p\":\"BACKEND_SERVICE\",\"v\":\"FIRST\"},{\"n\":\"ttl_ios\",\"p\":\"BACKEND_SERVICE\",\"v\":\"default\"}]}" ], "TraceId":[ "2a2e3980-e61b-4927-b270-1785569d5af8" ] }, "cookies":"", "body":"" }, "duration":115 }  Now, I am trying to generate count based on how many of the ttl_ios has value = default ({\"n\":\"ttl_ios\",\"p\":\"BACKEND_SERVICE\",\"v\":\"default\"}] is where it is default)  vs non-default - so the result should look like say -  variant               count default                10 non-default        3 (default and non-default are the only 2 values it can have) Appreciate your help on this. Thanks
Hello, I have following results like below: Host Type Type Duplicate Field  ABCD Coca Cola Coca Cola EFGH 7up - Sprite 7up - Sprite   but i want my search for the Type Duplicate... See more...
Hello, I have following results like below: Host Type Type Duplicate Field  ABCD Coca Cola Coca Cola EFGH 7up - Sprite 7up - Sprite   but i want my search for the Type Duplicate Field to remove anything after the first space so my table should like below: Host Type Type Duplicate Field  ABCD Coca Cola Coca EFGH 7up - Sprite 7up   Any help would be greatly appreciated. 
Hi, There is a need to send automated summary of daily emails that are stopped by anti-spam filter. In Splunk we have act=quarantined and we can see all stopped emails, but how to create "one a... See more...
Hi, There is a need to send automated summary of daily emails that are stopped by anti-spam filter. In Splunk we have act=quarantined and we can see all stopped emails, but how to create "one alert or report" for specific users where they get on Email only results where their Email is matched.  Example. If I do something like this             index="email" act="QUARANTINED" | lookup email_notify_quarantine email AS orig_recipient OUTPUTNEW notify AS notify | where notify="true" ...             I'll get results for Quarantined messages for all user's that need to get that report and now If I create Scheduled Report or Alert I am afraid that user will get all list of Quarantined messages not only theirs.
I want to colour the column on the basis of other values which i am reading in splunk related to that field not on the basis of the range or the data of that field
I've got the following to calculate our quota: index=summary source="splunk-storage-summary"| stats latest(activeStorageLicenseGB) and the following to give a list of how much is in each of our ... See more...
I've got the following to calculate our quota: index=summary source="splunk-storage-summary"| stats latest(activeStorageLicenseGB) and the following to give a list of how much is in each of our indexes:   index=summary source="splunk-storage-detail" |stats latest(rawSizeGBCustomer) as "size" by idxName |sort -size |fields idxName size   What I'd like to do is display 'size' in the second query as a percentage of our quota using the results of the first query. I can do it if I use a join and then eval, but is there a way to store the results of that first query in a variable I can then use in the second query?