All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi I'm trying to create dynamic apps using a python script below:   import os import requests from requests.auth import HTTPBasicAuth from requests.exceptions import RequestException def get_... See more...
Hi I'm trying to create dynamic apps using a python script below:   import os import requests from requests.auth import HTTPBasicAuth from requests.exceptions import RequestException def get_config(): config = { 'base_url': 'https://localhost:8089', 'csv_filename': "test9.csv", 'template_name': 'template_placeholder', 'username' : 'chakrad8', 'password' : 'random_password' } config['app_name'] = os.path.splitext(config['csv_filename'])[0] return config config = get_config() def make_request(method,url,data=None): try: response = requests.request(method,f"{config['base_url']}{url}",auth=HTTPBasicAuth(config['username'],config['password']),data=data,verify=False) return response except RequestException as e: print(f"An error occurred while making the request: {e}") return None response = make_request("GET",f"/services/apps/local/{config['app_name']}") if response.status_code == 200: print(f"App '{config['app_name']}' already exists.") else: response = make_request('POST',"/services/apps/local/",{'name':config['app_name'],'template':config['template_name']}) print(response) if response and response.status_code == 201: print(f"App '{config['app_name']}' created successfully.") else: print(f"Error Creating app: {response.text}")   I've created a custom template app called 'template_placeholder'  and i want to use it as a template when creating the new app so i was calling  response = make_request('POST',"/services/apps/local/",{'name':config['app_name'],'template':config['template_name']})   This is creating the app but the label is remaining same as the custom template app that is 'template_placeholder'. is there any workaround to this or is my request malformed in some way? , please help with this.
I have many event with the following format:  EVENT 1 {     'colors': [          {'color': 'red', 'appearances': 3},         {'color': 'blue', 'appearances': 2},         ...         {'c... See more...
I have many event with the following format:  EVENT 1 {     'colors': [          {'color': 'red', 'appearances': 3},         {'color': 'blue', 'appearances': 2},         ...         {'color': 'yellow', 'appearances': 4} }   EVENT 2 {     'colors': [          {'color': 'green', 'appearances': 1},         {'color': 'blue', 'appearances': 4},         ...         {'color': 'yellow', 'appearances': 2} }   I want to accumulate the field appearances after verifying each event grouped by the field color to have the following output: | ---------------------------------------------------------------------------------- | |                              Color                         |                     Appearances               | | ---------------------------------------------------------------------------------- | |                          blue                               |                               6                              | |                          red                                 |                               3                              | |                          yellow                           |                               6                              | |                          green                             |                               1                              | |  ---------------------------------------------------------------------------------- |   Does anyone know how to obtain this result, I have been playing with mv functions, but I am not able to produce the expected output. 
Is there a way to format the floating values displayed on the charts?  
So I have an angular app that compiles and it runs and I get a response but it is sending me all the data.  I have compared it to a post man call where I put nothing in the body and it seems to be th... See more...
So I have an angular app that compiles and it runs and I get a response but it is sending me all the data.  I have compared it to a post man call where I put nothing in the body and it seems to be the same so I think the issue is that the post is not sending in the body data.  I have tried to send it both in the body and using the params feature.  One complication is that when I console log params I don't ge the specifics like I do with the body.  Ok so im not going to bother putting in the header as it has the token in it and the call does go through so it seems like the preflight and everything is working.  For the body approach I am using: const body=JSON.stringify({ search: 'search index=dct_claims_dev dct_tenantID=10061675a sourcetype=\"mscs:azure:eventhub\" \"body.ApplicationName\"=* correlation_id!=null log_level=\"*\" \"body.@timestamp\"=\"*\" message=\"*\" \"body.Data\"=\"*\" | rex field=message \"(?i)(?<Message>.+?)(stack|\\Z)\" | rex field=body.Data \"(?i)(?<Data>.+?)(stack|\\Z)\" | rename \"body.@timestamp\" as \"Timestamp\", \"body.ApplicationName\" as Source, \"correlation_id\" as \"CorrelationId\", \"log_level\" as \"LogLevel\" | table Timestamp dct_tenantID Source dest CorrelationId LogLevel Message Data | sort - Timestamp', earliest_time: '-5m', latest_time: 'now', adhoc_search_level: 'fast' }); this.http.post('/api', body, { responseType: 'text', headers: headers }).subscribe(response => { this.apiResult = response; console.log(body); console.log(response); and the console log of body (remember response is just everything) is: {"search":"search index=dct_claims_dev dct_tenantID=10061675a sourcetype=\"mscs:azure:eventhub\" \"body.ApplicationName\"=* correlation_id!=null log_level=\"*\" \"body.@timestamp\"=\"*\" message=\"*\" \"body.Data\"=\"*\" | rex field=message \"(?i)(?<Message>.+?)(stack|\\Z)\" | rex field=body.Data \"(?i)(?<Data>.+?)(stack|\\Z)\" | rename \"body.@timestamp\" as \"Timestamp\", \"body.ApplicationName\" as Source, \"correlation_id\" as \"CorrelationId\", \"log_level\" as \"LogLevel\" | table Timestamp dct_tenantID Source dest CorrelationId LogLevel Message Data | sort - Timestamp","earliest_time":"-5m","latest_time":"now","adhoc_search_level":"fast"} While the params version I have is: const params = new HttpParams() .set('search', 'search index=dct_claims_dev') .set('earliest_time', '-5m') .set('latest_time', 'now') .set('adhoc_search_level', 'fast'); const options = { headers: headers, params: params }; this.http.post('/api', null, { responseType: 'text', headers: headers, params: params }).subscribe(response => { this.apiResult = response; console.log(params); console.log(headers); console.log(response); }); and the console log of headers and params (I put in headers because I wanted to compare the output to params): HttpParams {updates: null, cloneFrom: null, encoder: HttpUrlEncodingCodec, map: Map(4)} app.component.ts:33 HttpHeaders {normalizedNames: Map(2), lazyUpdate: null, lazyInit: null, headers: Map(2)}   Any ideas
I am new to Splunk and I wanted to make a dashboard to showcase the count of Linux machines and their distributions in the environment. I have gotten the search to be almost what I want except the ou... See more...
I am new to Splunk and I wanted to make a dashboard to showcase the count of Linux machines and their distributions in the environment. I have gotten the search to be almost what I want except the output statistic is wrong in the naming.  This is the current search.   index=main host=* sourcetype=syslog process=elcsend "\"config " "CentOS Linux release 7.9.2009 (Core)" OR "Rocky Linux release 8.7 (Green Obsidian)" OR "Rocky Linux release 9.1 (Blue Onyx)" | rex "CentOS Linux release (?P<vers>\d.\d)" | eval vers = "CentOS ".vers | rex "Rocky Linux release (?P<vers>\d.\d)" | eval vers = "Rocky ".vers | dedup host | stats count by vers | addcoltotals label=Total labelfield=vers | sort count desc    And this is the output.       I am looking to have "Rocky CentOS 7.9" to just be named "CentOS 7.9" while the others remain as they are.  
In order to satisfy the "Upgrade readiness app" in 9.0.2 it seems we must set "requireClientCert = true" in our server.conf under the [sslConfig] stanza. However, when I do this I begin to see a lot ... See more...
In order to satisfy the "Upgrade readiness app" in 9.0.2 it seems we must set "requireClientCert = true" in our server.conf under the [sslConfig] stanza. However, when I do this I begin to see a lot of errors in splunkd.log of the following nature: 03-31-2023 10:53:36.274 -0400 ERROR ExecProcessor [1996726 ExecProcessor] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_secure_gateway/bin/ssg_enable_modular_input.py" Enter PEM pass phrase: Presumably this is because the key has a passphrase set on it, but we do have "sslPassword" set, so  I'm not sure what's causing the issue. Our complete sslConfig stanza looks like this:   [sslConfig] enableSplunkdSSL = true sslRootCAPath = /opt/splunk/etc/auth/certs/ca.pem serverCert = /opt/splunk/etc/auth/certs/combined.pem sslPassword = <REDACTED> sslVerifyServerCert = true verifyServerCert = true requireClientCert = true I haven't been able to find this issue mentioned anywhere. Any help would be appreciated. TIA
Hi there, Before installing the Windows TA addon to a server , Windows Event Logs were shown in a different format, they are now shown in XML. I want to see searches in the original format. I hav... See more...
Hi there, Before installing the Windows TA addon to a server , Windows Event Logs were shown in a different format, they are now shown in XML. I want to see searches in the original format. I have checked the inputs.conf file and noticed "renderXml=true", I changed this to "renderXml=false" but it hasn't made a difference.  Any help would be appreciated. Jamie
<html><head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8"><meta name="Generator" content="Microsoft Word 15 (filtered medium)"><style> <!-- @font-face {font-family:"Cambria Math... See more...
<html><head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8"><meta name="Generator" content="Microsoft Word 15 (filtered medium)"><style> <!-- @font-face {font-family:"Cambria Math"} @font-face {font-family:Calibri} p.MsoNormal, li.MsoNormal, div.MsoNormal {margin:0in; font-size:11.0pt; font-family:"Calibri",sans-serif} span.EmailStyle17 {font-family:"Calibri",sans-serif; color:windowtext} .MsoChpDefault {font-family:"Calibri",sans-serif} @page WordSection1 {margin:1.0in 1.0in 1.0in 1.0in} div.WordSection1 {} --> </style></head><body lang="EN-US" link="#0563C1" vlink="#954F72" style="word-wrap:break-word"><div class="WordSection1"><p class="MsoNormal"><br>Dear [User],</p><p class="MsoNormal">&nbsp;</p><p class="MsoNormal"><br>I am writing to provide you with an update on the recent test email incident that occurred.<br><br><br><br>If you have any questions or concerns, please do not hesitate to contact our team.<br>Thank you for your attention to this matter.<br><br></p><p class="MsoNormal">Sincerely,<br><br></p><p class="MsoNormal">&nbsp;</p><p class="MsoNormal">&nbsp;</p></div></body></html><div class=""><p class="">Above format I get my body of the email in Splunk body field How do I get this to look like below in Splunk.<br />What should you in Line breaker configuration sourcetype?<br /><br />Dear [User],</p><p class="">&nbsp;</p><p class=""><br />I am writing to provide you with an update on the recent test email incident that occurred.<br /><br /><br /><br />If you have any questions or concerns, please do not hesitate to contact our team.<br />Thank you for your attention to this matter.<br /><br /></p><p class="">Sincerely,<br /><br /></p><p class="">&nbsp;</p><p class="">&nbsp;</p></div>
Exchange TA I have the scripted input for the Admin Audit Log.    In a nutshell the script runs the command below with additional scripting to dump to a file, etc.. Search-AdminAuditLog -Start... See more...
Exchange TA I have the scripted input for the Admin Audit Log.    In a nutshell the script runs the command below with additional scripting to dump to a file, etc.. Search-AdminAuditLog -StartDate $LastSeen -EndDate (Get-Date)   I am receiving the following error: WARNING: An unexpected error has occurred and a Watson dump is being generated: Object reference not set to an instance of an object. I have create a new domain account with appropriate roles in exchange.  I can also run this command successfully from the new user. Any ideas on how to troubleshoot this?  I'm at a loss. There is nothing in the event viewer that helps.
I am running script to get ping status of the servers and i onboarded the logs and extract filed as Servers.Now in my inputlookup i have 5 fields (ServerName,ApplicationName,Environment,Alias,IPAdres... See more...
I am running script to get ping status of the servers and i onboarded the logs and extract filed as Servers.Now in my inputlookup i have 5 fields (ServerName,ApplicationName,Environment,Alias,IPAdress).So i need to map the query result with inputlookup. Index=* sourcetype=StatusPing |rex field=_raw "^[^\|\n]*\|\s+(?P<Servers>[^ ]+)" | eval Status=case(Lost=0, "UP", Lost=2, "Warning", Lost=4, "Down")|append [|inputlookup PingStatus.csv|rename Servers as ServerName ]|table Alias,EnvironmentName,ApplicationName,ServerName,IPAddress,Lost,Status Thanks in Advance
Hello,   I have a windows 10 machine on which I have installed splunk enterprise server (indexer) and on another linux VM I have installed the forwarder as a docker container. For the forwarder,... See more...
Hello,   I have a windows 10 machine on which I have installed splunk enterprise server (indexer) and on another linux VM I have installed the forwarder as a docker container. For the forwarder, I have created a volume (monitoring  /var/lib/docker) and mapped it to the above docker container.  sudo docker run -d -p 9997:9997 -v simplevol:/var/lib/docker -e "SPLUNK_START_ARGS=--accept-license" -e "SPLUNK_PASSWORD=admin@123" --name splunkcontainer splunk/universalforwarder:8.2 I need to monitor the /var/lib/docker folder on linux and send it to my windows server index for any changes. But I am not able to achieve that when I install the agent as a container. (If I go with normal install it works fine and data is sent to index)   Could anyone help me on how to achieve the communication between forwarder agent container and windows indexer
Hello, does getting all initial data from fw, network appliances, servers... in sc4s log collector is free as open-source rsyslog or it's counting as Splunk Enterprise license usage? Can we use i... See more...
Hello, does getting all initial data from fw, network appliances, servers... in sc4s log collector is free as open-source rsyslog or it's counting as Splunk Enterprise license usage? Can we use it to also forward data to Elastic/Logstash (ELK) ? Thanks!
Hi, Can someone recommend a way to save the results of a Splunk search locally or to shared drive? We`re using a hybrid deployment (Cloud and Enterprise), but the data we require is available in Clo... See more...
Hi, Can someone recommend a way to save the results of a Splunk search locally or to shared drive? We`re using a hybrid deployment (Cloud and Enterprise), but the data we require is available in Cloud only. Many thanks, Toma
Hi. I have added a splunk DB connection. And i want to retrieve via API all connections and its metadata. Can anyone help or forward me to docs?
Hi there!     I need to choose the color in the dashboard based on the text results in dashboard,    where the value is "OK", it should be in green color,     where the value is "Ko", it should... See more...
Hi there!     I need to choose the color in the dashboard based on the text results in dashboard,    where the value is "OK", it should be in green color,     where the value is "Ko", it should be in red color, this is the only field that the dashboard returns.   Thanks in Advance!
Hi There,     I had a panel "OS", that gives the value os,  based on the value of os,  if it were "Windows" it should display a panel "defender version", not "Agent version" If it were  "MAC"... See more...
Hi There,     I had a panel "OS", that gives the value os,  based on the value of os,  if it were "Windows" it should display a panel "defender version", not "Agent version" If it were  "MAC" it should display "Agent version", not "defender version" I don't need drop down by selecting the values in "OS" panel, The os values wants to make impact on choosing the panel.   Thanks in Advance!
Hello, sharing my experience for beginners, especially new Splunk customers    Connected UF / forwarders :     index=_internal source=*metrics.log group=tcpin_connections | eval sourceH... See more...
Hello, sharing my experience for beginners, especially new Splunk customers    Connected UF / forwarders :     index=_internal source=*metrics.log group=tcpin_connections | eval sourceHost=if(isnull(hostname), sourceHost,hostname) | rename connectionType as connectType | eval connectType=case(fwdType=="uf","univ fwder", fwdType=="lwf", "lightwt fwder",fwdType=="full", "heavy fwder", connectType=="cooked" or connectType=="cookedSSL","Splunk fwder", connectType=="raw" or connectType=="rawSSL","legacy fwder") | eval version=if(isnull(version),"pre 4.2",version) | rename version as Ver | fields connectType sourceIp sourceHost destPort kb tcp_eps tcp_Kprocessed tcp_KBps splunk_server Ver | eval Indexer= splunk_server | eval Hour=relative_time(_time,"@h") | stats avg(tcp_KBps) as average_kbps sum(tcp_eps) sum(tcp_Kprocessed) sum(kb) by Hour connectType sourceIp sourceHost destPort Indexer Ver | dedup sourceHost | sort - avg(tcp_KBps) | search connectType="univ fwder" | stats dc(sourceHost) as nb_hosts     Current license usage:     index=_internal source=*license_usage.log type=Usage | fields h, b | rename h as host_name | timechart span=1h sum(eval(round(b/1024,2))) AS Total_KB | streamstats sum(Total_KB) as Cumul | fields - Total_KB | tail 1 | eval etatlic=round(Cumul/1024,0) | table etatlic     Chart over last days:     index=_internal source=*license_usage.log type=Usage earliest=-0d@d latest=now | timechart span=15m sum(eval(round(b/1024/1024,2))) AS Total_MB | eval ReportKey="today" | streamstats sum(Total_MB) as cumul | append [search index=_internal source=*license_usage.log type=Usage earliest=-1d@d latest=-0d@d | timechart span=15m sum(eval(round(b/1024/1024,2))) AS Total_MB | eval ReportKey="yesterday" | streamstats sum(Total_MB) as cumul| eval _time=_time+86400 ] | append [search index=_internal source=*license_usage.log type=Usage earliest=-2d@d latest=-1d@d | timechart span=15m sum(eval(round(b/1024/1024,2))) AS Total_MB | eval ReportKey="2 days ago" | streamstats sum(Total_MB) as cumul| eval _time=_time+172800] | append [search index=_internal source=*license_usage.log type=Usage earliest=-3d@d latest=-2d@d | timechart span=15m sum(eval(round(b/1024/1024,2))) AS Total_MB | eval ReportKey="3 days ago" | streamstats sum(Total_MB) as cumul| eval _time=_time+259200] | timechart span=15m avg(cumul) by ReportKey     Predictive license usage for today:     index=_internal source=*license_usage.log type=Usage | eval MB = round(b/1024,2) | timechart span=1h sum(MB) as totalkb | eval hour = strftime(_time,"%H") |streamstats sum(totalkb) as totalCumulativeMB reset_before="("hour==0")" | eval htilmnight=24-hour | predict totalCumulativeMB future_timespan=24 | where _time=relative_time(now(),"+1d@d") | rename prediction(totalCumulativeMB) as midprediction | eval midprediction=round((midprediction/1024),0) | table midprediction       Most consuming sources today:     index=_* source=*license* | eval h = lower(replace(h,"myFQDN","")) | eval h = lower(replace(h,".myotherFQDN.fr","")) | eval mb=round((b/1024),2) | stats sum(mb) as totalkb by s,h,idx | sort - totalkb | search s!="" | eval totalkb=round(totalkb/1024) | rename totalkb as totalmb | search totalmb>100       Yesterday:     index=_* source=*license* | eval h = lower(replace(h,"myFQDN","")) | eval h = lower(replace(h,".myotherFQDN.fr","")) | eval mb=round((b/1024),2) | stats sum(mb) as totalkb by s,h,idx | sort - totalkb | search s!="" | eval totalkb=round(totalkb/1024) | rename totalkb as totalmb | search totalmb>100     Diff license per host:     index=_internal source=*license_usage.log type=Usage earliest=@d latest=@h | stats sum(eval(round(b/1024,2))) AS Total_KB by h,s | join h,s [search index=_internal source=*license_usage.log type=Usage earliest=-1d@d latest=-1d@h | rename b as b_y | stats sum(eval(round(b_y/1024))) AS Total_KB_y by h,s] | eval diff_Total_KB=Total_KB-Total_KB_y | fields - Total_KB* | where (diff_Total_KB<-1000 OR diff_Total_KB>1000) | sort - diff_Total_KB | eval diff_Total_KB=round(diff_Total_KB/1024) | rename diff_Total_KB as diff_Total_MB | eval h = lower(replace(h,"myFQDN","")) | eval h = lower(replace(h,".myotherFQDN.fr","")) | search s!=""     Missing sources :     index=_* source="*license_usage.log" earliest=-1d@d latest=@d | eval h = lower(replace(h,".myFQDN.fr","")) | eval h = lower(replace(h,".myotherFQDN.fr","")) | eval indexname=idx | eval hostname=h | eval sourcename=s | stats sum(b) as sumByesterday by indexname hostname sourcename | eval sumByesterday=round(sumByesterday/1024,0) | search sumByesterday>0 | join indexname hostname sourcename type=left [search index=_* source="*license_usage.log" earliest=@d latest=now | eval h = lower(replace(h,".myFQDN.fr","")) | eval h = lower(replace(h,".myotherFQDN.fr","")) | eval indexname=idx | eval hostname=h | eval sourcename=s | stats sum(b) as sumBtoday by indexname hostname sourcename | eval sumBtoday=round(sumBtoday/1024,0)] | search sumBtoday=0 | sort indexname       More may come later or don't hesitate to reply. Have a nice day     
Column1                       column2               column3 abc                                            1 def                                             2 ghi                               ... See more...
Column1                       column2               column3 abc                                            1 def                                             2 ghi                                              3 jkl                                                4 abc                                                                            7 def                                                                              8 ghi                                                                               9 jkl                                                                                5
Hi, Could any one able to write the query for the use case if user triggers both alerts (alert_name="*pdm*" AND alert_name="*encrypted*") in between 2 hours another use case is alertname!="*pdm*"... See more...
Hi, Could any one able to write the query for the use case if user triggers both alerts (alert_name="*pdm*" AND alert_name="*encrypted*") in between 2 hours another use case is alertname!="*pdm*" if user triggers other than pdm alert in between 2 hours  need a query for above use cases  thanks...
Got Error ON Jetty server while instrument the App Agent in jetty.ini file. Error opening zip file or JAR manifest missing : /D:/APP Error occurred during initialization of VM agent library failed... See more...
Got Error ON Jetty server while instrument the App Agent in jetty.ini file. Error opening zip file or JAR manifest missing : /D:/APP Error occurred during initialization of VM agent library failed to init: instrument