All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I have tried everything i can think of ... this should work but i get nothing... any help would be greatly appreciated... Here is my CSS .dashboard-row .dashboard-panel div[id^="#panel1"] { ba... See more...
I have tried everything i can think of ... this should work but i get nothing... any help would be greatly appreciated... Here is my CSS .dashboard-row .dashboard-panel div[id^="#panel1"] { background-image: url("/static/app/xxx/blue-background.png") !important; background-repeat: no-repeat !important; background-size: cover !important; }
Hello, We're wanting to collect information from Azure Synapse and found this is not an out of the box DBConnect setup. I've seen a few posts about setting up DBConnect to grab data out of Azure Syn... See more...
Hello, We're wanting to collect information from Azure Synapse and found this is not an out of the box DBConnect setup. I've seen a few posts about setting up DBConnect to grab data out of Azure Synapse using MSAL4J java driver and a custom stanza in db_connection_types.conf.  I wanted to see if anyone has any recommendations on how to configure Splunk to collect data from Synapse? TIA!
Hi, I'm trying to create a report that has information about all the hosts with the kernel version and OpenSSL version and SSH version. The package.sh (sourcetype=package)script on the Splunk_TA_*n... See more...
Hi, I'm trying to create a report that has information about all the hosts with the kernel version and OpenSSL version and SSH version. The package.sh (sourcetype=package)script on the Splunk_TA_*nix app fetches all the information but the issue is I'm not able to extract fields from the  sourcetype=package.Any help is appreciated    
I have splunk logs that are of 2 types, successes and failures. They contain 2 things: "SUCCESS" "ID: <IDNumber>" "FAILURE" "ID: <IDNumber>"   My goal is to find IDs that are identified with ... See more...
I have splunk logs that are of 2 types, successes and failures. They contain 2 things: "SUCCESS" "ID: <IDNumber>" "FAILURE" "ID: <IDNumber>"   My goal is to find IDs that are identified with failures that are not also identified with a success. So for the data: "SUCCESS" "ID: 0000", "FAILURE" "ID: 0000", "SUCCESS" "ID: 1111", "FAILURE" "ID: 2222", "SUCCESS" "ID: 3333", "FAILURE" "ID: 4444" the result would be the IDs 2222 and 4444   My current search is:    index=sampleindex source=samplesource "SUCCESS" | rex field=_raw "ID: (?<id1>+)" | join [search index=sampleindex source=samplesource "FAILURE" | rex field=_raw "ID: (?<id2>+)"] | table id1, id2    I am able to perform each of these searches separately and output the ids, but when I combine them I cannot get the results of either id1 or id2, so I am not able to compare them   Does anyone know how I can structure my search to achieve my final goal?  
I need to plot a world map and color the countries based on the count and display it on the Studio dashboard. This is my query. ...<ommitted> AS iso2 | search iso2=* | stats count by iso2 | look... See more...
I need to plot a world map and color the countries based on the count and display it on the Studio dashboard. This is my query. ...<ommitted> AS iso2 | search iso2=* | stats count by iso2 | lookup geo_attr_countries iso2 OUTPUT country | fields+ count, country | geom geo_countries featureIdField="country" This works fine on the search page. When Visualization=Choropleth Map is chosen it is plotted correctly.  I then add it on a Classic Dashboard type dashboard, which works fine. But when I add the same query to a Dashboard Studio dashboard, it doesn't work. I get this error. Cannot read properties of undefined (reading 'warn') How to fix this?
This is a two parter: 1.  Is there a way to export Splunk logs from an indexer to an offline Splunk Search Head and conduct searches/create dashboards using those imported logs?  Is there a licensi... See more...
This is a two parter: 1.  Is there a way to export Splunk logs from an indexer to an offline Splunk Search Head and conduct searches/create dashboards using those imported logs?  Is there a licensing issue with this approach? 2.  When exporting to the offline SH, I'd like to be able to differentiate which systems I'm searching/viewing in the dashboards - but my different test/dev/prod instances of the UFs that I'm pulling logs from will have the same IP address and hostnames.  Is there a way to differentiate which instance I'm searching/viewing when dumping those logs into the offline SH? Thank you.
Hello, Wondering if anyone created an Add-on to onboard Workday business data on employees and contractors.  I know Workday has an API to use for this.  The Add-on on Splunkbase is great, but doesn... See more...
Hello, Wondering if anyone created an Add-on to onboard Workday business data on employees and contractors.  I know Workday has an API to use for this.  The Add-on on Splunkbase is great, but doesn't onboard this type of information.  Any help is appreciated. Thanks
I have a search that returns all of my correlation searches for a given app.   | rest splunk_server=local count=0 /services/saved/searches | where match('action.correlationsearch.enabled', "1|[Tt]|... See more...
I have a search that returns all of my correlation searches for a given app.   | rest splunk_server=local count=0 /services/saved/searches | where match('action.correlationsearch.enabled', "1|[Tt]|[Tt][Rr][Uu][Ee]") | rename eai:acl.app as app, title as csearch_name, action.correlationsearch.label as csearch_label, action.notable.param.security_domain as security_domain | search app=my_app | table csearch_name, csearch_label, app, security_domain, qualifiedSearch, description This works fine and gives the desired output.  However, I would like to add a line in there that would automatically expand any macros in the qualifiedSearch field. e.g. search `azuread` "body.operationName"="Add member to role" but return:  search sourcetype=mscs:azure:eventhub "body.operationName"="Add member to role"   Is there a lookup or macroExpand function that I could add to my SPL to do this?
I have a big query that produces output like this.  Those rows are guid id, count of occurrences, then ip addresses (they're stored in csv like that in raw data). What I'm attempting to do is ba... See more...
I have a big query that produces output like this.  Those rows are guid id, count of occurrences, then ip addresses (they're stored in csv like that in raw data). What I'm attempting to do is basically combine instances of the same guid, sum all occurrences, and then have a column that would be a big csv of ALL ip addresses for the guid. I've tried many things, but not  having any luck. 
My first search with regex as following: index=bigip "Storefront_v243" | rex ".*Common:(?<sid>.*?): New session from client IP (?<ip>.*?) \(ST.*\) at VIP 123.45.78.172" With my second search, I... See more...
My first search with regex as following: index=bigip "Storefront_v243" | rex ".*Common:(?<sid>.*?): New session from client IP (?<ip>.*?) \(ST.*\) at VIP 123.45.78.172" With my second search, I'll have to reference these two matched fields from the first search index=bigip "Storefront_v243" | rex "Common:$sid$: Username '(?<un>.*?)' |  stats count as nrs by sid, un, $ip$ | dedup un $ip$ How can I combine these two search queries into one by using pipe?   Thanks a lot in advance!  
I need to extract the values between >>>>||  ||  and after the >>>>|| || referring the below sample and output should be like values between>>>>||1407|| should be temp=1407 values after >>>>||140... See more...
I need to extract the values between >>>>||  ||  and after the >>>>|| || referring the below sample and output should be like values between>>>>||1407|| should be temp=1407 values after >>>>||1407|| should be message=[POD CleanUp] File deleted from POD : /dfgd/dfgdfgdfg.dat Here is the sample log: {"source":"fdgdfdfg","log":"2023-08-21 04:07:12.400 INFO 42 --- [dfgdf] c.j.t.f.dgf.dfgd.dgf : >>>>||1407|| [POD CleanUp] File deleted from POD : /dfgd/dfgdfgdfg.dat","host":"xx-ret353.svr.gg.fghs.net","tags":["_dateparsefailure"],"@version":"1","Kubernetes.pod":"gkp-xcs-services-black-prd-67986d784-b6c5j","s_sourcetype":"tyu","@timestamp":"2023-08-21T08:07:28.420Z","Kubernetes.namespace":"80578d64606-56-fyt-ty-prod","appId":"1235","app_id":"2345","log_file":"/app/logs/app.log","Kubernetes.node":"sd-1564sw32b0f.svr.us.sdf.net"} @ITWhisperer  
Hi Team, I have integrated AppD with ServiceNow which was successful but in the description of the incident created I am getting <br> or </b> in between the sentences, could you please let me know ... See more...
Hi Team, I have integrated AppD with ServiceNow which was successful but in the description of the incident created I am getting <br> or </b> in between the sentences, could you please let me know how to remove these junk characters in the description of the incident created? Thanks Kamal Rath
Hi, I am looking for a query to extract respectively the list of alerts, reports and dashboards whose code cointains more than one join. Thank you Best regards Marta  
Hello Community, I am trying to calculate number of days (difference) between today's date and a list of dates but getting desired result only for 1 record out of list of records (different dates). ... See more...
Hello Community, I am trying to calculate number of days (difference) between today's date and a list of dates but getting desired result only for 1 record out of list of records (different dates). Can anyone help here please?Following query I am using: index="cds_prod_app" sourcetype=httpevent source="lambda:dip-prod-certs-validity-Splunk" | eval today=strftime(now(), "%d-%m-%Y") | eval todaydate=strptime(today, "%d-%m-%Y") | eval t1 = mvindex(expiry,0) | eval expirydate=strptime(t1, "%d-%m-%Y") | eval diff = round((expirydate - todaydate)/86400) | table expiry, today, diff ***Attaching screenshot   ~TIA
Hi All, We have created multiple reports in our Splunk Cloud Search head  so once we schedule it we want the reports to reach a shared network drive so how can we get it enabled from our end.
Hi  I am trying to get the overall score for the Reg type.  Each system already has a score. Just trying to calculate over score by Registration Type. Any help would be appreciated. Thanks
Im trying to make a high level view dashboard that has multiple dashboards in it. I want to use the sparkline because it is a compact chart with a lot of information. My problem is that the sparkl... See more...
Im trying to make a high level view dashboard that has multiple dashboards in it. I want to use the sparkline because it is a compact chart with a lot of information. My problem is that the sparkline shouldnt show any partial time buckets because then someone might look at the dashboard and think there is something wrong because of the dip at the end of the sparkline. From my understanding I cannot use partial=f in my query because i use stats is there any other way to achieve this? This is my query | stats sparkline(max(field7)) as "sparkline" max(field7) by field10 | rename field10 as "Environment" | rename max(field7) as "Response time max" | rex field="Response time max" mode=sed "s/(\.\d{2})\d*/\1/"
Hello Community, i get all 24h a version REST call. How to get a alert mail with the new version (like 2023-09) as text in the message body, when a new different version appears like... Line 1: 2... See more...
Hello Community, i get all 24h a version REST call. How to get a alert mail with the new version (like 2023-09) as text in the message body, when a new different version appears like... Line 1: 2023-09 Line 2: 2023-08 index="my_index" source="/var/log/my_version" | head 2 | spath version Regards - Markus  
Hello I'm using Splunk Cloud and im looking for an option to disable multiple alert using rest api or script so it will be semi automatic  Since I'm using the Cloud, I don't have access to saveds... See more...
Hello I'm using Splunk Cloud and im looking for an option to disable multiple alert using rest api or script so it will be semi automatic  Since I'm using the Cloud, I don't have access to savedsearches.conf file. Any ideas ? Thanks
Hi, We are working on developing a second version of the app. Our use-case is when the App is upgraded we need to clear the existing kvstore contents programmatically. And to NOTE we need to clear t... See more...
Hi, We are working on developing a second version of the app. Our use-case is when the App is upgraded we need to clear the existing kvstore contents programmatically. And to NOTE we need to clear this kvstore only at the beginning of the app upgrade. It will be great if you can share some guidelines on achieving this.