All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi  Is there any app in Splunk to monitor ups logs or any sample,  demo ups monitoring dashboard available which I use as reference. Thanks 
How to fill null values in JSon field hello community, good afternoon I am trapped in a challenge which I cannot achieve how to obtain the expected result. Currently I have a log that contains a f... See more...
How to fill null values in JSon field hello community, good afternoon I am trapped in a challenge which I cannot achieve how to obtain the expected result. Currently I have a log that contains a field in JSon format: { "data": { "steps": [ { "code": "11501", "index": "0", "counter": "1", "stepType": "Req & Resp", "optional": "False", "parallel": "False", "parents": "-1", "children": "1", "begin": "2021-03-24T10:00:02.597-03:00", "serviceOut": "2021-03-24T10:00:02.598-03:00", "serviceIn": "2021-03-24T10:00:02.857-03:00", "end": "2021-03-24T10:00:02.859-03:00", "timedout": "False", "lateResp": "False", "status": "C", "return": "" }, { "code": "11502", "index": "1", "counter": "2", "stepType": "Req & Resp", "optional": "False", "parallel": "False", "parents": "0", "children": "2", "begin": "2021-03-24T10:00:02.860-03:00", "serviceOut": "2021-03-24T10:00:02.886-03:00", "serviceIn": "2021-03-24T10:00:03.238-03:00", "end": "2021-03-24T10:00:03.243-03:00", "timedout": "False", "lateResp": "False", "status": "C", "return": "03546" }, { "code": "11505", "index": "4", "counter": "3", "stepType": "Req & Resp", "optional": "False", "parallel": "False", "parents": "3", "children": "5", "begin": "2021-03-24T10:00:03.246-03:00", "serviceOut": "2021-03-24T10:00:03.250-03:00", "serviceIn": "2021-03-24T10:00:03.293-03:00", "end": "2021-03-24T10:00:03.294-03:00", "timedout": "False", "lateResp": "False", "status": "C", "return": "03546" } ] } } Query Splunk: index="pagos_moviles" source="/veritran/vt-net/log/MP_TRANSAC_LOG_BECP_ND*.log" trxl_resp!="000" | spath input=trxl_tech_detail | fields _time trxl_tech_detail data.steps* | fillnull value="-" data.steps* |rename data.steps{}.begin AS begin, data.steps{}.children AS step, data.steps{}.code AS code, data.steps{}.counter AS counter, data.steps{}.end AS end, data.steps{}.index AS index, data.steps{}.lateResp AS lateResp, data.steps{}.optional AS optional, data.steps{}.parallel AS parallel, data.steps{}.parents AS parents, data.steps{}.return AS return, data.steps{}.serviceIn AS serviceIn, data.steps{}.serviceOut AS serviceOut, data.steps{}.status AS status, data.steps{}.stepType AS stepType, data.steps{}.timedout AS timedout| eval campos=mvzip(begin,mvzip(step,mvzip(code,mvzip(end,mvzip(index,mvzip(status,mvzip(return,timedout))))))) | mvexpand campos | eval campos=split(campos,",")|eval begin =mvindex(campos,0) |eval step=mvindex(campos,1) |eval code =mvindex(campos,2) |eval end =mvindex(campos,3) |eval index =mvindex(campos,4) |eval status=mvindex(campos,5) |eval return=mvindex(campos,6) |eval timedout =mvindex(campos,7)| table _time trxl_tech_detail campos Servicio Operacion begin step code counter end index return status timedout trxl_resp Detalle Tipo_Error trxl_transaction_code | fillnull value="-" return Result:    As you can see in the photo, it only expands 2 rows and they should be 3. I also noticed that the return value does not always exist and I think this is what finally causes it not to expand all the json. Is there a way to fill the null values in the json with some character? In advance, thank you very much and excuse me for my English but it is not my native language.  
I am currently trying to parse data to map to a specific CIM-compliant field name. Specifically, I have setup a field alias as such: AffectedItems{}.Attachments ASNEW file_name After creating this ... See more...
I am currently trying to parse data to map to a specific CIM-compliant field name. Specifically, I have setup a field alias as such: AffectedItems{}.Attachments ASNEW file_name After creating this alias, when I do a search for the data, I can see the original field in the data, but file_name is only a fraction of the total events (%s are based on results at the time of my most recent search): AffectedItems{}.Attachments: 25.52% coverage file_name: 0.08% coverage To clarify, I am trying to normalize this data for the CIM Email Datamodel. The small coverage is from another sourcetype where I had created a field alias: messageParts{}.filename ASNEW file_name In this second sourcetype, it's a much smaller amount of data, but they have an identical coverage of 98.9%. At first we theorized it may be an issue with the curly braces, but one alias works, but not another. Looking to see if anyone has encountered a similar issue and knows the cause.
I'm trying to track state changes but having a difficult time. Ideally I'd like to know when a state changes from 0 to either 1, 2 or 3 and then back to 0, capturing the event's date/time that the va... See more...
I'm trying to track state changes but having a difficult time. Ideally I'd like to know when a state changes from 0 to either 1, 2 or 3 and then back to 0, capturing the event's date/time that the value changed initially from 0 until it goes back to 0 in a dashboard panel. Is that possible?
I have 3 search heads, 1 search head deployer. I need to create "a new app"  for which I will create a role that will be able to query few indices that are assigned to it in resources tab. I need t... See more...
I have 3 search heads, 1 search head deployer. I need to create "a new app"  for which I will create a role that will be able to query few indices that are assigned to it in resources tab. I need to create roles on SearchHead, because I have SAML auth setup there. When I create this new app on one search head (via UI -> Manage Apps -> Create App), I do not see it available on other search heads, or in the list of resources when I try to create the role. When I create this new app on the Search Head Deployer, I cannot see it on SHs when creating role. I cannot make `splunk shbundle-deploy` (or whatever the command actually is), when I try, it fails, because splunk tries to create something in `/dev/null/.splunk` (it runs under system account which does not have shell access; so I guess that's related to that and it cannot change). Where do I create the app and how do I push it to search heads, so I can see it there? Thanks for your time.
I have a search that I am using for tracking VPN connection and I have found that I have users having multiple connections throughout a day and I would like to add the same day connections together. ... See more...
I have a search that I am using for tracking VPN connection and I have found that I have users having multiple connections throughout a day and I would like to add the same day connections together.   Here is what I get from my search.  user Duration Termination Reason _time LocationIP jhicks 0h21m23s User Requested 2021-03-25T14:43:48.000-0400 Waverly, United States jhicks 0h31m16s Idle Timeout 2021-03-25T14:15:42.000-0400 Waverly, United States jhicks 0h09m49s User Requested 2021-03-25T13:23:03.000-0400 Waverly, United States jhicks 1h53m07s Idle Timeout 2021-03-25T12:57:42.000-0400 , United States jhicks 2h27m12s Idle Timeout 2021-03-24T15:32:43.000-0400 , United States and here is kind of what I am looking for. just adding the Duration fields together if they occur on the same day  jhicks 2h55m35s Idle Timeout 2021-03-25 , United States jhicks 2h27m12s Idle Timeout 2021-03-24 , United States     Here is what my search looks like  Cisco_ASA_message_id=113019 NOT "AnyConnect-Parent" | transaction user endswith="Duration:" keepevicted=true | eval full_duration = duration_hour."h".duration_minute."m".duration_second."s" | eval Start_time=strftime(_time,"%Y/%m/%d %H:%M:%S"), End_time=(strftime(_time + duration,"%Y/%m/%d %H:%M:%S")), Total_time=if(isnull(full_duration), Start_time." --> current session",Start_time." --> ".End_time) | mvexpand src | iplocation src | eval LocationIP=City.", ".Country | stats values(host) as host values(Total_time) as "Session Time" values(src) as "PublicIP" values(LocationIP) as LocationIP values(assigned_ip) as "Assigned IP" values(reason) as "Termination Reason" values(bytesMB) as bytesMB values(bytes_inMB) as bytes_inMB values(bytes_outMB) as bytes_outMB values(full_duration) as Duration by _time, user | sort -_time | search PublicIP=* |search user=$_user$ | table user Duration "Termination Reason" _time LocationIP "PublicIP" Duration
Hello everyone. There is a task of comparing the sessions of the user who came from the VPN and further with the same account in the RDP if user vpn = user rdp session = "matched", otherwise = "not ... See more...
Hello everyone. There is a task of comparing the sessions of the user who came from the VPN and further with the same account in the RDP if user vpn = user rdp session = "matched", otherwise = "not match" Session vpn index=fortigate eventtype=ftnt_fgt_event subtype=vpn (tunneltype="ssl-tunnel" OR tunneltype="ssl-web")  |transaction startswith=(logdesc="SSL VPN tunnel up") endswith=(command="SSL tunnel established") |dedup tunnelid Session rdp index=windows source = WinEventLog:Microsoft-Windows-TerminalServices* (EventCode=1149 OR EventCode=21) |mvexpand User |search User!=NOT_TRANSLATED |rex field=User "(?<user>[\w+\.\w+]*$)" |table _time,user,Source_Network_Address   I haven't built the query myself yet. grateful for any thoughts on request
In the cofense addon https://splunkbase.splunk.com/app/5253/ this confused me for a while on what the credentials where tried adding   the username and passwords in the credentials, by which I thou... See more...
In the cofense addon https://splunkbase.splunk.com/app/5253/ this confused me for a while on what the credentials where tried adding   the username and passwords in the credentials, by which I thought it means the username and password you use to access the gui In the cofense v2API  we can create Client ID: Client Secret: but that doesnt seem to work with app (think its V1) Found eventually under Account mangement that a User can create an API V1 key so use the logui gin as the username and the API as the password It now seems obvious , put posting for thick people like me , who might find this useful
Hello my dear friends
Reg. Correlation searches. Do they have to be configured in Splunk Ent. & ES? Could they be only on one of these 2 ? And reused in the whole environment? If can be on one side? How do I benefit acros... See more...
Reg. Correlation searches. Do they have to be configured in Splunk Ent. & ES? Could they be only on one of these 2 ? And reused in the whole environment? If can be on one side? How do I benefit across the whole environment?
I need to get average license utilization per sourcetype and host for 30 days for a particular index and I was trying this, which I had got from answers.com but not sure whether this query is correct... See more...
I need to get average license utilization per sourcetype and host for 30 days for a particular index and I was trying this, which I had got from answers.com but not sure whether this query is correct. index=_internal source=*license_usage.log* type="Usage" idx="xxx" earliest=-30d@d latest=@d | eval h=if(len(h)=0 OR isnull(h),"(SQUASHED)",h) | eval s=if(len(s)=0 OR isnull(s),"(SQUASHED)",s) | eval st=if(len(st)=0 OR isnull(st),"(UNKNOWN)",st) | fields _time, pool, b,h,st | bin _time span=1d | stats sum(b) as b by _time, pool,h,st | stats sum(b) AS volume by h, _time,st | stats avg(volume) AS avgVolume max(volume) AS maxVolume by h,st | eval avgVolumeGB=round(avgVolume/1024/1024/1024,3) | eval maxVolumeGB=round(maxVolume/1024/1024/1024,3) | fields h,st, avgVolumeGB, maxVolumeGB | rename avgVolumeGB AS "average" maxVolumeGB AS "peak",st AS "sourcetype", h AS "hostname" | sort -sourcetype, hostname | head 10  
My Splunk query is giving results but it is showing latitude & longitude details for all the countries. But i want my results to show country name and Event title details. Please help here.   Splu... See more...
My Splunk query is giving results but it is showing latitude & longitude details for all the countries. But i want my results to show country name and Event title details. Please help here.   Splunk Query - index=graphsecuityalert | iplocation LogonIP | geostats count by Event_Title  
Hi there! I am new to Splunk and i have a task that "Find count of employees based on their experience range, 0-5, 5-10, 10-15, and 15-20" from a company I am try lot used by comparison operators b... See more...
Hi there! I am new to Splunk and i have a task that "Find count of employees based on their experience range, 0-5, 5-10, 10-15, and 15-20" from a company I am try lot used by comparison operators but didn't get the answer, can you please help me in this?
How do I get status & list of my Correlation searches via GUI & How to get the best out of them?
Hi All,  I would like to get last event occurred time of each day, my searching window area is last 30 days. For example : If my query return 3 events for day1 and 5 events for day 2 than I need ... See more...
Hi All,  I would like to get last event occurred time of each day, my searching window area is last 30 days. For example : If my query return 3 events for day1 and 5 events for day 2 than I need only two event in output.  last event time of day 1 and last event time of day 2 and so on. I tried to get that with help of table command.  it works for me. but I need to do that without using of table command.  worth if you could help me to find rename or create duplicate field of date_mday and _time search | table date_mday, _time | dedup date_mday | sort date_mday.
Hi, I need your help in knowing if it is possible to have an alert that triggers at 1 PM everyday and if the search result is zero,the same alert should be triggered at 4PM.But if the search result ... See more...
Hi, I need your help in knowing if it is possible to have an alert that triggers at 1 PM everyday and if the search result is zero,the same alert should be triggered at 4PM.But if the search result is not zero,then the alert at 4PM shouldnt run. Thanks in advance 
Hi Everyone, I have one requirement. I have two queries like this: index=abc  ns=sidh-datagraph3 "NullPointerException" index=abc  ns=sidh-datagraph3 "IllegalStateException" I want to combine it... See more...
Hi Everyone, I have one requirement. I have two queries like this: index=abc  ns=sidh-datagraph3 "NullPointerException" index=abc  ns=sidh-datagraph3 "IllegalStateException" I want to combine it in a same query. Can someone guide me on this. Thanks in advance
Hello All, I am not good in Regular Expressions, I need you assist. In my data, I have a field containing IPs and Ports but in specific sequence: ...some text ... SourceIP DestIP SrcPort DestP... See more...
Hello All, I am not good in Regular Expressions, I need you assist. In my data, I have a field containing IPs and Ports but in specific sequence: ...some text ... SourceIP DestIP SrcPort DestPort ....some text... between them there is one SPACE. as an example: message=...w 2-APIS 0-External-1 tcp 10.0.12.13 40.126.31.8 55373 443 msg=\"HTTS... I need to extract fields for SrcIP, DestIP, SrcPort and DestPort. when I use  \b(?:[0-9]{1,3}\.){3}[0-9]{1,3}\b \b(?:[0-9]{1,3}\.){3}[0-9]{1,3}\b \d* \d* OR \b(?:[0-9]{1,3}\.){3}[0-9]{1,3} (?:[0-9]{1,3}\.){3}[0-9]{1,3}\b \d* \d* I can grab the 2 IPs and ports  with spaces between them. I am confused about how to assign each to a new field. Can someone help? Or do I have to use REX for search time extraction? Even to use REX, I appreciate your advices. Regards, -Ali
I have under each orderNr five different weights. __________________________ Weight: 0.898, WeightTypeId: 1, OrderNr: 8478 Weight: 0.094, WeightTypeId: 2, OrderNr: 8478 Weight: 7.45, WeightTypeId... See more...
I have under each orderNr five different weights. __________________________ Weight: 0.898, WeightTypeId: 1, OrderNr: 8478 Weight: 0.094, WeightTypeId: 2, OrderNr: 8478 Weight: 7.45, WeightTypeId: 3, OrderNr: 8478 Weight: 0.0, WeightTypeId: 4, OrderNr: 8478 Weight: 7.45, WeightTypeId: 5, OrderNr: 8478 ............................... ___________________________________ I would like to calculate the total weight and the yields, which is calculated like this:  Total:  Weight(WeightTypeID1) + Weight(WeightTypeID2) +Weight(WeightTypeID4) +Weight(WeightTypeID5) Yield1 = Weight (WeightTypeId 1)/ Total Yield2= Weight(WeightTypeID3)/Total I am thinking of using eval to assign the Weight (under WeightTypeId1) to weight 1, weight 2, weight 3.... Then it is easy to do the calculations.  |Eval werigh1= if (WeightTypeId=1, Weight, 0). But somehow I feel this is not correct... Can someone help me with that? Thanks a lot!    
What do I need to check / do to resolve this please? What causes delayed searches alerts in Splunk Enterprise - Error says "searches delayed"