All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

When I tried to save my experiment with the Smart Forecasting Assistant, I get a Cannot validate experiment error.   I check all of the mandatory field in Learn and found nothing wrong, along with ... See more...
When I tried to save my experiment with the Smart Forecasting Assistant, I get a Cannot validate experiment error.   I check all of the mandatory field in Learn and found nothing wrong, along with all the items in the Review screen. Do I need to be granted additional access to be able to save MLTK Experiments?
I'm working with a csv file with this header Filenm,EIN,Status,Business Function,Maintained By, Region,Manufacturer Name,Building Name,Service Area,Model Name,Model Number,Serial Number,AM Tag Numbe... See more...
I'm working with a csv file with this header Filenm,EIN,Status,Business Function,Maintained By, Region,Manufacturer Name,Building Name,Service Area,Model Name,Model Number,Serial Number,AM Tag Number,Equipment Type,Equipment Type Description,Network Connection Type Wired,IP Address v4 Wired,Nuvolo Flag,MAC Address Wired,Equipment Status Detail,Network Connection Type Wireless,IP Address v4 Wireless,IP Address Type Wireless,IP Address Type Wired,MAC Address Wireless,Host Name,Fully Qualified Domain Name,OS Version,Asset Type,Contains ePHI,Application Software Name What I would like to do is have Splunk transform to closer to Splunk field names.  Such as filenm,ein,status,business_function,maintained_by,region,manufacturer_name,building_name,service_area,model_name,model_number,serial_number,am_tag_mumber,equipment_type,equipment_type_description,network_connection_type_wired,ip_addres_v4_Wired,nuvolo_flag,mac_address_wired,equipment_status_detail,network_connection_type_wireless,ip_Address_v4_wireless,ip_address_type_wireless,ip_ddress_type_wired,mac_address_wireless,host_name,fully_qualified_domain_name,os_version,asset_type,contains_ephi,application_software_name The only thing I've been able to find is putting something in the TA transforms.conf like this [edge_asset_header] DELIMS = "," FIELDS = "filenm","ein","status","business_function","maintained_by","region","manufacturer_name","building_name","service_area","model_name","model_number","serial_number","am_tag_mumber","equipment_type","equipment_type_description","network_connection_type_wired","ip_addres_v4_Wired","nuvolo_flag","mac_address_wired","equipment_status_detail","network_connection_type_wireless","ip_Address_v4_wireless","ip_address_type_wireless","ip_ddress_type_wired","mac_address_wireless","host_name","fully_qualified_domain_name","os_version","asset_type","contains_ephi","application_software_name" Is the only solution or did I miss something? TIA, Joe
This is a bit of a sanity check/what I don't know can still hurt me. I have a indexer cluster that has a couple large (>1GB) apps that need to be deployed also they are relatively static in their con... See more...
This is a bit of a sanity check/what I don't know can still hurt me. I have a indexer cluster that has a couple large (>1GB) apps that need to be deployed also they are relatively static in their config changes. So I currently have those apps being deployed via a deployment server with the app set to not restart splunkd. This helps reduce bundle deployment and validation times. What risks/issues am I unaware of in running this setup? Besides the obvious peers may get restarted if that app ever gets changed to restart splunkd.
Hello experts, I'm looking for an APP or Add-on for HPE Switches model 5130, I have been looking for a supported APP for these Switches , with no success so far. Can you guys point me on the right ... See more...
Hello experts, I'm looking for an APP or Add-on for HPE Switches model 5130, I have been looking for a supported APP for these Switches , with no success so far. Can you guys point me on the right direction and/or share any details related to these devices? Any provided feedback would be very welcome! Thanks in advance!  
Unable to pull similar number 53726516638.77 (in billion) using chart for past 7 days. Dashboard only pulls data for only 1 day. rex filed example :   rex field=_raw "(?ms)^\\w+/\\w+\\s+\\w+\... See more...
Unable to pull similar number 53726516638.77 (in billion) using chart for past 7 days. Dashboard only pulls data for only 1 day. rex filed example :   rex field=_raw "(?ms)^\\w+/\\w+\\s+\\w+\\s+\\w+\\s+\\w+\\.\\s+\\w+/\\w+\\s+\\w+\\s+\\w+\\s+\\w+\\.\\s+\\w+\\s+\\w+\\s+\\w+\\s+\\w+\\s+\\w+:\\s+\\d+\\.\\d+\\s+\\w+\\s+\\w+\\s+\\w+\\s+\\w+\\s+\\w+:\\s+\\d+\\.\\d+\\s+\\w+\\s+\\w+\\s+\\w+\\s+:\\s+\\.\\d+\\s+\\w+\\s+\\w+\\s+\\w+\\s+\\w+\\s+\\w+\\s+\\w+\\s+\\w+:\\s+(?P<TotalAsset>[^ ]+)" offset_field=_extracted_fields_bounds | eval mytime=strftime(_time, "%b%d") |chart values(Total_Assets_Price) by mytime log output : Want to pull total asset on pricing and previous date but somehow only pulls data for 1 day
I would like to add a column to a chart that is the difference of the two columns before it in an application where I am showing the Cost for each employee for the last 2 weeks.   Using the search: ... See more...
I would like to add a column to a chart that is the difference of the two columns before it in an application where I am showing the Cost for each employee for the last 2 weeks.   Using the search: index=SampleData  | where _time>relative_time(now(),"-2w@w") | convert timeformat="%m-%d-%Y" ctime(_time) | chart sum(Cost) over Employee by _time The above produces a chart that looks like the following Employee        05-08-2022     05-15-2022 Employee1      100.00             150.00 Employee2      200.00             175.00 How can I add a column at the end that shows the difference between the two weeks?
Hi, if someone could help me out with, or point me in a nice direction to, producing a search which shows if/when a token (for API calls) is/was generated and/or deleted through either the UI or API ... See more...
Hi, if someone could help me out with, or point me in a nice direction to, producing a search which shows if/when a token (for API calls) is/was generated and/or deleted through either the UI or API calls, I would really appreciate it. To be clear, the search should be using the UI though the creation/deletion of tokens could be done in either way. Thank you in advance
Good Afternoon, We are attempting to make Splunk fit into our compliance needs. The auditors want us to check for certain things on the network (user locked out, user added to security group, etc) a... See more...
Good Afternoon, We are attempting to make Splunk fit into our compliance needs. The auditors want us to check for certain things on the network (user locked out, user added to security group, etc) and verify each day that we checked. We were doing this with Alert Logic previously. Basically, Alert Logic had an internal "cases" interface where each search would put a "case" in the list to be reviewed. If it found something, one employee notes the reason after investigation and another employee closes it. Auditors want "dual control" to prevent one admin from falsifying things I guess. The part where it gets tricky is when a search finds nothing. The auditors would like us to confirm that we checked even those "no findings" reports. Alert Logic did this out of the box (before they started changing their product to something wholly unrecognizable to us) and Splunk seemed to do it but I'm finding it's tougher than first thought. The "cases" interface could be had via the Alert Manager app or InfoSec app, neither of which are functioning in my cloud trial. I've resorted to an e-mail to a free Jira cloud instance to get these cases. Accepting that, I need to figure out how to get an alert to trigger both for no items found and for items found. The trigger options force me to choose. Any help is appreciated. I've been working with Splunk support on this and they think some of the apps not working are due to the trial but they can't seem to get the alert triggering going. I'm sure there is a phrase I can stick in "custom" that'll work. I just don't know what. Thank you in advance.
Hi i'm trying to capture 2 fields, the first part of this word (LON) and the remaining (RTI2_SND.TRACE) within the same regular expression LON_RTI2_SND.TRACE Thanks
Hey, i want a regex result from 10.66.189.62 -- -- -[17/May/2022:05:59:16--0400]--502- "POST /astra/sliceHTTP/1.1" req_len=1776-req_cont_len=117-req_cont_enc="-"-res_body_len=341 res_len=733 "htt... See more...
Hey, i want a regex result from 10.66.189.62 -- -- -[17/May/2022:05:59:16--0400]--502- "POST /astra/sliceHTTP/1.1" req_len=1776-req_cont_len=117-req_cont_enc="-"-res_body_len=341 res_len=733 "https://ninepoint.blackrock.com/astra/ ". "Mozilla/5.0- (Macintosh; Intel-Mac-OS-X-10_15_7) -AppleWebKit/537.36-(KHTML,-Like-Gecko) Chrome/10.0.4896.127 Safari/537.36" x_fw_for="-".req_time=278.326-ups_res_time=278.326 ups_con_time=0.011-ups_status=502-pipe=. -VNDRegID=undefined- as; ninepoint Could you please help me with my query? thank you
I have the following _raw field in my index: _raw Response Headers: {&#x27;Date&#x27;: &#x27;Fri, 13 May 2022 02:59:34 GMT&#x27;, &#x27;Content-Type&#x27;: &#x27;application/json; c... See more...
I have the following _raw field in my index: _raw Response Headers: {&#x27;Date&#x27;: &#x27;Fri, 13 May 2022 02:59:34 GMT&#x27;, &#x27;Content-Type&#x27;: &#x27;application/json; charset=utf-8&#x27;} So, I realized &#x27; = '. But there is no way to convert that string into a human readable string, like this: Response Headers: {'Date': 'Fri, 13 May 2022 02:59:34 GMT', 'Content-Type': 'application/json; charset=utf-8'} I tried with something like this, without sucess:   | eval myfield = replace(tostring(_raw),"x27","'")     Then I checked if the string contains "x27" and turns out it is not being detected:   | eval exists=if(like(tostring(_raw), "%x27%"), "YES", "NO")   Is there a way to convert that weird string into a human readable string?  
How will we renew SAML authentication credentials on Splunk?
I have a new UF installation and wish to register it with an existing Deployment Server. When I run the command: $SPLUNK_HOME/bin/splunk set deploy-poll <FQDN of DS>:<management port> I am prompte... See more...
I have a new UF installation and wish to register it with an existing Deployment Server. When I run the command: $SPLUNK_HOME/bin/splunk set deploy-poll <FQDN of DS>:<management port> I am prompted to login, and provide it with credentials that I am able to log into the DS Web interface. It gives me "Login failed" though.  How can I diagnose this further?
Hello there, The deal is that I have 2 forwarders that have exactly the same logs (I'm using 2 forwarders not to have a SPOF) and I want to find a solution to not have duplicated logs. I thought of... See more...
Hello there, The deal is that I have 2 forwarders that have exactly the same logs (I'm using 2 forwarders not to have a SPOF) and I want to find a solution to not have duplicated logs. I thought of using a load balancer but I just want to know first if there is some config on Splunk that allows to do that please. Best regards,  Abir
Hello Splunkers, @SPL , Was working on some of the development activity, got stuck at some level. We have a scenario where I need to check , on a single day which a user had done  transactions for ... See more...
Hello Splunkers, @SPL , Was working on some of the development activity, got stuck at some level. We have a scenario where I need to check , on a single day which a user had done  transactions for more than 3 different vendors Date User ID Vendor Transactions 10/5/2021 user 1 SAAS (User1) $$$$$ 10/5/2021 user 2 PAAS (User1) $$$$$ 10/7/2021 user 3 IAAS $$$$$ 10/8/2021 user 4 AAA $$$$$ 10/9/2021 user 5 CCCC $$$$$ 10/10/2021 user 6 FFFF $$$$$ 10/5/2021 user 7 XXXX (User1) $$$$$ 10/6/2021 user 8 ZZZZ $$$$$ 10/8/2021 user 9 EEE $$$$$ 10/9/2021 user 10 QQQQ $$$$$ 10/10/2021 user 11 SSSS $$$$$ 10/11/2021 user 12 PPPP $$$$$ 10/12/2021 user 13 WWW $$$$$
In our Splunk environment, we currently ingest Azure AD logs and we have three different sourcetypes: azure:aad:signin azure:aad:audit azure:aad:user There no missing events and the ingested d... See more...
In our Splunk environment, we currently ingest Azure AD logs and we have three different sourcetypes: azure:aad:signin azure:aad:audit azure:aad:user There no missing events and the ingested data is very rich. However, I don't see any way within the Splunk ingested Azure signin data to to filter by authentication method (Single-factor vs multi-factor). This is something that can be done via Azure Active Directory, Monitoring, Sign-in logs but I do not see any reference to it in my Splunk data (I do see a lot of conditional access enforcement and the other primary fields, but not any of the secondary fields that could be used for filtering in Azure):  
Hi Team, Our vendor need MIB files from our splunk heavy forwarder (Linux)  for monitoring purpose .. How can we get that ? can someone please provide steps ? we are using SNMP V2. Thanks in ... See more...
Hi Team, Our vendor need MIB files from our splunk heavy forwarder (Linux)  for monitoring purpose .. How can we get that ? can someone please provide steps ? we are using SNMP V2. Thanks in advance .  
I am working on a partner integration project using Splunk Security Essentials (SSE) with my custom security content. Locally, I have the security use cases in JSON format that SSE accepts, but I wa... See more...
I am working on a partner integration project using Splunk Security Essentials (SSE) with my custom security content. Locally, I have the security use cases in JSON format that SSE accepts, but I want to do this integration through my private GitLab by uploading these security use cases there. However, there is a need to keep this GitLab private, so I can't just make SSE download the formatted JSON content by simply passing it the URL in the `content_download_url` setting from `essentials_update.conf`. Is there a setting in the `essentials_update.conf` file, or in some other file that I can also include an access token for my GitLab? If not, what other ways can I download content from this private GitLab page in order to integrate with SSE?  
Hey Splunkers, I am not sure if this is possible or not but what i was trying to do is something like passing the values of search in the eval command to basically form a statement or  an event . ... See more...
Hey Splunkers, I am not sure if this is possible or not but what i was trying to do is something like passing the values of search in the eval command to basically form a statement or  an event . So for example consider below search returns multiple users first name, last name and country details. Now with that field values what i am trying to do is create a eval statement like below- index=foo source=user_detail |table first_name  last_name country |eval statement = My name is "$first_name $ $last_name$ and i come from $country$ |table statement   But this is not passing those field values to eval statement, so anyone knows if there is a way we can do this ? Thanks.
Good Morning, I am working on connecting my Splunk Cloud (trial at the moment, purchase coming soon) to Jira Cloud free and I'm able to retrieve all Jira data in Splunk (projects, issue types, autof... See more...
Good Morning, I am working on connecting my Splunk Cloud (trial at the moment, purchase coming soon) to Jira Cloud free and I'm able to retrieve all Jira data in Splunk (projects, issue types, autofill when setting alerts) but the alerts that are supposed to create tickets are still pending. I can see Splunk accessing the API on the Jira side, none of the troubleshooting steps here helped. I tried the other Jira Splunk add on and that one wouldn't even function so I got farther with this one but still just short of working. Any ideas? Is it something simple I'm missing? Thank you! As an aside: Are any of you aware of any other Splunk to free ticketing add-ons out there? Can't get Alert Manager to create Incidents on Splunk Cloud (just the trial? not sure). Trying to get Incidents created from Alerts and nothing seems to be fitting the bill. Thank you!