All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

This is awesome and just solved my custom panel width issue after banging my head for the last hour.  Do you know where this attribute is documented for either behavior or which CSS items are support... See more...
This is awesome and just solved my custom panel width issue after banging my head for the last hour.  Do you know where this attribute is documented for either behavior or which CSS items are support per version of Splunk.
Hello, I am building a dashboard in Splunk Enterprise, I included the map with the Choropleth layer type and that worked for me, but I have a table that performs a query based on the region clicked o... See more...
Hello, I am building a dashboard in Splunk Enterprise, I included the map with the Choropleth layer type and that worked for me, but I have a table that performs a query based on the region clicked on the map and that part does not work in Splunk Dashboard Studio. I have already defined the token on the map, adjusted the token in the table's query, and it seems that it does not capture the clicked area. I did the same process in Splunk Classic and it worked as expected.   below is the source code of the MAP { "dataSources": { "primary": "ds_4lhwtNWq" }, "eventHandlers": [ { "type": "drilldown.setToken", "options": { "tokens": [ { "key": "row.UF.value", "token": "clicked_uf" } ] } } ], "options": { "backgroundColor": "#294e70", "center": [ -13.79021870397439, -52.07072204233867 ], "layers": [ { "additionalTooltipFields": [ "Quantidade de erros" ], "areaIds": "> primary | seriesByName('UF')", "areaValues": "> primary | seriesByName('Quantidade de erros')", "bubbleSize": "> primary | frameBySeriesNames('Quantidade de erros')", "choroplethOpacity": 0.5, "choroplethStrokeColor": "transparent", "latitude": "> primary | seriesByName('LATITUDE')", "longitude": "> primary | seriesByName('LONGITUDE')", "resultLimit": 50000, "type": "choropleth" } ], "scaleUnit": "imperial", "zoom": 5.38493379665208 }, "title": "mapa", "type": "splunk.map", "context": {}, "containerOptions": {}, "showProgressBar": false, "showLastUpdated": false }   below is the SPL query of the table: index=<index> coderropc="0332" | eval PC = replace(codpcredeprop, "^0+", "") | stats count as "Erros por PC" by PC | join type=left PC [| inputlookup PcFabricante.csv | eval CODPC=replace(CODPC, "^0+", "") | rename CODPC as PC | fields PC NOMEFABR MODELO] | join type=left PC [| search index=ars source=GO earliest=-30d@d latest=now | eval CODPC=replace(CODPC, "^0+", "") | rename CODPC as PC | fields PC UF] | search UF="$token_mapa$" | table PC, NOMEFABR, MODELO, UF, "Erros por PC"   Is there any configuration that is different between Splunk classic and Splunk Dashboard Studio? When I add the default value in the map, the table receives the value, but does not register the clicks.
Thank you that helped!
Hi @RowdyRodney  How are you doing this extraction? Is it a search-time extraction in Splunk Enterprise/Cloud? These use PCRE based Regex whereas you have provided a Python-style named capturing gr... See more...
Hi @RowdyRodney  How are you doing this extraction? Is it a search-time extraction in Splunk Enterprise/Cloud? These use PCRE based Regex whereas you have provided a Python-style named capturing group (?P...)  Please can you update this to a PCRE based regex and see if this resolves the issue? "FileName":\s".+\.(?<Domain>.[a-zA-Z0-9]*) Can I also check, is the intention that it matches the file extension (docx) in your sample data?  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @Esky73  The app uses the HttpNtlmAuth/requests-ntlm library which as you've found does require the Username in 'domain\\username' format. There doesnt look to be a way around this. It should be... See more...
Hi @Esky73  The app uses the HttpNtlmAuth/requests-ntlm library which as you've found does require the Username in 'domain\\username' format. There doesnt look to be a way around this. It should be possible to authenticate using the domain\\username but the domain isnt always the first bit after the @ symbol in the full domain, e.g. it could by "mydomain", "mydomain.ad" or something completely different. Are you able to check with your AD team to see what this value should be?  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
Hello - I created a Field Extraction to look for a file extension. The raw log looks like this: "FileName": "John Test File.docx" The regex I used was: "FileName":\s".+\.(?P<Domain>.[a-zA-Z0-9]*) ... See more...
Hello - I created a Field Extraction to look for a file extension. The raw log looks like this: "FileName": "John Test File.docx" The regex I used was: "FileName":\s".+\.(?P<Domain>.[a-zA-Z0-9]*)   This tests out in any regex tester I use. When I first created this, I ran a search query and some of the fields populated, but some were blank. I then checked which records weren't being extracted correctly, and found the regex matched the raw log pattern, so I was unsure why it wouldn't have extracted. However,  ~30 minutes after creating this field extraction. It stopped extracting anything. The state I'm now, I can see that each raw log record matches my extraction regex, but the fields are still empty and this isn't being extracted. Why would that be? Each raw log matches the regex in the extraction...
Hi @Jayanthan  There are a number of approaches you could take to do this such as Edge Processor, Ingest Actions, props/transforms or segregating at source. What tools/apps/processes are you curren... See more...
Hi @Jayanthan  There are a number of approaches you could take to do this such as Edge Processor, Ingest Actions, props/transforms or segregating at source. What tools/apps/processes are you currently using bring the data in to Splunk? The most optimum way to reduce the amount of data ingested in to Splunk is to omit it at source (e.g. not send/pull it)! Please let us know and we can hopefully drill further into options for you.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
what is the correct format for domain users please? if i curl from a HF i get get the desired 200 response using : curl -v http://mywebsite.com --ntlm -u username@mydomain.ad.ltd.com.au If i use t... See more...
what is the correct format for domain users please? if i curl from a HF i get get the desired 200 response using : curl -v http://mywebsite.com --ntlm -u username@mydomain.ad.ltd.com.au If i use this format in the TA i see the error message in the logs asking for format in domain\\username I have tried several connotations of mydomain\\username but have not been successful what should be the format for this domain? Or is the issue with --ntlm ? as if we use the --negotiate flag or remove --ntlm we get 401 ? cheers
Thank you @livehybrid  But, I am using Splunk Enterprise. Is there any way to add and filter log from applications hosted the cloud in Splunk Enterprise.
Hi Everyone, We are using Splunk Enterprise in our company. We want to ingest logs from applications hosted on the cloud. But, when we try to connect we get a lot of logs which is unrelated to our a... See more...
Hi Everyone, We are using Splunk Enterprise in our company. We want to ingest logs from applications hosted on the cloud. But, when we try to connect we get a lot of logs which is unrelated to our application which in turn causes high License utilization. Is there any method where we can filter out the logs that we want (like logs of specific application or  log source) before ingesting in Splunk so as to reduce the License Utilization while getting required security logs for the application.  
Hi @Jayanthan  Data Manager is only available in Splunk Cloud. Please see https://help.splunk.com/en/splunk-cloud-platform/ingest-data-from-cloud-services/data-manager-user-manual/1.11/introduction/... See more...
Hi @Jayanthan  Data Manager is only available in Splunk Cloud. Please see https://help.splunk.com/en/splunk-cloud-platform/ingest-data-from-cloud-services/data-manager-user-manual/1.11/introduction/about-data-manager for more information. There is a useful page at https://lantern.splunk.com/Splunk_Success_Framework/Data_Management/GDI_-_Getting_data_in which links out to various methods to onboard different datasources.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi everyone, I want to ingest logs from applications hosted in cloud (such as AWS, Azure).  In our Company we are using Splunk Enterprise.  Can Data Manager be used to ingest and  filter out logs on... See more...
Hi everyone, I want to ingest logs from applications hosted in cloud (such as AWS, Azure).  In our Company we are using Splunk Enterprise.  Can Data Manager be used to ingest and  filter out logs only pertaining to that application's security in Splunk Enterprise. Splunk Enterprise Security   
Yes Im using ucc framework to build the app. Im using custom validator class  to the validate the configuration field.  This is the data value sent to validate function. As I debug the code only ... See more...
Yes Im using ucc framework to build the app. Im using custom validator class  to the validate the configuration field.  This is the data value sent to validate function. As I debug the code only the field data is set to the data during validate method invocation.     When I see the account validator python file where name is set as None.    Basically my need is during the configuration validation, I want to know the account name. How to fetch this account name ?    
Hi @Vasavi29  Please can you share a screenshot of where/how you are seeing this? Can you confirm the timezone you have set in the user preferences?   Did this answer help you? If so, please con... See more...
Hi @Vasavi29  Please can you share a screenshot of where/how you are seeing this? Can you confirm the timezone you have set in the user preferences?   Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @PoojaDevi  Are you using UCC Framework to build this app? Can you provide a little more of your code? Have you tried adding a log method to print out what is passed to the function? Can you shar... See more...
Hi @PoojaDevi  Are you using UCC Framework to build this app? Can you provide a little more of your code? Have you tried adding a log method to print out what is passed to the function? Can you share what is sent?  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @kfsplunk  What distinction do you need to make between the logs? You mention that they become hard to differentiate but I think you could probably create an eventtype or use a field extraction t... See more...
Hi @kfsplunk  What distinction do you need to make between the logs? You mention that they become hard to differentiate but I think you could probably create an eventtype or use a field extraction to determine if the FTD code is in the 43k range like you mentioned.  I would avoid onboarding it as one sourcetype and then using props/transforms to overwrite the sourcetype because you risk breaking the built-in field extractions and CIM mappings you get from the app's configuration. However, If you want to segregate into a separate index, or change the source to distinguish them apart then you could do this with props/transforms. The Cisco Security Cloud app does look a lot richer in terms of functionality and dashboards (if that helps you) but also gets much more frequent updates than the ASA app, not that this should necessarily sway your decision but might help!   Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Managed to get this resolved by ensuring the submitted token model was updated by adding submittedTokenModel.set() and submittedTokenModel.trigger() to the code. The title displaying the token v... See more...
Managed to get this resolved by ensuring the submitted token model was updated by adding submittedTokenModel.set() and submittedTokenModel.trigger() to the code. The title displaying the token value was a bit of a red herring. It showed that the default model was being updated, but it didn't reflect the state of the submitted token model.
Alright, thank you for your answer!
Onboarding Cisco FTD firewalls presents the choice of which Add-On to use. Apparently Cisco FTD firewalls run both ASA core and FTD core which means they send different types of events. The ASA event... See more...
Onboarding Cisco FTD firewalls presents the choice of which Add-On to use. Apparently Cisco FTD firewalls run both ASA core and FTD core which means they send different types of events. The ASA events are best handled with cisco:asa sourcetype whereas the FTD events are handled by cisco:ftd:syslog. However, all events in our environment use %FTD to tag their events, so this makes it harder to differentiate. What Add-On is the preferred Add-On (I'd expect the Cisco Security Cloud, but it still has some flaws)? And how should we get these events in with the correct sourcetype. My suggestion would be to send all events with cisco:asa sourcetype and include a transform which checks if the FTD code is in the 43k range, e.g. REGEX=%FTD-\d-43\d+.  
python.version = python3.10 is not a valid setting.  Allowed values are default, python3, python3.7, or python3.9.  I don't know how you were able to add Python 3.10 to Splunk, but doing so does not ... See more...
python.version = python3.10 is not a valid setting.  Allowed values are default, python3, python3.7, or python3.9.  I don't know how you were able to add Python 3.10 to Splunk, but doing so does not change the validation of python.version settings.  I strongly recommend using only the versions of Python that ship with Splunk (3.7 or 3.9).