All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @randoj !  We just created a lookup definition manually in a local/transforms.conf, as you would with any other KV Store lookup. Additionally, we needed to do the same for the mc_incidents col... See more...
Hi @randoj !  We just created a lookup definition manually in a local/transforms.conf, as you would with any other KV Store lookup. Additionally, we needed to do the same for the mc_incidents collection, as it is needed to correlate notable_ids and incident_ids, the latter of which are used in mc_notes. It probably is easier to access the collections using the Python SDK and scripts, but this solution worked for us and required less setup. Hope this helps!
Can someone please guide me on this.  
Hi @malisushil119  To ensure we can answer thoroughly, please could you confirm a few things. Are you sending these logs to your own indexers *and* a 3rd party indexer(s)? Or just to the 3rd party? ... See more...
Hi @malisushil119  To ensure we can answer thoroughly, please could you confirm a few things. Are you sending these logs to your own indexers *and* a 3rd party indexer(s)? Or just to the 3rd party?  You say you can see the data on your SH, when you search it please check the splunk_server field from the interesting fields on the left, is the server(s) listed here your indexers, or SH? How have you configured the connectivity to the 3rd party? Please could you check your _internal logs for any TcpOutputFd errors (assuming standard Splunk2Splunk forwarding).  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
We have installed Splunk in windows and we want to send windows logs from Search Head, LM and CM to 3rd party using an indexer, somehow those logs can be seen in Search head queries but indexer is no... See more...
We have installed Splunk in windows and we want to send windows logs from Search Head, LM and CM to 3rd party using an indexer, somehow those logs can be seen in Search head queries but indexer is not forwarding them to 3rd party.
Hi @malisushil119 , don't attach a new post to another one, even if on the same topic because you'll receive a faster and probably better answer. Ciao. Giuseppe
To ensure Splunk fully reindexes a file whenever the datestamp changes, consider using initCrcLength and crcSalt in your inputs.conf. The default CHECK_METHOD = modtime may not detect content changes... See more...
To ensure Splunk fully reindexes a file whenever the datestamp changes, consider using initCrcLength and crcSalt in your inputs.conf. The default CHECK_METHOD = modtime may not detect content changes if the file is overwritten with similar data. Including a unique timestamp in the file or path can also help.        
Hi All We have a requirement where user needs to send mail a dashboard periodically. The Dashboard is made using Dashboard studio so the Export is available, I configured the export option and sent ... See more...
Hi All We have a requirement where user needs to send mail a dashboard periodically. The Dashboard is made using Dashboard studio so the Export is available, I configured the export option and sent a mail but the PDF output showing no data on individual panels, it gives the output while the panel are searching for the result. The dashboard has  time picker in it, no matter which value I set it ( last 4 hours to last 30 days) the result is same. Has anybody faced the issue similarly, have any workaround is there for this.   Please help.
i am facing same issue
When we delete a row in a csv lookup file, it gets deleted for that moment. But on saving, that row re-appears. Looks like a bug in latest version 4.0.5, working perfectly fine in 4.0.4 version. Upgr... See more...
When we delete a row in a csv lookup file, it gets deleted for that moment. But on saving, that row re-appears. Looks like a bug in latest version 4.0.5, working perfectly fine in 4.0.4 version. Upgrading to 4.0.5 because of vulnerabilities in 4.0.4. Anyone noticed  this issue?
@tah7004  To use ingest-time lookup, the field you want to apply must be specified as an indexed-field. You can apply it successfully by configuring the configuration file as follows. 1. $SPLUNK_HOM... See more...
@tah7004  To use ingest-time lookup, the field you want to apply must be specified as an indexed-field. You can apply it successfully by configuring the configuration file as follows. 1. $SPLUNK_HOME/etc/apps/myapp/lookups/test.csv field1,field2,field3 value1,value2,value3 2. $SPLUNK_HOME/etc/apps/myapp/local/props.conf [test_ingest_lookup] DATETIME_CONFIG = LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true category = Custom pulldown_type = true TRANSFORMS-ingest_time_lookup = regex_extract_av_pairs, lookup_extract   3. $SPLUNK_HOME/etc/apps/myapp/local/transforms.conf [regex_extract_av_pairs] SOURCE_KEY = _raw REGEX = \s([a-zA-Z][a-zA-Z0-9-]+)=([^\s"',]+) REPEAT_MATCH = true FORMAT = $1::"$2" WRITE_META = true [lookup_extract] INGEST_EVAL= field3=json_extract(lookup("test.csv", json_object("field1", new_field, "field2", field2), json_array("field3")),"field3")   You can refer to another solution using INDEXED_EXTRACTIONS=json in the link below. - Splunkデータ取り込み時の絞り込み方法(リストマッチ) https://qiita.com/chobiyu/items/aec5ef3a75a8bab96546
Splunk as a software running on top of the OS doesn't have any privilege to choose between the swap and real memory as it's purely decided by the OS. There used to be many swap issues in Linux whic... See more...
Splunk as a software running on top of the OS doesn't have any privilege to choose between the swap and real memory as it's purely decided by the OS. There used to be many swap issues in Linux which could be better addressed or explained by the Vendor Support. Frequent swap access could impact the Splunk performance negatively - you may want to control 'swappiness' with the help of OS admin. https://www.techtarget.com/searchdatacenter/definition/Linux-swappiness  FYI.
Hi, I recently created a dash studio dashboard and I see while creating the dashboard the dashboard title and widget title are in one font format but once I finished my dashboard and shared it publi... See more...
Hi, I recently created a dash studio dashboard and I see while creating the dashboard the dashboard title and widget title are in one font format but once I finished my dashboard and shared it publicly I get one public URL which I see has a different font format when opened.  First snap is having the normal font format on which I created the dashboard Normal dashboard font format Once I open the shared URL the font looks like as below. Please help on how to restore it to original font.    Dashboard opened via shared URL  
Hi After updating to version 8.x, do I need to create new indexes? Please advise. Is there any documentation for this? @inveinvestigation #index
@sreeranjan wrote: We are currently working with the Splunk Enterprise product. The client has informed us that we will be transitioning to Splunk Cloud. From what I understand, Splunk Cloud r... See more...
@sreeranjan wrote: We are currently working with the Splunk Enterprise product. The client has informed us that we will be transitioning to Splunk Cloud. From what I understand, Splunk Cloud refers to the Splunk Cloud Platform, where the entire infrastructure is hosted and managed by Splunk on AWS. Even though it runs on AWS, it's still referred to as Splunk Cloud—not AWS Cloud—since the architecture and services are maintained by Splunk. Is that correct? It’s exactly this way. Usually when we are talking about splunk cloud it means just splunk core platform in cloud. That cloud can be in aws, azure or gcp. Then there are classic and Victoria experiences over it. This user point of view this means which kind of options it have e.g. for deployment apps etc. you can see those from splunk cloud description from docs.splun.com. With SCP your could expand your environment with edge or ingest processor which helps you with data ingestion configurations.
We have installed Splunk in windows and we want to send windows logs from Search Head, LM and CM to 3rd party using an indexer, somehow those logs can be seen in Search head queries but indexer is no... See more...
We have installed Splunk in windows and we want to send windows logs from Search Head, LM and CM to 3rd party using an indexer, somehow those logs can be seen in Search head queries but indexer is not forwarding them to 3rd party.
We are currently working with the Splunk Enterprise product. The client has informed us that we will be transitioning to Splunk Cloud. From what I understand, Splunk Cloud refers to the Splunk Clo... See more...
We are currently working with the Splunk Enterprise product. The client has informed us that we will be transitioning to Splunk Cloud. From what I understand, Splunk Cloud refers to the Splunk Cloud Platform, where the entire infrastructure is hosted and managed by Splunk on AWS. Even though it runs on AWS, it's still referred to as Splunk Cloud—not AWS Cloud—since the architecture and services are maintained by Splunk. Is that correct?
Thats great @Numb78  Glad you were able to get it sorted  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your... See more...
Thats great @Numb78  Glad you were able to get it sorted  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
@smahoney In order to set a refresh, you must specify the refresh option on your search by adding "refresh: 3s" to your underlying search options to refresh every 3 seconds. REFRESH is a setting that... See more...
@smahoney In order to set a refresh, you must specify the refresh option on your search by adding "refresh: 3s" to your underlying search options to refresh every 3 seconds. REFRESH is a setting that goes with Data Sources, not Visualizations. REFRESH can be set on individual datasources or can be set as a DEFAULT for all datasources . https://docs.splunk.com/Documentation/Splunk/9.4.2/DashStudio/Default#Set_defaults_by_data_source_or_visualization_type If this Helps, Please Upvote!!
That works, but I can't set refresh per panel from what I see in the documentation.
hi @Sarvesh_Fenix  The 429 error you're seeing is might be due to the  Graph throttling. Microsoft limits users to approximately 15 queries per 5-second window. Try to Increase your polling interv... See more...
hi @Sarvesh_Fenix  The 429 error you're seeing is might be due to the  Graph throttling. Microsoft limits users to approximately 15 queries per 5-second window. Try to Increase your polling interval in the Azure add-on configuration Split your subscription monitoring into separate inputs The Resource Graph approach you're currently using will continue to hit these limits. Microsoft's documentation (https://docs.microsoft.com/en-us/azure/governance/resource-graph/concepts/guidance-for-throttled-requests) recommends implementing pagination, staggering requests, and proper retry logic. Check your Azure application permissions as well - you'll need AuditLog.Read.All and Directory.Read.All for SignIn logs. If this helps, Please Upvote