All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @VatsalJagani    Yes I raised a case with Splunk support and they confirm they do not have such capability in place and I advised the to add it to their future enhancements list.  I hope this wi... See more...
Hi @VatsalJagani    Yes I raised a case with Splunk support and they confirm they do not have such capability in place and I advised the to add it to their future enhancements list.  I hope this will be considered.  Appreciate your response.
Hi @Sankar, Correlation Searches, in ES, write triggered alerts in the notable index. You can see in this index and create a statistic for search_name: index=notable | stats count BY search_name ... See more...
Hi @Sankar, Correlation Searches, in ES, write triggered alerts in the notable index. You can see in this index and create a statistic for search_name: index=notable | stats count BY search_name Ciao. Giuseppe
Hello, team I've made script, which uses the sudo command. I've deployed it on my forwarders and I get the error: message from "/opt/splunkforwarder/etc/apps/app/bin/script.sh" sudo: effective uid ... See more...
Hello, team I've made script, which uses the sudo command. I've deployed it on my forwarders and I get the error: message from "/opt/splunkforwarder/etc/apps/app/bin/script.sh" sudo: effective uid is not 0, is /usr/bin/sudo on a file system with the 'nosuid' option set or an NFS file system without root privileges? Please help to fix this issue. 
Hi @varsh_6_8_6 , in this case, please try index="xyz" host="*" "total payment count :" | eval messagevalue=mvindex(split(messagevalue,":"),1) | appendpipe [ stats count | eval messagevalue="No Fi... See more...
Hi @varsh_6_8_6 , in this case, please try index="xyz" host="*" "total payment count :" | eval messagevalue=mvindex(split(messagevalue,":"),1) | appendpipe [ stats count | eval messagevalue="No File Found" | where count==0 | fields - count ] Ciao. Giuseppe
Hi! I recently wanted to test sending traces using the signalfx splunk-otel-collector. In general everything works as expected, however when sending spans containing links to other spans, these link... See more...
Hi! I recently wanted to test sending traces using the signalfx splunk-otel-collector. In general everything works as expected, however when sending spans containing links to other spans, these links don't show up in the waterfall UI, even though they should be working according to the documentation . When downloading the trace data, span links are not mentioned at all. The (debug) logs of the splunk-otel-collector don't seem to mention any errors/abnormalities either. Following example shows my test span. It should be linking to two other spans, but it doesn't show up as such. Additionally I tested using Jaeger All-in-one, and in there the span links properly show up.   I am thankful for any hints you can provide that might help me debug this problem    
Are those tables individual sourcetypes on index or results of your SPL queries? If last, can you share it so we can modify it to create your requested result?
They are actually results coming from different event types. Each event contains different fields.  
we have 100+ use cases onboarded into splunk ES. also we are receiving the alerts few of them but i want to know exact count how many use cases onboarded into the splunk in that how many triggered th... See more...
we have 100+ use cases onboarded into splunk ES. also we are receiving the alerts few of them but i want to know exact count how many use cases onboarded into the splunk in that how many triggered the alerts? much appreciated any one guide. 
OK. It seems that Field Extractor only creates inline extractions. If you want to create transform-based extractions, you need to do them from the Settings menu Settings -> Fields -> Field transform... See more...
OK. It seems that Field Extractor only creates inline extractions. If you want to create transform-based extractions, you need to do them from the Settings menu Settings -> Fields -> Field transformations - there you can create a new transform with a possibility to check a "create multivalued fields" option Then you can use the transform created here to create extraction in Settings -> Field -> Field Extractions
@VatsalJagani  Thanks for your response. Parallel to this blog post I already created the ticket and had some really good conversations with the Splunk support. They confirmed that indeed this is a ... See more...
@VatsalJagani  Thanks for your response. Parallel to this blog post I already created the ticket and had some really good conversations with the Splunk support. They confirmed that indeed this is a bug. It will be fixed with the upcoming versions of 9.3.x and 9.4.x  Appreciated your reply.
What do you mean by "table"? There are several different possible approaches depending on where those "tables" come from.
Gotcha, I'll admit, I hope you're mistaken, and the Fields Extractor can properly extract multivalue fields......I say this because I just use Splunk. Unfortunately, I don't have any access to the ac... See more...
Gotcha, I'll admit, I hope you're mistaken, and the Fields Extractor can properly extract multivalue fields......I say this because I just use Splunk. Unfortunately, I don't have any access to the actual conf files on the server outside what can be edited in the Web UI.
The field extractor is a feature which admiteddly looks good and is a "selling feature" - you can show a potential customer that you don't have to be a master of regexes to be able to extract fields ... See more...
The field extractor is a feature which admiteddly looks good and is a "selling feature" - you can show a potential customer that you don't have to be a master of regexes to be able to extract fields from data. And it might be useful if you have a Splunk Free instance at home processing negligible amounts of data and it doesn't matter to you how "pretty" and efficient the resulting extractions are. But it of course doesn't cover all possible use cases, like your multivalue fields or a tokenizer. I'd have to double-check but you might be able to reach more advanced settings either via directly editing transforms in the fields extraction section of the configuration menu or via the "all configurations" section.
Hi @gcusello  Thank you for the inputs. I have voted for the idea which is essential. Also I have both number and string. The one mentioned worked perfectly for the number.  Is there any way to d... See more...
Hi @gcusello  Thank you for the inputs. I have voted for the idea which is essential. Also I have both number and string. The one mentioned worked perfectly for the number.  Is there any way to display "No files found" in case there no latest events in a particular time. Regards, Varsh
Its difficult to answer without detailed review of the system. But based on your comment it seems its an Upgrade issue. I would recommend opening a case with Splunk, and they should be able to help yo... See more...
Its difficult to answer without detailed review of the system. But based on your comment it seems its an Upgrade issue. I would recommend opening a case with Splunk, and they should be able to help you fix the issue.
Hi all, I have the following issue. I have a table A  col1 col2 A aa B bb C aa   And a table B colA colB aa FYI bb LOL   I need to add to table A the column c... See more...
Hi all, I have the following issue. I have a table A  col1 col2 A aa B bb C aa   And a table B colA colB aa FYI bb LOL   I need to add to table A the column colB based on the matching values from col1 (table A) and colA (table B) and it should look like: col1 colB col2 A FYI aa B LOL bb C FYI aa   so basically map the values from col2 to colA and add colB based on the matches Thanks for your support,
Hi, I finally solved this issue by downloading the source code from Splunkbase and manually moving the wheel file to the path: wheels/py3/. Then, I tarred the source code and installed it to SOAR Cl... See more...
Hi, I finally solved this issue by downloading the source code from Splunkbase and manually moving the wheel file to the path: wheels/py3/. Then, I tarred the source code and installed it to SOAR Cloud. The JSON file is also updated with the following configuration.  
It worked for me by adding my file in this path $SPLUNK_HOME/etc/apps/<your_app>/local   Thanks @gcusello @kiran_panchavat 
As other already mentioned those configurations must be in first HF/Indexer if there haven't been any HF from path source system (where this input is) to indexers. If input is in heavy forwarder the... See more...
As other already mentioned those configurations must be in first HF/Indexer if there haven't been any HF from path source system (where this input is) to indexers. If input is in heavy forwarder then those must be there.  If there are any intermediate HF between UF and Indexers then those must be in this IHF. If source side is UF and there is nothing else before indexers (or your sandbox) then those must be in your sandbox. And ensure that those KOs are shared as all apps/system. Otherwise it could be that those are not valid in ingestion phase.
<a href="https://www.google.com">HTML</a>