All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

That does seem like that would work as far as getting the results I want, though it leaves one of my issues unsolved.  The parent query "index=ind1 earliest=-1d field1=abc" returns many, many results... See more...
That does seem like that would work as far as getting the results I want, though it leaves one of my issues unsolved.  The parent query "index=ind1 earliest=-1d field1=abc" returns many, many results without the inclusion of some filter on field2.  My initial approach (plus your fix for it) filter those results after that broad search is done which isn't great from a performance perspective.  Perhaps I'm better off just using a join at that point, not sure. Anyway, thanks for the reply 
need query to remove duplicates from count stats Sample input event  email abc      xyz@email.com abc    xyz@email.com abc. test@email.com abc. test@email.com xyz xyz@email.com Expected outpu... See more...
need query to remove duplicates from count stats Sample input event  email abc      xyz@email.com abc    xyz@email.com abc. test@email.com abc. test@email.com xyz xyz@email.com Expected output  event count abc 2 xyz 1 what I am getting  event count abc 4 xyz 1
Hi everyone I updated the version of my database agent and by default, appdynamics set the name like a "default database agent", but I need customize the name for each one, but I could find out wher... See more...
Hi everyone I updated the version of my database agent and by default, appdynamics set the name like a "default database agent", but I need customize the name for each one, but I could find out where set up this configuration. Can anyone help me to know where change the database agent name? thank's
thanks sir   I was thinking something complex, but you made it very simple.
Hi @Satish.Kumar Yadav, Have you been able to check out the past two replies? If one of them has answered your question, click the "Accept as Solution" button for the reply that did. If you stil... See more...
Hi @Satish.Kumar Yadav, Have you been able to check out the past two replies? If one of them has answered your question, click the "Accept as Solution" button for the reply that did. If you still need help or have follow up questions, reply back to keep the conversation going.
I get the error showed in the title when tying to upload a csv as  lookup. I tried the solution mentioned here:  https://community.splunk.com/t5/Splunk-Search/What-does-the-error-quot-File-has-no-li... See more...
I get the error showed in the title when tying to upload a csv as  lookup. I tried the solution mentioned here:  https://community.splunk.com/t5/Splunk-Search/What-does-the-error-quot-File-has-no-line-endings-quot-for-a/m-p/322387 but that doesn't work. Any suggestions? 
Hi @Srujana.Mora, Looking into this for you! I was having trouble downloading the file too. 
I get weekly email updates with results from weekly URA scans. After noticing that we had outdated apps we rolled out updates for three public apps, Sankey Diagram, Scalable Vector Graphics and Splun... See more...
I get weekly email updates with results from weekly URA scans. After noticing that we had outdated apps we rolled out updates for three public apps, Sankey Diagram, Scalable Vector Graphics and Splunk Dashboard Examples. In our testing environment URA is now content and all apps pass jQuery scans without issues. However, in our production environment URA scan still fails in all three apps. It does not specify which files or of there is a problem om one or all instances so I don’t know what is causing the results. I have double and triple checked the apps comparing hash values for every file both on the deployment server and on all individual test and production search heads. Everything except for the “install hash” in “meta.local” is identical in both test and production environment. Apps are all identical between cluster members in test and production environment respectively. There are not additional files present on any search head in the production environment. Why is URA still failing these apps only in the production environment? How can I identify the reason for the scan failures as I they should all pass in both environments, being identical and all. Any and all suggestions are most welcome All the best
Hi, Thanks for asking your question on the community. Please check out this Community Knowledge Base Article and let me know if it helps you out. https://community.appdynamics.com/t5/Knowledge-Bas... See more...
Hi, Thanks for asking your question on the community. Please check out this Community Knowledge Base Article and let me know if it helps you out. https://community.appdynamics.com/t5/Knowledge-Base/How-does-AppDynamics-license-consumption-work/ta-p/34449
https://docs.splunk.com/Documentation/Splunk/9.3.1/Admin/MigrateKVstore#Migrate_the_KV_store_in_a_single-instance_deployment Seems your first install of Splunk contained MongoDB for KVStores.  Very ... See more...
https://docs.splunk.com/Documentation/Splunk/9.3.1/Admin/MigrateKVstore#Migrate_the_KV_store_in_a_single-instance_deployment Seems your first install of Splunk contained MongoDB for KVStores.  Very early in the release of 9 was supposed to contain an out of band upgrade from mmap to wiredTiger.  The link above is how the DB upgrade was to be handled and may still work for you.  I would recommend reaching out to Splunk support if you have a contract with them for support while you process this now.
Replied in the wrong thread, ignore!
Hi @_olivier_ , it seems to be a comma separated file, in this case, you must put props.conf also in the UF. Ciao. Giuseppe
Upgraded my HF and Deployment servers from 9.0.4 to 9.2.2.  These are Windows servers.  Then I received KVStore/MongoDB failure messages.  I've tried the KVStore migration steps.  I've tried removing... See more...
Upgraded my HF and Deployment servers from 9.0.4 to 9.2.2.  These are Windows servers.  Then I received KVStore/MongoDB failure messages.  I've tried the KVStore migration steps.  I've tried removing the .lock files from the mongo folder.  I am trying to get the MongoDB updated/upgraded to the latest version so the KVStore will stop complaining.  Any help to clear this up would be greatly appreciated.  
You generally thought well but you have to cast your hostname to lowercase (or uppercase; doesn't matter as long as it's consistent) _before_ you do your stats. EDIT: I didn't notice it started with... See more...
You generally thought well but you have to cast your hostname to lowercase (or uppercase; doesn't matter as long as it's consistent) _before_ you do your stats. EDIT: I didn't notice it started with tstats. Of course in this case @gcusello 's solution is the way to go.
Hi @RanjiRaje , please try this: | tstats max(_time) as latest where index=indexname by host | eval host=upper(host) | stats max(latest) AS latest BY host | convert ctime(latest) Ciao. Giuseppe
Hi All, Can anyone please help me on this ... I am framing a SPL query to get list of hosts with their last eventtime. SPL query:   | tstats max(_time) as latest where index=indexname by host ... See more...
Hi All, Can anyone please help me on this ... I am framing a SPL query to get list of hosts with their last eventtime. SPL query:   | tstats max(_time) as latest where index=indexname by host | convert ctime(latest) From this query, I am getting the list as expected, but with one bug. (If I have a host both in lower case & in upper case, I am getting 2 different entries) Eg:              host                            latest              HOSTNAME1               09/17/2024 15:27:49              hostname1                   08/30/2024 15:27:00              hostname2                   09/15/2024 15:27:49              HOSTNAME2               09/13/2024 15:27:49 From here, I have to get only one entry for a host along with latest time. (For hostname1, I should get 09/17/2024 15:27:49, similarly for hostname2 I should get 09/15/2024 15:27:49) I tried adding the command,  | eval host=upper(host), latest=max(latest) | dedup host But it is not considering max of "latest", and it just showing the single row for each host with random value of "latest" Can you please suggest me the better way to achieve this. thanks
@ITWhisperer , this will be good if am doing transforming search using mvexpand but any idea on how i can achieve the same results through search time fields extractions using props & transforms.conf
https://docs.splunk.com/Documentation/SplunkCloud/9.2.2406/DashStudio/chartsSV#Single_value_2 {     "type": "splunk.singlevalue",     "dataSources": {         "primary": "ds_2x8aw5k1"     },  ... See more...
https://docs.splunk.com/Documentation/SplunkCloud/9.2.2406/DashStudio/chartsSV#Single_value_2 {     "type": "splunk.singlevalue",     "dataSources": {         "primary": "ds_2x8aw5k1"     },     "title": "Title Font",     "description": "Description Font",     "options": {         "majorFontSize": 21     },     "context": {},     "containerOptions": {},     "showProgressBar": false,     "showLastUpdated": false } This will help you statically set the font size for the values you are displaying.  By default it will set to Auto to make it appropriate for the width of your data returned.  The longer the returned value the smaller the auto text.  
Hi everyone! Is it possible to pass a parameter from search to the next "action|url" step? Like in description: $result$ if not, is it possible to somehow change this behavior by modifying this nex... See more...
Hi everyone! Is it possible to pass a parameter from search to the next "action|url" step? Like in description: $result$ if not, is it possible to somehow change this behavior by modifying this next step, if yes, then how? Thanks.
Thanks for sharing useful link but unfortunately after adding the CA-CERT Chain to the below two locations and restarting the splunk still i am receiving the same error. 1) /opt/splunk/lib/python3... See more...
Thanks for sharing useful link but unfortunately after adding the CA-CERT Chain to the below two locations and restarting the splunk still i am receiving the same error. 1) /opt/splunk/lib/python3.7/site-packages/certifi And 2) /etc/apps/<APP_FOLDER>/lib/certify   Any further suggestions please?