All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Thank you! This pointed me in the right direction! It turned out that the issue was that the token was somehow picking up the nat_source_address field as well.    
Thank you very much for your comment and share of source code! It has helped me out. I am not very well versed in xml, html, web design etc but this is bringing back some memories and I'm starting to... See more...
Thank you very much for your comment and share of source code! It has helped me out. I am not very well versed in xml, html, web design etc but this is bringing back some memories and I'm starting to get more accustomed to it again.  Ken
Thanks @gcusello for your response.  From the doc, I read that for Splunk Classic experience, it is recommended to install TA on IDM. Whereas in case of Splunk Victoria, it is recommended to install... See more...
Thanks @gcusello for your response.  From the doc, I read that for Splunk Classic experience, it is recommended to install TA on IDM. Whereas in case of Splunk Victoria, it is recommended to install TA on search head. I like the second approach, might as well, strip out the KV store logic out of the TA and place it in App such that whether it is on-prem or cloud, there shouldnt be an issue in updating kvstore data since app is installed on search head and that would take care of updating kv store. Does this sound reasonable?  
Your data that's already in ingested needs to be made CIM complaint, it might be worth spending some time getting your head around the CIM concepts, after this you can look  at developing correlation... See more...
Your data that's already in ingested needs to be made CIM complaint, it might be worth spending some time getting your head around the CIM concepts, after this you can look  at developing correlation rules.    https://lantern.splunk.com/Splunk_Platform/Product_Tips/Data_Management/Complying_with_the_Splunk_Common_Information_model 
+1! « $Request failed with the following error code: 400 » Tx!
This did not work. Thank you.
Hi, basically I had a html button in a panel, next to a text box. When clicked, the button was supposed to run a search which added the text entered in the panel into a lookup. The problem is, the on... See more...
Hi, basically I had a html button in a panel, next to a text box. When clicked, the button was supposed to run a search which added the text entered in the panel into a lookup. The problem is, the only way to get the button to do this was to use the full URL of the search, opening in a new tab. When I tried the javascript approach, I was getting messages about running potentially-unsafe scripts but the original method worked so I stuck with that. I just want to know if it's possible to use a html button to run a search, without opening in a new tab. I have tried various ways but haven't had any success. Thanks
Holy Cow! Per that document, I tried enabling: mc_auto_config = enabled ...and it removed all my indexers from the cluster. Good times. I think I'll just learn to live without those volume dashbo... See more...
Holy Cow! Per that document, I tried enabling: mc_auto_config = enabled ...and it removed all my indexers from the cluster. Good times. I think I'll just learn to live without those volume dashboards, wouldn't be the first time I had to ignore missing functionality. Pro tip:  don't bother taking the advanced troubleshooting class at Splunk.conf -- didn't prepare me for anything useful...
Hi there, I am trying to get some data from MS Defender into a Splunk query.  My original KQL query in azure contains | JOIN KIND INNER. to concat DeviceProcess and DeviceRegistry tables. The Splu... See more...
Hi there, I am trying to get some data from MS Defender into a Splunk query.  My original KQL query in azure contains | JOIN KIND INNER. to concat DeviceProcess and DeviceRegistry tables. The Splunk app I am using:  Splunk https://splunkbase.splunk.com/app/5518    So basically I'd like to do concatenation between DeviceProcess and DeviceRegistry events in advanced hunting query | advhunt in splunk SPL. Is there a suitable Splunk query for this kind of purpose?
@dtburrows3 Would you be able to share your code or a snippet of the relevant function calls? I am trying to create a similar expansion command but have not yet been able to locate the appropriate fu... See more...
@dtburrows3 Would you be able to share your code or a snippet of the relevant function calls? I am trying to create a similar expansion command but have not yet been able to locate the appropriate functions to use in the Splunk Python SDK.
What sort of button are you using? Do you mean the Submit button? Please provide more details of what you tried and how it failed?
Hi, sorry for the lack of reply I'm afraid it didn't work for me, I ended up having to use the button to open a search in a new tab. Is there actually a way to run a search from a button without di... See more...
Hi, sorry for the lack of reply I'm afraid it didn't work for me, I ended up having to use the button to open a search in a new tab. Is there actually a way to run a search from a button without displaying the results?
ah, gotcha. Yes, it's configured, setup is correct, server roles are set, and I use it often for various things -- I can see data in pretty much every other dashboard, and even in the "Index Detail: ... See more...
ah, gotcha. Yes, it's configured, setup is correct, server roles are set, and I use it often for various things -- I can see data in pretty much every other dashboard, and even in the "Index Detail: Deployment" -- it *does* show some volume information, as does "Indexes and Volumes: Deployment". In "opening in search" pretty much any panel in the monitoring console, I can see the query and the macro or rest it uses. But in these volume detail pages, they all are "undefined". The info reporting on the individual indexes is correct -- I use it to trim and set limits on various indexes. I do see in the "Volume Detail: Instance" -- I each indexer populated in the dropdown, but the Volume (token) is empty. To recap:  all my dashboards in Monitoring Console on my Management server have data except for:  VolumeDetail: Instance and VolumeDetail: Deployment. Honestly, if it's NOT configured correctly (the management server/console), then I'm not sure what to fix. I know this doesn't always mean a Good Thing, but I have been using Splunk since v3.x.  
Hey, Your message allowed me to realize that in my question there is missing some information. IBMi data are in Json format and integrate to the HTTP event collector. I didn’t understand what I ha... See more...
Hey, Your message allowed me to realize that in my question there is missing some information. IBMi data are in Json format and integrate to the HTTP event collector. I didn’t understand what I had to do to make Splunk understand the data and make the correlation.
Hello @Maxime, By default Splunk tries to parse the data that got ingested from whatsoever log source it had been onboarded. However, there's no gaurantee that Splunk will be able to understand the ... See more...
Hello @Maxime, By default Splunk tries to parse the data that got ingested from whatsoever log source it had been onboarded. However, there's no gaurantee that Splunk will be able to understand the log source completely and provide you with the fields. There are lots of apps and add-ons available on Splunkbase for the exact same purpose (to collect and parse the data). However, if you do not find associated app/add-on, you can write the sourcetype configuration as per your requirement and you should then be able to get the necessary fields.  Also, if the data generated is in structured format (JSON, XML, CSV, etc.), Splunk has parsing written for those by default. In that case, you'll be able to directly visualize the data. You can find the relevant documentation links below: - https://docs.splunk.com/Documentation/Splunk/latest/Data/WhatSplunkdoeswithyourdata - https://docs.splunk.com/Documentation/Splunk/latest/Data/Overviewofeventprocessing - https://docs.splunk.com/Documentation/Splunk/latest/Data/Createsourcetypes - https://lantern.splunk.com/Splunk_Platform/Product_Tips/Data_Management/Configuring_new_source_types   Thanks, Tejas.   --- If the above solution helps, an upvote is appreciated.  
At a high level: Splunk can ingest most types of logs data, using different methods of collection. Splunk works on Key Value pairs - so if IBMi can send the data, with this data, you can then searc... See more...
At a high level: Splunk can ingest most types of logs data, using different methods of collection. Splunk works on Key Value pairs - so if IBMi can send the data, with this data, you can then search it. The first thing to do is to look at the IBMi data and workout what format it is and how to collect that data, example, is it in a text log file, Json, XML, DB, syslog or API. You then need to set up the data collection method, this could involve a UF(Splunk Agent), Hec using HTTP API, or syslog etc, this has to be based on your environment and preferred method of collecting IBMi data, and place that data into an index. You then need to look at if there's a Splunk Add on in Splunkbase for IBMi data, this is used for parsing the data, if there isn’t one, you then need to develop Splunk props and transforms for the parsing of the data. You then have to make the IBMi, Data CIM complaint, so analyse what type of data it is, extract it via parsing and map those fields to CIM fields, so Splunk SIEM ES can make use of that data.     
Hello, I have been using the Splunk SIEM tool for some time. I have integrated security data to be reused by IBMi servers. The information included in Splunk is such that it is generated by the IB... See more...
Hello, I have been using the Splunk SIEM tool for some time. I have integrated security data to be reused by IBMi servers. The information included in Splunk is such that it is generated by the IBMi, so I wonder whether Splunk understands the data it receives ?  An example is that when IBMi sends a zone call Remote_IP, can Splunk know that it is an IP address? Do I have to change the format of his data ? I also wonder how to do data correlation on Splunk? Thanks in advance for reading.      
Thanks @inventsekar, I don't have or want access to the Splunk system or user files. I only have access through the Web UI. The Splunk documentation at https://docs.splunk.com/Documentation/Splunk/9... See more...
Thanks @inventsekar, I don't have or want access to the Splunk system or user files. I only have access through the Web UI. The Splunk documentation at https://docs.splunk.com/Documentation/Splunk/9.2.0/Alert/CronExpressions actually states: You can customize alert scheduling using a time range and cron expression. The Splunk cron analyzer defaults to the timezone where the search head is configured. This can be verified or changed by going to Settings > Searches, reports, and alerts > Scheduled time. But then nowhere below do they explain how to change the timezone for the cron schedule. And when I go to my alert and choose "Advanced Edit", I get a huge page with ~450 fields but nowhere the time zone. There is the field cron_schedule (and next_scheduled_time) but again, no way to change the time zone for the schedule. So I conclude, it's simply not possible.
Thanks. I want to append the IP to the existing lookup  test_MID_IP.csv
Hi @SaintNick , let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all t... See more...
Hi @SaintNick , let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the Contributors