All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

+1! « $Request failed with the following error code: 400 » Tx!
This did not work. Thank you.
Hi, basically I had a html button in a panel, next to a text box. When clicked, the button was supposed to run a search which added the text entered in the panel into a lookup. The problem is, the on... See more...
Hi, basically I had a html button in a panel, next to a text box. When clicked, the button was supposed to run a search which added the text entered in the panel into a lookup. The problem is, the only way to get the button to do this was to use the full URL of the search, opening in a new tab. When I tried the javascript approach, I was getting messages about running potentially-unsafe scripts but the original method worked so I stuck with that. I just want to know if it's possible to use a html button to run a search, without opening in a new tab. I have tried various ways but haven't had any success. Thanks
Holy Cow! Per that document, I tried enabling: mc_auto_config = enabled ...and it removed all my indexers from the cluster. Good times. I think I'll just learn to live without those volume dashbo... See more...
Holy Cow! Per that document, I tried enabling: mc_auto_config = enabled ...and it removed all my indexers from the cluster. Good times. I think I'll just learn to live without those volume dashboards, wouldn't be the first time I had to ignore missing functionality. Pro tip:  don't bother taking the advanced troubleshooting class at Splunk.conf -- didn't prepare me for anything useful...
Hi there, I am trying to get some data from MS Defender into a Splunk query.  My original KQL query in azure contains | JOIN KIND INNER. to concat DeviceProcess and DeviceRegistry tables. The Splu... See more...
Hi there, I am trying to get some data from MS Defender into a Splunk query.  My original KQL query in azure contains | JOIN KIND INNER. to concat DeviceProcess and DeviceRegistry tables. The Splunk app I am using:  Splunk https://splunkbase.splunk.com/app/5518    So basically I'd like to do concatenation between DeviceProcess and DeviceRegistry events in advanced hunting query | advhunt in splunk SPL. Is there a suitable Splunk query for this kind of purpose?
@dtburrows3 Would you be able to share your code or a snippet of the relevant function calls? I am trying to create a similar expansion command but have not yet been able to locate the appropriate fu... See more...
@dtburrows3 Would you be able to share your code or a snippet of the relevant function calls? I am trying to create a similar expansion command but have not yet been able to locate the appropriate functions to use in the Splunk Python SDK.
What sort of button are you using? Do you mean the Submit button? Please provide more details of what you tried and how it failed?
Hi, sorry for the lack of reply I'm afraid it didn't work for me, I ended up having to use the button to open a search in a new tab. Is there actually a way to run a search from a button without di... See more...
Hi, sorry for the lack of reply I'm afraid it didn't work for me, I ended up having to use the button to open a search in a new tab. Is there actually a way to run a search from a button without displaying the results?
ah, gotcha. Yes, it's configured, setup is correct, server roles are set, and I use it often for various things -- I can see data in pretty much every other dashboard, and even in the "Index Detail: ... See more...
ah, gotcha. Yes, it's configured, setup is correct, server roles are set, and I use it often for various things -- I can see data in pretty much every other dashboard, and even in the "Index Detail: Deployment" -- it *does* show some volume information, as does "Indexes and Volumes: Deployment". In "opening in search" pretty much any panel in the monitoring console, I can see the query and the macro or rest it uses. But in these volume detail pages, they all are "undefined". The info reporting on the individual indexes is correct -- I use it to trim and set limits on various indexes. I do see in the "Volume Detail: Instance" -- I each indexer populated in the dropdown, but the Volume (token) is empty. To recap:  all my dashboards in Monitoring Console on my Management server have data except for:  VolumeDetail: Instance and VolumeDetail: Deployment. Honestly, if it's NOT configured correctly (the management server/console), then I'm not sure what to fix. I know this doesn't always mean a Good Thing, but I have been using Splunk since v3.x.  
Hey, Your message allowed me to realize that in my question there is missing some information. IBMi data are in Json format and integrate to the HTTP event collector. I didn’t understand what I ha... See more...
Hey, Your message allowed me to realize that in my question there is missing some information. IBMi data are in Json format and integrate to the HTTP event collector. I didn’t understand what I had to do to make Splunk understand the data and make the correlation.
Hello @Maxime, By default Splunk tries to parse the data that got ingested from whatsoever log source it had been onboarded. However, there's no gaurantee that Splunk will be able to understand the ... See more...
Hello @Maxime, By default Splunk tries to parse the data that got ingested from whatsoever log source it had been onboarded. However, there's no gaurantee that Splunk will be able to understand the log source completely and provide you with the fields. There are lots of apps and add-ons available on Splunkbase for the exact same purpose (to collect and parse the data). However, if you do not find associated app/add-on, you can write the sourcetype configuration as per your requirement and you should then be able to get the necessary fields.  Also, if the data generated is in structured format (JSON, XML, CSV, etc.), Splunk has parsing written for those by default. In that case, you'll be able to directly visualize the data. You can find the relevant documentation links below: - https://docs.splunk.com/Documentation/Splunk/latest/Data/WhatSplunkdoeswithyourdata - https://docs.splunk.com/Documentation/Splunk/latest/Data/Overviewofeventprocessing - https://docs.splunk.com/Documentation/Splunk/latest/Data/Createsourcetypes - https://lantern.splunk.com/Splunk_Platform/Product_Tips/Data_Management/Configuring_new_source_types   Thanks, Tejas.   --- If the above solution helps, an upvote is appreciated.  
At a high level: Splunk can ingest most types of logs data, using different methods of collection. Splunk works on Key Value pairs - so if IBMi can send the data, with this data, you can then searc... See more...
At a high level: Splunk can ingest most types of logs data, using different methods of collection. Splunk works on Key Value pairs - so if IBMi can send the data, with this data, you can then search it. The first thing to do is to look at the IBMi data and workout what format it is and how to collect that data, example, is it in a text log file, Json, XML, DB, syslog or API. You then need to set up the data collection method, this could involve a UF(Splunk Agent), Hec using HTTP API, or syslog etc, this has to be based on your environment and preferred method of collecting IBMi data, and place that data into an index. You then need to look at if there's a Splunk Add on in Splunkbase for IBMi data, this is used for parsing the data, if there isn’t one, you then need to develop Splunk props and transforms for the parsing of the data. You then have to make the IBMi, Data CIM complaint, so analyse what type of data it is, extract it via parsing and map those fields to CIM fields, so Splunk SIEM ES can make use of that data.     
Hello, I have been using the Splunk SIEM tool for some time. I have integrated security data to be reused by IBMi servers. The information included in Splunk is such that it is generated by the IB... See more...
Hello, I have been using the Splunk SIEM tool for some time. I have integrated security data to be reused by IBMi servers. The information included in Splunk is such that it is generated by the IBMi, so I wonder whether Splunk understands the data it receives ?  An example is that when IBMi sends a zone call Remote_IP, can Splunk know that it is an IP address? Do I have to change the format of his data ? I also wonder how to do data correlation on Splunk? Thanks in advance for reading.      
Thanks @inventsekar, I don't have or want access to the Splunk system or user files. I only have access through the Web UI. The Splunk documentation at https://docs.splunk.com/Documentation/Splunk/9... See more...
Thanks @inventsekar, I don't have or want access to the Splunk system or user files. I only have access through the Web UI. The Splunk documentation at https://docs.splunk.com/Documentation/Splunk/9.2.0/Alert/CronExpressions actually states: You can customize alert scheduling using a time range and cron expression. The Splunk cron analyzer defaults to the timezone where the search head is configured. This can be verified or changed by going to Settings > Searches, reports, and alerts > Scheduled time. But then nowhere below do they explain how to change the timezone for the cron schedule. And when I go to my alert and choose "Advanced Edit", I get a huge page with ~450 fields but nowhere the time zone. There is the field cron_schedule (and next_scheduled_time) but again, no way to change the time zone for the schedule. So I conclude, it's simply not possible.
Thanks. I want to append the IP to the existing lookup  test_MID_IP.csv
Hi @SaintNick , let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all t... See more...
Hi @SaintNick , let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the Contributors
Start with this.  Adjust the values as necessary.  Have the alert trigger when the number of results is not zero. index=<<index where your perfmon data is stored>> source=disk | where storage_free_... See more...
Start with this.  Adjust the values as necessary.  Have the alert trigger when the number of results is not zero. index=<<index where your perfmon data is stored>> source=disk | where storage_free_percent < <<your desired value>>
The missing information can be the result of one or several missing/wrong configurations either on MC or IDXs, that will depend on the architure you have. So it's important to frame your case. That ... See more...
The missing information can be the result of one or several missing/wrong configurations either on MC or IDXs, that will depend on the architure you have. So it's important to frame your case. That dashboard displays "Search is waiting for input" beacase there are probably missing tokens like the Volume dropdown, right? As an example, my system don't have any volumes defined and so the "Volume" dropdown will not populate, preventing the dashboard from running searches, and thus showing Undefined. Did you setup the Monitoring Console in that Managment Node? Can it "see" all the Indexers in Settings > Distributed Search? Is the info about individual indexes accurate? If you go to MC > Settings > General Setup, does all instances show with correct information? https://docs.splunk.com/Documentation/Splunk/9.2.1/DMC/Configureindistributedmode  
Hi @P_vandereerden  Yes, as per the log pattern there are distinct transaction id's with the ORA-00001 error message. Requirement is to identify all such transactions with the error message. P... See more...
Hi @P_vandereerden  Yes, as per the log pattern there are distinct transaction id's with the ORA-00001 error message. Requirement is to identify all such transactions with the error message. Please suggest.
Hi All, I want to extract email  from json event in splunk. Query I am using is :     index=*sec sourcetype=test | eval tags_json=spath(_raw, "Tag{}"), final_tag_json=json_object() | foreach... See more...
Hi All, I want to extract email  from json event in splunk. Query I am using is :     index=*sec sourcetype=test | eval tags_json=spath(_raw, "Tag{}"), final_tag_json=json_object() | foreach mode=multivalue tags_json [ | eval final_tag_json=json_set(final_tag_json, spath('<<ITEM>>', "Key"), spath('<<ITEM>>', "Value"))] | spath input=final_tag_json | rex field=Email "(?<email>^\w+@abc.com$)"     Raw data :     "Tag": [{"Key": "app", "Value": “test”_value}, {"Key": "key1", "Value": "value1"}, {"Key": "key2", "Value": "value2"}, {"Key": “email”, "Value": “test@abc.com}],     I want email to be mapped to contact when indexed. How can I achieve this ? Please help me Regards, pnv