Where can I find the HL7 add on for Splunk? We created a solution around this for healthcare field. We now have an official go ahead for a POC with Splunk in Asia. We need HL7 add on. Can you pleas...
See more...
Where can I find the HL7 add on for Splunk? We created a solution around this for healthcare field. We now have an official go ahead for a POC with Splunk in Asia. We need HL7 add on. Can you please help us? Thanks, Sanjay
As @mmccul_slac says, Indexed Extractions=true is what causes this behaviour. When JSON data comes in, if it's set to true, Splunk will parse and index the JSON data and when you search, Splunk will ...
See more...
As @mmccul_slac says, Indexed Extractions=true is what causes this behaviour. When JSON data comes in, if it's set to true, Splunk will parse and index the JSON data and when you search, Splunk will also parse and create fields from the JSON at search time, hence you get duplicates. See this https://community.splunk.com/t5/Getting-Data-In/Why-is-my-sourcetype-configuration-for-JSON-events-with-INDEXED/td-p/188551 and it may depend on where the data is coming from to HEC and whether it's coming from an intermediate Splunk Universal forwarder
Have you configured your forwarder firstly to collect data from the host and secondly where to send it? https://docs.splunk.com/Documentation/Forwarder/9.1.0/Forwarder/Configuretheuniversalforwarder...
See more...
Have you configured your forwarder firstly to collect data from the host and secondly where to send it? https://docs.splunk.com/Documentation/Forwarder/9.1.0/Forwarder/Configuretheuniversalforwarder https://docs.splunk.com/Documentation/Forwarder/9.1.0/Forwarder/Configureforwardingwithoutputs.conf#:~:text=conf-,The%20outputs.,require%20that%20you%20edit%20outputs. Have you created an index that the UF will send its data to?
You're most of the way there -- In your original search, replace the date you have with [] and put your make results in it. The items in the brackets run before the remainder of the search. | ldaps...
See more...
You're most of the way there -- In your original search, replace the date you have with [] and put your make results in it. The items in the brackets run before the remainder of the search. | ldapsearch search="(&(objectClass=user)(whenChanged>=[|makeresults |eval whenChanged=strftime(relative_time(now(),"-2d@d"),"%Y%m%d%H%M%S.0Z")|return $whenChanged])(!(objectClass=computer)))"
|table cn whenChanged whenCreated
I forgot to tell you what my inputs.conf contains: [WinEventLog://Application]
disabled = 0
[WinEventLog://Security]
disabled = 0
[WinEventLog://System]
disabled = 0
[WinEventLog://Setup]
disa...
See more...
I forgot to tell you what my inputs.conf contains: [WinEventLog://Application]
disabled = 0
[WinEventLog://Security]
disabled = 0
[WinEventLog://System]
disabled = 0
[WinEventLog://Setup]
disabled = 0 My outputs.conf: [tcpout]
defaultGroup = default-autolb-group
[tcpout:default-autolb-group]
server = 192.168.1.2:9997
[tcpout-server://192.168.1.2:9997]
Hello, I try to learn splunk and thatfor I have setup a demo-version in my home-lab on the Linux system... Actually I have splunk running and I added the local files. Then I activated port 9997 and ...
See more...
Hello, I try to learn splunk and thatfor I have setup a demo-version in my home-lab on the Linux system... Actually I have splunk running and I added the local files. Then I activated port 9997 and installed a universal forwarder on my Windows 10 PC. I can see on Linux with tcpdump that I get packages on port 9997 but I can't get the data into splunk! When I try to add data from a forwarder manually then I see the message that I have actually not forwarders configured... What am I doing wrong?
Hi Shawno: The Palo Alto App and Add on are supported by Palo Alto and they're very happy to work with folks on getting the app to work. I recommend you use your power as a PA customer to reach out ...
See more...
Hi Shawno: The Palo Alto App and Add on are supported by Palo Alto and they're very happy to work with folks on getting the app to work. I recommend you use your power as a PA customer to reach out to them for more specific help. If you'd like this community to help, you will need to be more specific, like telling us what dashboard and whether there is more info about the error (is there an actual context to the error or does it just say "error"). See if you can give a bit more info here and then people will be able to help out. Also you have an error on a dashboard are also talking about data no longer ingesting. Are you sure? Is the data there if you just say index=blah sourcetype=blah... ? Or is it that the error is stopping data from po
A custom JavaScript error caused an issue loading your dashboard. I'm experiencing this error on both the palo alto networks app and add-on app; I'm unsure why reporting is no longer ingesting data....
See more...
A custom JavaScript error caused an issue loading your dashboard. I'm experiencing this error on both the palo alto networks app and add-on app; I'm unsure why reporting is no longer ingesting data. Thanks
I need to run a daily ldap search that will grab only the accounts that have change in the last 2 days. I can hard code a data into the whenChanged attribute. | ldapsearch search="(&(ob...
See more...
I need to run a daily ldap search that will grab only the accounts that have change in the last 2 days. I can hard code a data into the whenChanged attribute. | ldapsearch search="(&(objectClass=user)(whenChanged>=20230817202220.0Z)(!(objectClass=computer)))"
|table cn whenChanged whenCreated I am trying to make whenChanged into a last 2 days variable that will work with ldap search. I can create a whenChanged using: |makeresults |eval whenChanged=strftime(relative_time(now(),"-2d@d"),"%Y%m%d%H%M%S.0Z")|fields - _time I could use the help getting that dynamic value into the ldap search so that I am looking for the >= values of whenChanged
You'd need to use btool to check at the OS level for any configs for that source and sourcetype, e.g., splunk btool props list RanorexJSon
splunk btool props list source::ElectraExtendedUI (Make s...
See more...
You'd need to use btool to check at the OS level for any configs for that source and sourcetype, e.g., splunk btool props list RanorexJSon
splunk btool props list source::ElectraExtendedUI (Make sure to get the sourcetype and source names accurate). You're looking for parameters about indexed extractions. Since a props can apply to both a sourcetype and a source (as well as host, but that's less likely), search for both.
Thank you your input, I found one workaround and here is the code. Luckily I am having timestamp field in my lookup file so I making use of that. If you have any idea to make it better please let m...
See more...
Thank you your input, I found one workaround and here is the code. Luckily I am having timestamp field in my lookup file so I making use of that. If you have any idea to make it better please let me know | inputlookup lkp_sds_wms_trw_slislo.csv | eval start_date = strftime(relative_time(now(),"-60d@d"), "%Y-%m-%d") | eval Endtimestamp = strptime(start_date, "%Y-%m-%d") |where timestamp > Endtimestamp | outputlookup lkp_sds_wms_trw_slislo.csv Thanks Again Regards Amit
it's EST. also that is not the problem. it's the date also. docuemtn name is 09042023_test.txt and inside it has something like ID= 101010 processed_date=09/03/2023 and today's date is 09...
See more...
it's EST. also that is not the problem. it's the date also. docuemtn name is 09042023_test.txt and inside it has something like ID= 101010 processed_date=09/03/2023 and today's date is 09/05/2023 but when the forwarder forwards, it takes the date inside of the document resulting the search has to go 2 days back to find the data
Just add any additional character groupings into the allowed character ranges, i.e. | rex field=Group max_match=0 "'(?<g>[A-Za-z_\.]+)':'"
| rex field=Value max_match=0 "'(?<v>[A-Za-z_\.]+)'"
The problem turned out to be that since I have the Add-on on a Heavy Forwarder and in this Splunk I only place the Forwarding license, it turns out that the DB Connect already needs something called ...
See more...
The problem turned out to be that since I have the Add-on on a Heavy Forwarder and in this Splunk I only place the Forwarding license, it turns out that the DB Connect already needs something called KVStore or something like that works that is only available with a paid license. After asking for support I was provided with a free license and the problem was solved.
forwarder is forwarding (ex) /var/log/test.txt and the file IS test.txt and it is active because I can see the files from the search, except the dates are not feeding. props.conf is seating on the...
See more...
forwarder is forwarding (ex) /var/log/test.txt and the file IS test.txt and it is active because I can see the files from the search, except the dates are not feeding. props.conf is seating on the /etc/apps/test/local/props.conf
I don't know what dataset you're working with but the first thing that comes to mind is that your datamodel is not accelerated. If you don't have accelerated summaries, tstats has nothing to operate ...
See more...
I don't know what dataset you're working with but the first thing that comes to mind is that your datamodel is not accelerated. If you don't have accelerated summaries, tstats has nothing to operate on. And it's completely irrelevant if it's a docker image, VM or bare metal install.