All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Thanks @gcusello  is it possible to define it like what you did    [TMAO_sourcetype]     and if yes sourcetype of data source right?
Hi @KhalidAlharthi , in props.conf, you have to use only the sourcetype of the logs to send to syslog. If they are more than one, put more stanzas in props. # props.conf [TMAO_sourcetype] TRANSFO... See more...
Hi @KhalidAlharthi , in props.conf, you have to use only the sourcetype of the logs to send to syslog. If they are more than one, put more stanzas in props. # props.conf [TMAO_sourcetype] TRANSFORMS-send_foo_to_remote_siem = send_foo_to_remote_siem # transforms.conf [send_foo_to_remote_siem] REGEX = . DEST_KEY = _TCP_ROUTING FORMAT = remote_siem # outputs.conf [tcpout:remote_siem] server = remotesiem:1234 sendCookedData = false AS I said, check the exact sourcetype name: I recently solved an issue near your, where the error was the sourcetype exact name. Ciao. Giuseppe
Hi @rmo23 , as also @yuanliu said, you should share more details about your infrastructure. Anyway, in ITSI there's an asset inventory that should be complete (otherwise you have a very bigger issu... See more...
Hi @rmo23 , as also @yuanliu said, you should share more details about your infrastructure. Anyway, in ITSI there's an asset inventory that should be complete (otherwise you have a very bigger issue!). So,  you could use the lookup containing these asset (I don' t remember its name) and run a search like the following: | tstats count where index=* BY host | append [ | inputlookup your_asset_lookup | eval count=0 | fields host count ] | stats sum(count) AS total BY host | where total=0 Ciao. Giuseppe
by this you are sending all the event to remote siem    i need to send just TMAO trend micro  soo what the best approach to do this using syslog ...
Hi @shimada-k , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma P... See more...
Hi @shimada-k , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Hi @irisk , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Hi @KhalidAlharthi , does it run your solution? I found an error: the transformation is missed in the props.conf. I'm not sure that you can put the TRANSFORMS in Default stanza and I don't like to... See more...
Hi @KhalidAlharthi , does it run your solution? I found an error: the transformation is missed in the props.conf. I'm not sure that you can put the TRANSFORMS in Default stanza and I don't like to use a regex on index field, I'd use a different approach: # props.conf [your_sourcetype] TRANSFORMS-send_foo_to_remote_siem = send_foo_to_remote_siem # transforms.conf [send_foo_to_remote_siem] REGEX = . DEST_KEY = _TCP_ROUTING FORMAT = remote_siem # outputs.conf [tcpout:remote_siem] server = remotesiem:1234 sendCookedData = false then put attention to the sourcetype: you must be sure that you are using, in the props.conf, the original sourcetype and not a transformed (by the add-on) one.  Ciao. Giuseppe
I used splunk web interface, went to reports > edit acceleration for the specific report > clicked save and it says "This search cannot be accelerated". Please find screenshot in the other reply.  
Splunk says "This search cannot be accelerated" when I go to enable acceleration for the report and hit save,
@VatsalJagani  1. Duplicate logs are ingested into splunk, we tried to change the checkpoint file value, even after that at 2 am duplicated are ingested 2. We are using python script to ingest TYK ... See more...
@VatsalJagani  1. Duplicate logs are ingested into splunk, we tried to change the checkpoint file value, even after that at 2 am duplicated are ingested 2. We are using python script to ingest TYK mongoDB logs into splunk
I am not particularly concerned about the vulnerability right now. But this old OpenSSL version is causing problems when I try to develop new apps. I know it might not be a Splunk problem. urllib3 v2... See more...
I am not particularly concerned about the vulnerability right now. But this old OpenSSL version is causing problems when I try to develop new apps. I know it might not be a Splunk problem. urllib3 v2 is a dependency of a package that I need to use. As I understand this version of OpenSSL is not supported by the library or newer version of Python. The error message is:     urllib3 v2 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with 'OpenSSL 1.0.2zi-fips 1 Aug 2023'. See: https://github.com/urllib3/urllib3/issues/2168    
i have used this approach to forward logs from specific index to third-party system in my case Qradar   so i need to do the same forwarding specific index using syslog not TCP cuz it's takes time (... See more...
i have used this approach to forward logs from specific index to third-party system in my case Qradar   so i need to do the same forwarding specific index using syslog not TCP cuz it's takes time ( i did tcpdump to figure that)   this approach i follow  # props.conf [default] TRANSFORMS-send_foo_to_remote_siem # transforms.conf [send_foo_to_remote_siem] REGEX = foo SOURCE_KEY = _MetaData:Index DEST_KEY = _TCP_ROUTING FORMAT = remote_siem # outputs.conf [tcpout:remote_siem] server = remotesiem:1234 sendCookedData = false thanks
thanks for your reply    @tscroggins can i forward using syslog not TCP because take time to handshaking ...   thanks again....
The problem was with the JSON because of the single quote instead of double quote, thanks for the help
@tscroggins i did the following ...   outputs.conf   [tcpout:tmao] server = xxx.xxx.xxx.xxx:9997 #Forward data for the "myindex" index forwardedindex.0.whitelist = tmao sendCookedData = false... See more...
@tscroggins i did the following ...   outputs.conf   [tcpout:tmao] server = xxx.xxx.xxx.xxx:9997 #Forward data for the "myindex" index forwardedindex.0.whitelist = tmao sendCookedData = false   props.conf [source::udp:1517] TRANSFORMS-routing = route_to_tmao_index transform.conf [route_to_tmao_index] REGEX = . DEST_KEY = _TCP_ROUTING FORMAT = tmao   is my configuration good because i want to forward tmao  index to another third-party system ...   thanks  
I have 2 queries which is having sub search for input look up in each. Query 1 This query outputs the timechart for for CPU1. It will count each processes listed in the CPU1 field of the test.csv. ... See more...
I have 2 queries which is having sub search for input look up in each. Query 1 This query outputs the timechart for for CPU1. It will count each processes listed in the CPU1 field of the test.csv.  index=custom | eval SEP=split(_raw,"|"), eval CPU1=trim(mvindex(SEP,1)) | bin _time span=1m | stats count(CPU1) as CPU1_COUNT by _time CPU1 | search  [ |  input lookup test.csv  | fields CPU1 | fillnull value = 0 |  format ]   Query 2 This query outputs the timechart for for CPU2. It will count each processes listed in the CPU2 field of the test.csv.  index=custom | eval SEP=split(_raw,"|"), eval CPU2=trim(mvindex(SEP,1)) | bin _time span=1m | stats count(CPU2) as CPU2_COUNT by _time CPU2 | search  [ |  input lookup test.csv  | fields CPU2 | fillnull value = 0 |  format ]   test.csv (sample) CPU1 CPU2 CPU3 process_a process_b process_c process_d process_e process_f process_g process_i process_h     What I want is to display the CPU1 and CPU2 time chart in one chart .  Any advice on that will be a great help. Thanks
Hi @ViniciusMariano, In Simple XML, you can use row and panel elements to group inputs and visualizations. To display objects side-by-side, place them in separate panel elements. To display objects ... See more...
Hi @ViniciusMariano, In Simple XML, you can use row and panel elements to group inputs and visualizations. To display objects side-by-side, place them in separate panel elements. To display objects stacked top-to-bottom, place them in the same panel element. Combine panel elements within row elements for mixed layouts. <form version="1.1" theme="light"> <label>Quality Management Storage Rework</label> <fieldset submitButton="false"></fieldset> <row> <panel> <input type="dropdown" token="region_tok"> <label>Region</label> <choice value="All">All</choice> <default>All</default> <initialValue>All</initialValue> </input> <input type="dropdown" token="info_tok"> <label>Info</label> <choice value="General">General</choice> <default>General</default> <initialValue>General</initialValue> </input> <chart> <title>Chart</title> <search> <query>| makeresults count=10 | rename count as x | eval y=random()%10 | table x y</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="charting.chart">column</option> <option name="charting.drilldown">none</option> </chart> </panel> <panel> <table> <title>Table</title> <search> <query>| makeresults count=10 | eval x=random()%10 | table _time x</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="drilldown">none</option> </table> </panel> </row> </form>  
Hi @alfredoh14, The "noise" is the Base64 certificate data. To combine multiple PEM certificate files into a single file, simply concatenate the files: -----BEGIN CERTIFICATE----- ... -----END CERT... See more...
Hi @alfredoh14, The "noise" is the Base64 certificate data. To combine multiple PEM certificate files into a single file, simply concatenate the files: -----BEGIN CERTIFICATE----- ... -----END CERTIFICATE----- -----BEGIN CERTIFICATE----- ... -----END CERTIFICATE----- General information on PEM is readily available online. Splunk also provides a quick tutorial at https://docs.splunk.com/Documentation/Splunk/latest/Security/HowtoprepareyoursignedcertificatesforSplunk#How_to_configure_a_certificate_chain. You can reference the combined PEM file in the server.conf [sslConfig] stanza sslRootCAPath setting, e.g.: # $SPLUNK_HOME\etc\system\local\server.conf [sslConfig] sslRootCAPath = X:\path\to\cacerts.pem  
Hi @KhalidAlharthi, The basic process is documented at https://docs.splunk.com/Documentation/Splunk/latest/Forwarding/Forwarddatatothird-partysystemsd. Summarizing: Define a props.conf stanza matc... See more...
Hi @KhalidAlharthi, The basic process is documented at https://docs.splunk.com/Documentation/Splunk/latest/Forwarding/Forwarddatatothird-partysystemsd. Summarizing: Define a props.conf stanza matching your source data. Use [default] to match data at the index level. Define a transforms.conf stanza matching your index and targeting one or more output groups. Define an outputs.conf stanza for your remote system. For example, to redirect all index=foo events from a heavy forwarder to a remote SIEM on port 1234: # props.conf [default] TRANSFORMS-send_foo_to_remote_siem # transforms.conf [send_foo_to_remote_siem] REGEX = foo SOURCE_KEY = _MetaData:Index DEST_KEY = _TCP_ROUTING FORMAT = remote_siem # outputs.conf [tcpout:remote_siem] server = remotesiem:1234 sendCookedData = false If defined on an indexer, the events will indexed locally and forwarded. Note that when using [default], all events will inspected. The exact settings you need depend on your Splunk architecture and the remote SIEM. I would start by reading Splunk Enterprise Forwarding Data at https://docs.splunk.com/Documentation/Splunk/latest/Forwarding/Aboutforwardingandreceivingdata  and asking new questions as needed.
Hi @ianthomas, The Event Timeline Viz add-on is developed and supported by @danspav. You can contact them directly via email (see Splunkbase) or tag them as I have here for support.