All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I am trying to deploy SH cluster, but when I run below command    ./splunk init shcluster-config -auth <username>:<password> -mgmt_uri <URI>:<management_port> -replication_port <replication_port> -... See more...
I am trying to deploy SH cluster, but when I run below command    ./splunk init shcluster-config -auth <username>:<password> -mgmt_uri <URI>:<management_port> -replication_port <replication_port> -replication_factor <n> -conf_deploy_fetch_url <URL>:<management_port> -secret <security_key> -shcluster_label <label>   But I am getting below error WARNING: Server Certificate Hostname Validation is disabled. Please see server.conf/[sslConfig]/cliVerifyServerName for details. Login failed but when I do below config  I get below error [sslConfig] cliVerifyServerName = true sslVerifyServerCert = true ERROR: certificate validation: self signed certificate in certificate chain Couldn't complete HTTP request: error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed
Hello, I am writing to ask from which point regarding the EPS OR Daily ingested GB/day and the number of users simultaneously access the search head. at what point should i consider a cluster searc... See more...
Hello, I am writing to ask from which point regarding the EPS OR Daily ingested GB/day and the number of users simultaneously access the search head. at what point should i consider a cluster search head cluster, as it will be (one-single SH ) OR (three SH + Deployer)? from your technical perspective?    
Hi Ryan, unfortunately, it did not work applying what is recommended in the doc you shared: C:\inetpub\wwwroot\wss\VirtualDirectories\{your-site} Add the CSP Header to the <httpProtocol> section o... See more...
Hi Ryan, unfortunately, it did not work applying what is recommended in the doc you shared: C:\inetpub\wwwroot\wss\VirtualDirectories\{your-site} Add the CSP Header to the <httpProtocol> section of the Web.config file. <system.webServer> <httpProtocol> <customHeaders> <add name="Content-Security-Policy" value="script-src 'unsafe-inline' cdn.appdynamics.com; connect-src peum.kaska.com; img-src cdn.appdynamics.com; child-src cdn.appdynamics.com;" /> </customHeaders> </httpProtocol> </system.webServer> The application crashed and we had to rollback.  Notes: the agent is loaded successfully. Any other suggestions? Where else to look?
It's not about DBConnect itself. It's about JDBC, becaus that's what's responsible for the actual connection. See https://learn.microsoft.com/en-us/sql/connect/jdbc/setting-the-connection-properties ... See more...
It's not about DBConnect itself. It's about JDBC, becaus that's what's responsible for the actual connection. See https://learn.microsoft.com/en-us/sql/connect/jdbc/setting-the-connection-properties There is an interesting paragraph in authentication parameter description which might pertain to you.
Hi, I'm interested to know more about RBA Navigator, anyone have the communication method to Matt Snyder the app creator? I would like to know more information about the list of available features,... See more...
Hi, I'm interested to know more about RBA Navigator, anyone have the communication method to Matt Snyder the app creator? I would like to know more information about the list of available features, Use Cases (if possible), and installation guide. Thanks.
It's a bit more complicated than that. Data is not sent from UF as events (unless you're using indexed extractions), it's getting sent as chunks (which can cause issues if you have big events and don... See more...
It's a bit more complicated than that. Data is not sent from UF as events (unless you're using indexed extractions), it's getting sent as chunks (which can cause issues if you have big events and don't have event breaker configured properly). And it's actually the other way around - you want to have event breaker (not line breaker! since no line breaking happens on UF) set so that events are _not_ split between two different chunks. The cause for it is that two chunks of data can go to different outputs from the same group and end up on two different indexers. So even if there was a way to reassemble an event you wouldn't have anything to reassemble it from. Long story short - you want to make sure your events are _not_ getting split.
Hi, i am trying to use custom javascript file to customize some button actions in my dashboard, but it doesn't work and i don't know why. I'm using the last version of Splunk enterprise My custom s... See more...
Hi, i am trying to use custom javascript file to customize some button actions in my dashboard, but it doesn't work and i don't know why. I'm using the last version of Splunk enterprise My custom script is in the folder  $SPLUNK_HOME/etc/apps/app_name/appserver/static/. I have  tried to restart Splunk web, use the bumps button but nothing works. Can anyone help me?  Simple xml dashboard code <form version="1.1" theme="dark" script="button.js"> <search> <query> | makeresults | eval field1="test", field2="test1", field3="lll", field4="sgsgsg" </query> <earliest></earliest> <latest>now</latest> <done> <set token="field1">$result.field1$</set> <set token="field2">$result.field2$</set> <set token="field3">$result.field3$</set> <set token="field4">$result.field4$</set> </done> </search> <label>stacked_inputs</label> <fieldset submitButton="false" autoRun="true"></fieldset> <row> <panel> <title>title</title> <input id="test_input1" type="text" token="field1"> <label>field1</label> <default>$field1$</default> <initialValue>$field1$</initialValue> </input> <input id="test_input2" type="text" token="field2"> <label>field2</label> <default>$field2$</default> <initialValue>$field2$</initialValue> </input> <html> <style> #test_input2 { padding-left: 30px !important; } </style> </html> </panel> </row> <row> <panel> <input id="test_input3" type="text" token="field3"> <label>field3</label> <default>$field3$</default> <initialValue>$field3$</initialValue> </input> <input id="test_input4" type="text" token="field4"> <label>field4</label> <default>$field4$</default> <initialValue>$field4$</initialValue> </input> </panel> </row> <row> <panel> <html> <form> <div> <div> <label>Password</label> <input type="text" value="$field4$"/> <br/> <input type="password" id="exampleInputPassword1" placeholder="Password"/> </div> </div> <button type="submit" class="btn btn-primary">Submit</button> </form> <button onclick="test()">Back</button> <button onclick="test1()">Back1</button> <button id="back" data-param="test">Back2</button> </html> </panel> </row> </form> Javascript code As you can see i have tried different methods Thanks for your help.  
If WSUS writes events to event log or flat files, you can use the usual methods (wineventlog and monitor inputs) to obtain that data. WID is another story - it's an embedded component and cannot be ... See more...
If WSUS writes events to event log or flat files, you can use the usual methods (wineventlog and monitor inputs) to obtain that data. WID is another story - it's an embedded component and cannot be queried from remote so the only way to access it would be by some component installed directly on the WSUS server. The most obvious way to access a MSSQL database which is using DBConnect will fail however because Microsoft's JDBC driver for MSSQL is a pure-Java implementation and only uses TCP/IP connectivity. You could try using jTDS driver but this is unsupported and generally unexplored territory. In other words you're on your own here. You could also try using SQL Studio and tools contained therein to script some queries against database and write results to a file but again - I don't think that's something people do often and you're unlikely to find a ready-made solution. There is a third-party (not Splunk-supported) add-on and app for WSUS on Splunkbase but the add-on assumes connectivity to WSUS database using DBConnect (which means a WSUS setup with an external MS SQL instance). But you can look into it to find the queries you need if you decide to implement the ingestion process on your own.
They seem to correspond to different Carbon Black products: https://splunkbase.splunk.com/app/5775 - Carbon Black App Control (formerly Bit9) https://splunkbase.splunk.com/app/5774 - Carbon Black d... See more...
They seem to correspond to different Carbon Black products: https://splunkbase.splunk.com/app/5775 - Carbon Black App Control (formerly Bit9) https://splunkbase.splunk.com/app/5774 - Carbon Black defense https://splunkbase.splunk.com/app/5947 - Carbon Black Response https://splunkbase.splunk.com/app/6732 - VMware Carbon Black Cloud Which Carbon Black product are you using? If you have a contact with your Carbon Black license then perhaps you can ask them which is the most appropriate SOAR connector for your Carbon Black products. Or you could try your API keys on each product and see which one succeeds in its actions.
Thanx. I will create support case for this. Do you have old case id on your hands?
Hello, I have a WSUS server that is using the Windows Internal Database (WID). I would like to ingest WSUS service logs into Splunk, store them, and then parse them for further analysis. Could someo... See more...
Hello, I have a WSUS server that is using the Windows Internal Database (WID). I would like to ingest WSUS service logs into Splunk, store them, and then parse them for further analysis. Could someone guide me on the best approach to achieve this? Specifically: What is the best way to configure Splunk to collect logs from the WSUS service (and database if necessary)? Are there any best practices or recommended add-ons for parsing and indexing WSUS logs in Splunk? Thanks in advance for your help!
Hi @whipstash , add to the stats command, using the values option9 all the fields you need from both the searches: index=INDEX sourcetype=sourcetypeA | rex field=eventID "\w{0,30}+.(?<sessionID>\d+... See more...
Hi @whipstash , add to the stats command, using the values option9 all the fields you need from both the searches: index=INDEX sourcetype=sourcetypeA | rex field=eventID "\w{0,30}+.(?<sessionID>\d+)" | do some filter on infoIWant fields here | append [ search index=INDEX sourcetype=sourcetypeB | stats count AS eventcount earliest(_time) AS earliest latest(_time) AS latest BY sessionID | eval duration=latest-earliest | where eventcount=2 | fields sessionID duration field3 field4 ] | stats values(eventID) AS eventID values(duration) AS duration values(field1) AS field1 values(field2) AS field2 values(field3) AS field3 values(field4) AS field4 values(count) AS count BY sessionID Ciao. Giuseppe  
Hi @BB2 , only one question: why? if the issue is the limit of 50,000 chars, you can only increase the TRUNCATE limit. There's no utility (even if it's possible but not!) to trucate an event on fo... See more...
Hi @BB2 , only one question: why? if the issue is the limit of 50,000 chars, you can only increase the TRUNCATE limit. There's no utility (even if it's possible but not!) to trucate an event on forwarders and then reassemble it  on Indexers because events are compressed and stored in packets and sent from Forwarders to Indexers with no relation with the lenght of the event. So I ask you again why? the only action that you must do is increasing the lenght of the events aging on the TRUNCATE parameters. Ciao. Giuseppe
Hi @alex12  As documented here  https://docs.splunk.com/Documentation/Forwarder/9.3.1/Forwarder/Installleastprivileged the CAP_DAC_READ_SEARCH will work only with UF (not with HF)   the HF insta... See more...
Hi @alex12  As documented here  https://docs.splunk.com/Documentation/Forwarder/9.3.1/Forwarder/Installleastprivileged the CAP_DAC_READ_SEARCH will work only with UF (not with HF)   the HF installation method (regular Splunk enterprise installation) https://docs.splunk.com/Documentation/Splunk/9.3.1/Installation/InstallonLinux  
Hi @DATT , pls check this one: | makeresults | eval latestEpoch=$token_epoch$ + 604800 | index=someIndex earliest=$token_epoch$ latest=latestEpoch  
 helper.get_arg(“interval”) is not working with me.   I used helper.get_input_stanza() to retrieve the stanza information as a dict. for the dict you will find the interval value.   Thanks, Awni
I have a working dashboard that displays a number of metrics and KPIs for the previous week.  Today, I was asked to expand that dashboard to include a dropdown of all previous weeks over the last yea... See more...
I have a working dashboard that displays a number of metrics and KPIs for the previous week.  Today, I was asked to expand that dashboard to include a dropdown of all previous weeks over the last year   Using this query I was able to fill in my dashboard dropdown pretty easily | makeresults | eval START_EPOCH = relative_time(_time,"-1y@w1") | eval END_EPOCH = START_EPOCH + (60 * 60 * 24 * 358) | eval EPOCH_RANGE = mvrange(START_EPOCH, END_EPOCH, 86400 * 7) | mvexpand EPOCH_RANGE | eval END_EPOCH = EPOCH_RANGE + (86400 * 7) | eval START_DATE_FRIENDLY = strftime(EPOCH_RANGE, "%m/%d/%Y") | eval END_DATE_FRIENDLY = strftime(END_EPOCH, "%m/%d/%Y") | eval DATE_RANGE_FRIENDLY = START_DATE_FRIENDLY + " - " + END_DATE_FRIENDLY | table DATE_RANGE_FRIENDLY, EPOCH_RANGE | reverse Using this I get a dropdown with values such as  10/07/2024 - 10/14/2024 09/30/2024 - 10/07/2024   And so on, going back a year. Adding it to my search as a token has been more challenging though. Here's what I'm trying to do: index=someIndex earliest=$token_epoch$ latest=$token_epoch$+604800  Doing this I get "Invalid latest_time: latest_time must be after earliest_time."   I've seen some answers around here that involve running the search then using WHERE to apply earliest and latest.  I'd like to avoid that because the number of records that would have to pulled before I could filter on earliest and latest is in the many millions.  I've also considered using the timepicker but my concern there is the users who use this dashboard will pick the wrong dates.  I'd like to limit that by hardcoding the first and last days of the search via the dropdown. Is there a way to accomplish relative earliest and latest dates/times like this?
Hi @darkins , It is actually simple, as long as you are comfortable with regex syntax. It will be like this: | eval condition=case(match(_raw, "thisword"), "first_condition", match(_raw, "thi... See more...
Hi @darkins , It is actually simple, as long as you are comfortable with regex syntax. It will be like this: | eval condition=case(match(_raw, "thisword"), "first_condition", match(_raw, "thisotherword"), "second_condition", 1=1,"default_condition") | rex field=_raw "<rex_pattern>" if condition=="first_condition" | rex field=_raw "<rex_pattern>" if condition=="second_condition" | rex field=_raw "<rex_pattern>" if condition=="default_condition" Give it a try and let me know how it goes.
like in the subject, i am looking at events with different fields and delimeters i want to say if the event contains thisword then rex blah blah blah elseif the event contains thisotherword then rex... See more...
like in the subject, i am looking at events with different fields and delimeters i want to say if the event contains thisword then rex blah blah blah elseif the event contains thisotherword then rex blah blah blah i suspect this is simple but thought to ask
I think its a permission issue, Google Workspace user should have a “Organization Administrator” role. That’s the only requirement for the account. you account might be read only?