All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@doli  1. Go to the add-on and configure  *Account Name: Enter a unique name for this account. *IP Address/Domain : Enter the IP Address of the Cisco Cyber Vision in format https://<ip address> or... See more...
@doli  1. Go to the add-on and configure  *Account Name: Enter a unique name for this account. *IP Address/Domain : Enter the IP Address of the Cisco Cyber Vision in format https://<ip address> or https://<domain-name>  API Token : Enter API Token generated from Cyber Vision for above account. If you have proxy, configure the proxy details.    2. Create input  Navigate to the inputs section and create a new input based on your requirements. Note:  Create an index for this data source to store incoming events. Check and open the necessary firewall ports/rules for data ingestion. Ensure communication between the data source and Splunk components. If events are not visible after configuration, check the internal index (_internal) For Splunk Clusters:  Create the index on the Cluster Master (CM) and push it to the indexers. Also, create the same index on the Heavy Forwarder (HF).     
@doli  There is no direct access to a specific document titled "Cisco Cyber Vision Integration with Splunk". Please follow this.  Based on standard practices from the Cisco and Splunk :  Install... See more...
@doli  There is no direct access to a specific document titled "Cisco Cyber Vision Integration with Splunk". Please follow this.  Based on standard practices from the Cisco and Splunk :  Install the Add-On: In Splunk, go to Apps > Manage Apps > Install App from File and upload the add-on package (.spl file) from Splunkbase. Configure the Add-On: Navigate to the add-on’s configuration page in Splunk, where you’ll input your Cisco Cyber Vision API details (e.g., IP address of the Cyber Vision portal, API token generated from Cyber Vision). You may also specify proxy settings or custom CA certificates if needed. Set Data Inputs: Define the time interval for data polling and the Splunk index to store the data. Install the App: Install the Splunk App similarly and use its dashboards to visualize the data. Syslog Option: Alternatively, configure Cyber Vision to send CEF syslog data to Splunk via TCP/UDP inputs (see the Cisco Catalyst Add-on for Splunk for syslog setup details).  
@doli  You can find the necessary documentation for integrating Cisco Cyber Vision with Splunk on Splunkbase. The Cisco Cyber Vision Splunk Add-On allows organizations to pull information from Cisco... See more...
@doli  You can find the necessary documentation for integrating Cisco Cyber Vision with Splunk on Splunkbase. The Cisco Cyber Vision Splunk Add-On allows organizations to pull information from Cisco Cyber Vision using its RESTful API interface. This add-on helps configure and pull component information, vulnerabilities, activities, and events from Cyber Vision to be used with the Cyber Vision Splunk App.    For detailed instructions and to download the add-on, you can visit the  Cisco Cyber Vision Splunk Add On | Splunkbase  Cisco Security and Splunk SIEM - Cisco
Hi @livehybrid First of all, thank you for your quick response. It is greatly appreciated. In the end, it was a much simpler mistake; I forgot to include the port in the SMTP FQDN since it is ... See more...
Hi @livehybrid First of all, thank you for your quick response. It is greatly appreciated. In the end, it was a much simpler mistake; I forgot to include the port in the SMTP FQDN since it is under SSL. Regards
Hi @lar06  No, there isnt anything that I have found in relation to this error, as this is a relatively low-level error which I believe you wouldnt encounter unless there were issues with the Splunk... See more...
Hi @lar06  No, there isnt anything that I have found in relation to this error, as this is a relatively low-level error which I believe you wouldnt encounter unless there were issues with the Splunk2Splunk (S2S) communication.  I think you will need to consult Splunk and/or Cribl Support in order to get to the bottom of this. Ironically the only times I have heard of this issue referenced before is when using Cribl (hence my question about sending via non-supported versions or external systems) so it might be worth speaking to Cribl support first as they may have resolved this for other people. Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will
HI @imam29  In this case, I would expect the timeout to be after 30 minutes of no activity in the UI. The other timeout option is in server.conf/[general]/sessionTimeout -  See https://docs.splunk... See more...
HI @imam29  In this case, I would expect the timeout to be after 30 minutes of no activity in the UI. The other timeout option is in server.conf/[general]/sessionTimeout -  See https://docs.splunk.com/Documentation/Splunk/latest/Admin/Serverconf?_gl=1*homgau*_ga*NzI2Njg4NjMzLjE2NTUzOTI4OTQ.*_gid*MTIyOTUyNTY3Mi4xNjU1MzkyODk0&_ga=2.206629523.1229525672.1655392894-726688633.1655392894#:~:text=string.%0A*%20Default%3A%20shortname-,sessionTimeout,-%3D%20%3Cnonnegative%20integer%3E%5Bs   The ui_inactivity_timeout is the amount of minutes when there is no user interface clicking, mouseover, scrolling, or resizing. * Notifies client side pollers to stop, resulting in sessions expiring at the 'tools.sessions.timeout' value, so any mouseover, scrolling etc on the browser will reset this timer. Just to check, have you restarted since making these changes?  Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will  
Hi Will Yes, both are 9.3.2 but we have Cribl in between. So I assume the message is due to Cribl. Is there any doc about those terms ? Thanks Lionel
I tried this and the result is the same...
@Rhidian  In DBConnect, you can modify the SQL query to cast the SQL_TEXT field from a CLOB to a different data type, such as VARCHAR2 (Oracle’s variable-length string type).   Please have a look:... See more...
@Rhidian  In DBConnect, you can modify the SQL query to cast the SQL_TEXT field from a CLOB to a different data type, such as VARCHAR2 (Oracle’s variable-length string type).   Please have a look:  https://community.splunk.com/t5/All-Apps-and-Add-ons/Is-it-possible-to-get-Character-Large-Object-CLOB-values-from/m-p/183455  https://community.splunk.com/t5/All-Apps-and-Add-ons/Is-it-possible-to-get-Character-Large-Object-CLOB-values-from/m-p/183453 
Hey @livehybrid , Thanks! I appreciate it. Wishing you the best too!  
Wouldn't casting this to a new data type in DBConnect fix this or is that merely converting the additional along with the other data?
Screenshot 2
@Rhidian  Since you mentioned that SQL_TEXT is a CLOB (Character Large Object) in Oracle, the issue likely stems from how this data type is processed and ingested by Splunk versus how DBConnect hand... See more...
@Rhidian  Since you mentioned that SQL_TEXT is a CLOB (Character Large Object) in Oracle, the issue likely stems from how this data type is processed and ingested by Splunk versus how DBConnect handles it.   This happens because: DB Connect is not handling CLOB properly. It may append metadata or placeholders (such as null indicators, buffer sizes, or offsets).
Thanks making this a bit odder if I look at the event in the fields the text seems OK but if I click on it in the Interesting fields that is when I see the spurious data see the screenshots.
@Rhidian  Used makeresults to create sample events.   To apply this logic permanently to your oracle:audit:unified sourcetype in Splunk, you’ll need to configure props.conf and transforms.conf... See more...
@Rhidian  Used makeresults to create sample events.   To apply this logic permanently to your oracle:audit:unified sourcetype in Splunk, you’ll need to configure props.conf and transforms.conf to clean the SQL_TEXT field during ingestion. This ensures the spurious text (e.g., 4,,1,,,,,,) is stripped out before the data is indexed, so all your searches will see the cleaned version.   In props.conf, you’ll associate the oracle:audit:unified sourcetype with a transform that cleans the SQL_TEXT field.   Location:   Typically $SPLUNK_HOME/etc/system/local/props.conf or an app-specific directory like $SPLUNK_HOME/etc/apps/<your_app>/local/props.conf.   props.conf   [oracle:audit:unified] SHOULD_LINEMERGE = false TRUNCATE = 10000 TRANSFORMS-clean_sql = clean_sql_text   transforms.conf   [clean_sql_text] SOURCE_KEY = SQL_TEXT REGEX = ^([^,]+) FORMAT = SQL_TEXT::$1 DEST_KEY = SQL_TEXT    
@Rhidian  Start with the rex solution in your search to quickly verify if you can isolate the valid SQL: index=<your_index> sourcetype=oracle:audit:unified | rex field=SQL_TEXT "^(?<cleaned_sql>[^... See more...
@Rhidian  Start with the rex solution in your search to quickly verify if you can isolate the valid SQL: index=<your_index> sourcetype=oracle:audit:unified | rex field=SQL_TEXT "^(?<cleaned_sql>[^,]+)" | table cleaned_sql If that works, apply it into props.conf and transforms.conf for a permanent fix at ingestion time. If the pattern of spurious text varies (e.g., not always commas).
Hello Team, I’ve been trying to ingest Splunk notable events into Splunk SOAR (Phantom), but I’m facing an issue where not all details are being transferred automatically. I’ve experimented with mu... See more...
Hello Team, I’ve been trying to ingest Splunk notable events into Splunk SOAR (Phantom), but I’m facing an issue where not all details are being transferred automatically. I’ve experimented with multiple methods, including: Using the "Send to Adaptive Response" option with "Send to SOAR." Utilizing the Splunk App for SOAR Export.(Forwording) Installing the Splunk App directly on my Splunk SOAR instance.(on poll) During the application configuration, I enabled the polling option, and data is being ingested through both methods. However, critical fields like event_id and others are missing in the ingested data. Interestingly, when I manually select the "Send to SOAR" option under a notable event, all fields are successfully transferred to SOAR without any issues. Is there a way to automate the process so that all details, including event_id and other fields, are sent to SOAR consistently? I’ve also attached a screenshot for reference to help clarify the issue. Manually send sieam to soar:    On poll using :
where is setting splunkweb timeout and splunkd timeout ? we only set in web.conf   
Hi @imam29  Just to check, have you restarted Splunk since making this change? Its also worth considering the following from the docs - How long did you wait to check for a timeout? What is your to... See more...
Hi @imam29  Just to check, have you restarted Splunk since making this change? Its also worth considering the following from the docs - How long did you wait to check for a timeout? What is your tools.sessions.timeout set to? The countdown for the splunkweb/splunkd session timeout does not begin until the browser session reaches its timeout value. So, to determine how long the user has before timeout, add the value of ui_inactivity_timeout to the smaller of the timeout values for splunkweb and splunkd. For example, assume the following: splunkweb timeout: 15m splunkd timeout: 20m browser (ui_inactivity_timeout) timeout: 10m The user session stays active for 25 minutes (15m+10m). After 25 minutes of no activity, the session ends, and the instance prompts the user to log in again the next time they send a network request to the instance. Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will
Hi,   I'm having an issues parsing the SQL_TEXT field from oracle:audit:unified. When the field comes through it contains spurious text that isn't returned by the query using DBConnect and the orac... See more...
Hi,   I'm having an issues parsing the SQL_TEXT field from oracle:audit:unified. When the field comes through it contains spurious text that isn't returned by the query using DBConnect and the oracle:audit:unified template. For example: DBConnect grant create tablespace to test_splunk, Splunk grant create tablespace to test_splunk,4,,1,,,,,, The RAW event seems to come through as a CSV by virtue of the Oracle TA but we have a regex for the event extraction that looks like the below which seems to work in regex101: SQL_TEXT="(?<SQL_TEXT>(?:.|\n)*?)(?=(?:",\s\S+=|"$)) I know the data type is CLOD so I have tried to converting it using the substring command but I get the same result, any idea what is going on here?