All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, Is there someone who can help me with this one. I had setup TA-Akamai_SIEM on our heavy forwarders. I do not see any data getting pulled after configuring API's but rather messages regarding SSL... See more...
Hi, Is there someone who can help me with this one. I had setup TA-Akamai_SIEM on our heavy forwarders. I do not see any data getting pulled after configuring API's but rather messages regarding SSL on the _internal. Anybody had this kind of issue? We are using this java versions java version "1.8.0_291" Java(TM) SE Runtime Environment (build 1.8.0_291-b10) Java HotSpot(TM) 64-Bit Server VM (build 25.291-b10, mixed mode)   Appreciate the help message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" ... 25 more message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at java.net.SocketOutputStream.socketWrite(Unknown Source) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at java.net.SocketOutputStream.socketWrite0(Native Method) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at java.net.SocketOutputStream.write(Unknown Source) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at sun.security.ssl.SSLSocketOutputRecord.encodeAlert(Unknown Source) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" ... 22 more message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" Suppressed: java.net.SocketException: Broken pipe (Write failed) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at com.akamai.siem.Main.main(Main.java:116) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at com.akamai.siem.Main.streamEvents(Main.java:474) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at com.splunk.modularinput.Script.run(Script.java:48) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at com.splunk.modularinput.Script.run(Script.java:74) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at java.net.SocketInputStream.read(Unknown Source) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:384) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:436) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:376) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:393) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at sun.security.ssl.Alert.createSSLException(Unknown Source) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at sun.security.ssl.SSLSocketImpl.decode(Unknown Source) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at sun.security.ssl.SSLSocketImpl.readHandshakeRecord(Unknown Source) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at sun.security.ssl.SSLSocketImpl.startHandshake(Unknown Source) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at sun.security.ssl.SSLSocketInputRecord.decode(Unknown Source) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at sun.security.ssl.SSLSocketInputRecord.read(Unknown Source) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at sun.security.ssl.SSLSocketInputRecord.readHeader(Unknown Source) message from "/opt/splunk/etc/apps/TA-Akamai_SIEM/linux_x86_64/bin/TA-Akamai_SIEM.sh" at sun.security.ssl.SSLTransport.decode(Unknown Source)
Hi Team, I am trying to upgrade Splunk from 7.3.1 to Splunk 8.1, using the following steps :- Stoping Splunk on the server tar -xzf ${SPLUNK_HOME}/splunk-8.1.5-9c0c082e4596-Linux-x86_64.tgz -C ${S... See more...
Hi Team, I am trying to upgrade Splunk from 7.3.1 to Splunk 8.1, using the following steps :- Stoping Splunk on the server tar -xzf ${SPLUNK_HOME}/splunk-8.1.5-9c0c082e4596-Linux-x86_64.tgz -C ${SPLUNK_HOME}/ ${SPLUNK_HOME}/bin/splunk start --accept-license --answer-yes When the splunk is getting started on this server its still showing splunk 7.3 version.   The python version on the linux machine is 2.7.5,  can you please confirm if that is the root cause for this behavior. Thanks in advance !!      
Hello, The Tenable Add-on for Splunk stores data with the following sources and source types. Tenable.sc Source Sourcetype Description <username>|<address> tenable:sc:vuln This collects all ... See more...
Hello, The Tenable Add-on for Splunk stores data with the following sources and source types. Tenable.sc Source Sourcetype Description <username>|<address> tenable:sc:vuln This collects all vulnerability data. <username>|<address> tenable:sc:assets This collects pull assets data. <username>|<address> tenable:sc:plugin This collects all plugin data.   Tenable.io Source Sourcetype Description tenable_io://<data input name> tenable:io:vuln This collects all vulnerability data. tenable_io://<data input name> tenable:io:assets This collects all asset data. tenable_io://<data input name> tenable:io:plugin This collects all plugin data.   In my production environment i am getting logs from sourcetype Tenable.sc (tenable:sc:vuln, tenable:sc:assets, tenable:sc:plugin)) and these sourcetypes are visible in in my data summary however sourcetype Tenable.io (tenable:io:vuln, tenable:io:assets, tenable:io:plugin) are not visible in data summary and not getting logs from these sourcetype. Question:- 1)need help to be confirmed for sourcetype Tenable.io either it is configure or not and if it is configured then why not visible in data summary sourcetype lists. 2)how can i identify ,where is my Tenable add-on is installed . 3)Tenable vulnerability dashboard not working.   Requesting answer for above mentioned question. Thanks in advance    
Hi, LOOKUP-asset_lookup = server_summary host OUTPUTNEW   serveros AS asset_os I have a lookup where serveros is one of the field asset_os is one of the enriched field from serveros Now, I need o... See more...
Hi, LOOKUP-asset_lookup = server_summary host OUTPUTNEW   serveros AS asset_os I have a lookup where serveros is one of the field asset_os is one of the enriched field from serveros Now, I need one more field called os (for datamodelling) which is same as asset_os I tried below but its not working out ( I need both asset_os and os field) 1) I tried asset_os as os in field alias --> didnt work 2) I created a calculated field,  case(isnotnull(asset_os),asset_os,1==1,"unkown") - asset_os is not showing in fields 3) I added the below line into props.conf - Also here asset_os is not showing in fields LOOKUP-asset_lookup1 = server_summary host OUTPUTNEW   serveros AS os  Is there any other way I can get both asset_os and os field in the fields? We cannot go for field extraction as the required field value is not available in logs, the value is taken from lookup table.
How can you delete reports that have been created on the /reports page? I have administrator rights but can't see any option to delete, only create. Thanks
Hi folks, I need to create an alert action in C #, how can I do that? I have an alert_actions.conf that describes a Python alert action, how can I add a new alert action using C #? [send_mail] is... See more...
Hi folks, I need to create an alert action in C #, how can I do that? I have an alert_actions.conf that describes a Python alert action, how can I add a new alert action using C #? [send_mail] is_custom = 1 python.version = python3 label =  Send mail icon_path = appIcon.png payload_format = json     Thanks and have a nice day!
I have a json in a field which i like to show on the dashboard as pretty formatted rather single line string. Is there an option in Splunk dashboard to do this? Currently i have the below   {"A":"... See more...
I have a json in a field which i like to show on the dashboard as pretty formatted rather single line string. Is there an option in Splunk dashboard to do this? Currently i have the below   {"A":"NAME", "B":"AGE"}   In dashboard one of the column value will be this json which i like to render as    { "A": "NAME", "B": "AGE" }    
Hi everyone! Maybe someone faced such a problem: I want to build a Layer 2 network topology, I have enough data for this. I am working with the Network Diagram Viz app. And I have a table of links,... See more...
Hi everyone! Maybe someone faced such a problem: I want to build a Layer 2 network topology, I have enough data for this. I am working with the Network Diagram Viz app. And I have a table of links, something like this: from to local_int remote_int linkcolor type linktext value AIC-switch-2960.aic.kz SW9300test.aic.kz Gi0/1 Gi1/0/23 green deployment-server Gi0/1 to Gi1/0/23 AIC-switch-2960.aic.kz SW9300test.aic.kz AIC-switch-2960.aic.kz Gi1/0/23 Gi0/1 green deployment-server Gi1/0/23 to Gi0/1 SW9300test.aic.kz SW9300test.aic.kz SW3850test.aic.kz Gi1/0/9 Gi1/0/9  green deployment-server Gi1/0/9 to Gi1/0/9 SW9300test.aic.kz SW9300test.aic.kz SW3850test.aic.kz Gi1/0/10 Gi1/0/10  green deployment-server Gi1/0/10 to Gi1/0/10 SW9300test.aic.kz SW3850test.aic.kz SW9300test.aic.kz Gi1/0/9 Gi1/0/9  green deployment-server Gi1/0/9 to Gi1/0/9 SW3850test.aic.kz SW3850test.aic.kz SW9300test.aic.kz Gi1/0/10 Gi1/0/10  green deployment-server Gi1/0/10 to Gi1/0/10 SW3850test.aic.kz AIC-switch-2960.aic.kz SIP-W60B Gi0/12 WAN PORT green phone-square Gi0/12 to WAN PORT AIC-switch-2960.aic.kz   And, accordingly, in the topology, this is:   I took information about connected devices from AIC-switch-2960.aic.kz, SW9300test.aic.kz and SW3850test.aic.kz. I just need to remove non-redundant links from the table. What solution can you advise to delete such entries automatically or some other way? Thanks!
Hi, We are exploring splunkjs for displaying charts and data from splunk.  we use splunk enterprise and it was recently changed to SSO login. can i authenticate splunkjs with SSO or with authenticat... See more...
Hi, We are exploring splunkjs for displaying charts and data from splunk.  we use splunk enterprise and it was recently changed to SSO login. can i authenticate splunkjs with SSO or with authentication bearer token. what other possibilities do we have. Thanks in advance.
Hi all i need some help with my splunk query… basically I need to exclude all jobs from output with job name ending in _fw as shown below:     jobname Abc_token_fw def_file_fw    
Hello, I am having an issue with piping the output of a custom reporting command, as documented here, into another SPL command. I can get the basic reporting command example to work. It's called "su... See more...
Hello, I am having an issue with piping the output of a custom reporting command, as documented here, into another SPL command. I can get the basic reporting command example to work. It's called "sum" and is provided within the "searchcommands_app" directory of the Splunk Python SDK hosted on GitHub. However, once I get statistics output from the "sum" command, I cannot pipe those results into another command. This first query works fine:   index = _internal | head 200 | sum total=lines linecount   However, this query does not work:   index = _internal | head 400 | sum total=lines linecount | stats count   When I try to pipe the output of the "sum" command into the "stats" command, I get the following error:   KeyError at "/opt/splunk/etc/apps/t-digest-custom-command/bin/sum.py", line 63 : 'linecount'   Am I getting this error due to a bug in the custom search command API, or am I missing something? Thanks,   Follow up question: why don't reporting commands reduce to a single value for sufficiently large numbers of input events? For example, this query yields a single statistic value as I expect:   index = _internal | head 50 | sum total=lines linecount   However, this query yields multiple statistic values, even when I only want one value:   index = _internal | head 400 | sum total=lines linecount      
Hi Everyone!  Could you please help, how to calculate (UP percentage) by app_service  I have the query as:  eval status=if(success="successful .Statuscode:200", "UP", "DOWN" | table   app_service... See more...
Hi Everyone!  Could you please help, how to calculate (UP percentage) by app_service  I have the query as:  eval status=if(success="successful .Statuscode:200", "UP", "DOWN" | table   app_service    status I get the table I need, but I have difficulty calculating the "UP" percentage and total percentage per app_service from the above query Thank you 
Hi All, I think the subject of my questions says it all... I wanted to add numerical data from 2 multivalue fields, and save it to a new field.  Field1 Field2 Field3 4 8 12 8 9 17 ... See more...
Hi All, I think the subject of my questions says it all... I wanted to add numerical data from 2 multivalue fields, and save it to a new field.  Field1 Field2 Field3 4 8 12 8 9 17 3 2 5   I know mvappend is not the one to be used here, but I already tried:   | eval field3=mvappend(field1,field2)   Any ideas are greatly appreciated?
Hi All, I'm not that familiar with DMA as I have not had any exposure really to setting up data models so far but am currently having an issue atm with DMA not saying active. We had to disabled DMA... See more...
Hi All, I'm not that familiar with DMA as I have not had any exposure really to setting up data models so far but am currently having an issue atm with DMA not saying active. We had to disabled DMA on all ES data models where it was enabled due to an incident recently.  Now that the issues have been resolved, we need to re-enable DMA. I have attempted to do this by following the below steps:  1. Go to the ES app 2. Click "Configure" -> "CIM Setup" 3. Check the checkbox next to the "Accelerate" then change the Summary Range to 7 days (- 7 days), then click Save. 4. To verify , click "Configure" -> "Content" -> "Content Management". 5. Filter the type to "Data Model" 6. Check the lightning icon next in the row of the data model if is coloured "yellow". This looked like it was working for a while, but after checking on it after a few hrs - all DMA had been disabled again. Not sure why DMA will not stay enabled - have checked settings, nothing obvious as to why this would be happening. Anyone else out there had this issue or got some idea on something I can check as to why this would be happening?
I need to access these saved searches & change their timing due to them conflicting / running at the same time so many are being skipped. Any helpfu
I have the below query: | inputlookup test.csv | eval epochtime=strptime(_time, "%a %b %d %H:%M:%S %Y") | eval desired_time=strftime(epochtime, "%d/%m/%Y") | rename desired_time as Date | eval d... See more...
I have the below query: | inputlookup test.csv | eval epochtime=strptime(_time, "%a %b %d %H:%M:%S %Y") | eval desired_time=strftime(epochtime, "%d/%m/%Y") | rename desired_time as Date | eval desired_time=strftime(epochtime, "%b%y'") | rename desired_time as Month   am getting this output in user field. user Abcd101 sv23010 ns03621 here i want to remove the user sv48840,ns19075 row in this table.
(index=* OR index=_*) (((index=azuread )) NOT (action=success user=*$)) | eval action=if(isnull(action) OR action="","unknown",action), app=if(isnull(app) OR app="",sourcetype,app), src=if(isnull(src... See more...
(index=* OR index=_*) (((index=azuread )) NOT (action=success user=*$)) | eval action=if(isnull(action) OR action="","unknown",action), app=if(isnull(app) OR app="",sourcetype,app), src=if(isnull(src) OR src="","unknown",src), src_user=if(isnull(src_user) OR src_user="","unknown",src_user), dest=if(isnull(dest) OR dest="","unknown",dest), user=if(isnull(user) OR user="","unknown",user) | rename signature AS Authentication.signature signature_id AS Authentication.signature_id action AS Authentication.action app AS Authentication.app src AS Authentication.src src_user AS Authentication.src_user dest AS Authentication.dest user AS Authentication.user | fields "_time" "host" "source" "sourcetype" "Authentication.signature" "Authentication.signature_id" "Authentication.action" "Authentication.app" "Authentication.src" "Authentication.src_user" "Authentication.dest" "Authentication.user"|search Authentication.signature=UserLoginFailed|
I need to change the timing for a few accelerated data model searches (Saved searches) for few apps in Enterprise Security. Thank u in advance.
Hi , In a Splunk Cloud instance - installed Alert Manager with all default settings - The incident setting is all  enabled - User setting is configured to "Both" .   Trying to assign an incident to... See more...
Hi , In a Splunk Cloud instance - installed Alert Manager with all default settings - The incident setting is all  enabled - User setting is configured to "Both" .   Trying to assign an incident to another member - but I can not make any changes to incident workflow in the Edit Incident box.  what files need to be modified on the Splunk cloud instance to allow any member to assign incidents to another member as well as leave comments .  Not sure if that is related to Notification - because that is not functional as well - created the member notifications schemes but can not add the emails address for each member?  Seem to be some files need to be modified on the Splunk cloud instance for these app - any help will be much appreciated.   
Hi , I have splunk_TA_NIX app installed on indexer,Heavy Forwarder and search heads. When i search index=os sourcetype=cpu on indexers i can see below fields.   But same query when i run... See more...
Hi , I have splunk_TA_NIX app installed on indexer,Heavy Forwarder and search heads. When i search index=os sourcetype=cpu on indexers i can see below fields.   But same query when i run on search heads i dont see any of those fields it is just below fields   Any solution on how to get all the fields on search heads?