All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi,  I have a dataset like below     [ {classificationA: null, classificationB: null}, {classificationA: {name: 'Education'}, classificationB: {name: 'Education'}}, {classificationA: {name:... See more...
Hi,  I have a dataset like below     [ {classificationA: null, classificationB: null}, {classificationA: {name: 'Education'}, classificationB: {name: 'Education'}}, {classificationA: {name: 'IT'}, classificationB: {name: 'IT'}} }     My aim is to find all the rows whose classificationA is not equals to classificationB. So given the above dataset, it should return zero rows. I thought it should be:     | where classficationA != classficationB     But it is not working.  Anyone can help? Thank you!
Hey Guys, I am working on a requirement where I have to extract the value of some nodes in XML which are in a name value pair. Those values are nothing but the products purchased from our website b... See more...
Hey Guys, I am working on a requirement where I have to extract the value of some nodes in XML which are in a name value pair. Those values are nothing but the products purchased from our website but the XML also contains other elements like discount code, additional products etc and I don't have a way to differentiate between those. I have attached a block of code from my XML which I am really interested in. Is there a way to extract those values and then display the top products purchased/used from different events. I am interested in the 2 fields  productIdentifier and name     <ns3:orderItem> <ns3:product> <ns3:productIdentifier>XXXXXXXXXXXXXXX</ns3:productIdentifier> <ns3:name>XXXXX</ns3:name> <ns3:action>New</ns3:action> <ns3:quantity>1</ns3:quantity> <ns3:product> <ns3:productIdentifier>P11845546565263</ns3:productIdentifier> <ns3:name>Mixit TV (M 2)</ns3:name> <ns3:instanceIdentifier>A</ns3:instanceIdentifier> <ns3:action>New</ns3:action> <ns3:quantity>1</ns3:quantity> <ns3:product> <ns3:productIdentifier>P1187877564259</ns3:productIdentifier> <ns3:name>360 Box</ns3:name> <ns3:instanceIdentifier>A</ns3:instanceIdentifier> <ns3:action>New</ns3:action> <ns3:quantity>1</ns3:quantity> </ns3:product> <ns3:product> <ns3:productIdentifier>P118565565656</ns3:productIdentifier> <ns3:name>360 Activation omph</ns3:name> <ns3:instanceIdentifier>A</ns3:instanceIdentifier> <ns3:action>New</ns3:action> <ns3:quantity>1</ns3:quantity> </ns3:product> </ns3:product> <ns3:product> <ns3:productIdentifier>P1068434343545681</ns3:productIdentifier> <ns3:name>Fibre Broadband</ns3:name> <ns3:instanceIdentifier>H</ns3:instanceIdentifier> <ns3:action>New</ns3:action> <ns3:quantity>1</ns3:quantity> <ns3:product> <ns3:productIdentifier>P1046134534341</ns3:productIdentifier> <ns3:name>Manned Install Only code</ns3:name> <ns3:instanceIdentifier>H</ns3:instanceIdentifier> <ns3:action>New</ns3:action> <ns3:quantity>1</ns3:quantity> </ns3:product> <ns3:product> <ns3:productIdentifier>P1015455566454</ns3:productIdentifier> <ns3:name>Manned Install Charge</ns3:name> <ns3:instanceIdentifier>H</ns3:instanceIdentifier> <ns3:action>New</ns3:action> <ns3:quantity>1</ns3:quantity> </ns3:product> </ns3:product> <ns3:product> <ns3:productIdentifier>P1243436565434</ns3:productIdentifier> <ns3:name>Weekend chatter</ns3:name> <ns3:instanceIdentifier>I</ns3:instanceIdentifier> <ns3:action>New</ns3:action> <ns3:quantity>1</ns3:quantity> <ns3:directoryServicesRequest> <ns3:includePhoneNumber>false</ns3:includePhoneNumber> </ns3:directoryServicesRequest> <ns3:product> <ns3:productIdentifier>A1000546567565</ns3:productIdentifier> <ns3:name>Voicemail Free</ns3:name> <ns3:instanceIdentifier>I</ns3:instanceIdentifier> <ns3:action>New</ns3:action> <ns3:quantity>1</ns3:quantity> </ns3:product> <ns3:product> <ns3:productIdentifier>P10454565656545</ns3:productIdentifier> <ns3:name>VOC Line Rental</ns3:name> <ns3:instanceIdentifier>I</ns3:instanceIdentifier> <ns3:action>New</ns3:action> <ns3:quantity>1</ns3:quantity> </ns3:product> </ns3:product> <ns3:product> <ns3:productIdentifier>D1057845454545</ns3:productIdentifier> <ns3:name>Free Install - Non QS address</ns3:name> <ns3:action>New</ns3:action> <ns3:quantity>1</ns3:quantity> </ns3:product> <ns3:product> <ns3:productIdentifier>P105704545458</ns3:productIdentifier> <ns3:name>Install Activation Fee</ns3:name> <ns3:action>New</ns3:action> <ns3:quantity>1</ns3:quantity> </ns3:product> </ns3:product> </ns3:orderItem>       Let me know if anyone has worked on this type of requirement before and if they can be of any help. Best Regards, SA
Hello, Can anyone pls help me with the cisco add-on for splunk that collects latency info from cisco devices.I am seeing bunch of add-on's on splunkbase but can't exactly figure out. Please help ... See more...
Hello, Can anyone pls help me with the cisco add-on for splunk that collects latency info from cisco devices.I am seeing bunch of add-on's on splunkbase but can't exactly figure out. Please help with your thoughts.   Thanks
Hi,  We have recently switched from Phantom to SOAR and I'm trying to send our triggered alerts to SOAR.  I have tested that from Splunk Enterprise to SOAR connect and it works. But I keep gett... See more...
Hi,  We have recently switched from Phantom to SOAR and I'm trying to send our triggered alerts to SOAR.  I have tested that from Splunk Enterprise to SOAR connect and it works. But I keep getting the following error for one alert       11-04-2022 05:31:21.724 +1100 WARN sendmodalert [17285 AlertNotifierWorker-0] - action=sendtophantom - Alert action script returned error code=1 11-04-2022 05:31:21.724 +1100 INFO sendmodalert [17285 AlertNotifierWorker-0] - action=sendtophantom - Alert action script completed in duration=1394 ms with exit code=1        
Is it possible to give a user two options when drilling down on a panel? For example the dashboard has a table with one column A. If a user clicks that it updates the token in the dashboard.  But no... See more...
Is it possible to give a user two options when drilling down on a panel? For example the dashboard has a table with one column A. If a user clicks that it updates the token in the dashboard.  But now when a user clicks the value in column A a pop up occurs that gives the user the option of updating the current dashboard or going to another dashboard.
can anyone help me to resolve my issue? here is the query which i am using    index="dynatrace" "userActions{}.name" = "clickonnotes" | table "userActions{}.name","userActions{}.visuallyCompleteT... See more...
can anyone help me to resolve my issue? here is the query which i am using    index="dynatrace" "userActions{}.name" = "clickonnotes" | table "userActions{}.name","userActions{}.visuallyCompleteTime"   output userActions{}.name userActions{}.visuallyCompleteTime   loadingofpage/cc/claimcenter.do clickonsearch keypressonc1 clickony3wc25120 clickonnotes clickonlossdetails clickonindemnity 9356 516 609 1276 981 1371 392 640
I want to display the output in a table format. Basically I have a list of responses values fields that I want to printout, but only if they have something in them. I don't want to routinely disp... See more...
I want to display the output in a table format. Basically I have a list of responses values fields that I want to printout, but only if they have something in them. I don't want to routinely display 10 extra fields that are usually with empty.
I have a dashboard with tokens on it: Token1, Token2, and Token3. I have a table that contains multiple columns: x, y, z. When I click the value of column x, I want to update Token1 on the same p... See more...
I have a dashboard with tokens on it: Token1, Token2, and Token3. I have a table that contains multiple columns: x, y, z. When I click the value of column x, I want to update Token1 on the same page. When I click the value of column y, I want to update Token2 and so on. When I click a column I only want to update the corresponding token and not the rest of the tokens.
Hello! I have a lookup in the csv format of about 1900 users. I have a panel in which we show application usage for every user, but it's formatted as a pivot. I'd like to make a new panel that onl... See more...
Hello! I have a lookup in the csv format of about 1900 users. I have a panel in which we show application usage for every user, but it's formatted as a pivot. I'd like to make a new panel that only shows application usage for the users on this specific lookup table. If it wasn't the case of a pivot being used, I'd have to use an inputlookup followed by an index subsearch, but i'm having a hard time figuring out how to do this with the pivot. This is the code for the current panel     | pivot Process_Detail dc(AppVersion) as "#Versions" dc(ProcUser) as "#Users" dc(host) as "#Hosts" splitrow AppName as Name filter SessionID > 0 filter AppName is "*" | eval sortfield = lower ('Name') | sort limit=0 sortfield | table Name "#Versions" "#Users" "#Hosts"      
Hi, I have requirement to open an additional non ssl http rest port in splunk and bind it to localhost for my splunk security issue. In splunk documentation server. conf file they have mentioned as... See more...
Hi, I have requirement to open an additional non ssl http rest port in splunk and bind it to localhost for my splunk security issue. In splunk documentation server. conf file they have mentioned as below. How to open non ssl port and bound to localhost?  ############################################################################ # Open an additional non-SSL HTTP REST port, bound to the localhost # interface (and therefore not accessible from outside the machine) Local # REST clients like the CLI can use this to avoid SSL overhead when not # sending data across the network. ############################################################################ [httpServerListener:127.0.0.1:8090] ssl = false   
Hi,  We have 2 Splunk authentication systems - SAML,Splunk (default). We wanted to have an alert, if the user login to Splunk via "Splunk" authentication system. Is there way to do that? Can we d... See more...
Hi,  We have 2 Splunk authentication systems - SAML,Splunk (default). We wanted to have an alert, if the user login to Splunk via "Splunk" authentication system. Is there way to do that? Can we do this via Splunk query? need help on this. Thanks, Mala S
Hi Splunkers, I have a doubt about Slunk data forwarding to third part systems. I know that this task can be performed with forwarders; what I'm not able to understand is if it can be performed af... See more...
Hi Splunkers, I have a doubt about Slunk data forwarding to third part systems. I know that this task can be performed with forwarders; what I'm not able to understand is if it can be performed after data are arrived to Splunk and has been ingested, parsed and aggregated. let me explain better: in a usual scenario, we know that Forwarder are something that stay before a Splunk environment, so it is something similar to: Data sources -> Forwarders (on DS or on intermediate) -> Splunk environment (SH +Indexer) With Forwarder I can achieve this scenario: Data sources -> Forwarders (on DS or on intermediate) -> Splunk environment (SH +Indexer)                                                                                                                  -> Other systems So, I can forward data before they arrive to SH + Indexer. But what about if I need to perform this task: Data sources -> Forwarders (on DS or on intermediate) -> Splunk environment (SH +Indexer) -> parsing, aggregation and filtering -> Forwarding to third part system Is it possible? The requirement is that data must have completed all Splunk lifecycle: they arrive raw/only partially manipulated from Mulesoft, then must be parsed, filtered and aggregated and after this sent back to Mulesoft that forward them to final systems. Is this something that I can achieve with Splunk?  
Hi , I want to change the date format and find difference in days from current day . Date format i  have now is , Timestamp 10/31/2022 1:28:20 PM index=s sourcetype=Resources|fillnull | eva... See more...
Hi , I want to change the date format and find difference in days from current day . Date format i  have now is , Timestamp 10/31/2022 1:28:20 PM index=s sourcetype=Resources|fillnull | eval Timestamp=strftime(strptime('Timestamp',"%m/%d/%Y"),"%Y-%m-%d") | eval diff=now()-Timestamp|table Name _time Timestamp diff Here i am not getting the diff field populated with difference in number of days.It shows as blank .The date format i expect is %Y-%m-%d for timestamp  column. Please help me to achieve this .  
Hello Community, I'm currently trying to configure the Splunk Add-on for Microsoft Azure. The Addon is installed on the Heavy Forwarder in my Environment and my goal is to forward the Data from th... See more...
Hello Community, I'm currently trying to configure the Splunk Add-on for Microsoft Azure. The Addon is installed on the Heavy Forwarder in my Environment and my goal is to forward the Data from the API to an Index in the Indexer Cluster.  The Problem I'm having is that the configuration of the App only accepts Indexes that are visible locally on the machine. Since Indexes, that are configured through the _cluster, don't show up in the local index list, I am not able to choose it. If I write the index into the config anyway the App outputs the Status=false and doesn't pull from the API. My Question now is:  How can I make the Index from the Cluster visible on the Heavy Forwarder so I can select it in the Script? Or if there is any other way to forward the Data from the App that I have missed? Any advice or hints to documentation I might have missed are appreciated! 
Hello,   I'm following the steps here: https://docs.splunk.com/Documentation/Splunk/9.0.1/Installation/InstallonLinux#Next_steps After installing and starting the service, I'm of course unabl... See more...
Hello,   I'm following the steps here: https://docs.splunk.com/Documentation/Splunk/9.0.1/Installation/InstallonLinux#Next_steps After installing and starting the service, I'm of course unable to access port 8000 to access the web interface because the system firewall is blocking connections. Besides port 8000, what other ports should I open through the firewall and why isn't this documented on the above page? If anyone has a link to splunk documentation about the ports used, please let me know. I've seen lots of splunk community answers showing different ports, but others say they are user-defined. Like port 9997 for the forwarder to send data to the splunk server... I haven't configured that yet (it wasn't in the above documentation). I see that my splunk server is currently listening on ports 8000, 8089, and 8191, according to the output of "sudo ss -tunlp" tcp LISTEN 0 128 0.0.0.0:8089 0.0.0.0:* users:(("splunkd",pid=1806,fd=4)) tcp LISTEN 0 128 0.0.0.0:8191 0.0.0.0:* users:(("mongod",pid=2285,fd=9)) tcp LISTEN 0 128 0.0.0.0:8000 0.0.0.0:* users:(("splunkd",pid=1806,fd=100)) I tried opening a support case, but apparently I can't do that either. I'm really not sure where to ask this question, or who to ask in order to get the installation documentation updated. If I should post this somewhere else, please let me know. Thank you, Jonathan
We have MFA logs being sent to one of our indexes and the field I'm looking at is as follows:   message: MFA challenge issued for account 1234556. Email is example@example.com   I want to be ... See more...
We have MFA logs being sent to one of our indexes and the field I'm looking at is as follows:   message: MFA challenge issued for account 1234556. Email is example@example.com   I want to be able to count the number of times "MFA challenge issued" occurs in the logs. I'm trying to use the rex command but it's just so confusing at the moment Any help would be great!
Hi I dont understand the goal of the summary range in accelerated search what is the difference with the report range For example if i run my report on the last 30 days and I put 7 days in the ... See more...
Hi I dont understand the goal of the summary range in accelerated search what is the difference with the report range For example if i run my report on the last 30 days and I put 7 days in the summary range what it means exactly? Rgds      
We have dropdown field with four static value. Drop down field value: 1.cloud 2. hana 3. copy2 4. principal for each dropdown static value have separate query,  looking to create dashbo... See more...
We have dropdown field with four static value. Drop down field value: 1.cloud 2. hana 3. copy2 4. principal for each dropdown static value have separate query,  looking to create dashboard with single panel changing dropdown value should change respective search query  output in same panel. 1.cloud -> Query1 2. hana -> Query2 3. copy2 -> Query3 4. principal -> Query4
Hi everyone,   I'm struggling with SplunkDB connect and HEC. I have a monoinstance splunk that has all roles. I have multiple UF and use deployment server with HEC. I'm currently trying to us... See more...
Hi everyone,   I'm struggling with SplunkDB connect and HEC. I have a monoinstance splunk that has all roles. I have multiple UF and use deployment server with HEC. I'm currently trying to use SplunkDB connect on that instance to read some DB data and write it in an index and I keep having the error message : "Unsupported or unrecognized SSL message". I checked the port (8088), seems fine. SSL is enabled on HEC and splunkd. (default parameters)   I wonder if it is possible that splunk db connect uses HEC entry on localhost:8088 if I have deported the HEC entry on the UFs with "useDeploymentServer". Could it explain the ssl error ? I tried to use dbx_settings.conf but it does not seems to use my second entry :     [hec] maxRetryWhenHecUnavailable = 3 hecUris = localhost:8088,splunk-hec.qualgend:8088     Logs sample :     127.0.0.1 - - [03/nov./2022:15:30:08 +0000] "GET /api/taskserver HTTP/1.1" 200 414 "-" "python-requests/2.25.0" 2 2022-11-03 16:30:00.286 +0100 [Scheduled-Job-Executor-4] DEBUG c.s.d.s.dbinput.recordreader.DbInputRecordReader - action=closing_db_reader task=rising_mcipe_qualif 2022-11-03 16:30:00.286 +0100 INFO c.s.dbx.server.task.listeners.JobMetricsListener - action=collect_job_metrics connection=mcipe_qualif jdbc_url=null db_read_time=0 hec_record_process_time=3 format_hec_success_count=69 status=FAILED input_name=rising_mcipe_qualif batch_size=1000 error_threshold=N/A is_jmx_monitoring=false start_time=2022-11-03_04:30:00 end_time=2022-11-03_04:30:00 duration=21 read_count=69 write_count=0 error_count=0 2022-11-03 16:30:00.285 +0100 [Scheduled-Job-Executor-4] INFO org.easybatch.core.job.BatchJob - Job 'rising_mcipe_qualif' finished with status: FAILED 2022-11-03 16:30:00.285 +0100 [Scheduled-Job-Executor-4] ERROR org.easybatch.core.job.BatchJob - Unable to write records javax.net.ssl.SSLException: Unsupported or unrecognized SSL message at java.base/sun.security.ssl.SSLSocketInputRecord.handleUnknownRecord(SSLSocketInputRecord.java:451) at java.base/sun.security.ssl.SSLSocketInputRecord.decode(SSLSocketInputRecord.java:175) at java.base/sun.security.ssl.SSLTransport.decode(SSLTransport.java:111) Afficher toutes les 35 lignes 2022-11-03 16:30:00.285 +0100 [Scheduled-Job-Executor-4] ERROR c.s.d.s.dbinput.recordwriter.CheckpointUpdater - action=skip_checkpoint_update_batch_writing_failed javax.net.ssl.SSLException: Unsupported or unrecognized SSL message at java.base/sun.security.ssl.SSLSocketInputRecord.handleUnknownRecord(SSLSocketInputRecord.java:451) at java.base/sun.security.ssl.SSLSocketInputRecord.decode(SSLSocketInputRecord.java:175) at java.base/sun.security.ssl.SSLTransport.decode(SSLTransport.java:111) Afficher toutes les 35 lignes 2022-11-03 16:30:00.285 +0100 [Scheduled-Job-Executor-4] ERROR c.s.d.s.task.listeners.RecordWriterMetricsListener - action=unable_to_write_batch javax.net.ssl.SSLException: Unsupported or unrecognized SSL message at java.base/sun.security.ssl.SSLSocketInputRecord.handleUnknownRecord(SSLSocketInputRecord.java:451) at java.base/sun.security.ssl.SSLSocketInputRecord.decode(SSLSocketInputRecord.java:175) at java.base/sun.security.ssl.SSLTransport.decode(SSLTransport.java:111) Afficher toutes les 35 lignes 2022-11-03 16:30:00.279 +0100 [Scheduled-Job-Executor-4] INFO c.s.d.s.dbinput.recordwriter.HttpEventCollector - action=writing_events_via_http_event_collector record_count=69 2022-11-03 16:30:00.279 +0100 [Scheduled-Job-Executor-4] INFO c.s.d.s.dbinput.recordwriter.HttpEventCollector - action=writing_events_via_http_event_collector 2022-11-03 16:30:00.279 +0100 [Scheduled-Job-Executor-4] INFO c.s.dbx.server.dbinput.recordwriter.HecEventWriter - action=write_records batch_size=69       Do you have any idea what I did wrong ? Any clue would be greatly appreciated ! Thanks in advance, Ema
Hello! I need your help please, I need to be able to view the logs for complete processes and not for fractions of these, since when I see them from Splunk, that is, it shows me each event, belongi... See more...
Hello! I need your help please, I need to be able to view the logs for complete processes and not for fractions of these, since when I see them from Splunk, that is, it shows me each event, belonging to the same process, separately in time intervals ( it only happens with a specific sourcetype, the others are fine)