All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

In Monitoring Console in our Master, we have And I open it in search and got this query | rest splunk_server=test43 /services/server/introspection/queues | eval current_fill_perc = r... See more...
In Monitoring Console in our Master, we have And I open it in search and got this query | rest splunk_server=test43 /services/server/introspection/queues | eval current_fill_perc = round(current_size_bytes / max_size_bytes * 100, 0) | fields title, current_fill_perc | search title="parsingQueue.*" OR title="aggQueue.*" OR title="typingQueue.*" OR title="indexQueue.*" | rex field=title "(?<queue_name>^\w+)\.(?<pipeline_number>\d+)" | chart values(current_fill_perc) over pipeline_number by queue_name | fields pipeline_number, parsingQueue, aggQueue, typingQueue, indexQueue | rename pipeline_number as "Pipeline Number", parsingQueue as "Parsing Queue Fill Ratio (%)", aggQueue as "Aggregator Queue Fill Ratio (%)", typingQueue as "Typing Queue Fill Ratio (%)", indexQueue as "Index Queue Fill Ratio (%)" I want to create a dashboard for some members that don't have access in our Master server to see, but when I run that search in our Search head, it produce no result. So there is other way to show it in our SH?
What would be the regular expression when using rex to match fields that end with a range of values? Sample: "var0":0,"var1":10,"var2":20,"var10":100 I would like to extract fields from var1 to... See more...
What would be the regular expression when using rex to match fields that end with a range of values? Sample: "var0":0,"var1":10,"var2":20,"var10":100 I would like to extract fields from var1 to var10, and exclude var0. Thanks
As the title says, I am looking to setup retrospective searches based on new threat intelligence indicators in ES. Is it possible ? if yes, can someone please suggest the best way to do this ?
Hi, I have installed the IBM IIB agent (https://docs.appdynamics.com/appd/4.5.x/en/application-monitoring/install-app-server-agents/ibm-integration-bus-agent/install-the-iib-agent) and all appears t... See more...
Hi, I have installed the IBM IIB agent (https://docs.appdynamics.com/appd/4.5.x/en/application-monitoring/install-app-server-agents/ibm-integration-bus-agent/install-the-iib-agent) and all appears to be working ok. When I start the IIB Application however, I get an error "Agent license request denied. Agent type: WMB/IIB". I see nothing on the dashboard either. I am on the 16-day SAAS trial. Is the IIB agent not available to be used in the SAAS trial? Thanks,
I've been reading about the differences between forward indexes and inverted indexes.  Which model does Splunk use?  I have not been able to find that information in the documentation.
Hi, I am using the following rex command to extract all text in between "....device-group:" and "succeeded ...." for a field called "old" and assigning the extracting values to a new field called "n... See more...
Hi, I am using the following rex command to extract all text in between "....device-group:" and "succeeded ...." for a field called "old" and assigning the extracting values to a new field called "new". | rex field=old "device-group:\s*(?<new>\S+)" Currently, it is extracting all text in between "....device-group:" and "succeeded ...." EXCEPT for cases where there are multiple words with spaces. Examples include: 1) "Panorama push to device:013101009509 for device-group: Austin Cloud DMZ succeeded. JobId=2484595" where the extracted values should be "Austin Cloud DMZ " 2) "Panorama push to device:013101014290 for device-group: Austin Bank Segmentation succeeded. JobId=2482583" where the extracted values should be "Austin Bank Segmentation" Can you please help on extracting  such cases too? Thank you!
I have the csv file which has the below lines. ========================= METRIC_NAME,METRIC_UNIT,BEGIN_TIME,END_TIME,MAXVAL,MINVAL,AVERAGE Buffer Cache Hit Ratio,% (LogRead - PhyRead)/LogRead,... See more...
I have the csv file which has the below lines. ========================= METRIC_NAME,METRIC_UNIT,BEGIN_TIME,END_TIME,MAXVAL,MINVAL,AVERAGE Buffer Cache Hit Ratio,% (LogRead - PhyRead)/LogRead,09/25/2022 14:59,09/25/2022 15:59,100,0,100 Memory Sorts Ratio,% MemSort/(MemSort + DiskSort),09/25/2022 14:59,09/25/2022 15:59,100,0,100 Redo Allocation Hit Ratio,% (#Redo - RedoSpaceReq)/#Redo,09/25/2022 14:59,09/25/2022 15:59,100,0,100 User Transaction Per Sec,Transactions Per Second,09/25/2022 14:59,09/25/2022 15:59,1.383,0,.528 Physical Reads Per Sec,Reads Per Second,09/25/2022 14:59,09/25/2022 15:59,1.05,0,.138 Physical Reads Per Txn,Reads Per Txn,09/25/2022 14:59,09/25/2022 15:59,2.296,0,.223 -----160 lines I have modified the header , inputs.conf &props.conf but still the mentioned error has seen.   Inputs.conf: [monitor:///u01/app/oracle/scripts/Performance_metrics/output] disabled = false index=brm_db initCrcLength=2048 sourcetype = csv crcSalt=<SOURCE>   Props.conf: [csv] SHOULD_LINEMERGE = False pulldown_type = true INDEXED_EXTRACTIONS = csv KV_MODE = none category = custom HEADER_FIELD_ACCEPTABLE_SPECIAL_CHARACTERS=_ HEADER_FIELD_DELIMITER=, FIELD_DELIMITER=, description = Comma-separated value format. Set header and other settings in "Delimited Settings" Can someone help with this as i need to complete it ASAP. Stuck for almost 2 days with this. Quickly help can be appreciated.  
Hello, I found a new failure of an App's Setup page (secret storage) on a Splunk Cloud when I tried to register password to secret storage. [SPLUNKD] You (user=<my splunk user>) do not have permissi... See more...
Hello, I found a new failure of an App's Setup page (secret storage) on a Splunk Cloud when I tried to register password to secret storage. [SPLUNKD] You (user=<my splunk user>) do not have permission to perform this operation (requires capability: $db_connect_read_app_conf$). Could this be just an account permissions issue ? Can it be proceeded by using admin account  ? Do you have any information for this db_connect_read_app_conf Splunk capability ?? Unfortunately, there was no valuable document and reference for this capability in my search. I'm happy for any information from you. Thank you in advance.
Hi, I have rows that are json based. each row has a field that looks like this: { "students" : [ {"id":"123", "name":"abc"}, {"id":"456", "name":"def"}, {"id":"789", "name":"h... See more...
Hi, I have rows that are json based. each row has a field that looks like this: { "students" : [ {"id":"123", "name":"abc"}, {"id":"456", "name":"def"}, {"id":"789", "name":"hij"} ], "student_id" : "456" } each row can have multiple students and always just one student_id. for each row I want to extract the name of the student who's id is equal to the student_id. how can I do that?   I tried this,: |spath path=students{} output=students|mvexpand students | spath input=students|multikv| table id, name, student_id I do get 3 rows like this in the result: id name student_id 123 abc 456 456 def 456 789 hij 456   but when I try to filter the matching row with: | where id = student_id I get 0 rows coming back.   TIA Asaf
New install and new to splunk.  SC4S issue.  Professional Service left earlier in the week.  Firewall and panorama log sources. Only the firewall logs going to the right index as specified in the spl... See more...
New install and new to splunk.  SC4S issue.  Professional Service left earlier in the week.  Firewall and panorama log sources. Only the firewall logs going to the right index as specified in the splunk_metadata.csv.  I can see the panorama logs in tcpdump but they aren't in the firewall index and aren't in lastchanceindex.  pan_panos_log,index,firewall pan_panos_globalprotect,index,firewall pan_panos_traffic,index,firewall pan_panos_threat,index,firewall pan_panos_system,index,firewall pan_panos_config,index,firewall pan_panos_hipmatch,index,firewall pan_panos_correlation,index,firewall  
I have created custom command *| cloudcidrlookup cloud=azure* but how to change it to be just *| cloudcidrlookup azure* ?     @Configuration() class CloudCidrLookup(GeneratingCommand): clou... See more...
I have created custom command *| cloudcidrlookup cloud=azure* but how to change it to be just *| cloudcidrlookup azure* ?     @Configuration() class CloudCidrLookup(GeneratingCommand): cloud = Option(require=False) ... def generate(self): if self.cloud=='azure': ... dispatch(CloudCidrLookup, sys.argv, sys.stdin, sys.stdout, __name__)      
Hi guys, I'm trying to do something that I expected to be very simple, so I guess I'm missing something big. This is my test SPL: | makeresults | eval data=json_object("id", "123")| spath inp... See more...
Hi guys, I'm trying to do something that I expected to be very simple, so I guess I'm missing something big. This is my test SPL: | makeresults | eval data=json_object("id", "123")| spath input=data| table data.id I want to be able to refer to the data.id field, but I can't. how do I convert "data" into a parsed json?   TIA Asaf
How to apply props.conf EVENT_BREAKER on UF for better data distribution instead of using outputs.conf forceTimebasedAutoLB=true?
If i want to create a peernode , is iT mandatory to have a masternode, or is masternode optional. I learned how to create both, but i dont get if one nerds THE other
Hello Splunk Ninjas! I'm new to the group (and to the splunk) and will require your assistance with designing my regex expression. I need to filter for the value of Message in this sample log line:... See more...
Hello Splunk Ninjas! I'm new to the group (and to the splunk) and will require your assistance with designing my regex expression. I need to filter for the value of Message in this sample log line:   2022-09-23T13:20:25.765+01:00 [29] WARN Core.ErrorResponse - {} - Error message being sent to user with Http Status code: BadRequest: {"Message":"Sorry, only real values are valid in this environment.","UserMessage":null,"Code":64,"Explanation":null,"Resolution":null,"Category":3}   I will be interested in extracting value of Message, Code, Resolution and Category, Any help, much appreciated! Thanks again
I am pushing DNS logs to Splunk Cloud and I am noticing the QueryType is in numeric format, I would like to see that in string format Sample Log:   {"ColoID":378,"Datetime":"2022-09-23T23:55:23Z... See more...
I am pushing DNS logs to Splunk Cloud and I am noticing the QueryType is in numeric format, I would like to see that in string format Sample Log:   {"ColoID":378,"Datetime":"2022-09-23T23:55:23Z","DeviceID":"df34037e","DstIP":"xx.xx.xx.xx","DstPort":0,"Email":"non_identity@ec.com","Location":"London","Policy":"","PolicyID":"","Protocol":"https","QueryCategoryIDs":[26,81],"QueryName":"europe-west9-a-osconfig.googleapis.com","QueryNameReversed":"com.googleapis.europe-west9-a-osconfig","QuerySize":67,"QueryType":28,"RData":[{"type":"28","data":"F2V1cm9wZS13ZXN0OS1hLW9zY29uZmlnCmdvb2dsZWFwaXMDY29tAAAcAAEAAADdABAqABRQQAkIIAAAAAAAACAK"},{"type":"28","data":"F2V1cm9wZS13ZXN0OS1hLW9zY29uZmlnCmdvb2dsZWFwaXMDY29tAAAcAAEAAADdABAqABRQQAkIHwAAAAAAACAK"},{"type":"28","data":"F2V1cm9wZS13ZXN0OS1hLW9zY29uZmlnCmdvb2dsZWFwaXMDY29tAAAcAAEAAADdABAqABRQQAkIFQAAAAAAACAK"},{"type":"28","data":"F2V1cm9wZS13ZXN0OS1hLW9zY29uZmlnCmdvb2dsZWFwaXMDY29tAAAcAAEAAADdABAqABRQQAkIIQAAAAAAACAK"}],"ResolverDecision":"allowedOnNoPolicyMatch","SrcIP":"xx.xx.xx.xx","SrcPort":0,"UserID":"723f7"}     In the above log you would notice "QueryType":28, I'd like to replace 28 with a string - AAAA, other DNS query types can be found in https://en.wikipedia.org/wiki/List_of_DNS_record_types Is there a way I could replace or append the query types string instead of the numeric value that is showing up in the logs by using techniques like lookup or Join? Desired Log: (only QueryType is changed from 28 to AAAA)     {"ColoID":378,"Datetime":"2022-09-23T23:55:23Z","DeviceID":"df34037e","DstIP":"xx.xx.xx.xx","DstPort":0,"Email":"non_identity@ec.com","Location":"London","Policy":"","PolicyID":"","Protocol":"https","QueryCategoryIDs":[26,81],"QueryName":"europe-west9-a-osconfig.googleapis.com","QueryNameReversed":"com.googleapis.europe-west9-a-osconfig","QuerySize":67,"QueryType":AAAA,"RData":[{"type":"28","data":"F2V1cm9wZS13ZXN0OS1hLW9zY29uZmlnCmdvb2dsZWFwaXMDY29tAAAcAAEAAADdABAqABRQQAkIIAAAAAAAACAK"},{"type":"28","data":"F2V1cm9wZS13ZXN0OS1hLW9zY29uZmlnCmdvb2dsZWFwaXMDY29tAAAcAAEAAADdABAqABRQQAkIHwAAAAAAACAK"},{"type":"28","data":"F2V1cm9wZS13ZXN0OS1hLW9zY29uZmlnCmdvb2dsZWFwaXMDY29tAAAcAAEAAADdABAqABRQQAkIFQAAAAAAACAK"},{"type":"28","data":"F2V1cm9wZS13ZXN0OS1hLW9zY29uZmlnCmdvb2dsZWFwaXMDY29tAAAcAAEAAADdABAqABRQQAkIIQAAAAAAACAK"}],"ResolverDecision":"allowedOnNoPolicyMatch","SrcIP":"xx.xx.xx.xx","SrcPort":0,"UserID":"723f7"}   Thanks!
My sample logs is: 2022-09-12 34:45:12.456 info  Request uri [/asdff/aii/products] Request patameters [] Request payload [Request body size : : 5678 bytes Request body : : [{\activaterequest\:\ESRTY... See more...
My sample logs is: 2022-09-12 34:45:12.456 info  Request uri [/asdff/aii/products] Request patameters [] Request payload [Request body size : : 5678 bytes Request body : : [{\activaterequest\:\ESRTYBBS\*\*, \"addresslines\":[{\"addressLineOrder\":\"NAME\"linevalues\":[\"esmal interger\"]}], \"productsio\":\"IM630\", \"productjourneykey\":\"IM630-p-6789778\",\"lineValues\":[\"sejo guleim ramo versa"]}], \"statusdesc\":\"unknown protocol version. http header [x-aacs-rest-version]. Assuming current version [v1.0]\"}],[{ \number\"4\",\"storePONumber\":\"3456\*}, \"app\",\"message\":\"Action taken when more than 10 points\"}], :[{\"serverstatuscode\":\"400 bad_request\",\"severity\", \"statusdesc\":\"Action taken when more than 10 points\"}], \"number\"6\"] My query: index=axcf   "Action taken when more than 10 points" but i want the following values(productsio, addressLineOrder,  linevalues, storePONumber, message, serverstatuscode, statusdesc  ) in table format. how can i do this??
Hello, My goals is to send rrd file data to a splunk indexer. I have a remote host that currently forwards linux_secure data to the indexer - works fie. I am NEVER able to create an input for a... See more...
Hello, My goals is to send rrd file data to a splunk indexer. I have a remote host that currently forwards linux_secure data to the indexer - works fie. I am NEVER able to create an input for any port tcp or otherwise from this dialog window: When I configure a TCP forward-server using lthe UF the forward-server never goes active - I only get "cooked" data on the indexer. the host and source type are configured If I configure a port (tcp or udp) from here: this comes from Data/Data inputs/TCP This setting comes from Settings/Data/Forwarding and receiving I get data to the indexer.  I may be missing something. I installed collectd on a remote host, configured it for the csv plug in, and the cpu plugin -  this data is being collected and save to the /var/lib/collectd directory on the remote host. How can I get this data to splunk and graph it? I can see data coming in - but cannot do anything with it. The splunk web site says that the HEC inputs must be used to get metrics into splunk. How do I configure the remote host to do this? I.E. send the data from collectd to splunk, I am open to suggestions and clarification thanks eholz1  
Hello, I have started my splunk cloud trial and it is giving me below error while trying to access the link. Too many HTTP threads (1267) already running, try again later The server can not pre... See more...
Hello, I have started my splunk cloud trial and it is giving me below error while trying to access the link. Too many HTTP threads (1267) already running, try again later The server can not presently handle the given request.
I am trying to create a query that returns a table showing counts of different error codes and percentage of transactions that are failing (error != 0) for each service.  service 0 3100 ... See more...
I am trying to create a query that returns a table showing counts of different error codes and percentage of transactions that are failing (error != 0) for each service.  service 0 3100 2000 1200 % Failure Foo 1000 12 0 0 1.2% Bar 100 0 3 2 5.0%   My query which returns the above table is:  index=my_index | where error=0 OR error!=0 | chart count by service, error | eval "% Failure"  = round(('3100'+'2000'+'1200')/('3100'+'2000'+'1200'+'0'),2)."%"   How can I modify this query so that I don't need to hardcode each error code into the last part of the query, as error codes may vary?