All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello , Using the below query i am able to get title and Definition of macros . |rest /servicesNS/-/-/admin/macros |table title,definition Can this same be achievable using https://*****:8089/... See more...
Hello , Using the below query i am able to get title and Definition of macros . |rest /servicesNS/-/-/admin/macros |table title,definition Can this same be achievable using https://*****:8089/servicesNS/-/-/admin/macros?output_mode=json  postman call , that i will get only title and definition in response of an api call . i tried using filter  f, search as per the documentation but its not giving required response  Thanks In advance
Hi, I’m newly upgrading the platform. Need help we have a splunk cloud instance upgrade 9.1.however are in due to upgrade Deployment Server and Heavy forwarders followed by UF’s. could your please... See more...
Hi, I’m newly upgrading the platform. Need help we have a splunk cloud instance upgrade 9.1.however are in due to upgrade Deployment Server and Heavy forwarders followed by UF’s. could your please anyone kindly let me know the stepwise process to upgrade the deployment server from 9.0.5 to 9.1 and similarly for Heavy Forwarder . which are rpm packages to choose for both DS and HF ‘s.
I have a case where the we have some associated metric for each request/response event , something like below: { "Key1" : "val", "Array12" : [ "val1", "val2" ], "NewList" : [ { "K1":"v11", "K2":"v1... See more...
I have a case where the we have some associated metric for each request/response event , something like below: { "Key1" : "val", "Array12" : [ "val1", "val2" ], "NewList" : [ { "K1":"v11", "K2":"v12", "K3":"v13" }, { "K1":"v21", "K2":"v22", "K3":"v23" } ] } Now this list , NewList is too big and having key-val pairs is making the log very bulky. Is there any way to make it consize and they be able to read this in a dashboard as below K1 , K2 , K3 V11,V12,V13 V21,V22,V23
Hello, I need your help with a field extraction. I have this type of data, and I'd like to extract the following fields with a rex command:   The syntax is as follows :  ("data": ["from... See more...
Hello, I need your help with a field extraction. I have this type of data, and I'd like to extract the following fields with a rex command:   The syntax is as follows :  ("data": ["from" : "2024-04-25T11: 30Z", "to": "2024-04-2512:00Z", "intensity": ("forecast": 152, "actual": null, "index": "moderate"}), ("from": "2024-04-25T12:002", "intensity": {"forecast": 152, "actual": null, "index": "moderate"}), ("from": "2024-04-25T12:30Z", "to": {"from": "2024-04-25T13:00Z", "to": "2024-04-25T12: 30Z", ("forecast": 164, "actual": null, "2024-04-2513: 30Z", "intensity": ("forecast": 154, "to": "2024-04-25T13: 002", "intensity": ("forecast": 154, "actual": null, "index": "actual": null, "index": "moderate"}), ("from": "2024-04-25T13:30Z*, "to": "2024-04-25T14:002", "moderate"}}, "intensity": 04-25T14: 30Z", "to" : "index": "moderate"3}, ("from": "2024-04-25T14:002*, "to": "2024-04-25T14:30Z", "intensity": ("forecast": 166, "actual": null, "index": "moderate"}), ("from": " 2024-04-25T15:00Z" "actual": nu11, "index" "intensity": {"forecast": 170, "actual": null, "index": "moderate"}), {"from": "2024-04-2515: 00Z", 2024- "to" : "moderate"}), {"from": "2024-04-25T15:30Z", "to": "2024-04-25T16:00Z", "intensity": ("forecast": 175, "to": "2024-04-25T15: 30Z", "intensity": {"forecast": 172, "2024-04-25T16: 30Z", "index": "moderate"}}, ("from": "2024-04-25T17:00Z", "to": "intensity": ("forecast": 177, "actual": null, "index": "moderate"?), ("from": "2024-04-2516: 302", "actual" : nu11, "index" "moderate"}}, ("from": "2024-04-2516: 00Z", "to": "2024-04-25T17:002", "intensity": ("forecast": 179, "actual": null, 2024-04-2517: 30Z", "intensity": ("forecast": 181, "actual": null, 25T18:00Z", "intensity": ("forecast": 184, "index": "moderate"}}, {"from": "2024-04-25T17: 30Z", "actual": null, "index": "moderate"}), ("from": "2024-04-25T18:002", "to": "2024-04-25T18: 30Z", "to": "2024-04- "moderate"}}, ("from": "2024-04-2518: 30Z", "to": "2024-04-25T19:002", "intensity": ("forecast": 187, "actual": null, "intensity": ("forecast": 190, "actua1": nul1, "index": "index": "high"}}, ("from": "intensity": {"forecast": 193, "actual": null, "index": "2024-04-25T19: 00Z", "to": "2024-04-25T19: 30Z" "high"}}, ("from": "2024-04-25T19:30Z", "to": "2024-04-25T20:00Z", "intensity": {"forecast": 194, "2024-04-2520: 00Z", "to": "2024-04-25T20:30Z", "intensity": {"forecast": 195, "actual": null, "index": "high"3}, ("from": "2024-04-25T20:30Z", "actual": null, "index": "high")}, ("from": "2024-04-25T21:00Z", "intensity": ("forecast": 198, "actual": null, "index": "high"'), ("from": "2024-04-25T21: 002", "2024-04-25T22: 00Z", "intensity": {"forecast": 187, "actual": null, "to": "2024-04-25T21: 30Z", "intensity": {"forecast": 196, 'actual": null, "index": "high"}}, {"from": "2024-04-25T21:302" "to" "index": "moderate"}}, ("from": "2024-04-25T22:00Z", "to": "2024-04-25T22: 30Z", "intensity": ("forecast": 181, "actual": null, "index": "moderate"}}, {"from": "2024-04-25T22:30Z", "to": "2024-04-25T23:002", "intensity": ("forecast": 180, "actual": null, "index' moderate"}},{"from": 25T23:30Z", "intensity": {"forecast": 172, "actual": null, "index": "moderate"}}, {"from": "2024-04-25T23: 30Z", "2024-04-25T23:002", "to": " 2024-04- "moderate"}}, {"from": "2024-04-26T00:00Z", "to": "2024-04-2600: 30Z", "intensity": ("forecast": 150, "to": "2024-04-2600: 00Z", "intensity": ("forecast": 150, "actual": null, "index": "actual": null, "index": "moderate")}, ("from": "2024-04-26T00: 302", "to": "2024-04-26T01:00Z" "intensity": {"forecast": 149, "actual": null, "index": "moderate"}}, ("from": "2024-04-26T01:002", "to": "2024-04-26T01:30Z", "intensity": {"forecast": 149, "actual": null, "index": "moderate"}}, ... Thank you very much 
Dear Splunk   I have a use case to send some notification/warning alert to those users who are met with some criteria in search. How can i send alert only to the members(identified in search) in B... See more...
Dear Splunk   I have a use case to send some notification/warning alert to those users who are met with some criteria in search. How can i send alert only to the members(identified in search) in BCC list as the alert configuration have a mandatory TO list (for at least one member ) which  do not required in the use case. simple , i want to set up an alert only with bcc'd users not anyone in "to" list
Hi All, how to write a query in Splunk to take two same days in a week only if the difference between the start day and end day is not more than 24 hours. For example - the two days can be Tuesday, bu... See more...
Hi All, how to write a query in Splunk to take two same days in a week only if the difference between the start day and end day is not more than 24 hours. For example - the two days can be Tuesday, but the query should check the difference between two Tuesdays is less than 24 hours, which means the end day hours and the starting day hours falls in the same Tuesday.
messages shows the below: Search head cluster member A is having problems pulling configurations from the search head cluster captain B. Changes from the other members are not replicating to this me... See more...
messages shows the below: Search head cluster member A is having problems pulling configurations from the search head cluster captain B. Changes from the other members are not replicating to this member, and changes on this member are not replicating to other members. Consider performing a destructive configuration resync on this search head cluster member. any idea regarding the resync commands ??
We have a requirement to have a splunk dashboard which shows all the testcases that we have run from Jmeter for visibility purpose. I want to understand how can we achieve this. Jmeter gives the opt... See more...
We have a requirement to have a splunk dashboard which shows all the testcases that we have run from Jmeter for visibility purpose. I want to understand how can we achieve this. Jmeter gives the option to export the detailed reports in csv format which can be uploaded to splunk but that would be manual approach, every time uploading csv file would be time consuming one. Is there anyway we can integrate Jmeter with Splunk which allows to dump the reports on splunk. The limitation here is since its a project in an organization, any external app or plugin can not be installed which is not approved by the organization.
Is this intended behavior? After selecting only a single event with "head 1" fields from excluded events that occurred at the same time can be seen in a table when using wildcards in example "table ... See more...
Is this intended behavior? After selecting only a single event with "head 1" fields from excluded events that occurred at the same time can be seen in a table when using wildcards in example "table _time,tags.* values.*"
Hi, I am trying to ingest botsv2 and botsv3 indexed data into security essentials for demo and learning purposes, but the onboarding background search only checks the data in the last 30 days, the t... See more...
Hi, I am trying to ingest botsv2 and botsv3 indexed data into security essentials for demo and learning purposes, but the onboarding background search only checks the data in the last 30 days, the two types of BOTs datasets are about 6 years ago,  I want to know how to modify such onboarding search to expand its search time?
Hello I have the following sample log lines from a splunk search query      line1 line2 line3: field1 : some msg line4 line5 status: PASS line6 line7 line3: field2: some msg line8 line9: status: P... See more...
Hello I have the following sample log lines from a splunk search query      line1 line2 line3: field1 : some msg line4 line5 status: PASS line6 line7 line3: field2: some msg line8 line9: status: PASS line1 line2 line3: field3: some msg line4 line5: status: PASS line1 line2 line3: field4: some msg line4 line5: status: PASS       I want to write a transaction to return lines between field1, status: PASS  field2, status: PASS field3: status:PASS and so-on I have tried the following search query with multiple startswith values   index="test1" source="test2" run="test3" | transaction source run startswith IN ("field1", "field2", "field3") endswith="status: PASS"   Instead of using IN keyword for startswith, I want to use a csv lookup table messages.csv Sample messages.csv content   id,Message 1,field1 2,field2 3,field3 4,field4   I want to write splunk transaction command with startswith parameter containing each Message field from messages.csv My inputlookup CSV file may have 100 different rows with different messages There is also a chance that my splunk search results may not have any entries with lines containing field1, field2, field3, field4 Can someone please help on how to write splunk transaction where startswith needs to be run for each Message in messages.csv?
We are seeing a very different issue, 1.As shown  in a table when there are no logs for any one of the List rows are returning null values. eg: App2 and App3 column. used |fillnull value=0 its not w... See more...
We are seeing a very different issue, 1.As shown  in a table when there are no logs for any one of the List rows are returning null values. eg: App2 and App3 column. used |fillnull value=0 its not working. 2. If we have data for any one of the Service(eg: service=fast, App4 column), it showing the count and rest of the rows are filling up with zero. how to fill the null values as zero even when there are no single count for the columns?? Query:  index=testindex source=sourcelogs |rex field= _raw "service :\s(?<Service>\w+)" | rex field= _raw "(?<List>Received main message|Application published to service|List status are in process|Application process running successfully)" |chart count over Service by List |rename "Received main message" as App1 "Application published to service" as App2 "List status are in process" as App3 "Application process running successfully" as App4 |table App1 App2 App3 App4 Result of the query: Service App1 App2 App3 App4 Token 10     0 Modern 40     0 Surb 3     0 Fast 12     4 Normal 4     0 Forward 6     0 Medium 7     0
How to check if the host has been correctly whitelisted to receive configuration from Splunk Deployment Server?
Hi, I am lily. I want to know how to customize the MLTK model using in ESCU rules. If it doesn't, it is possible to check the contents of models inside the MLTK?   <example> I want to know how 'c... See more...
Hi, I am lily. I want to know how to customize the MLTK model using in ESCU rules. If it doesn't, it is possible to check the contents of models inside the MLTK?   <example> I want to know how 'count_by_http_method_by_src_1d' was made  
Hello everyone. I need to create a metric or Health Rule, which does the following: Warning : 15% of calls response time >= 50 secs Critical: 30% of calls with response time >= 50 secs Critical: ... See more...
Hello everyone. I need to create a metric or Health Rule, which does the following: Warning : 15% of calls response time >= 50 secs Critical: 30% of calls with response time >= 50 secs Critical: 10% of calls with error. Is this possible with AppDynamics?? I'm trying with this formula: ({n_trx_rt}>=50000/{total_trx})*100 Where n_trx_rt = Average Response Time total_trx = Calls per minute This gives me a result, but I'm not sure if the operation is supported by AppDynamics.
So far I created this Join   index="index" "mysearchtext" | rex field=message ", request_id: \\\"(?<request_id>[^\\\"]+)" | fields _time request_id | eval matchfield=request_id | join matchfield [ ... See more...
So far I created this Join   index="index" "mysearchtext" | rex field=message ", request_id: \\\"(?<request_id>[^\\\"]+)" | fields _time request_id | eval matchfield=request_id | join matchfield [ search index="index" | spath request.id | rename request.id as id | fields mynewfield | eval matchfield=id | table _time request_id mynewfield     Basically I want to join 2 logs where request_id = id . The join is working as expected but as you expect is not efficient. I'd like to replace it with a more efficient search leveraging the fact that the events of the subsearch where I extract the field "mynewfield" are indexed for sure after some milliseconds the main search (where I extract the field request_id) Another useful info is that the logs that matches "mysearchtext" are way less than the logs in the subsearch Here a sample of the data {"AAA": "XXX","CCC":"DDD","message":{"request":{ "id": "MY_REQUEST_ID"} } } {"AAA": "XXX","CCC":"DDD","message":"application logs in text format e.g. 2024/04/26 06:35:21 mysearchtext headers: [], client: clientip, server, host, request_id=\"MY_REQUEST_ID\" "} The first event contains the message field which is a json string --> we have thousands of this logs The second one are "alerts" and we have just a few of them, the format of the "message" field is plain text. Both contains the value MY_REQUEST_ID which is the field that I have to use to correlate both logs. The output should be a TABLE of ONLY the events with "mysearchtext" (the second event) with some additional fields coming from the second event. The events above are sorted by time (reverse order), the second event is happens just few milliseconds before the first one (basically the second one is just a log message of the same REST request of the first event. The first event is the REST request response sent to the customer)
Hello Experts, I'm trying to create a python script to run adhoc searches via a api request but the documentation has me opening webpages after webpages. I've created a token already. Can someone ple... See more...
Hello Experts, I'm trying to create a python script to run adhoc searches via a api request but the documentation has me opening webpages after webpages. I've created a token already. Can someone please help me with this task? Thank you in advance,Splunk Search
Hi, I have extracted fields manually in Splunk cloud, The regex works perfectly in the field extraction preview page but while seraching fields are not showing up. I have kept the permission to glob... See more...
Hi, I have extracted fields manually in Splunk cloud, The regex works perfectly in the field extraction preview page but while seraching fields are not showing up. I have kept the permission to global, searching in verbose mode,set the coverage to All fields. I am trying to extract 6 fields(all the regex are working in preview page)out of which only one field(IP address) is showing in search.I have debugged and refreshed the page bumped it as well. Funny part if i use default regex expression in Splunk instead of writing my own regex then fields pops in search. Also, I have observed couple of fields showing up and then disappearing while searching Sample data :  nCountry: United States\nPrevious Country   nCountry\:\s(?<country>.+?)\\\nPrevious\sCountry     I have referred almost all the splunk answers but none of the solution fixed my problem. I am intruiged to know why is it not working? Thanks in advance 
Hi, I have two panels with two different search results. Say, Panel A and Panel B both panels just return/shows single value. I want to get the difference of these panels in other panel but it sho... See more...
Hi, I have two panels with two different search results. Say, Panel A and Panel B both panels just return/shows single value. I want to get the difference of these panels in other panel but it should check whether the  panel A and Panel B finalized results before doing difference. please could you suggest ? Thanks, Selvam.
Hello all,   Can someone Please help me, regarding my qwery,  "base | stats count by field 1" I am using this qwery but i would like to add field2 also in this qwery as form of table, Please ... See more...
Hello all,   Can someone Please help me, regarding my qwery,  "base | stats count by field 1" I am using this qwery but i would like to add field2 also in this qwery as form of table, Please provide your valuable suggestions