All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Did you run the search again and get the same message?  If so, what did you find in search.log?
I am getting this error, may have returned partial results try running your search again.if you see this error repeatedly, review search.log for details or contact your Splunk administration Can i ... See more...
I am getting this error, may have returned partial results try running your search again.if you see this error repeatedly, review search.log for details or contact your Splunk administration Can i please get solution for this     Thanks, sahitya
Ah, backups.  Splunk has this documented, so I'll just point you to their docs on "Backup and restore Splunk DB Connect version 3.10.0 or higher" Hope that helps! -Rich
Excellent point.  My answer should use list rather than values.
I used this search and it did work, however, something that I probably should have mentioned earlier is that multiple hosts and users are linked to the same external ip, so I am now getting multivalu... See more...
I used this search and it did work, however, something that I probably should have mentioned earlier is that multiple hosts and users are linked to the same external ip, so I am now getting multivalue fields for the Hostnames and Users. Anything that can be done for that? Or should I combine the two fields beforehand, then split them after the eventstats command?
Thank you for the response! I had a try like this (maybe not exactly the same) before posting, and it didn't work. However, this time I pasted yours and after a slight change, it works! Now it's ... See more...
Thank you for the response! I had a try like this (maybe not exactly the same) before posting, and it didn't work. However, this time I pasted yours and after a slight change, it works! Now it's like: if(SDLC=="pm","ctpm",SDLC) So it seems I cannot use $ and quotes. After removing them, it's good!
Hi Guys, I want to show two field values into single column in a table .query and sample logs given below.   index="mulesoft" applicationName="api" |spath content.payload{} |mvexpand content.pay... See more...
Hi Guys, I want to show two field values into single column in a table .query and sample logs given below.   index="mulesoft" applicationName="api" |spath content.payload{} |mvexpand content.payload{}| transaction correlationId | rename "content.payload{}.AP Import flow processing results{}.requestID" as RequestID "content.payload{}.GL Import flow processing results{}.impConReqId" as ImpConReqId content.payload{} as response | eval OracleRequestId="RequestID: ".RequestID." ImpConReqId: ".ImpConReqId|table response OracleRequestId   Actual Result response        OracleRequestId GL Import flow related results : Extract has no GL records to Import into Oracle { "AP Import flow processing results" : [ { "concurBatchId" : "395", "requestID" : "101660728", "returnMessage" : null, "returnStatus" : "SUCCESS", "insertCount" : "72", "fileName" : "CONCUR_APAC_APINV_1711965640.csv" } ] }   { "AP Import flow processing results" : [ { "concurBatchId" : "393", "requestID" : "101572722", "returnMessage" : null, "returnStatus" : "SUCCESS", "insertCount" : "66", "fileName" : "CONCUR_APAC_APINV_1711620043.csv" } ] } { "GL Import flow processing results" : [ { "concurBatchId" : "393", "batchId" : "6409", "count" : "5", "impConReqId" : "101572713", "errorMessage" : null, "filename" : "CONCUR_APAC_GLJE_51711620043.csv" } ] } RequestID: 101572722 ImpConReqId: 101572713   Expected Result: response        OracleRequestId GL Import flow related results : Extract has no GL records to Import into Oracle { "AP Import flow processing results" : [ { "concurBatchId" : "395", "requestID" : "101660728", "returnMessage" : null, "returnStatus" : "SUCCESS", "insertCount" : "72", "fileName" : "CONCUR_APAC_APINV_1711965640.csv" } ] } requestID:101660728 { "AP Import flow processing results" : [ { "concurBatchId" : "393", "requestID" : "101572722", "returnMessage" : null, "returnStatus" : "SUCCESS", "insertCount" : "66", "fileName" : "CONCUR_APAC_APINV_1711620043.csv" } ] } { "GL Import flow processing results" : [ { "concurBatchId" : "393", "batchId" : "6409", "count" : "5", "impConReqId" : "101572713", "errorMessage" : null, "filename" : "CONCUR_APAC_GLJE_51711620043.csv" } ] } RequestID: 101572722 ImpConReqId: 101572713  
Hi @dondef , I know it is a couple years later but for someone that might need this I enable the Splunk integration manually and that worked successfully.  I would try to enable manually via the cur... See more...
Hi @dondef , I know it is a couple years later but for someone that might need this I enable the Splunk integration manually and that worked successfully.  I would try to enable manually via the curl call.  https://www.bitdefender.com/business/support/en/77211-171475-splunk.html  
The problem with using values() is that the multivalue fields are each sorted lexicographically independently and the original relationship between the values from the events is lost before the mvzip... See more...
The problem with using values() is that the multivalue fields are each sorted lexicographically independently and the original relationship between the values from the events is lost before the mvzip/mvexpand/mvindex fixup.
Try something like this <change> <eval token="new_sdlc">if("$SDLC$"=="sldc","ctpm","$SDLC$")</eval> </change>
The mvexpand command will split the multi-value fields into separate events.  The problem is doing so breaks the relationships with other multi-value fields.  To work around that, combine the three m... See more...
The mvexpand command will split the multi-value fields into separate events.  The problem is doing so breaks the relationships with other multi-value fields.  To work around that, combine the three multi-value fields into a single multi-value field, use mvexpand, then split the fields apart. | rename jsonevent.external_ip as exip | rename aip as agentip | eval external_ip = coalesce(agentip, exip) | stats values(jsonevent.hostname) as Hostnames, values(jsonevent.Username) as Users, values(AppVendor) as Vendors, values(AppName) as Applications, values(AppVersion) as Version by external_ip | eval tuple=mvzip(Hostnames, mvzip(Usernames, mvzip(Vendors, mvzip(Applications, Version)))) | mvexpand tuple | eval tuple=split(tuple, ",") | eval Hostnames=mvindex(tuple, 0), Usernames=mvindex(tuple, 1), Vendors=mvindex(tuple, 2), Applications=mvindex(tuple, 3),Version=mvindex(tuple, 4)
Try something like this (index=index1 sourcetype=sourcetype1) OR (index=index2 sourcetype=sourcetype2) | rename jsonevent.external_ip as exip | rename aip as agentip | eval external_ip = coalesce(a... See more...
Try something like this (index=index1 sourcetype=sourcetype1) OR (index=index2 sourcetype=sourcetype2) | rename jsonevent.external_ip as exip | rename aip as agentip | eval external_ip = coalesce(agentip, exip) | eventstats values(jsonevent.hostname) as Hostnames, values(jsonevent.Username) as Usernames by external_ip | rename AppVendor as Vendors, AppName as Applications, AppVersion as Version | where isnotnull(Vendors) | table external_ip Hostnames Usernames Vendors Applications Version
Hi @Felipe.Windmoller, Thanks so much for following up with all this additional information! 
I have this query (below): 1) When I run this query in Splunk web, I get back SID and get data using SID. 2) When I use curl command, I get back SID and get data using SID. 3) But when I use Py... See more...
I have this query (below): 1) When I run this query in Splunk web, I get back SID and get data using SID. 2) When I use curl command, I get back SID and get data using SID. 3) But when I use Python, I get SID in the response with status code 201. #read this query from file   with open("aquery2.txt", "r") as f: aQuery = f.read() ##derive earliest and latest finalAQuery=("search" + " " +("earliest=" + "1711982700.001" + "=" + ("latest=" + 1711983600.0 + " " + aQuery)   url = "https://abc.splunkcloud.com:8089/servicesAB/-/xyz/search/jobs" def getSid(): try:   response = requests.post(url, headers={'Authorization': TOKEN}, data={'search': finalAQuery}, verify=False)   I get back the SID. But when I use the SID to get the results, I get error 404, <Response [404]> {"messages":[{"type":"FATAL","text":"Unknown endpoint."}]} def getMetric(): try:   getData=(url + '/' + sid + '/results') getSidResponse = requests.get(getData, headers={'Authorization': TOKEN}, data={'output_mode' : 'json'}, verify=False)   #aquery.txt contents below.   index=apigee sourcetype="apigee:Prod_access_logs" | rex field=proxyUri "(?P<proxyUri>(([a-zA-Z]+)\d)(?:\d\/[a-zA-Z]+|\/[a-zA-Z]+)+)" | convert num("requestTimeinSec") |rex field=_raw "(?<timeStamp>\d{4}\-\d{1,2}\-\d{1,2}T\d{1,2}\:\d{1,2}\:\d{1,2}\-\d{1,2}\:\d{1,2})\s+(?<hostValue>\w+)\s+\S+\s+\S+\s+(?<requestTimeinSec>\S+)\s+\-\s+-\s+(?P<httpStatusCode>\w+)\s+(?<upstreamHttpStatusCode>\w+)\s+\w+\s+\w+\s+(?<methodName>\w+)\s+(?<proxyUri>\S+)\s+(?<httpVersion>\S+)\s+(?<messageId>\S+)" |rex field=_raw "^([^\t]+\t){35}(?P<ClientId>[^\t]+)" | eval totalResponseTime=round(requestTimeinSec*1000) | replace "z1/credit/bank/info/usa" with "x1/credit/bank/info/canada" in proxyUri | replace "v1/taste" with "/connecticut/taste/v1/newyork" in proxyUri | rangemap field="httpStatusCode" "httpStatusCode"=0-499 | rename range as RangeSuccesshttpStatusCode | eval Product=case(like(ClientId, "JERSEY"), "aaa", like(ClientId, "APPLE"), "bbb", like(ClientId, "HELLO"), "ccc") | eval ATier=case((like(proxyUri,"/paypal/jersey/v1/newyork") AND like(methodName,"POST") AND IN (Product, "aaa", "bbb", "ccc")) , "Tier1", (like(proxyUri,"v1/credit/bank/info/canada") AND like(methodName,"GET") AND IN (Product, "aaa", "bbb", "ccc")) OR (like(proxyUri,"v1/credit/accounts/profile") AND like(methodName,"GET") AND IN (Product, "Venmo", "Cobrand", "PPC")), "Tier2",(like(proxyUri,"v1/alerts/preferences") AND like(methodName,"GET") AND IN (Product, "Venmo", "Cobrand", "PPC")), "Tier3",1==1,"NA") | stats count(totalResponseTime) as TotalTrans, count(eval(RangeSuccesshttpStatusCode="httpStatusCode")) as TotalSuccesTran, count(eval(httpStatusCode<500)) as GoodEvents, by ATier Product proxyUri methodName | where ((Product IN ("aaa", "bbb", "ccc")) AND (ATier IN ("Tier1", "Tier2","Tier3"))) |rename methodName AS Method | fields ATier proxyUri Method TotalTrans GoodEvents  
Hi @vik  , I'm encountering the same issue. Did you resolve it?
So, I have two indexes and sourcetypes with the following fields: index1 and sourcetype1: aip = 34.465.45.234 AppVendor = vendor1, vendor2, vendor3 (These are all from different events) AppNa... See more...
So, I have two indexes and sourcetypes with the following fields: index1 and sourcetype1: aip = 34.465.45.234 AppVendor = vendor1, vendor2, vendor3 (These are all from different events) AppName = app2, app3, app1 (All from different events corresponding to position of the vendors above) AppVersion = 3.0343, 1.354, 2.5465 (Same convention) index2 and sourcetype2: jsonevent.external_ip = 34.465.45.234 jsonevent.hostname = Host1 jsonevent.Username = User1 I use the following search: (index=index1 sourcetype=sourcetype1) OR (index=index2 sourcetype=sourcetype2) | rename jsonevent.external_ip as exip | rename aip as agentip | eval external_ip = coalesce(agentip, exip) | stats values(jsonevent.hostname) as Hostnames, values(jsonevent.Username) as Users, values(AppVendor) as Vendors, values(AppName) as Applications, values(AppVersion) as Version by external_ip | search Hostnames=* Users=* Vendors=* Applications=* Version=* I get the following: external_ip                    Hostnames         Usernames              Vendors             Applications                Version 34.465.45.234             Host1                    User1                          Vendor1           app1                                1.354                                                                                                                   Vendor2           app2                                2.5465                                                                                                                    Vendor3           app3                                3.0343   What I want is the following: external_ip                    Hostnames         Usernames              Vendors             Applications                Version 34.465.45.234             Host1                    User1                          Vendor1           app2                                3.0343 34.465.45.234             Host1                    User1                          Vendor2           app3                                1.354  34.465.45.234             Host1                    User1                          Vendor3           app1                                2.5465 Does anyone have any ideas how to achieve this?
Hi community, I have a dropdown for environments like DEV/CT/PROD, and saved it into a token `SDLC`. Now I would like to define another token `new_sdlc`. It's "ctpm" when `SDLC` is "pm"; Otherwise... See more...
Hi community, I have a dropdown for environments like DEV/CT/PROD, and saved it into a token `SDLC`. Now I would like to define another token `new_sdlc`. It's "ctpm" when `SDLC` is "pm"; Otherwise, it's the same value as `SDLC`. In the end, I found a way working but a bit stupid, simply because it seems "!=" is not allowed so I have to list all conditions. I've checked a few posts but didn't find a working and elegant way. I bet there is one. Looking forward to your help. Here is my code: <fieldset submitButton="false"> <input type="time" token="field1"> <label></label> <default> <earliest>-24h@h</earliest> <latest>now</latest> </default> </input> <input type="dropdown" token="SDLC"> <label>SDLC</label> <choice value="prod">PROD</choice> <choice value="ct">CT</choice> <choice value="pm">PM</choice> <default>prod</default> <initialValue>prod</initialValue> <change> <condition label="CT"> <set token="new_sdlc">ct</set> </condition> <condition label="PM"> <set token="new_sdlc">ctpm</set> </condition> <condition label="PROD"> <set token="new_sdlc">prod</set> </condition> </change> </input> </fieldset>
This seems to be an limitation on the Microsoft end: https://learn.microsoft.com/en-us/previous-versions/office/developer/o365-enterprise-developers/jj984335(v=office.15)?redirectedfrom=MSDN#data-gra... See more...
This seems to be an limitation on the Microsoft end: https://learn.microsoft.com/en-us/previous-versions/office/developer/o365-enterprise-developers/jj984335(v=office.15)?redirectedfrom=MSDN#data-granularity-persistence-and-availability   The `delay_throttle` option in the input can be used to control it, but the docs seem to say that events can be delayed by 24-hours in the Microsoft end, so that seems to be why the default is 24-hours.
@abhi_2985  The openssl commands needed for the conversion are straightforward to use. Start by opening your terminal and navigating to the directory with your CRT file. Here’s how to convert .crt... See more...
@abhi_2985  The openssl commands needed for the conversion are straightforward to use. Start by opening your terminal and navigating to the directory with your CRT file. Here’s how to convert .crt to .pem: openssl x509 -in certificate.crt -out certificate.pem -outform PEM Replace ‘certificate.crt’ with your .crt file name and ‘certificate.pem’ with your desired .pem file name. This command will convert your .crt to .pem, providing you with a new file in .pem format.
ODBC is not a database but a standardized way of accessing the database - any database for which there is a n appropriate ODBC driver. So there is no such thing as "ODBC database". Same goes for JD... See more...
ODBC is not a database but a standardized way of accessing the database - any database for which there is a n appropriate ODBC driver. So there is no such thing as "ODBC database". Same goes for JDBC - it's another abstraction layer providing common interface to databases. Question is what database you have and what do you want to do with it and is there a jdbc driver for your particular database because if there is you can use the DBConnect with proper jdbc data source.