All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

perfect, thank you
Is it a typo or just not valid JSON (the comma after "values" should be a colon?)
Hi @lyngstad , Based in your search results screenshot, you get the values from the by clause but lack the calculation of air temp, so I guess the trick is to format the datamodel to be measured (... See more...
Hi @lyngstad , Based in your search results screenshot, you get the values from the by clause but lack the calculation of air temp, so I guess the trick is to format the datamodel to be measured (so yes, it may be related to some double quotes around the values). Can you try this then? Trying to get it via "values" so we can later convert to number and remove the quotes, which will allow average metric. | tstats values(Climate.air_temp) as air_temp_raw from datamodel="Climate" where sourcetype="TU_CLM_Time" host=TU_CLM_1 by host _time span=60m | eval air_temp_numeric = tonumber(trim(air_temp_raw, "\"")) | stats avg(air_temp_numeric) as air_temp by host _time   
Hello, Thank you @mattymo for taking the time to provide input on my question. These are the answers I received from the Splunk team after discussing your questions with them. Is it a single search... See more...
Hello, Thank you @mattymo for taking the time to provide input on my question. These are the answers I received from the Splunk team after discussing your questions with them. Is it a single search head or search head cluster? - No cluster all single search heads Have the indexers already been migrated to Azure? - No indexers have been migrated yet Will the SH(C) be searching Azure or On-Prem Indexers as well? - SH will be searching Azure indexers What "components" do you rely on most on this SH(C)? Premium apps like ES or ITSI? or just Splunk Enterprise apps? - SH will rely on both premium apps and Splunk Enterprise apps
index=idx1 source="file1.log" sysname IN ("SYS1","SYS6") | table sysname value_name value_info | eventstats dc(value_info) as distinct_values by value_name | where distinct_values > 1 | xyseries valu... See more...
index=idx1 source="file1.log" sysname IN ("SYS1","SYS6") | table sysname value_name value_info | eventstats dc(value_info) as distinct_values by value_name | where distinct_values > 1 | xyseries value_name sysname value_info
Hello,   I obtain a  "Failed processing http input" when trying to collect the following json event with indexed fields : {"index" : "test",  "sourcetype", "test", "event":"This is a test", "field... See more...
Hello,   I obtain a  "Failed processing http input" when trying to collect the following json event with indexed fields : {"index" : "test",  "sourcetype", "test", "event":"This is a test", "fields" : { "name" : "test" , "values" : {}  }} Error is : "Error in handling indexed fields" Could anyone precise reason of the error ? "fields value" could not be empty ? I can't prevent it on the source. Best regards, David  
Hi, What collector are you using to ship the logs? 
After looking at some examples online, I was able to come up with the below query, which can display one or more columns of data based on the selection of "sysname". What we would like to do with th... See more...
After looking at some examples online, I was able to come up with the below query, which can display one or more columns of data based on the selection of "sysname". What we would like to do with this is optionally just two sysnames, and only the rows where the values do not match. index=idx1 source="file1.log" sysname IN ("SYS1","SYS6")| table sysname value_name value_info | eval {sysname}=value_info | fields - sysname, value_info | stats values(*) as * by value_name   The data format is the below and there are a couple hundred value_names for each sysname with varying formats from integer values, to long strings sysname, value_name, value_info   The above query displays the data something like this value_name            SYS1                     SYS6 name1                       X                             Y name2                       A                             A name3                       B                             C
Hi! I think you are on the right track with field extraction and it's behaviours.  The search that works, does so because the search "looks for any match of your fruit string in the _raw event", ... See more...
Hi! I think you are on the right track with field extraction and it's behaviours.  The search that works, does so because the search "looks for any match of your fruit string in the _raw event", whereas the ones you are struggling with look for a field value pair, which actually does not exist int the raw event. (there is no "ERROR="). Splunk would have to extract this to recognize it as a field.  I would start with, what is the sourcetype of this data? Does it have any JSON parsing happening at search time, index time or both. (HINT: kv_mode =json / props.conf / transforms.conf )  Easy way to start is..does the Splunk UI recognize this as properly formed JSON and show you it "pretty printed"? Do you see the JSON kv pairs extracted in "interesting fields"? If not then we would need to extract them to be able to reference the fields and their values. 
Good day, I am trying to figure out how I can join two searches to see if there is a service now ticket open for someone leaving the company and if that person is still signing into some of our pl... See more...
Good day, I am trying to figure out how I can join two searches to see if there is a service now ticket open for someone leaving the company and if that person is still signing into some of our platforms. This is to get the signin details into the platform - as users might have multiple email addresses I want them all.   index=collect_identities sourcetype=ldap:query [ search index=db_mimecast splunkAccountCode=* mcType=auditLog |fields user | dedup user | eval email=user, extensionAttribute10=user, extensionAttribute11=user | fields email extensionAttribute10 extensionAttribute11 | format "(" "(" "OR" ")" "OR" ")" ] | dedup email | eval identity=replace(identity, "Adm0", "") | eval identity=replace(identity, "Adm", "") | eval identity=lower(identity) | table email extensionAttribute10 extensionAttribute11 first last identity | stats values(email) AS email values(extensionAttribute10) AS extensionAttribute10 values(extensionAttribute11) AS extensionAttribute11 values(first) AS first values(last) AS last BY identity   This is to check all leavers in service now   index=db_service_now sourcetype="snow:incident" affect_dest="STL Leaver" | dedup description | table _time affect_dest active description dv_state number   Unfortunately the Supporthub does not add the email in the description and only user names and surnames. So I would need to search the first queries 'first' 'last' against the second query to find leavers. this is what I tried but it does not work.   index=collect_identities sourcetype=ldap:query [ search index=db_mimecast splunkAccountCode=* mcType=auditLog |fields user | dedup user | eval email=user, extensionAttribute10=user, extensionAttribute11=user | fields email extensionAttribute10 extensionAttribute11 | format "(" "(" "OR" ")" "OR" ")" ] [search index=db_service_now sourcetype="snow:incident" affect_dest="STL Leaver" | dedup description | rex field=description "*(?<first>\S+) (?<last>\S+)*" | fields first last] | dedup email | eval identity=replace(identity, "Adm0", "") | eval identity=replace(identity, "Adm", "") | eval identity=lower(identity) | stats values(email) AS email values(extensionAttribute10) AS extensionAttribute10 values(extensionAttribute11) AS extensionAttribute11 values(first) AS first values(last) AS last BY identity   Search one results identity email extensionattribute10 extensionattribute11 first last nsurname name.surname@domain.com nsurnameT1@domain.com name.surname@consultant.com name surname Search two will get all my tickets that was created for people leaving my company and will return results like this _time affect_dest active description dv_state number 2024-10-31 09:46:55 STL Leaver true Leaver Request for Name Surname - 31/10/2024 active INC01 So the only way of searching would by to search the second query's description field where first and last appear Expectations identity email extensionattribute10 extensionattribute11 first last _time affect_dest active description dv_state number nsurname name.surname@domain.com nsurnameT1@domain.com name.surname@consultant.com name surname 2024-10-31 09:46:55 STL Leaver true Leaver Request for Name Surname - 31/10/2024 active INC01 jdoe john.doe@domain.com jdoeT1@domain.com jdoe@worker.com john doe 2024-11-11 12:46:55 STL Leaver true John Doe Offboarding on  - 31/12/2024 active INC02
To install an app from a file using the GUI requires signing in to Splunk.  No other credentials are required. If you wish to install using the CLI, un-tar the app into /opt/splunk/etc/apps and re-s... See more...
To install an app from a file using the GUI requires signing in to Splunk.  No other credentials are required. If you wish to install using the CLI, un-tar the app into /opt/splunk/etc/apps and re-start the DS. Either way, you should then copy the created 100_splunkcloud directory to /opt/splunk/etc/deployment-apps. There are no commands to run to complete the installation. The DS does not use port 9997.
Hi @Siddharthnegi , this means that you have an issue on the account used for the connection. Probably you used a domain user, try using a local user on the DB. Ciao. Giuseppe
Thank you, its working
| eval output=if(ABC="NO_Match" OR XYZ="NO_Match", "NO_Match", "Match")
Hi @PickleRick  Actually in phase-1 we are upgrading RHEL version (6.10 to 9.x--- just same OS but we are upgrading the version) the RHEL upgrade we are planning in new separate server from the sc... See more...
Hi @PickleRick  Actually in phase-1 we are upgrading RHEL version (6.10 to 9.x--- just same OS but we are upgrading the version) the RHEL upgrade we are planning in new separate server from the scratch and after  RHEL upgrade,in phase-2 we have a splunk upgrade(8.2 to 9.x) . So in phase-1 we are installing  existed splunk version (8..2)  in new server and we need data migration from existed server to new server.Please could you help on this... Thank You in advance..
Here you will find the prerequisites, hardware requirements and supported operating systems. Universal forwarder deployment prerequisites - Splunk Documentation
Can you please help me to build eval query Condition-1 ABC=Match XYZ=Match then output of ABC compare to XYZ is Match Condition-2 ABC=Match XYZ=NO_Match then output of ABC compare... See more...
Can you please help me to build eval query Condition-1 ABC=Match XYZ=Match then output of ABC compare to XYZ is Match Condition-2 ABC=Match XYZ=NO_Match then output of ABC compare to XYZ is No_Match Condition-3 ABC=NO_Match XYZ=Match then output of ABC compare to XYZ is No_Match Condition-4 ABC=NO_Match XYZ=NO_Match then output of ABC compare to XYZ is No_Match
Hello I have a DBConnect query that gets data from a database and then send it to a Splunk index. Below are the query and also how it looks in Splunk. The data is being indexed as key=value pair wit... See more...
Hello I have a DBConnect query that gets data from a database and then send it to a Splunk index. Below are the query and also how it looks in Splunk. The data is being indexed as key=value pair with dobbel quotes around "value". I have plenty of other data that is not using DBConnect and they dont have dobbel quotes around value.   Maybe the quotes is there because im using DBConnect? Is it possible to index data from DBConnect without adding the quotes? When i try to searc the data in Splunk i just dont get any data. I think it may have to do with the dobbel quotes? I'm not sure. Here are the search string. The air_temp is defined in the Climate datamodel. The TA(air temperature) in the data is defined in props.conf with the right sourcetype TU_CLM_Time. | tstats avg(Climate.air_temp) as air_temp from datamodel="Climate" where sourcetype="TU_CLM_Time" host=TU_CLM_1 by host _time span=60m ```Fetching relevant fields from CLM sourcetype in CLM datamodel.```      
Hey Splunk team, I’m facing an issue where Splunk fails to search for certain key-value pairs in some events unless I use wildcards (*) in the value. Here's an example to illustrate the problem: ... See more...
Hey Splunk team, I’m facing an issue where Splunk fails to search for certain key-value pairs in some events unless I use wildcards (*) in the value. Here's an example to illustrate the problem: { "xxxx_ID": "78901", "ERROR": "Apples mangos lemons. Banana blackberry blackcurrant blueberry.", "yyyy_NUM": "123456", "PROCESS": "orange", "timestamp": "yyyy-mm-ddThh:mm:ss" } Query Examples: This works (using wildcards): index="idx_xxxx" *apples mangos lemons* These don’t work: -> index="idx_xxxx"  ERROR="Apples mangos lemons. Banana blackberry blackcurrant blueberry." -> index="idx_xxxx"  ERROR=*apples mangos lemons* -> The query below, using regex, does not include all error values trying to find any value after the ERROR key:  index="idx_xxxx" | rex field=_raw "ERROR:\s*(?<error_detail>.+?)(?=;|$)" | table error_detail Observations: Non-Latin characters are not the issue because of other events, for example, Greek text in the ERROR field is searchable without wildcards. This behavior is inconsistent: some events allow exact matches, but others don’t. Questions: Could this issue stem from inconsistencies in the field extraction process? Are there common pitfalls or misconfigurations during indexing or source-type assignments that might cause such behavior? How can I debug and verify that the keys and values are properly extracted/indexed? Any help would be greatly appreciated! Thank you!  ‌‌
Login failed. The login is from an untrusted domain and cannot be used with Integrated authentication. this error is also been showed when i try to save.