All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Thank you all for this thread!   I've been working on Splunk for close to 10 years and JUST ran into this today for the first time and thought I was losing my mind because I can't find any record of ... See more...
Thank you all for this thread!   I've been working on Splunk for close to 10 years and JUST ran into this today for the first time and thought I was losing my mind because I can't find any record of etc/system/default/deploymentclient.conf on any of my systems.     Thank you for confirming that I'm not the only one thinking this is really, really weird but seems "normal"  
Hi Team, I have set alert for below query: index= "abc" "ebnc event did not balanced for filename" sourcetype=600000304_gg_abs_dev source!="/var/log/messages" | rex "-\s+(?<Exception>.*)" | table E... See more...
Hi Team, I have set alert for below query: index= "abc" "ebnc event did not balanced for filename" sourcetype=600000304_gg_abs_dev source!="/var/log/messages" | rex "-\s+(?<Exception>.*)" | table Exception source host sourcetype _time And I got below result:   I have set the alert as below   And I have set the incident for it with SAHARA Forwarder but I am getting only 1 incident though the statistics was 6. 6 incidents should get created And also Incidents are coming very late if  event triggered at 8:20 incident is coming on 9:16 Can someone guide me on it.      
Instead of using nomv, try this | eval merged_environment=mvjoin(merged_environment, ",")
I have a SH cluster with 3 servers, but I'm getting a lot of replication errors because the datamodels fill up the dispatch directory. How are jobs released from dispatch? Are files cleaned automati... See more...
I have a SH cluster with 3 servers, but I'm getting a lot of replication errors because the datamodels fill up the dispatch directory. How are jobs released from dispatch? Are files cleaned automatically? There are many bad alloc errors. Thanks.  
Thanks for the explanation..!! This is not working. | inputlookup fileB.csv table A E F |lookup fileA.csv A OUTPUT E This is the actual result, but not getting the proper results.  
....
Hi @bishopolis   -  I’m a Community Moderator in the Splunk Community.  This question was posted 12 years ago, so it might not get the attention you need for your question to be answered. We recom... See more...
Hi @bishopolis   -  I’m a Community Moderator in the Splunk Community.  This question was posted 12 years ago, so it might not get the attention you need for your question to be answered. We recommend that you post a new question so that your issue can get the  visibility it deserves. To increase your chances of getting help from the community, follow these guidelines in the Splunk Answers User Manual when creating your post. Thank you! 
@ITWhisperer  Thank you!! It works, Since I am passing this token in mail subject, can I separate it by comma or hyphen.?
I'm trying to troubleshoot some Windows Event Log events coming into Splunk. The events are stream processed, and come in as JSON. Here is a sample (obfuscated). {"Version":"0","Level":"0","Task":"... See more...
I'm trying to troubleshoot some Windows Event Log events coming into Splunk. The events are stream processed, and come in as JSON. Here is a sample (obfuscated). {"Version":"0","Level":"0","Task":"12345","Opcode":"0","Keywords":"0x8020000000000000","Correlation_ActivityID":"{99999999-9999-9999-9999-999999999999}","Channel":"Security","Guid":"99999999-9999-9999-9999-999999999999","Name":"Microsoft-Windows-Security-Auditing","ProcessID":"123","ThreadID":"12345","RecordID":"999999","TargetUserSid":"AD\\user","TargetLogonId":"0xXXXXXXXXX"} There are a number of indexed fields as well, including "Computer" and "EventID". What's interesting - signature_id seems to be created, but when I search on it, it fails. In this event, signature_id is shown under "Interesting Fields" with the value 4647, but if I put signature_id=4647 in the search line, it comes back with no results. If I put EventID=4647, it comes back with the result. I'm using Smart Mode. This led me to digging into the Fields configurations (alias', calculations, etc.) but I couldn't figure out how signature_id was created in the Windows TA. Can anyone provide any insight? Thank you! Ed
Good Day Ladies, Gentlemen! It's my first Dashboard Studio experience, and one (1) space boggles me. I have a datasource that works :       "ds_teamList": { "type": "ds.search", "options"... See more...
Good Day Ladies, Gentlemen! It's my first Dashboard Studio experience, and one (1) space boggles me. I have a datasource that works :       "ds_teamList": { "type": "ds.search", "options": { "query": "host=\"splunk.cxm\" index=\"jira\" sourcetype=\"csv\" \"Project name\"=\"_9010 RD\" \n| rename \"Custom field __Team\" as TEAM\n| table TEAM\n| dedup TEAM \n| sort TEAM" }, "name": "teamList" }       A multiselect input that list the correct data: With one (1) team name containing spaces.       "input_TEAM": { "options": { "items": [{}], "token": "ms_Team", "clearDefaultOnSelection": true, "selectFirstSearchResult": true }, "title": "TEAM", "type": "input.multiselect", "dataSources": { "primary": "ds_teamList" }       A chain search that uses the ms_team token:       | search TEAM IN ($ms_Team$) | search CLIENT IN ($dd_Client$) | search Priority IN ($ms_priority$) | chart count over Status by TEAM       The result gets all good data, but for the team that have a space in its name:   I know that if I could add double quotes for the team with space, it would work, but cannot find a solution for this simple minus issue. Or this is a bug, or not the way I'm suppose to use Dashboard Studio.       | search TEAM IN (Detector,Electronic,Mechanical,Software,"Thin film")       I searched and tried many solutions about strings in token, search... then I'm here for the first time...  Any simple solution possible? Thank you! Sylvain
Hi All,   My requirement is source data records data need to be encrypted. What does process need to follow? Is there any possibly  props.conf ?   Please help me the process.   Regards, Vij 
Thank you very much, it seems the previous recommendation works for me.
Thank you very much!   ... | eval output1=json(output) | eval output3 = json_extract(output1, "data.affected_items{}.id") | eval output3 = replace(output3,"[\[\]\"]","") | makemv output3 delim=... See more...
Thank you very much!   ... | eval output1=json(output) | eval output3 = json_extract(output1, "data.affected_items{}.id") | eval output3 = replace(output3,"[\[\]\"]","") | makemv output3 delim="," | mvexpand output3 | rename output3 as id   returns all ids in a column, so it seems it is what I need for further processing of this data. Thanks a lot!
Space separated | stats values(environment) as merge_environment | nomv merge_environment
Link is dead, as is the link in the initial email. https://response.splunk.com/subscription There are obvious no links to disable marketing email on the dashboard either. How do I unsubscribe ?
Perhaps if you shared your actual raw unformatted events (anonymised as appropriate) in a code block to preserve any formatting there might be in the event, we might be able to suggest something that... See more...
Perhaps if you shared your actual raw unformatted events (anonymised as appropriate) in a code block to preserve any formatting there might be in the event, we might be able to suggest something that might work with your data.
Thank you, but it doesn't work by some reason... ... ... | eval output1=json(output) | eval output3 = json_extract(output1, "data.affected_items{}.id") | table output3   - works fine, but the ... See more...
Thank you, but it doesn't work by some reason... ... ... | eval output1=json(output) | eval output3 = json_extract(output1, "data.affected_items{}.id") | table output3   - works fine, but the result in the one row   ... | eval output1=json(output) | spath input=output1 path="data.affected_items{}.id{}" output=output3 | mvexpand output3 | table output3   - shows "No results found."
OK so how do you want them combined e.g. do you want the times from the "unique objectIds" message and "data retrieved for Ids" message to be in the same row by object id? | spath uniqObjectIds{} ou... See more...
OK so how do you want them combined e.g. do you want the times from the "unique objectIds" message and "data retrieved for Ids" message to be in the same row by object id? | spath uniqObjectIds{} output=uniqObjectIds | spath uniqueRetrievedIds{} output=uniqueRetrievedIds | spath eventBody.objectIds{} output=eventBodyObjectIds | eval eventBodyObjectIds=if(eventBodyObjectIds=="",null(),eventBodyObjectIds) | eval objectId=coalesce(eventBodyObjectIds,coalesce(uniqueRetrievedIds,uniqObjectIds)) | eval retrievedTime=if(msg=="data retrieved for Ids",time,null()) | eval uniqueTime=if(msg=="unique objectIds",time,null()) | stats values(retrievedTime) as retrievedTime values(uniqueTime) as uniqueTime by objectId
I have a field called environment which has values like dev,prod,uat,sit. Now I want to create a new_field which all the field values of environment field. Example: (4 field values) environment ... See more...
I have a field called environment which has values like dev,prod,uat,sit. Now I want to create a new_field which all the field values of environment field. Example: (4 field values) environment  dev prod uat sit After query: ( 1 field value, separated by any string) merge_environment= dev | prod | uat | sit How to achieve this?
Happy 12th Birthday, posting #107735 ! You're a tween now! Why, it seems like only yesterday we were commenting on how decade-old authentication code for Yum repo consumers makes the current auth wall... See more...
Happy 12th Birthday, posting #107735 ! You're a tween now! Why, it seems like only yesterday we were commenting on how decade-old authentication code for Yum repo consumers makes the current auth wall completely pointless, and how easy it would be to set up a simple yum repo to make enterprise update staging and testing on-premise such a trivial thing. Now it's TWO decades old! Yay! Oh, how you've grown as the technology has aged. Remember all the times we've been told "we're just sorting it with [another group]" and progress went absolutely nowhere? Remember how we sadly pointed out the delayed development against its peers in that regard -- which is still a developmental delay today? This is your year, kiddo. Go on and be adequate!