All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I currently have 4 indexers as part of my Splunk deployment. I am upgrading these indexers with new hardware. I am going to join the 4 new indexers to the existing indexer cluster and then ultimatel... See more...
I currently have 4 indexers as part of my Splunk deployment. I am upgrading these indexers with new hardware. I am going to join the 4 new indexers to the existing indexer cluster and then ultimately retire the 4 old indexers once the data is redistributed across the cluster. But, once all of the indexers are in the same cluster I seem to have two options (I think) for making sure that data is distributed across the new indexers: Option 1 Rebalance data across all 8 indexers...   splunk rebalance cluster-data -action start   ...and then retire the old indexers as normal. Option 2 Put each indexer in detention one by one and then retire in the following way, which as I understand it will move data off the indexer in the process...   splunk offline --enforce-counts   I've read the documentation around these topics, however Option 2 was mentioned to me in a previous post and so I just wanted clarification. Many thanks. Edit: Or, thinking about it some more, would I just use Option 1 to rebalance the data and then use Option 2 to remove the old indexers one by one?
Hi, I need to extract a value  from a message field, which has multiple data values. as like below, message:{user: xxxx,age:yy,gender:xxxx, position:"nnnn", place:yyy} In the above i need to extra... See more...
Hi, I need to extract a value  from a message field, which has multiple data values. as like below, message:{user: xxxx,age:yy,gender:xxxx, position:"nnnn", place:yyy} In the above i need to extract the position value, which may have n number datas present after this. So, I need to extract the position value by its name alone.  And also this position value can be there with a name as designation also    
I have a query structured like below with main search and sub search where the main search includes lookup, |inputlookup tci|search tag.name="ap" |rename tag.name as tags|dedup indicator|table indic... See more...
I have a query structured like below with main search and sub search where the main search includes lookup, |inputlookup tci|search tag.name="ap" |rename tag.name as tags|dedup indicator|table indicator confidence rating ownerName tags|union[search sourcetype="cisco:*" action=allowed |rename src_ip as indicator|dedup indicator|table indicator confidence rating ownerName tags]|stats count values(confidence) as confidence values(rating) as rating values(ownerName) as ownerName values(tags) as tags by indicator|where count>1|table indicator confidence rating ownerName tags   I wanted the results of this query to be lookup into one more source type and take out raw data. I have tried the below but it doesn't work, sourcetype="symantec:*"[|inputlookup tci|search tag.name="ap" |rename tag.name as tags|dedup indicator|table indicator confidence rating ownerName tags|union[search sourcetype="cisco:*" action=allowed |rename src_ip as indicator|dedup indicator|table indicator confidence rating ownerName tags]|stats count values(confidence) as confidence values(rating) as rating values(ownerName) as ownerName values(tags) as tags by indicator|where count>1|table indicator confidence rating ownerName tags]|table _raw Please suggest any alternatives to lookout for source type where we have to derive the result from nested sub searches with lookups.
Hi, anyone knows how to onboard J-Boss Application Servers to the Splunk Enterprise. Need the tutorial for the configuration and explanations.  Please help.
Hi,   We are able to fetch update logs from our WSUS server using add-on for windows. However, we want to display approved/unapproved update status in Splunk itself without having to go to the serv... See more...
Hi,   We are able to fetch update logs from our WSUS server using add-on for windows. However, we want to display approved/unapproved update status in Splunk itself without having to go to the server. Any suggestions.
I have some error keywords. These words all come in Raw data. I put them in a lookup file, lookup file name is mylookup.csv . Now I need to get an email alert when I have the word triggered in that ... See more...
I have some error keywords. These words all come in Raw data. I put them in a lookup file, lookup file name is mylookup.csv . Now I need to get an email alert when I have the word triggered in that file. Thanks in advance
Hello Team, In my org they installed the below certs in particular role, need to know by seeing below table which category it may comes to. Can anyone please explain how this. We are checking this l... See more...
Hello Team, In my org they installed the below certs in particular role, need to know by seeing below table which category it may comes to. Can anyone please explain how this. We are checking this link but not understand. About securing Splunk Enterprise with SSL - Splunk Documentation Role Cert Remarks Internal Heavy Forwarders /opt/splunk/etc/auth/myServerCertificate.pem /opt/splunk/etc/auth/rootCA.pem   SH cluster   Cluster Master   DMZ HF   DS   ES SH Deployer   HF Cluster   IDX Cluster   Monitoring Console   ES SH Cluster /opt/splunk/etc/auth/webcerts/mySplunkWebCertificate.pem   /opt/splunk/etc/auth/myServerCertificate.pem   /opt/splunk/etc/auth/rootCA.pem  
Hi, A new user here on Splunk. It's been 4 hours that I am going through Splunk multiple documents and I am going in circle here. Maybe someone can point me to the right direction to get me started... See more...
Hi, A new user here on Splunk. It's been 4 hours that I am going through Splunk multiple documents and I am going in circle here. Maybe someone can point me to the right direction to get me started. We have a new splunk cloud account, I am trying to get my cisco asa and pfsense logs to splunk cloud. I installed on windows server splunk forwarder, But I can't figure out how to get the logs to the forwarder and then to the splunk cloud. I specified on the ASA in the syslog server the IP of splunk forwarder but it doesn't seem like the forwarder is taking it. PS: I already installed the spl credential on the forwarder and restarted the service. (I believe that's all that is needed for the forwarder to send data to the cloud right?) Thank you for any help I can get.
Hello experts, My splunk search can return only a list of group IDs, but group names can only be found separately there is a groups.csv file which maps id and name groupid,groupname, "a1234", "ap... See more...
Hello experts, My splunk search can return only a list of group IDs, but group names can only be found separately there is a groups.csv file which maps id and name groupid,groupname, "a1234", "apple", "b2345","balloons", "c1144","cats" How can I write the query to return group id and the corresponding group name index=myidx type=groups  | table _time groupid groupname Thanks a lot!  
Hello guys... We need some help, as always. We are a bunch of noobs in Splunk and we want to create some basic dashboards about the local performance such as disk, cpu, memory... And dashboards about... See more...
Hello guys... We need some help, as always. We are a bunch of noobs in Splunk and we want to create some basic dashboards about the local performance such as disk, cpu, memory... And dashboards about a few of the most importants event logs in windows. Any idea how to start? I've been reading docs, forums, etc. but it looks like since is too basic no one talks about it lol Hope you can give me a hand. We are using splunk enterprise on a local w10 machine just to get our hands dirt and learn the basics as you can see. Thank you again and happy halloween!
Hi There,   Any guidance on how to find common values starting with similar values from two different sources?   exp:   Source: 1 field:SerialId value: 123_abc Source:  2 field: SerialId value... See more...
Hi There,   Any guidance on how to find common values starting with similar values from two different sources?   exp:   Source: 1 field:SerialId value: 123_abc Source:  2 field: SerialId value: 123_abcde   so if the values start with the first common 6 letters and numbers-find those matching.   any advice on how to approach this?
I'm trying to use a key across three sourcetypes to show unique non-multivalue rows using a stats by clause that has a different field in each of the sourcetypes i.e. Sourcetype A NumberA(Key) Dat... See more...
I'm trying to use a key across three sourcetypes to show unique non-multivalue rows using a stats by clause that has a different field in each of the sourcetypes i.e. Sourcetype A NumberA(Key) Date (by clause) Sourcetype B NumberB(Key) Username (by clause) Sourcetype C NumberC(Key) Version (by clause) if you use the number field, which is the key across the sourcetypes, as the stats by clause and add the different sourcetype fields as values, it produces multivalue fields (e.g. a number may have multiple dates, or users), where I'm looking for unique rows to show number, Date, Username, Version e.g. sourcetype=A OR sourcetype=B OR sourcetype=C eval number=coalesce(NumberA, NumberB, NumberC) stats values(sourcetype) values(Date) values(Username) values(Version) by number I would have thought that you could add the different fields to the stats by clause after the key, but it's not returning anything- e.g. sourcetype=A OR sourcetype=B OR sourcetype=C eval number=coalesce(NumberA, NumberB, NumberC) stats values(sourcetype) by number Date Username Version Would this make sense, and is possible?
    index=IndexName | table username ip_address_new id_new desti | lookup file.csv user as username OUTPUT user id_old ip_address_old | where NOT (id_new = id_old AND ip_address_new = ip_address... See more...
    index=IndexName | table username ip_address_new id_new desti | lookup file.csv user as username OUTPUT user id_old ip_address_old | where NOT (id_new = id_old AND ip_address_new = ip_address_old AND username = user)   Can I combine "where" and "if" command together Or do something like this need to write something like this if  id_new != id_old:      | eval match_id = not match id elif username != user:      | eval match_user = not match user elif ip_address_new != ip_address_old:       | eval match_ip = not match IP address  else:       | eval ....
Hi all, I am trying to change the behavior of legend, I am trying to make it NOT clickable. I have below properties related to bar chart. <option name="charting.chart">column</option> <option name... See more...
Hi all, I am trying to change the behavior of legend, I am trying to make it NOT clickable. I have below properties related to bar chart. <option name="charting.chart">column</option> <option name="charting.drilldown">all</option> <option name="charting.legend.labelStyle.overflowMode">ellipsisEnd</option> <option name="charting.legend.mode">standard</option> <option name="charting.legend.placement">right</option> <option name="trellis.enabled">0</option> <option name="refresh.display">progressbar</option> Any suggestions please? I am not seeing any other options related to legend in splunk docs.
I'm writing Python script that assigns multiple roles to a user, but having difficulty understanding what the  'roles' data structure needs to look like. According to the REST documentation for authe... See more...
I'm writing Python script that assigns multiple roles to a user, but having difficulty understanding what the  'roles' data structure needs to look like. According to the REST documentation for authentication/users/{name}: To assign multiple roles, pass in each role using a separate roles parameter value. For example, -d roles="role1", -d roles="role2". In Postman, I can successfully construct a request with multiple 'roles' parameters to produce the result I want, which is to assign multiple roles to the user.  In Python, my code looks like this:   (response, content) = h.request(HOST + URL + OUTPUT_MODE, 'POST', headers=HEADERS, body=urllib.parse.urlencode({'roles':'admin','roles':'user'}))   But the end result is that the user is only assigned the 'user' role, presumably because the 'body' data structure ends up being a dictionary with a single key: {'roles':'user'} Anybody know what is the right Python data structure to pass to urlencode so that I can add multiple roles to the user in a single POST?
Hi I have several file in "myindex", when I set date "yesterday" I expect show just yesterday files , but it return older than yesterday files somtimes! e.g today is 10/31/2020, and I run this spl ... See more...
Hi I have several file in "myindex", when I set date "yesterday" I expect show just yesterday files , but it return older than yesterday files somtimes! e.g today is 10/31/2020, and I run this spl (time set to yesterday)   command: | metadata type=sources index=myindex output: /app/20211031/server1.20211031.zip /app/20211031/server2.20211031.zip /app/20211025/server2.20211025.zip   FYI: modify date of this file server2.20211025.zip belong to 20211025   Any idea? Thanks,
Search head , Intermittenly  search quaries are not completing and failed to fetch the data , The error show that search that can't be created  and bellow warnings 1. Expected common latest bundle v... See more...
Search head , Intermittenly  search quaries are not completing and failed to fetch the data , The error show that search that can't be created  and bellow warnings 1. Expected common latest bundle version on all pears after  sync replication , found none,Reverting  old behavior-using most recent bundle on all. 2. Unable to distribute peer named  gbl20051204  at uri:https:// because replication was unsuccessful .Replication status;Failed -Failure-Info:Failed _because_HTTP_Error_Code_,Verify connectivity to search peer ,that search peer ,that search peer is up  and that an adequate  level of system resource are available  
Hi! My setup has a log archive account using AWS Landing zone where all the CloudTrail and VPC Flow Logs from multiple accounts get aggerated and stored in an s3 bucket. I want to send both of the l... See more...
Hi! My setup has a log archive account using AWS Landing zone where all the CloudTrail and VPC Flow Logs from multiple accounts get aggerated and stored in an s3 bucket. I want to send both of the logs to a Splunk HEC. Which is the best suited architecture pattern for this?   
Hi I have field that call "servername" that return this: ...| table servername server1 server2 server3 need spl that when I give list of my servername, return which servername not exist expe... See more...
Hi I have field that call "servername" that return this: ...| table servername server1 server2 server3 need spl that when I give list of my servername, return which servername not exist expected output: ...|search server1 OR server2 OR server3 OR server4 | table servername status servername    status server4                X   any idea? Thanks
i am trying to pull incidents resolved by each user in date wise . can any one help me how to form the below table with count User Name 10/4/2021 10/5/2021 10/6/2021 Grand Total AAAA   3... See more...
i am trying to pull incidents resolved by each user in date wise . can any one help me how to form the below table with count User Name 10/4/2021 10/5/2021 10/6/2021 Grand Total AAAA   3   3 BBBBB 2     2 CCCCC 3 1   4 DDD 1     1