All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi I am trying to capture all event="DcSyncs" from my index. This index also contains event="DcID". The event "DCSyncs" can occur at anytime (pretty often though), but "DcID" occurs once every 8 hour... See more...
Hi I am trying to capture all event="DcSyncs" from my index. This index also contains event="DcID". The event "DCSyncs" can occur at anytime (pretty often though), but "DcID" occurs once every 8 hours. I am trying to get all "DcSyncs" and then take the HostName field of those results and see if that HostName field has a result for event="DcID". If it does filter it out of the results. To summarize: I am trying to collect all HostName's that have a "DCSyncs" event, but no "DcID" event. I have this setup to run on an 8 hour interval so I don't think I need the time logic of the search.  I keep trying different variations, but I think I am way off. Any help is appreciated. index=MyIndex event="DcSyncs" | join HostName [search NOT index=MyIndex event="DcID"] | table _time HostName event
I have 3 columns that I'm using. URL, website, count. The URL is too large and I would like to reduce just the size it keeping all the words. I tried using the following command in the code but it... See more...
I have 3 columns that I'm using. URL, website, count. The URL is too large and I would like to reduce just the size it keeping all the words. I tried using the following command in the code but it didn't work: }, "options": { "columnFormat": { "URL": { "width": 100  
I guess my real question is how do I move Splunk from one company to another, including some but not all of the data and the indexes for the selected data? I see I can copy config and indexes from t... See more...
I guess my real question is how do I move Splunk from one company to another, including some but not all of the data and the indexes for the selected data? I see I can copy config and indexes from the $SPLUNK_HOME, but indexes are (I guess) just metadata, referencing other data. So, a search will read the index, then use that to get the data to return and display. I am going to guess Splunk will make a copy of the indexed data, because data sources can disappear for various reasons and that would not be ideal for later searches.  
I am trying to add fields from a lookup table. However, the matching field is a multivalue field. I need to expand the matching field but do not know how to group the lookup command with a multivalue... See more...
I am trying to add fields from a lookup table. However, the matching field is a multivalue field. I need to expand the matching field but do not know how to group the lookup command with a multivalue command lookup file assest.csv:  ip, host 10.10.1.1|10.100.1.1|10.10.200.1, srv1 10.10.1.2|10.100.1.2|10.10.200.2, srv2 original search that returns an IP value | [lookup assets.csv ip OUTPUT host |makemv delim="|" ip] does not work  
I have a list of IPs and want to check if they are sending data to Splunk but using a single query. The devices in this list need troubleshooting. Is there some query I could run referencing this l... See more...
I have a list of IPs and want to check if they are sending data to Splunk but using a single query. The devices in this list need troubleshooting. Is there some query I could run referencing this list to get an output of stats or something similar? Any guidance, please?  
I have splunk logs as given below. However, I wanted display fields in between square brackets "[ ]" in a table as given below. Please advise. Expeted query result in a table sqsMsgId            ... See more...
I have splunk logs as given below. However, I wanted display fields in between square brackets "[ ]" in a table as given below. Please advise. Expeted query result in a table sqsMsgId                                                                        | snsMsgId                                                                        | requestId dec6c564-9e1c-4d0f-8e5e-ac9dc7bdf14a  | 7d81b4cf-43c0-5bb4-8370-ef064a78da16 | d487108c-863f-5ab2-96df-4b458f97c74e My splunk Logs {"level":"info","message":"[sqsMsgId=dec6c564-9e1c-4d0f-8e5e-ac9dc7bdf14a | snsMsgId=7d81b4cf-43c0-5bb4-8370-ef064a78da16 | workItemKey=CAMP:MI4:ORG_ID:103857:7fbf0f46-4131-404d-9a13-57cdff7c473a | requestId=d487108c-863f-5ab2-96df-4b458f97c74e | status=SUCCESS | ags=CAMP | component=MI4 | duration=383]","requestId":"d487108c-863f-5ab2-96df-4b458f97c74e"}
@jkat54  I ran into an error with a long data parameter: command="curl", field larger than field limit (10485760) I see this value in a bunch of internals.py under csv.field_size_limit and changed... See more...
@jkat54  I ran into an error with a long data parameter: command="curl", field larger than field limit (10485760) I see this value in a bunch of internals.py under csv.field_size_limit and changed every possible one to a smaller value to see if the error message changes but it is still the same. Do you know which config controls this issue?
Do we have terraform provider for splunk alerts replicating in multiple environments We have search queries and alerts created in one environment -  can we promote same alerts to different environm... See more...
Do we have terraform provider for splunk alerts replicating in multiple environments We have search queries and alerts created in one environment -  can we promote same alerts to different environments - Do we have a way to automate this or if we have any terraform provider for replicating alerts across environments
Hi All, I wan to see user who are using splunk more. I am using the below query: |rest /services/authentication/users splunk_server=local Here i am getting all results, but i need the list of... See more...
Hi All, I wan to see user who are using splunk more. I am using the below query: |rest /services/authentication/users splunk_server=local Here i am getting all results, but i need the list of users  who are using Splunk more.
I have recently run into an issue with multiple "WARN HttpListener [HttpDedicatedIoThread:0] Socket error from [Search Head IP] while accessing /services/streams/search: broken pipe" for each Indexe... See more...
I have recently run into an issue with multiple "WARN HttpListener [HttpDedicatedIoThread:0] Socket error from [Search Head IP] while accessing /services/streams/search: broken pipe" for each Indexer in the Index cluster. There is a SH cluster, and a standalone SH. The standalone houses an app that does heavy backend searching. And the majority of the errors are from the standalone. When looking at the "I/O Operations per second" and "Storage I/O Saturation (cold/hot/opt)" from the Monitoring Console all instances are below 1%. I am not sure which settings to adjust to fix this error. Would adjusting the maxSockets and/or maxThreads in the server.conf help? Currently they are both set to default. Or should I be looking at values in limits.conf? This is happening on version 9.0.1. Any suggestions to help solve this would be much appreciated. Thanks!
Is it possible to set Splunk's timezone for each user based on their metadata in their SSO profile they use to log into Splunk? I'm pretty sure I can automate this information on the SSO side in a... See more...
Is it possible to set Splunk's timezone for each user based on their metadata in their SSO profile they use to log into Splunk? I'm pretty sure I can automate this information on the SSO side in a way that I can't using role-based configurations in Splunk (why can't roles have default timezones???), the only issue is getting Splunk to accept the timezone value. Using Splunk Cloud, currently on 8.2.something Thanks!
I have a lookup table that I want to use in a search. So I load the lookup table and use format. However I noticed there is a limit to 50,000 events in the format's result. What config allows me to i... See more...
I have a lookup table that I want to use in a search. So I load the lookup table and use format. However I noticed there is a limit to 50,000 events in the format's result. What config allows me to increase that or convert a column of values into a single string? | inputlookup test.csv | table count | format The lookup has 50,001 values in it but format only goes up to 50,000. 
Hi all,  I am trying to configure a REST API (OAuth) into a Splunk cloud trial environment. I'm running into issues and not seeing a clear way to pull this data in. I've tried using the HTTP event c... See more...
Hi all,  I am trying to configure a REST API (OAuth) into a Splunk cloud trial environment. I'm running into issues and not seeing a clear way to pull this data in. I've tried using the HTTP event collector with no luck.  I want to consume and search on the events that are pulled. Could this be a limitation on my cloud trial? Anything I'm missing?  I did find an app called "REST API Modular Input" that seems to work with splunk enterprise, but not splunk cloud. is there a free equivalent on splunk cloud splunkbase? Thanks, Michelle
Hi Spelunker, I want to create a field "Credentialed checks:" with this field value. Please help. regards, Nessus version : 8.10.0 Nessus build : 20232 Plugin feed version : 202210171349 Scann... See more...
Hi Spelunker, I want to create a field "Credentialed checks:" with this field value. Please help. regards, Nessus version : 8.10.0 Nessus build : 20232 Plugin feed version : 202210171349 Scanner edition used : Nessus Scanner OS : LINUX Scanner distribution : es6-x86-64 Scan type : Normal Scan policy used : eb1cd575-c2d4-5be5-8010-1290128ec92e-23586099/01. PCI-INTERNAL-VA-SCAN Scanner IP : 10.6.6.51 Port scanner(s) : nessus_syn_scanner Port range : sc-default Ping RTT : Unavailable Thorough tests : no Experimental tests : no Plugin debugging enabled : no Paranoia level : 1 Report verbosity : 1 Safe checks : yes Optimize the test : yes Credentialed checks : no Patch management checks : None Display superseded patches : no (supersedence plugin launched) CGI scanning : disabled Web application tests : disabled Max hosts : 30 Max checks : 4 Recv timeout : 5 Backports : Detected Allow post-scan editing : Yes Scan Start Date : 2022/10/18 19:50 +07 Scan duration : 561 sec" ........................................................................................................ Nessus version : 8.10.0 Nessus build : 20232 Plugin feed version : 202210171349 Scanner edition used : Nessus Scanner OS : LINUX Scanner distribution : es6-x86-64 Scan type : Normal Scan policy used : eb1cd575-c2d4-5be5-8010-1290128ec92e-23586099/01. PCI-INTERNAL-VA-SCAN Scanner IP : 10.6.6.51 Port scanner(s) : netstat Port range : sc-default Ping RTT : Unavailable Thorough tests : no Experimental tests : no Plugin debugging enabled : no Paranoia level : 1 Report verbosity : 1 Safe checks : yes Optimize the test : yes Credentialed checks : yes, as 'isdscan' via ssh Attempt Least Privilege : no Patch management checks : None Display superseded patches : no (supersedence plugin launched) CGI scanning : disabled Web application tests : disabled Max hosts : 30 Max checks : 4 Recv timeout : 5 Backports : Detected Allow post-scan editing : Yes Scan Start Date : 2022/10/18 19:51 +07 Scan duration : 2483 sec"
Hello,  I find myself needing a jump start on Splunk API calls with tokens. Anyone have more information than the docs?   https://docs.splunk.com/Documentation/Splunk/8.2.6/Security/UseAuthToke... See more...
Hello,  I find myself needing a jump start on Splunk API calls with tokens. Anyone have more information than the docs?   https://docs.splunk.com/Documentation/Splunk/8.2.6/Security/UseAuthTokens   any techniques to make testing easier?  I need to confirm that a given token is valid, I have the correct port number and that some basic functionality works with the API calls.  Anything would be greatly appreciated.  Has this been a topic of past Splunk .conf? 
I'm setting up Splunk Cloud for my organization.  Lately I've been getting errors when I try to establish a connection between my services and Splunk Cloud both from on-premises systems and cloud env... See more...
I'm setting up Splunk Cloud for my organization.  Lately I've been getting errors when I try to establish a connection between my services and Splunk Cloud both from on-premises systems and cloud environments.   I've been getting timeout errors as shown above when Cloudflare tests the connection.  I also get timeout messages when other applications (ADAudit) send data to Splunk.  Has anyone run into issues such as this or could give me some pointers.  My organization has reached out to Splunk regarding establishing a service contract (We thought it was included in our order) but have not heard anything back from our sales rep yet. 
Hi, Any thoughts appreciated. I have some connection data captured at connection termination, it has connection start and end times. CONSTA and CONEND in the format "2022-10-18 15:40:00.000000". ... See more...
Hi, Any thoughts appreciated. I have some connection data captured at connection termination, it has connection start and end times. CONSTA and CONEND in the format "2022-10-18 15:40:00.000000". What I'd like to do is timechart in say 5 minute intervals the number of connections that were active in those intervals. So all connections in an interval 15:40 - 15:45 that had started but not terminated and repeat that across the timechart so 15:45 - 15:50 etc. Hopefully that make sense. Thanks in advance Steve
Hello, Assuming i have numbers, let's say 1-2-3-4-5-6. And each of those represent Ip adress number of request method 1.1.1.2 1 get 1.1.1.3 1 get 1.1.1.4 2 ... See more...
Hello, Assuming i have numbers, let's say 1-2-3-4-5-6. And each of those represent Ip adress number of request method 1.1.1.2 1 get 1.1.1.3 1 get 1.1.1.4 2 get 1.1.1.5 4 get 1.1.1.6 4 get 1.1.1.7 5 get 1.1.1.8 7 get   What's could be the search to get following table   number of requests number of IPs that make 'x' requests 1 2 (meaning to client has made 1 requests) 2 1 3 0 4 2 5 1 6 0 7 1 8 0 9 0   Thanks
Hi Guys, Is there anybody here knows how to remove user email from any Splunk alert and add new user email in his place! I used this search to find any Splunk alerts related to the person I want ... See more...
Hi Guys, Is there anybody here knows how to remove user email from any Splunk alert and add new user email in his place! I used this search to find any Splunk alerts related to the person I want to remove, but I'm getting 0 events. | `a_searches` | fields report_name email_recipients cc_email_recipients | search email_recipients="* A@gmail.com*" Any help will be appreciated!
  I have a search index="xyz" sourcetype="csv" | fillnull value="unknownMan" field1 field2 field3 field4 | eventstats dc(field1) as xyz by field2 field3 field4 | table field1 field2 field3 field... See more...
  I have a search index="xyz" sourcetype="csv" | fillnull value="unknownMan" field1 field2 field3 field4 | eventstats dc(field1) as xyz by field2 field3 field4 | table field1 field2 field3 field4 while running this, i'm getting NULL values in the results? Please help me with this why NULL values will be coming when there is no NULL values in the events??