All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Same results I get the IP address but no country in the Geo Location. I have noticed that I have a space at the end of the IP address using this REX command.  Ended up using the following command to ... See more...
Same results I get the IP address but no country in the Geo Location. I have noticed that I have a space at the end of the IP address using this REX command.  Ended up using the following command to remove the ending space and that resolved my problem. | eval ip_address=trim(ip_address)
I signed up for the Splunk Cloud Platform free trial as part of an online class. However, I'm unable to access my instance. I see that an instance has been created, but nothing happens when I click t... See more...
I signed up for the Splunk Cloud Platform free trial as part of an online class. However, I'm unable to access my instance. I see that an instance has been created, but nothing happens when I click the "Access instance" button. I also got an email with a temporary password for the instance, but the login fails, and I got locked out after several attempts. Anyone know how to resolve this? Update: I was able to log in after resetting the password and waiting for the lockout to expire, but the "Access instance" button is still unresponsive.
Hello, I am trying to join two indexes to display data from our local printers.  I have an index getting data from our printer server that contains the following data:    index=prntserver _time,   ... See more...
Hello, I am trying to join two indexes to display data from our local printers.  I have an index getting data from our printer server that contains the following data:    index=prntserver _time,                                   prnt_name     username   location 2024-11-04 11:05:32    Printer1           jon.doe         Office 2024-11-04 12:20:56    Printer2           tim.allen       FrontDesk   I have an index getting data from our DLP software that contains the following data:    index=printlogs _time                                    usersname     directory                          file 2024-11-04 11:05:33    jon.doe             c:/desktop/prints/     document1.doc 2024-11-04 12:20:58    tim.allen  c:/documents/files/   document2.xlsx   I am trying to join the two indexes to give me time, printer name, user name and location from the Print Server Index and then give me directory and file name that was recorded from my Print Log Index.  I am wanting to use time to join the two indexes but my issues is that the timestamp is off by 1 if not 2 seconds between the two index records.  I was trying to use the transaction command with a maxspan=3s to be safe but cannot get it to work.  Here is what I have been trying to work with   index=printserver | convert timeformat="%Y-%m-%d %H:%M:%S" ctime(_time) AS servtime    | join type=inner _time       [ search index=printlogs         | convert timeformat="%Y-%m-%d %H:%M:%S" ctime(_time) AS logtime       ] | transaction startswith=eval(src="<servtime>") endswith=eval(src="<logtime>") maxspan=3s | table servtime prnt_name username location directory file   Thanks for any assistance given on this one. 
I have data similar to: Field-A Field-B A1           B1 A1           B2 A1           B3 A2           B4 A3           B5 A2           B6 Where Field-A will repeat but Field-B is unique values.... See more...
I have data similar to: Field-A Field-B A1           B1 A1           B2 A1           B3 A2           B4 A3           B5 A2           B6 Where Field-A will repeat but Field-B is unique values.  I am using | stats count by Field-A to give me the number of occurrences of A1, A2, A3 and am trying to include a single example of Field-B.  Something like: Field-Count-Example A1 -- 3 -- B2 A2 -- 2 -- B6 A3 -- 1 -- B5 Thank you for any suggestions.  
I have a working dashboard where a token is used as a variable. But now I am trying to use the same concept when making a direct search within "Search & Reporting app".  I have Windows events that ha... See more...
I have a working dashboard where a token is used as a variable. But now I am trying to use the same concept when making a direct search within "Search & Reporting app".  I have Windows events that have multiple fields that produce a common value. In this example, the following search will give me usernames.   ...base search (member_dn=* OR member_id=* OR Member_Security_ID=* OR member_user_name=*)   I would like to declare a variable that I can use as a value to search all four aforementioned fields. I tried the following with no luck:   index=windows_logs | eval userid=johnsmith | where $userid$ IN (member_dn, member_id, Member_Security_ID, member_user_name)    
Hey @dswoff , AFAIK there is a problem in your logic. The | iplocation command accepts a few arguments, but not like key:value pair as the IP. I believe in your case you want to pass the IP and... See more...
Hey @dswoff , AFAIK there is a problem in your logic. The | iplocation command accepts a few arguments, but not like key:value pair as the IP. I believe in your case you want to pass the IP and get the Country as result, then try this: index="eventlog" EventCode=1309 | rex field=Message "User host address:\s(?<ip_address>.*)" | iplocation ip_address | table ip_address, Country OR for fixed IP index="eventlog" EventCode=1309 | iplocation "<your_ip_here>" | table ip_address, Country The iplocation accepts an IP and will give you as response the fields: City, Continent, Country, MetroCode, Region, Timezone, lat and lon. Give it a try and let me know
HI roshnadabala Wondering if you are able to resolve it..I am seeing the same issue across multiple SH clusters.
@scelikok regex is correct but below if it is applied then timestamp wont be there in the event. Splunk will take as current time which completely misleads. I want to have 2 events for a single log e... See more...
@scelikok regex is correct but below if it is applied then timestamp wont be there in the event. Splunk will take as current time which completely misleads. I want to have 2 events for a single log entry. first event should have till 2024-11-04T19:05:46.323Z [INFO] ContentGenerator  second event should have full JSON and even the JSON wont have timestamp in it but first event timestamp is written to this JSON.
So I am trying to find the geo location for some IP addresses that keep crashing our webserver when they crawl it.  I am getting the information from the event logs. The IP addresses are coming in on... See more...
So I am trying to find the geo location for some IP addresses that keep crashing our webserver when they crawl it.  I am getting the information from the event logs. The IP addresses are coming in on a generic field called message that contains a lot of text, so I am pulling that using a rex command, but the iplocation command shows no country code. I have used the iplocation command to get geo information about IP addresses in the past several hours on another search, so I know that works in my system.  When I use the where | where ip_address='ip-address' command it shows no data. So I'm guessing that Splunk doesn't see the text in the created field of ip_address as actual IP addresses.  Anyone know how I can make it see this data as an IP address? Or is it that there might be a leading space or something like that that is causing the issue and if so how do I get rid of that noise? index="eventlog" EventCode=1309 | rex field=Message "User host address:\s(?<ip_address>.*)" | iplocation ip_address=Country | table ip_address, Country
Hi @Harinder.Rana, Thanks for asking your question on the Community. Did you happen to find out anymore information or a solution you can share here? If you are still looking for help, you can c... See more...
Hi @Harinder.Rana, Thanks for asking your question on the Community. Did you happen to find out anymore information or a solution you can share here? If you are still looking for help, you can contact AppDynamics Support: How to contact AppDynamics Support and manage existing cases with Cisco Support Case Manager (SCM) 
I've been using dbxquery connection=my_connection procedure=my_procedure to build reports and a few that my DBAs have built require time inputs, one I'm working on expects parameter '@StartDate',  Is... See more...
I've been using dbxquery connection=my_connection procedure=my_procedure to build reports and a few that my DBAs have built require time inputs, one I'm working on expects parameter '@StartDate',  Is there a way to pass that through to the stored proc?
Hello all, hoping someone can help me. We are setting up IAM User Keys that are supposed to rotate on a monthly basis. We use those keys to send email from AppDynamics. I can connect to the SMTP serv... See more...
Hello all, hoping someone can help me. We are setting up IAM User Keys that are supposed to rotate on a monthly basis. We use those keys to send email from AppDynamics. I can connect to the SMTP server just fine. What I need to find out is where is this information stored so that I can create a script that will update that information when the keys get rotated. Is it in the database, and if so what table? Or if its in a file what file? Thanks for any and all help!
Hi @splunklearner , yes sorry, it was a mistyping! I don't know exactly the differences between Splunk Cloud ans Splunk on AWS, probably they are very quite because the infrastructure is the same a... See more...
Hi @splunklearner , yes sorry, it was a mistyping! I don't know exactly the differences between Splunk Cloud ans Splunk on AWS, probably they are very quite because the infrastructure is the same and the product is the same, it's different only who manages it. If you want am on premise solution take in consideration Splunk on-premise, if you want a cloud solution, take in consideration Splunk Cloud. Ciao. Giueppe
Adding to @ITWhisperer 's question - remember that if you're detecting a downtime as lack of events you are unable to either detect downtime longer than your search window completely (if you're not u... See more...
Adding to @ITWhisperer 's question - remember that if you're detecting a downtime as lack of events you are unable to either detect downtime longer than your search window completely (if you're not using a list of values to compare your results to) or at least unable to detect their real length beyond your search window.
That thread linked by @gcusello is relatively old but quite valid. Generally speaking - in terms of the basic user's experience, they are prety similar and you could have difficult time telling one ... See more...
That thread linked by @gcusello is relatively old but quite valid. Generally speaking - in terms of the basic user's experience, they are prety similar and you could have difficult time telling one from the other. The difference is who is responsible for the infrastructure and who does the "low-level" stuff on the environment (and what you can do there). Because obviously you don't have direct access to the underlying servers in Splunk Cloud. Some of the settings you normally can adjust from the CLI you can only manipulate via apps uploaded to the Cloud (and remember that your private apps go through the vetting process so you can't just throw anything in them). Some settings may only be set by support. Some cannot be changed. But you don't need to worry about mundane stuff like backups. If you set up your Splunk environment in AWS (I assume that's what you mean by AWS Splunk), it's exactly like an on-prem Splunk Enterprise installation but without having to maintain the hardware.
Hi @gcusello , Expecting it is not QWS it is AWS.(correct me if I am wrong) Can you please illustrate more about Splunk cloud vs AWS splunk? 
Hi @splunklearner , Splunk on premise is installed on your own infrastructure. Splunk Cloud is a service that is managed by Splunk itself and it's located on QWS infrastructure but it's transparent... See more...
Hi @splunklearner , Splunk on premise is installed on your own infrastructure. Splunk Cloud is a service that is managed by Splunk itself and it's located on QWS infrastructure but it's transparent for you. Splunk on QWS is a service from AWS, is similar to Splunk on premise but installed on a private clud on AWS. You can find a comparative analysis between On-Premise and Cloud at: https://community.splunk.com/t5/Splunk-Enterprise/Main-differences-between-Splunk-Enterprise-and-Splunk-Cloud/m-p/218797 https://www.conducivesi.com/about-splunk/splunk-enterprise-vs-splunk-cloud https://www.gartner.com/reviews/market/security-information-event-management/compare/product/splunk-cloud-vs-splunk-enterprise Ciao. Giuseppe
I am pretty new to Splunk. What is the difference between Splunk on premises vs Splunk cloud vs AWS splunk? Please enlighten me.
How (in non-SPL terms) do you determine what the downtime for a component is?
Hi @PickleRick ,   Thanks for the response. I agree that usually web service would be disabled but we keep the UI so that we can see the changes. I managed to clean the indexer completely of all t... See more...
Hi @PickleRick ,   Thanks for the response. I agree that usually web service would be disabled but we keep the UI so that we can see the changes. I managed to clean the indexer completely of all the configurations. Then recreate from backup and it worked.   Thanks, Pravin