All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I would like to change the color of a cell based on the date. If the date is in the future, then green. If the date is today, then yellow. If the date is in the past, then red. I have read many posts... See more...
I would like to change the color of a cell based on the date. If the date is in the future, then green. If the date is today, then yellow. If the date is in the past, then red. I have read many posts about how to do this based on _time and appending a word to the date ("overdue", for example), but I can't seem to make this work in my case. The field I'm working with is dueDate (not based on _time), and it's formatted as %Y-%m-%d. 
Hi, I'm pretty new in splunk, I've been reading a lot of documentation and other questions here, but I don't find the help that I need. I have this search, every day is a left join like this: ... See more...
Hi, I'm pretty new in splunk, I've been reading a lot of documentation and other questions here, but I don't find the help that I need. I have this search, every day is a left join like this:   index=myIndex sourcetype=mySource | eval weekday=strftime(_time,"%A") | where weekday = "Monday" | where Systems= "SYSTEM 1" OR "SYSTEM 2" OR "SYSTEM 3" OR "SYSTEM 4" | eval ExpectedTime = case( System="SYSTEM 1", "6:30am", System="SYSTEM 2", "6:35am", System="SYSTEM 3", "6:45am", System="SYSTEM 4", "6:40am" ) | eval CurrentSLO= case( System="SYSTEM 1", "7:15am", System="SYSTEM 2", "7:20am", System="SYSTEM 3", "7:10am", System="SYSTEM 4", "7:10am" ) | eval EndHour=substr(time, 50, 1) | eval EndMin=substr(time, 52, 2) | eval time = EndHour.":".EndMin | eval Mon = " (" .EndHour. ":" .EndMin. "am)" | eval category="CATEGORY 1" | table category Systems ExpectedTime CurrentSLO Mon Tue Wed Thu Fri | rename ExpectedTime as "Expected Time" | rename CurrentSLO as "Current SLO" | rename category as "Category" | join type=left Systems [ search index=myIndex sourcetype=mySource | eval weekday=strftime(_time,"%A") | where weekday = "Tusday" | where Systems= "SYSTEM 1" OR "SYSTEM 2" OR "SYSTEM 3" OR "SYSTEM 4" | eval ExpectedTime = case( System="SYSTEM 1", "6:30am", System="SYSTEM 2", "6:35am", System="SYSTEM 3", "6:45am", System="SYSTEM 4", "6:40am" ) | eval CurrentSLO= case( System="SYSTEM 1", "7:15am", System="SYSTEM 2", "7:20am", System="SYSTEM 3", "7:10am", System="SYSTEM 4", "7:10am" ) | eval EndHour=substr(time, 50, 1) | eval EndMin=substr(time, 52, 2) | eval time = EndHour.":".EndMin | eval Tue = " (" .EndHour. ":" .EndMin. "am)" | eval category="CATEGORY 1" | table category Systems ExpectedTime CurrentSLO Mon Tue Wed Thu Fri | rename ExpectedTime as "Expected Time" | rename CurrentSLO as "Current SLO" | rename category as "Category" . . .   I need to trigger an alert when there is no information for a day of the week. I've been trying whit search count=0, transaction and other failed solution attempts.
Hello Splunk Team,  I registered on the Splunk Soar community edition page two days ago, but still received no emails about next steps. Please let me know what I have to do to get the process moving... See more...
Hello Splunk Team,  I registered on the Splunk Soar community edition page two days ago, but still received no emails about next steps. Please let me know what I have to do to get the process moving.  Thank you 
I keep getting this error when trying to start splunk, can anyone assist me? I am trying to install splunk within Kali Linux VM   zsh: exec format error: ./splunk    
i am trying to search over REST API, seeing "All Time searches don't adhere to Splunk best practices" Error.  Any policy on Splunk would block REST API searches ? curl -u 'XXXX' -k https://splunkap... See more...
i am trying to search over REST API, seeing "All Time searches don't adhere to Splunk best practices" Error.  Any policy on Splunk would block REST API searches ? curl -u 'XXXX' -k https://splunkapi.example.com/services/search/jobs -d search='search index="webaccess" status=403 earliest_time=-1d' curl -u 'XXXX' -k https://splunkapi.example.com/services/search/jobs -d search='search index="webaccess" status=403 earliest=-1d@d latest=now()'   <?xml version="1.0" encoding="UTF-8"?> <response> <messages> <msg type="FATAL">Please reduce your search to a smaller time range. All Time searches don't adhere to Splunk best practices</msg> </messages> </response>
hello I count events in a single panel from a relative time like below As you can see, I search only events between 7h and 20h 7 days ago    earliest=-7d@d+7h latest=-7d@d+20h   Now, I d... See more...
hello I count events in a single panel from a relative time like below As you can see, I search only events between 7h and 20h 7 days ago    earliest=-7d@d+7h latest=-7d@d+20h   Now, I dont know if it is possible but I would like to add a condition in this relative time because even if  I  use the timepicker, the result count dont change So I would like to count events only for the last 60 minutes during 7h and 20h for the 7 days ago Is it possible? Thanks
We use the Splunk Hadoop Data Roll to move our frozen data over to our Hadoop cluster.  The writing of the data to HDFS seems to work pretty well, but the searching of it through Splunk doesn't work ... See more...
We use the Splunk Hadoop Data Roll to move our frozen data over to our Hadoop cluster.  The writing of the data to HDFS seems to work pretty well, but the searching of it through Splunk doesn't work well at all.  We get lots of different errors from the query not parsing correctly (some problem with how splunk translates the parenthesis) or some mysterious error happens in the MR job on Hadoop. We use Cloudera, and would like to be able to query the data there through Hue/Hive as an alternative to our terrible experience trying to query the hadoop data through Splunk.   Can anyone offer guidance on how to query the 'rolled' data on a Cloudera Hadoop cluster without going through Splunk search?  
Hi All, Can someone please explain what is seekaddress and seekcrc in CRC in simple terms. I tried to check documentation but looks quit confusing. Read the below scenario but Little confused. ... See more...
Hi All, Can someone please explain what is seekaddress and seekcrc in CRC in simple terms. I tried to check documentation but looks quit confusing. Read the below scenario but Little confused. The CRC from the file beginning in the database has no matching record, indicating a file that Splunk hasn’t seen before. Splunk picks it up and ingests its data from the start of the file and updates the database with the new CRCs and Seek Addresses as it ingests the file.
HI I have data that i can't access unless I use regex   but when I run the command that Splunk gives me I get the empty return.   I can use this SPL, but the performance is not go... See more...
HI I have data that i can't access unless I use regex   but when I run the command that Splunk gives me I get the empty return.   I can use this SPL, but the performance is not good. How do I get the attribute to work for this, so I can get the performance gains. Is it the only way I can see the data? what can i do as  the performance is very bad.  
I am currently monitoring AD account data using InfoSec. However, the number of accounts under the "Compliance" tab and the "Health" tab that are being monitored is not the correct number. Even the n... See more...
I am currently monitoring AD account data using InfoSec. However, the number of accounts under the "Compliance" tab and the "Health" tab that are being monitored is not the correct number. Even the numbers presented don't agree with each other.  My data on AD appears to be CIM compliant as the CIM_Authentication section under "Health" is green. Only pulling from the "main" index from source types WinEventLog and ActiveDirectory.  My data sources appear correct.   Any ideas why InfoSec would not display the proper amount of accounts in AD? Am I looking at this wrong?
I have a query similar to the one below.  index = "idx" source = "mysource"  |spath path=myField output=res|stats count by res | where res="xyz" Is there a way to get the search to return zero i... See more...
I have a query similar to the one below.  index = "idx" source = "mysource"  |spath path=myField output=res|stats count by res | where res="xyz" Is there a way to get the search to return zero if no rows are returned? The reason Im asking for this is because this is the query Im using to populate values into a dashboard panel. If no rows are returned I would prefer the panel show zero instead of no results. 
Hi Experts, The documentation indicates that Splunk Cloud supports encrypted assertions with SAML SSO: https://docs.splunk.com/Documentation/SplunkCloud/8.2.2202/Security/HowSAMLSSOworks  "Confi... See more...
Hi Experts, The documentation indicates that Splunk Cloud supports encrypted assertions with SAML SSO: https://docs.splunk.com/Documentation/SplunkCloud/8.2.2202/Security/HowSAMLSSOworks  "Configure automatic decryption of SAML assertions from an IdP" However, the instructions for obtaining the Splunk instance's encryption certificate appear to be for Splunk Enterprise installs, not for Splunk Cloud. For example, "On your Splunk platform instance, change to the $SPLUNK_HOME/etc/auth directory." A little help?
While editing the Notable, we have options called "Edit selected".  Can anyone help me with how to put the limit(number of Notable to be edited) at one time? eg: If I want to update 20 Notable wit... See more...
While editing the Notable, we have options called "Edit selected".  Can anyone help me with how to put the limit(number of Notable to be edited) at one time? eg: If I want to update 20 Notable with the same work note. I will check mark 20 Notable and update the Work note. However, I just want to put a Max limit of 10 Notable to be updated /edited in a single time. 
I am trying to pull two fields from the lookup_ims lookup table and depending on the user entered I want to populate the category and department fields and place it in the USB.csv. <query> | inputl... See more...
I am trying to pull two fields from the lookup_ims lookup table and depending on the user entered I want to populate the category and department fields and place it in the USB.csv. <query> | inputlookup USB.csv | lookup_ims t fields category, Department | append [ | makeresults | eval user="$user_tok$", description="$description_tok$", revisit="$revisit_tok$", Action="$dropdown_tok$"] | eval _time=now() | table _time, user, category, department, description, revisit | outputlookup USB.csv </query>
Hello, I would like to know how I can generate a report from all the servers that are monitored with Appdynamics. THanks. 
Hi - I am a relatively novice Splunk user. I am looking at implict vs explicit audit events and looking to do a calculation based on a count of these two events. I was trying to write an eval but w... See more...
Hi - I am a relatively novice Splunk user. I am looking at implict vs explicit audit events and looking to do a calculation based on a count of these two events. I was trying to write an eval but wasn't getting anywhere This is my search (redacted) | multisearch [| search auditSource=SOURCE auditType=TYPE 1 | regex tags.path=PATH ] [| search auditSource=SOURCE auditType=TYPE2] | stats dc(SESSIONS) as Total by auditType So, now I have a count of the sessions in both audit types, where unique sessions in TYPE1 are journey starts, and unique sessions in TYPE2 are completions. I want to calculate the completion rate so essentially what I need is the distinct session count in TYPE1 divided by the distinct session count in TYPE2.   p.s. I should note the audit sources for both are the same, and there are no other unique fields I can use instead.
But the log says 017.002.100.103. I am receiving data from universal forwarder and I would like to remove 0 in front of me Is there a way? i want 17.2.100.103
Hi Team, I'm generating a report weekly and sending it across as an email. However, the team wants this file to be pushed onto a directory on Unix server. Any idea on how I can achieve this?
When I try to start the splunkd service, it gives me the following crash log.   [build 51d9cac7b837] 2022-05-16 14:43:39 Received fatal signal 6 (Aborted). Cause: Signal sent by PID 2084 runni... See more...
When I try to start the splunkd service, it gives me the following crash log.   [build 51d9cac7b837] 2022-05-16 14:43:39 Received fatal signal 6 (Aborted). Cause: Signal sent by PID 2084 running under UID 26001. Crashing thread: TcpChannelThread
I need to extract the below field, Required a Regex for the same 1)trc values I need to get regex for "Asva.nsearoon@peypafe.com" 2) tsd values I need to get regex for "flipkart.com" 3)SIP valu... See more...
I need to extract the below field, Required a Regex for the same 1)trc values I need to get regex for "Asva.nsearoon@peypafe.com" 2) tsd values I need to get regex for "flipkart.com" 3)SIP values I need to get regex for "198.161.151.190" Below the sample logs. {"etype":"User","eid":"prvs=343333211os.com","ut":"Regular","tsd":"\"flipkart.com\" <Flipkart@youraccount-alerts.com>","sip":"198.161.151.190","srt":"1","trc":"Asva.nsearoon@peypafe.com"," Thanks,