All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I am currently monitoring AD account data using InfoSec. However, the number of accounts under the "Compliance" tab and the "Health" tab that are being monitored is not the correct number. Even the n... See more...
I am currently monitoring AD account data using InfoSec. However, the number of accounts under the "Compliance" tab and the "Health" tab that are being monitored is not the correct number. Even the numbers presented don't agree with each other.  My data on AD appears to be CIM compliant as the CIM_Authentication section under "Health" is green. Only pulling from the "main" index from source types WinEventLog and ActiveDirectory.  My data sources appear correct.   Any ideas why InfoSec would not display the proper amount of accounts in AD? Am I looking at this wrong?
I have a query similar to the one below.  index = "idx" source = "mysource"  |spath path=myField output=res|stats count by res | where res="xyz" Is there a way to get the search to return zero i... See more...
I have a query similar to the one below.  index = "idx" source = "mysource"  |spath path=myField output=res|stats count by res | where res="xyz" Is there a way to get the search to return zero if no rows are returned? The reason Im asking for this is because this is the query Im using to populate values into a dashboard panel. If no rows are returned I would prefer the panel show zero instead of no results. 
Hi Experts, The documentation indicates that Splunk Cloud supports encrypted assertions with SAML SSO: https://docs.splunk.com/Documentation/SplunkCloud/8.2.2202/Security/HowSAMLSSOworks  "Confi... See more...
Hi Experts, The documentation indicates that Splunk Cloud supports encrypted assertions with SAML SSO: https://docs.splunk.com/Documentation/SplunkCloud/8.2.2202/Security/HowSAMLSSOworks  "Configure automatic decryption of SAML assertions from an IdP" However, the instructions for obtaining the Splunk instance's encryption certificate appear to be for Splunk Enterprise installs, not for Splunk Cloud. For example, "On your Splunk platform instance, change to the $SPLUNK_HOME/etc/auth directory." A little help?
While editing the Notable, we have options called "Edit selected".  Can anyone help me with how to put the limit(number of Notable to be edited) at one time? eg: If I want to update 20 Notable wit... See more...
While editing the Notable, we have options called "Edit selected".  Can anyone help me with how to put the limit(number of Notable to be edited) at one time? eg: If I want to update 20 Notable with the same work note. I will check mark 20 Notable and update the Work note. However, I just want to put a Max limit of 10 Notable to be updated /edited in a single time. 
I am trying to pull two fields from the lookup_ims lookup table and depending on the user entered I want to populate the category and department fields and place it in the USB.csv. <query> | inputl... See more...
I am trying to pull two fields from the lookup_ims lookup table and depending on the user entered I want to populate the category and department fields and place it in the USB.csv. <query> | inputlookup USB.csv | lookup_ims t fields category, Department | append [ | makeresults | eval user="$user_tok$", description="$description_tok$", revisit="$revisit_tok$", Action="$dropdown_tok$"] | eval _time=now() | table _time, user, category, department, description, revisit | outputlookup USB.csv </query>
Hello, I would like to know how I can generate a report from all the servers that are monitored with Appdynamics. THanks. 
Hi - I am a relatively novice Splunk user. I am looking at implict vs explicit audit events and looking to do a calculation based on a count of these two events. I was trying to write an eval but w... See more...
Hi - I am a relatively novice Splunk user. I am looking at implict vs explicit audit events and looking to do a calculation based on a count of these two events. I was trying to write an eval but wasn't getting anywhere This is my search (redacted) | multisearch [| search auditSource=SOURCE auditType=TYPE 1 | regex tags.path=PATH ] [| search auditSource=SOURCE auditType=TYPE2] | stats dc(SESSIONS) as Total by auditType So, now I have a count of the sessions in both audit types, where unique sessions in TYPE1 are journey starts, and unique sessions in TYPE2 are completions. I want to calculate the completion rate so essentially what I need is the distinct session count in TYPE1 divided by the distinct session count in TYPE2.   p.s. I should note the audit sources for both are the same, and there are no other unique fields I can use instead.
But the log says 017.002.100.103. I am receiving data from universal forwarder and I would like to remove 0 in front of me Is there a way? i want 17.2.100.103
Hi Team, I'm generating a report weekly and sending it across as an email. However, the team wants this file to be pushed onto a directory on Unix server. Any idea on how I can achieve this?
When I try to start the splunkd service, it gives me the following crash log.   [build 51d9cac7b837] 2022-05-16 14:43:39 Received fatal signal 6 (Aborted). Cause: Signal sent by PID 2084 runni... See more...
When I try to start the splunkd service, it gives me the following crash log.   [build 51d9cac7b837] 2022-05-16 14:43:39 Received fatal signal 6 (Aborted). Cause: Signal sent by PID 2084 running under UID 26001. Crashing thread: TcpChannelThread
I need to extract the below field, Required a Regex for the same 1)trc values I need to get regex for "Asva.nsearoon@peypafe.com" 2) tsd values I need to get regex for "flipkart.com" 3)SIP valu... See more...
I need to extract the below field, Required a Regex for the same 1)trc values I need to get regex for "Asva.nsearoon@peypafe.com" 2) tsd values I need to get regex for "flipkart.com" 3)SIP values I need to get regex for "198.161.151.190" Below the sample logs. {"etype":"User","eid":"prvs=343333211os.com","ut":"Regular","tsd":"\"flipkart.com\" <Flipkart@youraccount-alerts.com>","sip":"198.161.151.190","srt":"1","trc":"Asva.nsearoon@peypafe.com"," Thanks,  
Hi at all, this is probably a stupid question but I have few experience on Splunk Cloud. I uploaded a custom app on Splunk Cloud and the validation upload procedure asked to me to move all images... See more...
Hi at all, this is probably a stupid question but I have few experience on Splunk Cloud. I uploaded a custom app on Splunk Cloud and the validation upload procedure asked to me to move all images and JSs from $SPLUNK_HOME/etc/apps/my_app/appserver/static to $SPLUNK_HOME/etc/apps/my_app/appserver After this update I was able to upload my custom App, but now all the images aren't visible and JSs non executed because the path is chaged from /static/app/my_app/appserver/my_image.png to what? Thank you in advance. Ciao. Giuseppe
Hey, I am tasked with creating a bar chart for one of my dashboard panels and the colour of the bar chart must be pink. I am using: <option name="charting.fieldColors">0xff66cc</option> for th... See more...
Hey, I am tasked with creating a bar chart for one of my dashboard panels and the colour of the bar chart must be pink. I am using: <option name="charting.fieldColors">0xff66cc</option> for the panel but the bar chart still turns blue:   Can you please help? Thanks, P
below is the data which has multiple features for a single item. I want to write a regex which could search all occurrences of feature (not just first occurance) and then count the feature . I have w... See more...
below is the data which has multiple features for a single item. I want to write a regex which could search all occurrences of feature (not just first occurance) and then count the feature . I have written below search string but count value is not consistent. can someone plz take a look and advice. Many thanks in advance. |makeresults | eval _raw="[{\"\"feature\"\": \"\"INTDATA\"\"}, {\"\"feature\"\": \"\"INTDATA2\"\"}, {\"\"feature\"\": \"\"MGDAT0\"\"}, {\"\"feature\"\": \"\"MGPR2TI\"\"}, {\"\"feature\"\": \"\"MSTORE\"\"}, {\"\"feature\"\": \"\"PNINCLWAP\"\"}, {\"\"feature\"\": \"\"PRMCAFIND\"\"}, {\"\"feature\"\": \"\"3WY\"\"}, {\"\"feature\"\": \"\"CFC\"\"}, {\"\"feature\"\": \"\"CFU\"\"}, {\"\"feature\"\": \"\"CLIP\"\"}, {\"\"feature\"\": \"\"CLIR\"\"}, {\"\"feature\"\": \"\"CLW\"\"}, {\"\"feature\"\": \"\"DATA\"\"}, {\"\"feature\"\": \"\"CAMTAC\"\"}, {\"\"feature\"\": \"\"HOLD\"\"}, {\"\"feature\"\": \"\"INROAM\"\"}, {\"\"feature\"\": \"\"ISP\"\"}, {\"\"feature\"\": \"\"MSTORE\"\"}, {\"\"feature\"\": \"\"NWROAM\"\"}, {\"\"feature\"\": \"\"PERMGL\"\"}, {\"\"feature\"\": \"\"SMSO\"\"}, {\"\"feature\"\": \"\"VM\"\"}, {\"\"feature\"\": \"\"GFLEX\"\"}]" |rex max_match=0 "\"\"feature\"\": \"\"(?<feature>.*?)\"\"}" |stats count(feature) by feature
Hello, we have a cluster environment: - Search Head Cluster (3 nodes) - Indexers Cluster (4 sites) 10 nodes each actually is still with version 7.3.9 based on CentOS. We have to migrate the... See more...
Hello, we have a cluster environment: - Search Head Cluster (3 nodes) - Indexers Cluster (4 sites) 10 nodes each actually is still with version 7.3.9 based on CentOS. We have to migrate the OS to Suse linux and at the same time upgrade to Splunk 8.2.6 ,  we want to prepare a parallel environment with the same number of nodes where to install the latest Splunk version. We also would like to use this new environment to migrate and fix the apps to be compatible with python, xml and jquery then start the env in production. We are struggling to find a way to migrate the indexes buckets (db_* and rb_*) and kvstore from old to new environment with less downtime and loss data, if it is possible what about the the GUID in buckets name. Thank you.
Dear Splunkers, We are upgrading our UFs in our environment, and I noticed that the number of clients is increasing due to the upgrading process. Right now, I'm seeing the same clients with the sa... See more...
Dear Splunkers, We are upgrading our UFs in our environment, and I noticed that the number of clients is increasing due to the upgrading process. Right now, I'm seeing the same clients with the same information except the GUIDs (Client Name) are different while the old one had not phoned home for a while, is this considered a problem? and how long will it be there until the  Forwarder Manager drops it? thanks, 
Just making sure that I didn't miss something. There is no way to set RF and SF based on which site the originates from? I mean - let's say that I have two sites and I don't want the data to be repli... See more...
Just making sure that I didn't miss something. There is no way to set RF and SF based on which site the originates from? I mean - let's say that I have two sites and I don't want the data to be replicated between sites in any way. The customer understand that there is no site-level data resiliency and in case of a site outage faces unavailability of all the buckets stored at that site and is OK with that. I know that I could simply do - for example - origin:2, total:2 but that means that I have to have the same settings in both sites but if I wanted to have different settings at each site? Of course I could do separate clusters at each site but that also means more fuss with managing configurations - deploying apps and so on. Anything I missed?
i have the below data, dc_number argosweekstart total_forecast 610 2022-10-23 23534.000003657507 610 2022-05-22 457659.9999990086 610 2022-06-19 457026.96672... See more...
i have the below data, dc_number argosweekstart total_forecast 610 2022-10-23 23534.000003657507 610 2022-05-22 457659.9999990086 610 2022-06-19 457026.96672087134 610 2022-06-12  499736.9999989038 i have fetched the below table which have maximum minimum and current data corresponding to the number from the below query index="index"  |stats min(total_forecast) as Minimum max(total_forecast) as Maximum latest(total_forecast) as Current by dc_number |table dc_number week_min Minimum week_max Maximum week_cur Current dc_number week_min Minimum week_max Maximum week_cur Current 610   23534.000003657507   499736.999998903800   23534.000003657507   but i am expecting the below output with corresponding week value from the first table. that means, for week_min it should pick it's corresponding value from week from the minimum value and same from maximum and current. below is the expected output. dc_number week_min Minimum week_max Maximum week_cur Current 610 2022-10-23 23534.000003657507 2022-06-12 499736.999998903800 2022-10-23 23534.000003657507
Hi, I need to create a dashboard of statistics and graphs for Firewall data. There is a huge volume of data generated on an hourly basis. What is the best way to graph ip addresss in a graph?... See more...
Hi, I need to create a dashboard of statistics and graphs for Firewall data. There is a huge volume of data generated on an hourly basis. What is the best way to graph ip addresss in a graph? Like, line graph or pie chart of the counts? Thanks,
Need Solr cloud metric document.