All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

So i am trying to compare bar graphs for event count for our indexes for two separate days. We are upgrading our environment, and I was wanting this query to show us the event count before and after ... See more...
So i am trying to compare bar graphs for event count for our indexes for two separate days. We are upgrading our environment, and I was wanting this query to show us the event count before and after we upgrade. I am have tried using the earliest=-<int>d and latest=-<int>d, but the query keeps using the time picker. I am using dbinspect, so i wasn't sure if that had something to do with it. Below is the working query that outputs the same results for both EventCount and EventCount_1     |dbinspect index=* | search index!=_* | fields bucketId eventCount index _time | stats sum(eventCount) as EventCount values(max(_time)) as Time by index | table index EventCount, | join type=outer index [| dbinspect index=* | search index!=_* | fields bucketId eventCount index | stats sum(eventCount) as EventCount_1 by index | table index EventCount_1] | table index EventCount EventCount_1       I have tried putting the the time periods in a few places, after the first index, in which the query runs, but returns the same results using the time from the time picker. If i place it after the search, I dont get any results.      |dbinspect index=* earliest=-4d latest=-3d | search index!=_* | fields bucketId eventCount index _time | stats sum(eventCount) as EventCount values(max(_time)) as Time by index | table index EventCount, | join type=outer index [| dbinspect index=* | search index!=_* earliest=2023-05-30T00:00:00 latest=2023-06-01T23:59:59 | fields bucketId eventCount index | stats sum(eventCount) as EventCount_1 by index | table index EventCount_1] | table index EventCount EventCount_1     ^  this is also a working query, but it still uses the time from time picker instead of the stated one in query ^ Am I supposed to be using a different type of time selection with the dbinspect? If i don't use dbinspect, I don't get the same results. Is there any other way to get these results? I'm just trying to get event count by index. Thank you for any help.   
Hello, upgraded to 9.0.4.1 from V8.2.2, in Forwarder management we renamed Server class but when going to Data inputs / Files & directories then the Server Class is not updated. serverclass.conf ... See more...
Hello, upgraded to 9.0.4.1 from V8.2.2, in Forwarder management we renamed Server class but when going to Data inputs / Files & directories then the Server Class is not updated. serverclass.conf is updated, problem is just in GUI even after restart / reload ... Thanks for your help.
Here is the search I am trying to do and I hope I can explain this correctly....I am searching for dlp events where there are x events within a period of time for my testing I am using 1 hour...   ... See more...
Here is the search I am trying to do and I hope I can explain this correctly....I am searching for dlp events where there are x events within a period of time for my testing I am using 1 hour...   index=epp "content threat" Policy="Content Aware Protection - Content Threat Detected" `comment("Creating buckets of 10 minutes")` | bin _time span=1h | stats count values(MatchedItem) by _time ClientUser, DestinationDetails, MatchedItem | eval PotentialLeak=if(count >= 10, 1, 0) | search PotentialLeak = 1 What I am trying to get out of this is a table of the following; _time, ClientUser, DestinationDetails, MatchedItem etc   However, I only see one MatchedItem, not all of them for one user I know there is 12 but only see one of them.   Hope that explains it well enough and appreciate your help.
  Nothing is returned for SOT (assuming NULL).  I don't understand what could be wrong.  If I run the mstats command as a standalone search it works as expected so I'm guessing it's because it's in... See more...
  Nothing is returned for SOT (assuming NULL).  I don't understand what could be wrong.  If I run the mstats command as a standalone search it works as expected so I'm guessing it's because it's inside this map command?     |inputlookup blah.csv | dedup ArrayName | map maxsearches=1000 search=" |mstats avg(some.statistic) WHERE index=myindex AND Array_Name=$ArrayName$ by sgname Array_Name Model |eval SOT=case(Model="ModelA", 94000, Model="ModelB", 104000), PctIOPS=round((sgIOPS/SOT)*100, 2) | sort - PctIOPS | head 5 | table Array_Name Model SOT sgname PctIOPS  
index=os process=sshd name="session opened" action=success | eval user=upper(user) | lookup all_svc_samaccountname.csv SamAccountName as user OUTPUT Match | search Match =1 | eval dest=upper(dest... See more...
index=os process=sshd name="session opened" action=success | eval user=upper(user) | lookup all_svc_samaccountname.csv SamAccountName as user OUTPUT Match | search Match =1 | eval dest=upper(dest) | fields dest user | lookup cmdb_all_assets.csv name as dest Output sys_class_name | search sys_class_name=cmdb_ci*server | eval User=mvindex(user,-1) | eval AccountUsed=upper(User) | search AccountUsed IN (*) | fillnull value="Not Provided" AccountUsed | lookup user_info_all.csv Samaccountname as AccountUsed OUTPUT department Samaccountname Owner_Samaccountname | stats values(department) as Department latest(_time) as Time count by Owner_Samaccountname AccountUsed dest | convert ctime(Time) | rename dest as Target_Computer | append [search index=wineventlog EventID IN (4648) ProcessName="C:\\Windows\\System32\\lsass.exe" source="XmlWinEventLog:Security" action=success |stats count by user ComputerName] | table Time Owner_Samaccountname AccountUsed Department Target_Computer user ComputerName | sort - Time   I I have these search query that returned the append fields values at the bottom to the the main search. My question is how can I match the fields?
Hi,  First of all thanks for taking out time and answering doubts. I tried to join 2 search results and show as output in a single table. Here is the data from 2 logs:  Log1:  Book bought by ... See more...
Hi,  First of all thanks for taking out time and answering doubts. I tried to join 2 search results and show as output in a single table. Here is the data from 2 logs:  Log1:  Book bought by AccountId=Account1, BookName=Book1 Book bought by AccountId=Account1, BookName=Book1 Book bought by AccountId=Account2, BookName=Book3   Log2:  Book sold by AccountId=Account1, BookName=Book1   Output wanted: AccountId bookBoughtName Bought Sold Account1 Book1 2 1 Account2 Book3 1 0   output getting: AccountId bookBoughtName Bought Sold Account1 Book1 2 2 Account2 Book3 1 0 Splunk Query I am using:       "Book bought by" | rex field=_raw "Book\sbought\sby\sAccountId=(?P<Account>\d+),\sBookName=(?P<bookBoughtName>.*)"| join type=left Account [search "Book Sold By" | rex field=_raw "Book\ssold\sbuy\sAccountId=(?P<Account>\d+),\sBookName=(?P<bookSoldName>.*)"]| stats count as "Bought" count(bookSoldName) as "Sold" by Account bookBoughtName       Please help what i am doing wrong and thanks again for taking out precious time and helping.    
I need to make a POST using the TA-Webtools addons curl, for the following json, but this problem is happening. I've done several tests and the json is correct. Can someone help me? data={'ssd':... See more...
I need to make a POST using the TA-Webtools addons curl, for the following json, but this problem is happening. I've done several tests and the json is correct. Can someone help me? data={'ssd': '724b4ssd8-ce6f-4ss28-2ssc6-db324dssef949', '2r2ckerssd': '787d74db-88dss-4872-8b80-896ed980eb2c', 'u8erDocumen2': '2977606ss-07dc-447e-9399-889db982db47', 'u8erssd': '2977606ss-07dc-447e-9399-889db982db47', 'comp2nyssd': '', '8ervssceOrssgssn': 'lno-m2ke-de2l8', 'u8er2ype': 'PF', 'pl22form': 'LNO', 'even2Code': 802, 'clssen2Orssgssn': '2pp-2ndrossd', 'ver8sson': '8.0', 'd22e2ssme': '2023-0ss-87283:09:48.8849ss037Z', '8e88ssonssd': 'bf3387ss0-b04c-4dc3-2989-2ss6869f24f38', '8creenOrssgssn': '', 'd222': {'lno': {'ssd': 'd78229d8-92f3-4ss9e-90ss8-04b798088847', 'p2r2nerKey': '2rssb2nco', 'p2r2nerCnpj': '873ss8880', 'ssn2egr22sson2ype': '2pss-colch2o', 'deb28': [{'ssd': '2b2d07c8-9ec2-ss86ss-838d-d24928c68cc3', 'neg22ssv22ed': 2rue, 'con2r2c2Number': '94JN4ssbvEvd0obxy6e2w849Q+g==', 'documen2': '873ss8880', '2ype': 'C2r2ão decrédss2o', 'orssgssn2lV2lue': '880.98', '2c2u2lV2lue': '8498.44', 'neg22ssv22edV2lue': '880.98', '2nno222sson2ype': 'REFssN', 'ocurrenceD22e': '2023-02-80', 'neg22ssv22edD22e': '2023-03-28', 'ssnclu8ssonD22e': '2023-03-87', 'comp2nyOrssgssn': 'B2NCO 2Rss2NGULO 8/2 - 2RssC2RD', 'w2lle2': '', 'preNeg22ssv22ed': F2l8e, 'pre8crssbed': F2l8e}], 'd222': {'p2ymen2Me2hod': 'b2nk8lssp', 'c2mp2ssgn': {}, 'p2r2ner2u2om22sscDeb22ccoun2': {}}, 'offerssd': '8esse3b6e-bbb0-4208-2328-489993888ccf', 'offerV2lue': '8382.44', 'offerDss8coun2Percen22ge': '83.000887738024996', 'ssn822lmen2Number': 82, 'deb28Coun2': 8, '2greemen28222u8': 8, '2greemen2D22e': '2023-0ss-87280:09:48.ss36208', 'offer8ssgn22ure': 'ss093ssfcss-4bd6-sscbf-8d2b-427f6fdb9e2e', 'boo82': {'2c2ssve': F2l8e, 'v2lue': 0, '8core': 0, 'offer8222u8': 3}, 'c2llssnfo8': {}, 'checkou2': {'2c2ssve': F2l8e}, 'con2r2c2H28h': '9ss4d3bb8-d8b2-ssb26-2399-f788ef429ssc4', 'ssn822lmen28': [{'ssn822lmen2': 8, 'v2lue': '809.37', 'dueD22e': '2023-0ss-22', '2o22l': '809.37', '8222u8': 8, 'v2lue8': [{'ssof': 0, 'ce2': 0, 'v2lue': '809.37', '2o22l': '809.37'}], '22xe8': {'ssof': {'percen22ge': 0, '2o22lV2lue': 0}, 'ce2': {'ye2rPercen22ge': 0, 'mon2hPercen22ge': 0, '2o22lV2lue': 0}, 'ssn2ere82': {'ye2rPercen22ge': 0, 'mon2hPercen22ge': 0, '2o22lV2lue': 0}}}, {'ssn822lmen2': 2, 'v2lue': '809.37', 'dueD22e': '2023-06-22', '2o22l': '809.37', '8222u8': 6, 'v2lue8': [{'ssof': 0, 'ce2': 0, 'v2lue': '809.37', '2o22l': '809.37'}], '22xe8': {'ssof': {'percen22ge': 0, '2o22lV2lue': 0}, 'ce2': {'ye2rPercen22ge': 0, 'mon2hPercen22ge': 0, '2o22lV2lue': 0}, 'ssn2ere82': {'ye2rPercen22ge': 0, 'mon2hPercen22ge': 0, '2o22lV2lue': 0}}}, {'ssn822lmen2': 3, 'v2lue': '809.37', 'dueD22e': '2023-07-22', '2o22l': '809.37', '8222u8': 6, 'v2lue8': [{'ssof': 0, 'ce2': 0, 'v2lue': '809.37', '2o22l': '809.37'}], '22xe8': {'ssof': {'percen22ge': 0, '2o22lV2lue': 0}, 'ce2': {'ye2rPercen22ge': 0, 'mon2hPercen22ge': 0, '2o22lV2lue': 0}, 'ssn2ere82': {'ye2rPercen22ge': 0, 'mon2hPercen22ge': 0, '2o22lV2lue': 0}}}]}}} | curl method=post uri="my_uri" debug=true datafield=data headerfield=header verifyssl=false  
Hello, Here is the deal, I am following this link to ingest cisco umbrella logs into splunk: https://support.umbrella.com/hc/en-us/articles/360001388406-Configuring-Splunk-with-a-Cisco-managed-S3-... See more...
Hello, Here is the deal, I am following this link to ingest cisco umbrella logs into splunk: https://support.umbrella.com/hc/en-us/articles/360001388406-Configuring-Splunk-with-a-Cisco-managed-S3-Bucket (I know, you will say: why not use one of the existing apps in SplunkBase? I will say: I really don't know which one to use, they seem doing the same thing, plus I just want the raw logs, so getting the logs directly from the bucket is good enough ) I downloaded the logs and stored them in the HF under /opt/ciscologs/  and i configured a simple inputs.conf file to read those files with the idea that these logs will be sent to the indexers (because I already have the output file with the config to send data from HF to indexers)  But, I dont see any logs being indexed, I dont see any events on the search head   here is the inputs.conf in the HF: [monitor:///opt/ciscologs/dnslogs/*] index = index_name sourcetype = csv #whitelist = 2023-*/* disabled = 0 crcSalt = <SOURCE> #_TCP_ROUTING = default-autolb-group (the commented lines are settings that i have tried but still no luck.) Any suggestions here? I am out of ideas  Thanks.  
Hi everyone, I'm trying to create an EventID 8004 exception from the C:\Program Files (x86)\Adobe\Acrobat Reader DC\Reader\acrocef_1\RdrCEF.exe directory. I need to receive in Splunk EventID 8004 b... See more...
Hi everyone, I'm trying to create an EventID 8004 exception from the C:\Program Files (x86)\Adobe\Acrobat Reader DC\Reader\acrocef_1\RdrCEF.exe directory. I need to receive in Splunk EventID 8004 but not from RdrCEF.exe . I'm trying to use these blacklists below, but I still get events from this directory. I'm suspicious about the regex, perhaps incorrectly. Some help? directory C:\Program Files (x86)\Adobe\Acrobat Reader DC\Reader\acrocef_1\RdrCEF.exe regex usage: blacklist = EventCode = "^8004$" FullFilePath = "C:\\Program\sFiles\s\(x86\)\\Adobe\\Acrobat\sReader\sDC\\Reader\\acrocef\_1\\RdrCEF\.exe" blacklist1 = EventCode = "^8004$" Message = "C:\\Program\sFiles\s\(x86\)\\Adobe\\Acrobat\sReader\sDC\\Reader\\acrocef\_1\\RdrCEF\.exe" in event viwer the trigger is: %PROGRAMFILES%\ADOBE\ACROBAT READER DC\READER\ACROCEF_1\RDRCEF.EXE
I'm trying to shipa json data set, my code is working fine for file size less than 10kb, but is failing for higher file sizes.     import pandas as pd import requests import json with open('em... See more...
I'm trying to shipa json data set, my code is working fine for file size less than 10kb, but is failing for higher file sizes.     import pandas as pd import requests import json with open('empire_psremoting_stager_2020-09-20170827.json') as file: data = pd.read_json(file, lines=True) index_count = data.shape[0] for count in range(index_count): logs = data.loc[count] for index, value in logs.items(): logs = f"{index}: {value}" logs = str(data.loc[count]) # Convert DataFrame row to dictionary url = "https://xxx-xxx-xxx.splunkcloud.com:8088/services/collector" headers = { "Authorization": "Splunk xxx-xxx-xxx-xxx-xxx" } payload = { "event": logs } response = requests.post(url, headers=headers, data=json.dumps(payload), verify=False) if response.status_code == 200: print("Data sent") else: print("Failed", response.status_code, response.reason)       does any one have other alternatives? or can look at my code.
Hello, I have a question for a custom table.   I have to do this, but in Splunk it's not possible to merge cells.   I have to do this :   For the moment, I made this :  
I need a query that will provide the earliest date for data within an index as well as the indexer it is stored on, specifically looking for which indexes are storing data for one year or more and on... See more...
I need a query that will provide the earliest date for data within an index as well as the indexer it is stored on, specifically looking for which indexes are storing data for one year or more and on which indexers they are being stored on. Any ideas?
HI Team, I want to get when server goes down time. time status 6/2/2023 12:55 down 6/3/2023 12:52 down 6/4/2023 12:50 down 6/4/2023 12:46 up 6/4/202... See more...
HI Team, I want to get when server goes down time. time status 6/2/2023 12:55 down 6/3/2023 12:52 down 6/4/2023 12:50 down 6/4/2023 12:46 up 6/4/2023 12:45 down 6/4/2023 12:45 down MY output want to display server down at 12:45 6/4/2023 12:45 down  Thanks in Advance..!!
Hi Guys, We have a distributed environment with Search Heads/Indexers/Deployement server/License Master/Heavy Forwarder etc in our architecture. All servers are on Splunk version 8.2.4 We are thin... See more...
Hi Guys, We have a distributed environment with Search Heads/Indexers/Deployement server/License Master/Heavy Forwarder etc in our architecture. All servers are on Splunk version 8.2.4 We are thinking to update to 9.0.4- What is the best way of doing this? I mean can we upgrade Search Head to 9.0.4 and upgrade other servers later? In other words- Can a 9.0.4 version Search Head talk to 8.2.4 indexer? Could not find a document for SH-IDX compatibility. Since we have multiple servers, we cannot upgrade all the servers all at once. Any help would be appreciated.
Hello, I have an issue with the json data that is being ingested into Splunk using Universal Forwarder. Some times the json entries are ingested as individual entries in Splunk and other times the ... See more...
Hello, I have an issue with the json data that is being ingested into Splunk using Universal Forwarder. Some times the json entries are ingested as individual entries in Splunk and other times the entire content is loaded as one single event. I tried to search for some special characters that might be causing this issue, but I wasn't able to find any.  Attached is a print screen with 2 examples, one that is being loaded as expected and the another where json is not correctly parsed. Did someone already faced something similar? What should I do to fix it?
Hey Guys, I have a problem: I have two time fields because one of the type source is json file and there is another script running, but now I have to mix both, so I created this eval:   | eval ... See more...
Hey Guys, I have a problem: I have two time fields because one of the type source is json file and there is another script running, but now I have to mix both, so I created this eval:   | eval _time=if(match(source, "[(\S).json]\w+"), strptime(time, "%Y/%m/%d %H:%M:%S.%f"), _time)   but when I search I just receive the .json file search, so Is there any other way to "join"?, or at least the "if" statement ignore and just modify the json source?
I'm trying to store the results into a source_type and use the saved sourcetype to check whether the Event is already there in saved sourcetype or not. If it's not in saved sourcetype, I'm inserting ... See more...
I'm trying to store the results into a source_type and use the saved sourcetype to check whether the Event is already there in saved sourcetype or not. If it's not in saved sourcetype, I'm inserting the event. If the event is there in sourcetype and the time is greater than the time from current search then I'm updating the event with the earliest event. But the below query fails to display the Time(SavedTime) from the saved sourcetype and my query is failing to update events. Any advise would be very much appreciated. Thanks in advance!! source=testSource | stats count earliest(_time) as Time first(host) as host first(source) as source by EventCode | join type=left EventCode [ search index=main sourcetype=saved_sourcetype | eval savedTime=strptime(Time, "%Y-%m-%d %H:%M:%S") | stats count as Known values(Time) as sTime values(host) as host values(source) as source by EventCode] | fillnull Known value=0 | eval insertRequired=if(Known=0, "Yes", "No") | eval UpdateRequired=if(Time < savedTime , "Yes", "No") | eval SaveAction=case( insertRequired == "Yes" AND UpdateRequired != "Yes", "insert", UpdateRequired == "Yes", "update", 1=1, "ignore" ) | eval Time=strftime(Time, "%Y-%m-%d %H:%M:%S") | stats count earliest(savedTime) as savedTime latest(Time) as Time values(Known) as Known first(host) as host last(source) as source by EventCode, SaveAction, insertRequired, UpdateRequired  
Hello all,   My setup has 2 hubs, Hub1 with 2 indexers and a Master Cluster node and hub2 with another 2 indexers and a standby Master Cluster node. Hub1 is working as expected however hub2 is ... See more...
Hello all,   My setup has 2 hubs, Hub1 with 2 indexers and a Master Cluster node and hub2 with another 2 indexers and a standby Master Cluster node. Hub1 is working as expected however hub2 is unable to contact the MCN. None of the 3 servers in hub2 are able to do the health check of the MCN and when I do “curl –k –vvv https://IP:8089/services/server/health/splunkd/local “ It just times out. I can curl to the local MCN in each hub and get the expected Unauthorized reply but not between hubs. When I do TCPdump I can see the traffic hitting the MCN but never completing. Below is the output of the connection,   * Trying IP * Connected to IP (IP) port 8089 (#0 * ALPN, offering h2 * ALPN, offering http/1.1 * Successfully set certificate verify locations: *  CAfile: /etc/pki/tls/certs/ca-bundle.crt CApath : none * TLSv1.3 (OUT), tls handshake, Client hello (1): * OpenSSL  SSL_Connect : SSL_ERROR_SYSCALL in connection to  IP:8089 * Closing Cconnection 0   any help would be much appreciated   Regards  Leon
Hi, I am trying to establish a query that checks whether a random src IP is in a specific subnet. However, all the subnets and IP addresses are in String format and I am unable to establish any ... See more...
Hi, I am trying to establish a query that checks whether a random src IP is in a specific subnet. However, all the subnets and IP addresses are in String format and I am unable to establish any mathematical relationship between the conditions. Here is a part of my current query: | inputlookup ABC.csv | eval ip = 10.1.2.342 | eval AMERICAS =if(ip >= 10.0.0.1 OR ip <= 10.63.255.254,"NOK","OK") | table AMERICAS   Can you please help? Many thanks as always,
index="go_pro" Appid="APP-5f" prod (":[ Axis" OR "ErrorCode" OR "System Error" OR "Invalid User :") | rex field=_raw "ErrorDesc\:\s(?<error_caused_by>.*?)\Z" | rex field=_raw "calldm\(\)\s\:\[\s(?<... See more...
index="go_pro" Appid="APP-5f" prod (":[ Axis" OR "ErrorCode" OR "System Error" OR "Invalid User :") | rex field=_raw "ErrorDesc\:\s(?<error_caused_by>.*?)\Z" | rex field=_raw "calldm\(\)\s\:\[\s(?<error_caused_by>.*?)\Z" | rex field=_raw "app5f\-(?<Environment>.*?)\-\Z" | convert timeformat="%m-%d-%Y %I:%M:%S" ctime(_time) AS time | stats count by time error_caused_by Environment host | reverse   i am using this query but in count some transactions are matching so the count is getting to 5 or 6 because that transaction were matching i want every transaction to come on different line if they are matching also. PLease help me in segregating the count or limit the count to 1