All Topics

Top

All Topics

Hello, Can anyone please help me with the line breaking. Multiple Security events are merged into a single event, putting a highlight on the the timestamp where the event should break.   Event b... See more...
Hello, Can anyone please help me with the line breaking. Multiple Security events are merged into a single event, putting a highlight on the the timestamp where the event should break.   Event below should be split into 2.   sourcetype = claroty:cef   2021-10-02 14:37:17 User.Info 10.10.32.132 Oct 02 2021 04:37:17 server026 CEF:0|Claroty|CTD|4.3.1|Alert/Host Scan|Host Scan|10|src=10.206.164.120 smac=2c:00:00:0f:9a:ad:73 dmac=28:0000:61:f7:7e:c3 externalId=24459 cat=Security/Host Scan start=Oct 02 2021 12:43:36 msg=ICMP Host scan: Asset 10.5.164.10 sent packets to different IP destinations deviceExternalId=INDO - CIBITUNG cs1Label=SourceAssetType cs1=Endpoint cs3Label=SourceZone cs3=Endpoint: Other cccs4Label=DestZone cs4=Endpoint: MQTT cs6Label=CTDlink cs6= 2021-10-02 14:37:17 User.Info 10.10.32.132 Oct 02 2021 04:37:17 server026 CEF:0|Claroty|CTD|4.3.1|Alert/Host Scan|Host Scan|10|src=10.206.163.251 smac=28:00:00:f7:7e:cb shost=DESKTOP-0C11AFV dmac=00:1b:1b:2c:12:1a externalId=24460 cat=Security/Host Scan start=Oct 02 2021 12:43:42 msg=ICMP Host scan: Asset 10.51.163.251 sent packets to different IP destinations deviceExternalId=INDO - CIBITUNG cs1Label=SourceAssetType cs1=Endpoint cs3Label=SourceZone cs3=Endpoint: Other cccs4Label=DestZone cs4=PLC: S7 cs6Label=CTDlink cs6=   Current props.conf in search head [claroty:cef] LINE_BREAKER = ([\r\n]*)[0-9]{4}-(0[1-9]|1[0-2])-(0[1-9]|[1-2][0-9]|3[0-1])(2[0-3]|[01][0-9]):[0-5][0-9] BREAK_ONLY_BEFORE = ([\r\n]*)[0-9]{4}-(0[1-9]|1[0-2])-(0[1-9]|[1-2][0-9]|3[0-1])(2[0-3]|[01][0-9]):[0-5][0-9] REPORT-cs1 = claroty_cef_1 REPORT-cs2 = claroty_cef_2 REPORT-cs3 = claroty_cef_3 REPORT-cs4 = claroty_cef_4 REPORT-cs5 = claroty_cef_5 REPORT-cs6 = claroty_cef_6 REPORT-cs7 = claroty_cef_7 REPORT-cs8 = claroty_cef_8 REPORT-cs9 = claroty_cef_9 EXTRACT-msg = msg=(?<msg>.*?).x00$ EXTRACT-rt = rt=(?<rt>\w\w\w\s\d\d\s\d\d\d\d\s\d\d:\d\d:\d\d) EXTRACT-alert = Alert\|(?<Alert>.+?)\| EXTRACT-event = Event\|(?<Event>.+?)\| FIELDALIAS-AssignedTo = ResolvedAs AS AssignedTo Additional question once we got the correct settings to props.conf and running search again will the data be the same? or it will only work for the new events that will be ingested to splunk?
I use the below SPL to find how hosts are logging in my environment and how far off the timestamp of the last event sent by each host is from the current time. | eval time_zone_diff = Now() - last... See more...
I use the below SPL to find how hosts are logging in my environment and how far off the timestamp of the last event sent by each host is from the current time. | eval time_zone_diff = Now() - lastTime | eval recent_time = Now() - recentTime | sort time_diff The results are hard to read. Instead of reporting which hosts, it just shows so many are late & so many are ahead & no host is listed. Would you help with an SPL, not Meta Woot! App (could not configure it properly) to list which hosts are late & which are reporting event in the future please. Thanks a million in advance.
hi, My organisation is using Splunk SaaS and would like to integrate with ServiceNow using addons. In order to get security thumbs up i would like to know below: 1. Where the basic credentials of S... See more...
hi, My organisation is using Splunk SaaS and would like to integrate with ServiceNow using addons. In order to get security thumbs up i would like to know below: 1. Where the basic credentials of ServiceNow stored in Splunk Web UI, can someone edit it? Who has permission to view that password. 2. What TLS splunk SaaS uses to communicate with ServiceNow addons?
Hi, I am trying to import a saml cert into phantom 5.0.1. The phenv is deprecated in this version. Is there another script to do this or will a copy in the cacerts.pem work in this case? Thanks
I've seen a few of my colleagues recently use a command called multireport which seems to be largely undocumented to solve some search conditions that have been very challenging to solve otherwise.  ... See more...
I've seen a few of my colleagues recently use a command called multireport which seems to be largely undocumented to solve some search conditions that have been very challenging to solve otherwise.  Is there a splunk dev who is close to the developer of multireport who would care to shed light into a full synopsys of how this command can be used and any potential arguments it accepts?
I'm having trouble getting all the fields from sysmon automatically parse with the microsoft sysmon add in could someone tell me what i might be missing? The events are coming into my home splunk in... See more...
I'm having trouble getting all the fields from sysmon automatically parse with the microsoft sysmon add in could someone tell me what i might be missing? The events are coming into my home splunk instance (8.2.2) but not being fully parsed correctly, I'm pretty sure i need to use a transform, but the one I've tried isn't working (I'm pretty sure i did it wrong but *shrug* no idea if i did or not) I've installed sysmon on my home computer and have the universal forwarder pointed to my home splunk instance. I followed the guide i found here: https://hurricanelabs.com/splunk-tutorials/splunking-with-sysmon-series-part-1-the-setup/ As you can see in the screenshot it only extracted some of the fields and the IMPHASH value carried over into some other data.   inputs.conf for sysmon       [WinEventLog://Microsoft-Windows-Sysmon/Operational] disabled = false renderXml = true source = XMLWinEventLog:Microsoft-Windows-Sysmon/Operational   output of my transform with the path:   cat /opt/splunk/etc/apps/search/default/transforms.conf [geo_us_states] external_type = geo filename = geo_us_states.kmz [geo_countries] external_type = geo filename = geo_countries.kmz [geo_attr_us_states] filename = geo_attr_us_states.csv [geo_attr_countries] filename = geo_attr_countries.csv [geo_hex] external_type=geo_hex [xmlwineventlog] REGEX = "Data Name\=\'(?<_KEY_1>[A-Za-z]+)\'>(?<_VAL_1>[^<]+)<\/Data>" DELIMS = "'>"       Here's a sample event straight from _raw (looked this event over nothing seemed overly sensitive)       <Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'><System><Provider Name='Microsoft-Windows-Sysmon' Guid='{5770385f-c22a-43e0-bf4c-06f5698ffbd9}'/><EventID>1</EventID><Version>5</Version><Level>4</Level><Task>1</Task><Opcode>0</Opcode><Keywords>0x8000000000000000</Keywords><TimeCreated SystemTime='2021-10-05T22:36:40.9004216Z'/><EventRecordID>7090</EventRecordID><Correlation/><Execution ProcessID='5908' ThreadID='7428'/><Channel>Microsoft-Windows-Sysmon/Operational</Channel><Computer>eelo</Computer><Security UserID='S-1-5-18'/></System><EventData><Data Name='RuleName'>-</Data><Data Name='UtcTime'>2021-10-05 22:36:40.899</Data><Data Name='ProcessGuid'>{ce4bb586-d378-615c-5b1e-000000007100}</Data><Data Name='ProcessId'>10476</Data><Data Name='Image'>C:\Program Files\SplunkUniversalForwarder\bin\splunk-powershell.exe</Data><Data Name='FileVersion'>-</Data><Data Name='Description'>-</Data><Data Name='Product'>-</Data><Data Name='Company'>-</Data><Data Name='OriginalFileName'>-</Data><Data Name='CommandLine'>"C:\Program Files\SplunkUniversalForwarder\bin\splunk-powershell.exe"</Data><Data Name='CurrentDirectory'>C:\WINDOWS\system32\</Data><Data Name='User'>NT AUTHORITY\SYSTEM</Data><Data Name='LogonGuid'>{ce4bb586-a1ce-615c-e703-000000000000}</Data><Data Name='LogonId'>0x3e7</Data><Data Name='TerminalSessionId'>0</Data><Data Name='IntegrityLevel'>System</Data><Data Name='Hashes'>MD5=1F8722C371906F7B659FA38B39B21661,SHA256=383581B2E6BE7003CCCC0DAFAE75CBA3B0885C441ACDBD9AE76EAAFD9602A022,IMPHASH=1BDECF92268D3D3EF70015DDFEB0FFB9</Data><Data Name='ParentProcessGuid'>{ce4bb586-d191-615c-2a1d-000000007100}</Data><Data Name='ParentProcessId'>18812</Data><Data Name='ParentImage'>C:\Program Files\SplunkUniversalForwarder\bin\splunkd.exe</Data><Data Name='ParentCommandLine'>"C:\Program Files\SplunkUniversalForwarder\bin\splunkd.exe" service</Data></EventData></Event>      
We are running a  single Splunk Enterprise 8.1 instance on a Linux AWS EC2 instance. We are using smartstore backed by S3. We have /opt/splunk (including the index volume/cache) on a separate partiti... See more...
We are running a  single Splunk Enterprise 8.1 instance on a Linux AWS EC2 instance. We are using smartstore backed by S3. We have /opt/splunk (including the index volume/cache) on a separate partition from the OS.  I have not seen/found any documentation addressing OS level partitioning.  Are there performance concerns with our current configuration? Would/could a spike in the size of non-index files on the splunk partition cause performance issues or caching abnormalities with smartstore? Does it make sense to separate the indexes (./lib/) onto its own partition, separate from both the OS and splunk config/app/log and other transient files?  
Hey,    We have cisco:esa:cef coming in and whilst we are getting the extractions we need from the main part of the log. We are also getting thousands of other fields being extracted due to part of... See more...
Hey,    We have cisco:esa:cef coming in and whilst we are getting the extractions we need from the main part of the log. We are also getting thousands of other fields being extracted due to part of the log message containing non cef data.  Is there a way to disable these extra search time extractions? Ive tried kv_mode=none but it doesnt affect it. 
I'm attempting to get a count for multiple fields Description and ActionDescription with the values for them AFTER counting by another field with a where clause over a period of time. This is what I'... See more...
I'm attempting to get a count for multiple fields Description and ActionDescription with the values for them AFTER counting by another field with a where clause over a period of time. This is what I'm wanting: UserName Description DescriptionCount ActionDescription ActionDescriptionCount Count _time Andy SSO Send to home update password 1 1 1 1 Sign in Sign in successful 1 1 4 10/5/2021 15:00 Bob Authentication Successful Sending to SecondFactor Sent token via SMS Successfully Authorized 1 2 1 3 1 Sign in Sign in successful Sign in failed 1 1 2 8 10/5/2021 17:00   This is the closest I've got but there are times where either the DescriptionCount or ActionDescriptionCount has missed a count for the Description or the ActionDescription: index=foo source=bar | bin _time span=1h | fillnull value="0" | eventstats count by UserName _time | where count > 500 | rename count as UserNameCount | eventstats count by Description | rename count as DescriptionCount | eventstats count by ActionDescription | rename count as ActionDescriptionCount | stats values(ActionDescription) as ActionDescriptionValues values(ActionDescriptionCount) as ActionDescriptionCount values(Description) as Description values(DescriptionCount) as DescriptionCount values(_time) as "Time Frame(s)" count by UserName | convert ctime("Time Frame(s)")
Does anyone have any information about how to use the new Alert Actions? We created a simple alert which has output greater than 0, added the account name, and pasted in a simple KQL.  Nothing happe... See more...
Does anyone have any information about how to use the new Alert Actions? We created a simple alert which has output greater than 0, added the account name, and pasted in a simple KQL.  Nothing happens   Release Notes Version 1.3.0 May 21, 2021 Alert actions introduced: Advanced Hunting alert action runs advanced hunting queries on entities to ingest additional detail Incident Update alert action updates the Microsoft 365 Defender portal from a Splunk search
I'm working with a standalone splunk 8.1.3 instance with the Splunk CIM 4.20.2.      I have several accelerated data models that are populating data properly.    I have a couple of data sources,speci... See more...
I'm working with a standalone splunk 8.1.3 instance with the Splunk CIM 4.20.2.      I have several accelerated data models that are populating data properly.    I have a couple of data sources,specifically an ISC DHCP server logging to a custom UDP port, and a Palo Alto firewall which is logging to its own index, that I'm not finding the data within the data model.   pan:traffic from the palo alto index should constitute network session data, and the ISC DHCP data should likewise.     Is there a way to find out why that data isn't being categorized in that manner?      Is there some way I can get that data in there properly?   Thanks,  
How do you export all rules from Splunk for an internal audit request?
I am using Splunk to review logs from disconnected systems.  We have the users export the evtx files and send them to us.  I then put them in a folder and Splunk indexes the new files.   Is there an... See more...
I am using Splunk to review logs from disconnected systems.  We have the users export the evtx files and send them to us.  I then put them in a folder and Splunk indexes the new files.   Is there an easy way to see the indexing process?  Right now I just keep hitting refresh occasionally until nothing changes.
Hi @jkat54, thank you for creating this wonderful app.  I have a use case that requires executing remote searches from one independent search head to another search head, with the use of auth tokens.... See more...
Hi @jkat54, thank you for creating this wonderful app.  I have a use case that requires executing remote searches from one independent search head to another search head, with the use of auth tokens.   I am able to do so using the linux curl command, using the following command syntax:   curl -k -H "Authorization: Bearer eyJraWQiOiJzcGx1bmsuc2VjcmV0IiwiYWxnIjoiSFM1MTIiLCJ2ZXIiOiJ2MiIsInR0eXAiOiJzdGF0aWMifQ.eyJpc" https://localhost:8089/services/search/jobs/export -d output_mode=csv -d search="search index=_internal | head 10"   I would like to know how I can translate the above syntax into search command, leveraging the webtools add-on.   Thanks in advance for your help.  
Hi I have a table on a dashboard who's click.value is two numeric params joined like this: NumA_NumB I have a second dashboard that I want to drill down to from that table that has form inputs for... See more...
Hi I have a table on a dashboard who's click.value is two numeric params joined like this: NumA_NumB I have a second dashboard that I want to drill down to from that table that has form inputs for NumA and NumB.  Is there any way i can split the click.value and use it to populate the two form fields from the drilldown? I've tried using mvindex(split($click.value$,"_"),0) but it doesn't work. I can populate both the form fields with NumA_NumB but just can't work out how to split it. 
This morning all my dashboards within multiple apps stopped showing data.  I rebooted all my splunk servers and restarted the services.  I am unable to view any health information under the monitorin... See more...
This morning all my dashboards within multiple apps stopped showing data.  I rebooted all my splunk servers and restarted the services.  I am unable to view any health information under the monitoring console, all that data is also blank.
Hi , Could someone help me with the below issue In splunk cloud I have 500+ events and each event contains 100+ lines of data. while exporting in CSV file single event is splitting in different row... See more...
Hi , Could someone help me with the below issue In splunk cloud I have 500+ events and each event contains 100+ lines of data. while exporting in CSV file single event is splitting in different rows which should not happen. I need the data same as the splunk results row wise without splitting Is there an limitation per single row while exporting in csv file? Here is the screenshot for reference, where 2nd and 3rd rows are single event(but splitted in 2 rows) and 5&6 single event and 8&9 single event,  data from 4th and 7th row is fine
I have a field, let's say the user field, that has both usernames without a domain and some with. I want the fields values that don't have an extension to have it added   Example: sparky1 sparky... See more...
I have a field, let's say the user field, that has both usernames without a domain and some with. I want the fields values that don't have an extension to have it added   Example: sparky1 sparky2@splunk.com   I want to be able to append splunk.com to the sparky1 value, without adding it again to sparky2@splunk.com
format 20211005000000 example 2021/10/05 with the time in another field
Hello, I followed the documentation to export health rules from one app as follows:  curl -k --user admin@@customer1:password https://controllerFQDN:8181/controller/healthrules/35 >> healthrules.xm... See more...
Hello, I followed the documentation to export health rules from one app as follows:  curl -k --user admin@@customer1:password https://controllerFQDN:8181/controller/healthrules/35 >> healthrules.xml Then I tried importing the health rules to a different App using the following:  curl -k -X POST --user admin@customer1:password https://controllerFQDN:8181/controller/healthrules/52 -F file=@healthrule.xml I get the following error:  "Min triggers should be within 0 and 1." I am not sure what that means or if I am doing anything wrong. I followed the documentation exactly as written. Thanks,