I want to use join. However, the fields to be compared are fields called _time and b. However, when join _time, b [sub_search] is performed, the date is output for only one day. What should I do?
| chart values(Date_Policy) BY Volume,WeekRange,
in above command I wanted to add host as well in the BY section but not getting result for it. Can any help to fix this.
| chart values(Date_P...
See more...
| chart values(Date_Policy) BY Volume,WeekRange,
in above command I wanted to add host as well in the BY section but not getting result for it. Can any help to fix this.
| chart values(Date_Policy) BY Volume,WeekRange, host
Thanks in advance. We have scenario that we need to send alerts multiple times . 1. Lagging E.g Lets put Threshold time : 1 Hour 20/02 10:00 AM NZT : Lagging encountered after 1 Hour th...
See more...
Thanks in advance. We have scenario that we need to send alerts multiple times . 1. Lagging E.g Lets put Threshold time : 1 Hour 20/02 10:00 AM NZT : Lagging encountered after 1 Hour threshold -> Alert#1 via email with subject : Total lag time 20/02 11:00 AM NZT : Lagging still occur -> Alert#2 via email with subject : Total lag time 2 hour (Accumulated lagging hour) + Create Incidents in ServiceNow 20/02 13:00 PM NZT : Lagging encountered after 1 Hour threshold -> Alert#3 via email with subject : Total lag time 20/02 14:00 PM NZT : Lagging encountered after 1 Hour threshold -> Alert#4 via email with subject : Total lag time ---Got fixed
i have jira issue collector app in splunk and inputs also added we can see logs as well. here it was with only one project with index=jirarequest Now i modified the JQL query in the inputs to get da...
See more...
i have jira issue collector app in splunk and inputs also added we can see logs as well. here it was with only one project with index=jirarequest Now i modified the JQL query in the inputs to get data from 2 projects, the JQL query i used is: project%20in(MMAAS%2CBBAT)%20AND%20updated%3E-2h But now after updating the JQL query with 2 projects i am not able to see the data from second project i.e CBBAT. Do i need to modify or update anywhere else, i am not sure.
Hi, I have installed Splunk Universal Forwarder for windows 9.0.4 to work with Splunk Cloud. Downloaded credential packages and saw logs in Splunk cloud. We need to send these logs to a index other ...
See more...
Hi, I have installed Splunk Universal Forwarder for windows 9.0.4 to work with Splunk Cloud. Downloaded credential packages and saw logs in Splunk cloud. We need to send these logs to a index other than the main. Do you know how I can do this? Also, which files would need to be edited if we want to add/remove log source (right now, we only have windows security logs, but maybe we want to add windows application logs also) Can the installation of UF and these files be done via Microsoft SCCM when we need to do a mass deployment (instead of using deployment server)? Thanks, Krunal Dave
On page 12 of 122 on the documentation of "Splunk Security Analyst Workflows 7.1.0" it says and I quote:
"If you added notable events to investigations, or generated short IDs for notable events to...
See more...
On page 12 of 122 on the documentation of "Splunk Security Analyst Workflows 7.1.0" it says and I quote:
"If you added notable events to investigations, or generated short IDs for notable events to share them with other analysts, you can filter by the Associations filter to quickly view the notable events associated with a specific investigation or the notable event represented by a short identifier.
However, the short ID filter dropdown lists all short IDs, including notable events that are suppressed."
I am confused by the statement: "If the notable event is suppressed, you will not be able to see it on the Incident Review page when filtering on short ID." when just before it, it says that : "However, the short ID filter dropdown lists all short IDs, including notable events that are suppressed."
Which one is true or correct. Does filtering using short IDS show the suppressed notable events or does it not on the incident review page?
Thank You
Kind Regards,
I have 20+ data sources in a server and each data source is over 500MB so splunk is not indexing all the data sources. Please advise what config that i should add to parse all the 20+ data sources i...
See more...
I have 20+ data sources in a server and each data source is over 500MB so splunk is not indexing all the data sources. Please advise what config that i should add to parse all the 20+ data sources in a server. [monitor:///var/opt/*/logs/*/mlogging/*/*/*/*/ProxyLog/Proxy.log] sourcetype = middlewarelog index = prd initCrcLength = 512 ignoreOlderThan = 7d _TCP_ROUTING=if disabled = false
The above input is not parsing all the data sources, it just parsing only one data source and skipping all others in the server. Thanks.
Hi ALL,
Has anyone Integrated Appsense logs in Splunk cloud instance.
If yes what is the best way to perform the action. any idea or help would be helpful
Thank you
How come this doesn't work given indexers.csv is a list of Splunk servers with role Indexer?
| inputlookup indexers.csv| rename splunk_server as Indxr| foreach Indxr [search index=_introspection so...
See more...
How come this doesn't work given indexers.csv is a list of Splunk servers with role Indexer?
| inputlookup indexers.csv| rename splunk_server as Indxr| foreach Indxr [search index=_introspection sourcetype=splunk_resource_usage component=IOStats host=Indxr | eval reads_ps = 'data.reads_ps'| eval writes_ps = 'data.writes_ps' | eval writes_ps=avg(write_ps) | eval reads_ps=avg(reads_ps)]
Hi splunkers,
I would like to inform you that i am using below geostat spl, but i am unable to get result can anyone help me please where i am doing mistake i have chosen .csv file source type when...
See more...
Hi splunkers,
I would like to inform you that i am using below geostat spl, but i am unable to get result can anyone help me please where i am doing mistake i have chosen .csv file source type when i am trying to get spl result it says no data found
index="main" | geostats latfield=vendorlatitude longfield=vendorlongtitude count by vendorcountry
Would be appreciate your kind support. thanks in advance
Is there a way to journal all Exchange 365 messages to splunk for archiving /Compliance purposes?
The actual message, so if there were to be a lawsuit down the road, we could bring up the actual me...
See more...
Is there a way to journal all Exchange 365 messages to splunk for archiving /Compliance purposes?
The actual message, so if there were to be a lawsuit down the road, we could bring up the actual messages and read it.
Hi All, i have one app in splunk and created inputs, under input i have Jira Query Language(JQL)......................project=IIT%20AND%20updated%3E-6h But now i need to add other project int...
See more...
Hi All, i have one app in splunk and created inputs, under input i have Jira Query Language(JQL)......................project=IIT%20AND%20updated%3E-6h But now i need to add other project into it, the project is IIM, now i am not getting how to add in this JQL query. please help me on this.
I dont want incidentid/ checkpoint type
Hi Team,
I have created an application which has key, name etc., But there are some extra fields like checkpoint type are popping up whenever I create a ...
See more...
I dont want incidentid/ checkpoint type
Hi Team,
I have created an application which has key, name etc., But there are some extra fields like checkpoint type are popping up whenever I create a new input to that app. How do I change these setting backend?
Please help!!!
Regards,
Samhitha.
I have tried the following to send the included windows event to null but it does not work
I have tried the props.conf and transform.conf in system\local and apps\"appname"\local
raw event:
<Ev...
See more...
I have tried the following to send the included windows event to null but it does not work
I have tried the props.conf and transform.conf in system\local and apps\"appname"\local
raw event:
<Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'><System><Provider Name='Microsoft-Windows-Sysmon' Guid='{5770385f-c22a-43e0-bf4c-06f5698ffbd9}'/><EventID>13</EventID><Version>2</Version><Level>4</Level><Task>13</Task><Opcode>0</Opcode><Keywords>0x8000000000000000</Keywords><TimeCreated SystemTime='2023-02-22T16:39:16.083750800Z'/><EventRecordID>18650882160</EventRecordID><Correlation/><Execution ProcessID='2496' ThreadID='3780'/><Channel>Microsoft-Windows-Sysmon/Operational</Channel><Computer>site-wec.site.lan</Computer><Security UserID='S-1-5-18'/></System><EventData><Data Name='RuleName'>-</Data><Data Name='EventType'>SetValue</Data><Data Name='UtcTime'>2023-02-22 16:39:16.081</Data><Data Name='ProcessGuid'>{4bf925e4-0d0b-63e5-4100-000000002000}</Data><Data Name='ProcessId'>2688</Data><Data Name='Image'>C:\Windows\system32\svchost.exe</Data><Data Name='TargetObject'>HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\EventCollector\Subscriptions\Sysmon\EventSources\site-wec.site.lan\Bookmark</Data><Data Name='Details'><BookmarkList><Bookmark Channel="Microsoft-Windows-Sysmon/Operational" RecordId="18650811531" IsCurrent="true"/></BookmarkList></Data><Data Name='User'>NT AUTHORITY\NETWORK SERVICE</Data></EventData></Event>
props.conf
[XmlWinEventLog:Microsoft-Windows-Sysmon/Operational TRANSFORMS-sysmon13Bookmark = sysmon13-Bookmark
transforms.conf
[sysmon13-Bookmark] REGEX = (<EventID>13<\/EventID>).+Bookmark DEST_KEY = queue FORMAT = nullQueue
Try this request on Splunk :
| makeresults | eval redir="../../app"
My request is automatically transformed by
| makeresults | eval redir="app"
How can I have...
See more...
Try this request on Splunk :
| makeresults | eval redir="../../app"
My request is automatically transformed by
| makeresults | eval redir="app"
How can I have a work around ?
I try on chrome or firefox without success.
I am trying to create a report that will take a username(user) and look for the most recent IP address(src_ip) they used, then take that IP address that was found and look back 7 days for all events ...
See more...
I am trying to create a report that will take a username(user) and look for the most recent IP address(src_ip) they used, then take that IP address that was found and look back 7 days for all events where the src_ip is the same. I've attempted to use join in different manners but have been unsuccessful. Here is the latest try, where I get a return of everything, but I just need a complete list of users using that same IP.
Any help is appreciated, thanks!
index=firewall
|where isnotnull(src_ip)
|join type=left [
| search (index=firewall user=mrusername)
| eval ip=src_ip
| where isnotnull(ip)
| fields ip
]
| where ip=src_ip
|table user IP src_ip
Hi everybody,
I would like to duplicate data coming from my sourcetype in such a way:
- send the original data to Splunk for indexing.
- send the duplicated events to an external server with "<...
See more...
Hi everybody,
I would like to duplicate data coming from my sourcetype in such a way:
- send the original data to Splunk for indexing.
- send the duplicated events to an external server with "<DNS>" prefix string.
How should I modify the transform.conf file in order to do that?
Another question: is there a better way to forwards logs to external server while keeping the original source host (source IP) instead of adding prefixes like what I'm trying to do.
Thanks in advance,
Angelo