All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

index=mysql sourcetype=audit_log earliest=1 | rex field=source "\/home\/mysqld\/(?<Database1>.*)\/audit\/" | rex field=source "\/mydata\/log\/(?<Database2>.*)\/audit\/" | eval Database... See more...
index=mysql sourcetype=audit_log earliest=1 | rex field=source "\/home\/mysqld\/(?<Database1>.*)\/audit\/" | rex field=source "\/mydata\/log\/(?<Database2>.*)\/audit\/" | eval Database = coalesce(Database1,Database2) | fields - Database1,Database2 | rex field=USER "(?<USER>[^\[]+)" | rex mode=sed field=HOST "s/\.[a-z].*$//g" | eval TIMESTAMP=strptime(TIMESTAMP, "%Y-%m-%dT%H:%M:%S UTC") | where TIMESTAMP > now()-3600*24*90 | eval TIMESTAMP=strftime(TIMESTAMP, "%Y-%m-%d") | eval COMMAND_CLASS=if(isnull(COMMAND_CLASS) OR COMMAND_CLASS="", "NA", COMMAND_CLASS) | eval HOST=if(isnull(HOST) OR HOST="", "NA", HOST) | eval IP=if(isnull(IP) OR IP="", "NA", IP) | eval Action=if(isnull(NAME) OR NAME="", "NA", NAME) | eval STATUS=if(isnull(STATUS) OR STATUS="", "NA", STATUS) | eval Query=if(isnull(SQLTEXT) OR SQLTEXT="", "NA", SQLTEXT) | eval USER=if(isnull(USER) OR USER="", "NA", USER) | stats count as Events by Database USER HOST IP COMMAND_CLASS Action STATUS Query TIMESTAMP | lookup mysql_databases.csv DATABASE as Database OUTPUT APP_NAME | eval APP_NAME=if(isnull(APP_NAME) OR APP_NAME="", "NA", APP_NAME)   and hence getting no output in search and reporting tab
I want to trigger custom actions from Appdynamics to Ansible, This required to create custom action on controller but we have SaaS controller. Please advise how would I create the same.
Hello - I am trying to rename column produced using xyseries for splunk dashboard. Can I do that or do I need to update our raw splunk log? The log event details=     data: { [-] error... See more...
Hello - I am trying to rename column produced using xyseries for splunk dashboard. Can I do that or do I need to update our raw splunk log? The log event details=     data: { [-] errors: [ [+] ] failed: false failureStage: null event: GeneratePDF jobId: 144068b1-46d8-4e6f-b3a9-ead742641ffd pageCount: 1 pdfSizeInMb: 7.250756 } userId: user1@user.com       the current splunk query I have is -       | stats count by data.userId, data.failed | xyseries data.userId, data.failed count       Currently - my data is returning as follows data.userId false true User1@user.com 2   User2@user.com 3 1 User3@user.com 2 2   Can I rename false = Successful and true = Failed?   Thank you in advance
This is my sample data: I need props for this so that events will break properly in Splunk. Can any one help me to know how the line breaker, time format, time prefix ect to be wriiten and any ... See more...
This is my sample data: I need props for this so that events will break properly in Splunk. Can any one help me to know how the line breaker, time format, time prefix ect to be wriiten and any other are required in props.conf  quotation-events~~IM~. ABC~CA~Wed Jan 02 23:24:56 EST   2023~A~0.12~0...~2345.78~SM~quotation-events D0C5A044~~AB~DFR~Mon Jan 01 12:52:14 EST   2022~B~107.45~106.90~123.09~T~2345A1 quotation-events~~IS~;S. ABC~CA~Tue Jan 02 23:24:56 EST   2023~A~0.12~0...~2345.78~SM~quotation-events V0C5A044~~AB~DFR~Mon Jan 01 12:52:14 EST   2022~B~107.45~106.90~123.09~T~2345A1 quotation-events~~IM~. ADC~BA~Sat Jan 01 13:24:56 EST   2023~A~0.12~0...~2345.78~SM~quotation-events B0C5A044~~AB~DFR~Mon Jan 01 12:52:14 EST   2022~B~107.45~106.90~123.09~T~2345A1 quotation-events~~IM~. CCC~HA~Sun Jan 01 20:24:56 EST   2023~A~0.12~0...~2345.78~SM~quotation-events G0C5A044~~AB~DFR~Mon Jan 01 12:52:14 EST   2022~B~107.45~106.90~123.09~T~2345A1
I'm trying to add a condition in a playbook (version 5.2.1.78411) that will test the current day of the week. At the moment, I've been trying to get current_date:now.day_of_week to evaluate - choosin... See more...
I'm trying to add a condition in a playbook (version 5.2.1.78411) that will test the current day of the week. At the moment, I've been trying to get current_date:now.day_of_week to evaluate - choosing any of the 'day of week' values seems to insert a number (string?) between 01 and 07. As far as I can tell, the case in the screenshot should evaluate as 'true', but it isn't, and for good measure I've tried all of the other possible options in the "day of week" selector. Any idea what I'm doing wrong here?
I am trying to match results to ONLY the names in a list I have using a lookup.  I cant figure out for the life of me what I am doing wrong, been trying every single variated on lookup and inputlooku... See more...
I am trying to match results to ONLY the names in a list I have using a lookup.  I cant figure out for the life of me what I am doing wrong, been trying every single variated on lookup and inputlookup  I can think of or find online.  Anyone have any idea what I am doing wrong? index=epp "Device Control" AND ("USB Storage Device" OR "Internal CD or DVD RW" OR "Internal Floppy Drive" OR "Zip Drive") AND NOT ("file read" OR "Connected" OR "unblocked" OR "Disconnected") | rex field=_raw "epp\.tusimple\.ai\s\-\s(?<LogSource>.*)\s\-\s" | rex field=_raw "\[Event\sName\]\s(?<EventAction>.*)\s\|\s\[Client\sComputer" | rex field=_raw "\[Client\sComputer\]\s(?<Hostname>.*)\s\|\s\[IP\sAddress" | rex field=_raw "\[IP\sAddress\]\s(?<IPAddress>.*)\s\|\s\[MAC\sAddress" | rex field=_raw "\[MAC\sAddress\]\s(?<MACAddress>.*)\s\|\s\[Serial\sNumber" | rex field=_raw "\[Serial\sNumber\](?<SerialNumber>.*)\|\s\[Client\sUser" | rex field=_raw "\[Client\sUser\](?<UserName>.*)\|\s\[Device\sType" | rex field=_raw "\[Device\sType\](?<DeviceType>.*)\|\s\[Device\]" | rex field=_raw "\|\s\[Device\](?<DeviceDescription>.*)\|\s\[Device\sVID\]" | rex field=_raw "\|\s\[Device\sSerial\](?<DeviceSerial>.*)\|\s\[EPP\sClient\sVersion\]" | rex field=_raw "\[File\s\Name\](?<FileName>.*)\|\s\[File\sHash\]" | rex field=_raw "\|\s\[File\sType\](?<FileType>.*)\|\s\[File\sSize\]" | rex field=_raw "\|\s\[File\sSize\](?<FileSize>.*)\|\s\[Justification\]" | rex field=_raw "\[Date\/Time\(Client\)\](?<EventTimeStamp>.*)\|\s\[Date\/Time\(Server\sUTC\)\]" [ | inputlookup R_Emp.csv | table EventTimeStamp LogSource EventAction UserName FileName FileType FileSize Hostname IPAddress MACAddress SerialNumber DeviceType DeviceDescription DeviceSerial ]
I have an ASP .Net application that is currently setup to be monitored using Splunk Open Telemetry (Signal Fx) using the automated tracer installed in the VM host. I found a need to add custom trac... See more...
I have an ASP .Net application that is currently setup to be monitored using Splunk Open Telemetry (Signal Fx) using the automated tracer installed in the VM host. I found a need to add custom trace or span for some of critical code path to gather more instrumentation. What is the best way to integrate into .Net app while still using the automated tracer installed in the VM Host?   These are the options I'm seeing: Splunk Observability via SignalFx auto instrumentation OpenTelemetry.io Nuget (manual or automatic instrumentation) System.Diagnostics.DiagnosticSource to manually instrument and then collect these using OpenTelemetry.io Nuget Which one would be the one that will not interfere with the automated tracer from Splunk?
I would like to know if it is possible to be able to inject an event to a heavy forwarder via the hec and then have it be split into two events and sent to different indexes. For example I have the... See more...
I would like to know if it is possible to be able to inject an event to a heavy forwarder via the hec and then have it be split into two events and sent to different indexes. For example I have the original log line of: ID=1 time=”2022-12-29 16:57:41 UTC” name=”person” address=”abc” message=”some note”. I want the event to be split but the two new events can share similar fields. So index1 would be:   ID=1 time=”2022-12-29 16:57:41 UTC” name=”person” address=”abc” And index2 would be: ID=1 time=”2022-12-29 16:57:41 UTC” message=”some note”
I will be ingesting a JSON file daily that has a K/V field for the date as follows:   "Date": "2023-01-04"   I just want to verify the time format in the props.conf file should be set as fol... See more...
I will be ingesting a JSON file daily that has a K/V field for the date as follows:   "Date": "2023-01-04"   I just want to verify the time format in the props.conf file should be set as follows:   TIME_FORMAT=%y-%m-%d    Thx
I am using the search depicted in the attached photo below to develop a viz in Dashboard Studio separating events by the field "bundleId". It appears to display events in the statistics table the way... See more...
I am using the search depicted in the attached photo below to develop a viz in Dashboard Studio separating events by the field "bundleId". It appears to display events in the statistics table the way that I want them to. However, when I save them to a dashboard via Dashboard Studio, I get an "Invalid Date" where I want the break in events (Note - this does not happen in the "Classic" version)   How can the "Invalid date" be removed? I already attempted to eval _time=" " in the appendpipe with no success. Thank you.
Hi, I have been looking to see if splunk has the capability of searching for loggins outside of a specified set time range on windows and linux systems. What I mean by this is that I am looking to ... See more...
Hi, I have been looking to see if splunk has the capability of searching for loggins outside of a specified set time range on windows and linux systems. What I mean by this is that I am looking to see loggings that only happen before, lets say, 0600 and after 1600. any information that i can get would be much appreciated.
I'm fairly new to Splunk and I am having some trouble grouping somethings they way I want I have some data which all have a certain ID and a multitude of other values. I want to be able to group th... See more...
I'm fairly new to Splunk and I am having some trouble grouping somethings they way I want I have some data which all have a certain ID and a multitude of other values. I want to be able to group this data if they have the same ID, but only group them in a maximum time interval of 24 hours. This I figured out pretty easily, however, the problem is I would also like to see the actual duration of events.  For example, say I have 10 or so events that all have the same ID and they occur within a 5 minute period, I'd want to group them together. I'd also like to be able to group 10 or so events that have the same ID and occur within a 23 hour period.  I've tried using bins, which groups them properly, but then it gives them all the exact same time, so I don't know how to find the exact duration. I've also tried using time charts and transactions with poor results. Does anyone have any ideas?
Hi, I'm trying to come up with a query to generate the count of strings in a json field in a log, across all events.  For example, say I have a search that displays say, 100,000 logs, with each log ... See more...
Hi, I'm trying to come up with a query to generate the count of strings in a json field in a log, across all events.  For example, say I have a search that displays say, 100,000 logs, with each log containing some JSON structured string [{"First Name": "Bob", "DOB":"1/1/1900", ..."Vendor":"Walmart"}] I want to generate a table that lists all the unique Vendor values, and the count of the values. Something like, Vendor | Count Walmart   5 Target       3 ToysRUs.   100 etc... Is something like this possible?
I'm making a dashboard for a customer that contains vulnerability data and some of the vulnerability names are really long causing the text on the y axis of the bar chart to be very small. Is there a... See more...
I'm making a dashboard for a customer that contains vulnerability data and some of the vulnerability names are really long causing the text on the y axis of the bar chart to be very small. Is there any way I can increase this font size so it's easier to read? Not a big deal if it truncates, but need larger font.
Good morning, I have a question. I have an ngnix proxy and I would like to monitor it with appdynamics. Since we have it configured for the agents to connect to it and the proxy to the controller... See more...
Good morning, I have a question. I have an ngnix proxy and I would like to monitor it with appdynamics. Since we have it configured for the agents to connect to it and the proxy to the controller for client security How can we integrate the controller? What agents could deal with this?  
I mean I don't even know where to start with this Error, lol Of course you can not import something that does not exist, it's like me saying I can not eat the cake that does not exist on my table. ... See more...
I mean I don't even know where to start with this Error, lol Of course you can not import something that does not exist, it's like me saying I can not eat the cake that does not exist on my table. Anyway how do I go about find this application, looks like it gave me a name of it, does it look defaultish to you all or is this something we rolled ourselves? WARN ApplicationManager [0 MainThread] - Cannot import non-existent application: __globals__
Hello,           Please help me with the below requirement.          I need to capture usernames from 90 days worth of data from a large datasets which includes multiple source types and multiple... See more...
Hello,           Please help me with the below requirement.          I need to capture usernames from 90 days worth of data from a large datasets which includes multiple source types and multiple indexes.           Search "index=* sourcetype=* earliest=-90d@d latest=now | eval LOGIN = lower(user) | stats count by LOGIN sourcetype"" is taking forever.            Is there a better way to capture the 90 days worth usernames and source types without timeout?          Note:  I can able to schedule the search to capture them and append the results. However I am not sure, what time modifiers I should use, If I want to capture all of them in a single day & that should be a continuous process every day.      
I have created a dashboard in Dashboard Studio and have configured a "Link to dashboard" drilldown. It works fine when the token value does not have any whitespace in it but does not work when there ... See more...
I have created a dashboard in Dashboard Studio and have configured a "Link to dashboard" drilldown. It works fine when the token value does not have any whitespace in it but does not work when there are spaces. The URL that is generated from the drilldown has a format of "firstword%2Bsecondword" and when I show the token value in the second dashboard, it is translated to "firstword+secondword".  This causes the search to return 0 results due to the "+" in the value. How do I configure this so that there is a space in the value instead of a "+"? Thanks
Hello experts, I am trying to integration salesforce cloud modules into splunk for security monitoring. Does anyne has any prior experience in this I want to know whether we can use Splunk Add-on for... See more...
Hello experts, I am trying to integration salesforce cloud modules into splunk for security monitoring. Does anyne has any prior experience in this I want to know whether we can use Splunk Add-on for Salesforce for the same or any customisation is required. also what kind of logs would be relevant to be collected from from the modules , espeially marketing cloud . please guide. Thanks 
Is it possible to set up a report that includes drilldown events? For example, if my search returns a field with 10 values, can the reporting feature include all 10 events in the CSV file instead of ... See more...
Is it possible to set up a report that includes drilldown events? For example, if my search returns a field with 10 values, can the reporting feature include all 10 events in the CSV file instead of  the event statistics?