All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

We are trying to install AppD controller and noticed below error Task failed: Check if the required data directories have sufficient space on host: ..: The destination directory has insufficient dis... See more...
We are trying to install AppD controller and noticed below error Task failed: Check if the required data directories have sufficient space on host: ..: The destination directory has insufficient disk space. You need a minimum of 409600 MB for installation. I selected Profile as Small which says 400 GB Disk Storage. However, our host partition has only 355 GB. Is it possible to bypass the space check ? We do understand the risk of using less than suggested storage. Supported Platforms: Linux(64 bit) or Windows(64 bit) Supported on VMs: Yes CPU: 4 Cores, 1.5GHz minimum RAM: 16GB Disk Storage: 400GB
Are there any release notes available for Thinkst Canary AddOn For Splunk? Any concerns in moving from 1.1.7 to 1.1.11 ?
Example: Say I have two lookups A and B. Let's say they're both file-based lookups (even though I don't think it actually matters) A takes an input field called "first" and outputs an output fie... See more...
Example: Say I have two lookups A and B. Let's say they're both file-based lookups (even though I don't think it actually matters) A takes an input field called "first" and outputs an output field called "second" B takes an input field called "second" and outputs an output field called "third" Let's say they are both "automatic" courtesy of this in props.conf: LOOKUP-aaa = A first OUTPUT second LOOKUP-bbb = B second OUTPUT third We've also made an example with no inline renaming or anything - the external field names happen to be identical to the column names in the lookup. It's a little artificial but hey it's an example. More background A) The Splunk docs claim that this isn't supported. ( https://docs.splunk.com/Documentation/Splunk/8.0.2/Knowledge/Makeyourlookupautomatic See the statement "Splunk software does not support nested automatic lookups") However 2) They clearly do work at least a little, because apps use them and there are answers posts telling you how to do it. ( https://answers.splunk.com/answers/94609/automatic-lookup-on-a-field-that-is-automatically-looked-up.html https://answers.splunk.com/answers/209148/can-you-perform-an-automatic-lookup-based-on-the-o.html ) Question 1 -- how much of this actually works? Question 2 -- what changes have there been to that answer over time, if any?
My search is index="xxx" sourcetype="yyy" topic=IN* | stats list(message_count) as message_count by _time topic | xyseries _time topic message_count **RESULTS** IN-D IN-E ... See more...
My search is index="xxx" sourcetype="yyy" topic=IN* | stats list(message_count) as message_count by _time topic | xyseries _time topic message_count **RESULTS** IN-D IN-E IN-F IN-G 920699302 5140913432 7287016676 533221175 944835796 5149696236 7374961617 543221084 971821781 5157796684 7469880690 554235434 996644156 5166493227 7566048933 566376030 1021919011 5175093160 7660955334 577854421 1034750619 5183653994 7756249835 585835689 1043620281 5191941703 7840431124 593107481 Once I add delta command to get results I get accurate data for all topics except IN-F, but when I only run delta for topic IN-F Splunk returns accurate results index="historic_forensics" sourcetype="kafka_event_count" topic=IN* | stats list(message_count) as message_count by _time topic | xyseries _time topic message_count | delta IN-DT as IN-D | delta IN-E as IN-E | delta IN-F as IN-F | delta IN-G as IN-G | eval date=relative_time(_time,"-1d") | eval _time=strftime(date, "%F") | table _time IN* Is there something I'm doing wrong?
Is there a way to make an alert when a interesting field is no longer sending data?
Hi, I am sorry I am very new to the splunk and I am struggling with the results I want to get. I have a query that produces desired (kind of.. In visualization, months are still not in chronologi... See more...
Hi, I am sorry I am very new to the splunk and I am struggling with the results I want to get. I have a query that produces desired (kind of.. In visualization, months are still not in chronological order) result as bar chart without any effort. When I convert that to line chart, my grouping by month is removed and I get result for each day as seen below: Query: QUERY | eval Date=strftime('_time',"%Y-%m") | timechart count by Click | sort _time Desc Visualization looks perfect as bar chart: As soon as I select Line chart, My X-Axis displays chart by individual date and not grouped by month as seen below: 2: Can someone please tell me how to group my data by month in a line OR any other chart? Also how can I display my chart starting from latest? Thanks
Hello All, Is there a way in a Splunk search to iterate through a multiline field and do stats on each value/each line? The goal is to display only when Field2 has varying lengths of values. So i... See more...
Hello All, Is there a way in a Splunk search to iterate through a multiline field and do stats on each value/each line? The goal is to display only when Field2 has varying lengths of values. So in this case, the result will only display the rows Apples and Bananas. Example Table: Field1, Field2 (where \n is an actual newline and not the characters for a newline) Apples, 12345 \n 123456 \n 1234 Bananas, 123 \n 12345 Oranges, 123456 \n 123456 \n 123456 Thanks!
I have an dashboard that takes an email address in a Text input. Is there a way to supply an email address for the input and render the Dashboard using the REST API? The dashboard is used by a gro... See more...
I have an dashboard that takes an email address in a Text input. Is there a way to supply an email address for the input and render the Dashboard using the REST API? The dashboard is used by a group of internal investigators to gather artifacts for their investigations. Their process right now (obviously) is to log into Splunk, navigate to the dashboard, input an email address, hit the Submit button and export the resulting PDF. This is a small part of a much broader workflow. We'd like to automate this Splunk portion to help streamline their investigation process. So I'm trying to perform the same dashboard operations a person would, but using REST APIs instead. The PDF export is important because it compiles information from multiple searches into a single artifact, as opposed to creating multiple searches and outputs with the search API.
i have a table like below. cola:colb:colc:cold 1::2:3: :::: 1:2:3:4 when i do a stats , i only get non-null values is it possible to show null values in the form of table and eval them t... See more...
i have a table like below. cola:colb:colc:cold 1::2:3: :::: 1:2:3:4 when i do a stats , i only get non-null values is it possible to show null values in the form of table and eval them to 0...
Hi, I accidently deleted a CSV file. Is there any way to restore it or retrieve the CSV file.
In my props.conf I need a [source::] stanza to override some settings from a [sourcetype] stanza. The source is a file on a Windows server, so I take a look at the props.conf documentation, and follo... See more...
In my props.conf I need a [source::] stanza to override some settings from a [sourcetype] stanza. The source is a file on a Windows server, so I take a look at the props.conf documentation, and follow these instructions: When you specify Windows-based file paths as part of a [source::<source>] stanza, you must escape any backslashes contained within the specified file path. Example: [source::c:\\path_to\\file.txt] But that seems to be flat out wrong. If I use this: [source::C:\\dir1\\dir2\\app.log] it doesn't match. Change it to: [source::C:\dir1\dir2\app.log] and it matches. Am I missing something obvious, or is the documentation just wrong?
I have about 100 different metrics_names and I'd like to be able to Get the metric_name Plot that metric_name over time I have about 100 hosts and they all have a metric rack_number and ... See more...
I have about 100 different metrics_names and I'd like to be able to Get the metric_name Plot that metric_name over time I have about 100 hosts and they all have a metric rack_number and serial_number. The 100 hosts I don't care about but I want to plot every single metric where rack_number=1 and serial_number=12345 and take a look at it.
Below is my code and I want to display only "Druv" Failed logins. But, I see the user name 'None' , 'Karla' and other few names. How avoid the unwanted users in the output index=wineventlog s... See more...
Below is my code and I want to display only "Druv" Failed logins. But, I see the user name 'None' , 'Karla' and other few names. How avoid the unwanted users in the output index=wineventlog source="WinEventLog:Security" sourcetype=WinEventLog:Security EventCode=4625 where Account_Name=Druv | stats count by _time, Workstation_Name,Account_Name,dest, dest_nt_domain, product, app, action, EventCode, EventCodeDescription, Failure_Reason, name | sort _time | where Account_Name!="-" Thanks in advance
Hello, I have logs from syslog server, my goal is to have events from the same log but these events will indexed with different sourcetype according to the IP in the log. lets say I have a row i... See more...
Hello, I have logs from syslog server, my goal is to have events from the same log but these events will indexed with different sourcetype according to the IP in the log. lets say I have a row in the log with IP X his sourcetype will be ST1 for IP Y the sourcetype will be ST2 and so on. Here are my configurations inputs.conf [monitor://D:\Kiwi_DB] disabled = false index = db_it_syslog sourcetype = transformsourcetype props.conf [transformsourcetype] TRANSFORMS-transformsourcetype = change_st_by_IP1,change_st_by_IP2 transforms.conf [change_st_by_IP1] REGEX = "0.0.0.0" FORMAT = sourcetype::cyberark:epv:cef DEST_KEY = MetaData::Sourcetype Thanks! Hen
Derp nothing to see here- I used the generic data input and not the PCAP app specific app input
I am having trouble getting a result to appear for the below query. I am trying to produce a column showing time_diff of the lastest timestamp result for lane_RFID subtracted from the time now. The... See more...
I am having trouble getting a result to appear for the below query. I am trying to produce a column showing time_diff of the lastest timestamp result for lane_RFID subtracted from the time now. The table doesn't show a result for time_diff, but everything else shows properly. Hopefully it is something easy. Thank you. index=*"RFID Message received for:" | stats latest(date_time) by LANE_RFID | eval time_now=now() | eval time_now=strftime(time_now,"%Y/%m/%d %H:%M:%S") | eval time_diff=strftime(time_diff,"%M:%S") | eval time_diff=time_now-date_time| table LANE_RFID time_now latest(date_time) time_diff
Hi team, message_id status time 2020-02-12T12:22:23.415248Z ERROR 2020-02-14T00:01:14.038498814Z 2020-02-12T12:22:23.415248Z ERROR 2020-02-14T00:00:34.034346... See more...
Hi team, message_id status time 2020-02-12T12:22:23.415248Z ERROR 2020-02-14T00:01:14.038498814Z 2020-02-12T12:22:23.415248Z ERROR 2020-02-14T00:00:34.034346477Z 2020-02-12T12:22:23.415248Z ERROR 2020-02-13T23:59:53.851851061Z 2020-02-12T12:22:23.415248Z ERROR 2020-02-13T23:57:12.663621081Z 2020-02-12T12:22:23.415248Z ERROR 2020-02-13T23:53:51.293506747Z 2020-01-21T13:09:14.416164Z PROCESSED 2020-02-19T01:50:05.55630875Z 2020-01-21T13:09:14.416164Z PROCESSING 2020-02-19T01:50:04.621606854Z 2020-01-21T13:09:44.586501Z ERROR 2020-02-19T01:50:04.305742277Z 2020-01-21T13:09:44.586501Z PROCESSING 2020-02-19T01:50:04.233225192Z 2020-01-21T13:09:44.586416Z PROCESSED 2020-02-19T01:50:04.142651435Z 2020-01-21T13:09:44.586416Z PROCESSING 2020-02-19T01:50:03.826457927Z 2020-01-21T13:09:44.586321Z PROCESSED 2020-02-19T01:50:03.745964666Z 2020-01-21T13:09:44.586321Z PROCESSING 2020-02-19T01:50:03.449583679Z 2020-01-21T13:09:44.586190Z PROCESSED 2020-02-19T01:50:03.337887858Z 2020-01-21T13:09:44.586190Z PROCESSING 2020-02-19T01:50:03.086329734Z 2020-01-21T13:09:44.586063Z PROCESSED 2020-02-19T01:50:03.00531639Z 2020-01-21T13:09:44.586063Z PROCESSING 2020-02-19T01:50:02.735821778Z 2020-01-21T13:09:44.585532Z PROCESSED 2020-02-19T01:50:02.677935722Z 2020-01-21T13:09:44.585532Z PROCESSING 2020-02-19T01:50:02.379874913Z 2020-01-21T13:09:44.585456Z PROCESSED 2020-02-19T01:50:02.320574471Z 2020-01-21T13:09:44.585456Z PROCESSING 2020-02-19T01:50:02.056969718Z 2020-01-21T13:09:44.585379Z PROCESSED 2020-02-19T01:50:01.993389933Z 2020-01-21T13:09:44.585379Z PROCESSING 2020-02-19T01:50:01.645723986Z 2020-01-21T13:09:44.585301Z PROCESSED 2020-02-19T01:50:01.573655793Z 2020-01-21T13:09:44.585301Z PROCESSING 2020-02-19T01:50:01.319969304Z 2020-01-21T13:09:44.585220Z PROCESSED 2020-02-19T01:50:01.256761569Z 2020-01-21T13:09:44.585220Z PROCESSING 2020-02-19T01:50:00.980754532Z 2020-01-21T13:09:44.585132Z PROCESSED 2020-02-19T01:50:00.920435406Z 2020-01-21T13:09:44.583423Z PROCESSING 2020-02-19T01:49:54.709364124Z 2020-01-21T13:09:44.583342Z PROCESSED 2020-02-19T01:49:54.627564396Z 2020-01-21T13:09:44.583342Z PROCESSING 2020-02-19T01:49:54.379127471Z 2020-01-21T13:09:44.583255Z PROCESSED 2020-02-19T01:49:54.319034068Z 2020-01-21T13:09:44.583255Z PROCESSING 2020-02-19T01:49:54.028230252Z 2020-01-21T13:09:44.583171Z PROCESSED 2020-02-19T01:49:53.942640218Z 2020-01-21T13:09:44.583171Z PROCESSING 2020-02-19T01:49:53.689197493Z 2020-01-21T13:09:44.583085Z PROCESSED 2020-02-19T01:49:53.627728985Z 2020-01-21T13:09:44.583085Z PROCESSING 2020-02-19T01:49:53.389097603Z 2020-01-21T13:09:44.582989Z PROCESSED 2020-02-19T01:49:53.332868523Z 2020-01-21T13:09:44.582989Z PROCESSING 2020-02-19T01:49:53.085943873Z 2020-01-21T13:09:44.582905Z PROCESSED 2020-02-19T01:49:53.027980939Z 2020-01-21T13:09:44.582905Z PROCESSING 2020-02-19T01:49:52.757156504Z 2020-01-21T13:09:44.582821Z PROCESSED 2020-02-19T01:49:52.697941959Z 2020-01-21T13:09:44.582821Z PROCESSING 2020-02-19T01:49:52.463730556Z 2020-01-21T13:09:44.582727Z PROCESSED 2020-02-19T01:49:52.410138972Z 2020-01-21T13:09:44.582727Z PROCESSING 2020-02-19T01:49:52.169536808Z 2020-01-21T13:09:44.582639Z PROCESSED 2020-02-19T01:49:52.107720449Z 2020-01-21T13:09:44.582639Z PROCESSING 2020-02-19T01:49:51.84715461Z 2020-01-21T13:09:44.582555Z PROCESSED 2020-02-19T01:49:51.777011069Z 2020-01-21T13:09:44.582555Z PROCESSING 2020-02-19T01:49:51.488824085Z 2020-01-21T13:09:44.582467Z PROCESSED 2020-02-19T01:49:51.414304108Z 2020-01-21T13:09:44.582467Z PROCESSING 2020-02-19T01:49:51.146699571Z 2020-01-21T13:09:44.582370Z PROCESSED 2020-02-19T01:49:51.07314806Z 2020-01-21T13:09:44.582370Z PROCESSING 2020-02-19T01:49:50.803455506Z 2020-01-21T13:09:44.582288Z PROCESSED 2020-02-19T01:49:50.68563427Z 2020-01-21T13:09:44.582288Z PROCESSING 2020-02-19T01:49:50.418044177Z 2020-01-21T13:09:44.582211Z PROCESSED 2020-02-19T01:49:50.34967605Z I had three columns message_id, status, time and I want to print the 'message_ids' which are YetToBeProcessed YetToBeProcessed = ERROR+PROCESSED-PROCESSING Example: Error appeared in 50 times and corresponding message_id's Processed apperared 200 times and corresponding message_id's Procesing appeared 100 times and corresponding message_id's Note: error, processing message_id's might be same Processed,Processing message_id's might be same But error, processed should not be same. Thanks, Yamuna
I am using DB connect AddOnn 3.1.3 and my results are not being enriched. When setting up the DB Connect Lookup wizard, the "Set Reference Search" returns the suitable fields from the Splunk i... See more...
I am using DB connect AddOnn 3.1.3 and my results are not being enriched. When setting up the DB Connect Lookup wizard, the "Set Reference Search" returns the suitable fields from the Splunk index the "Set Lookup SQL" returns the requested fields from the DB I set up the field mapping by selecting all the right fields. But when i run the "Preview Results" it does not enrich the data. I've then done a packet capture and extracted the SQL query being sent to the SQL server and the last line is as follows below. dbxlookup WHERE "serial" IN (null) So I'm looking up a serial number, and the value should be from the Splunk search. "serial" AS "SERIAL_NUM" ^^ So this should copy the value of SERIAL_NUM to serial and then be used in the SQL string to the server. But it's sending null by mistake.
How do I get the App for Web Analytics to look at data in a different index? I made sure to add all non-default indexes to my role and when I search for "index=blah and tag=web" I see all of the co... See more...
How do I get the App for Web Analytics to look at data in a different index? I made sure to add all non-default indexes to my role and when I search for "index=blah and tag=web" I see all of the correct data with file, site etc fields populated but if I remove "index=blah" no data shows up so Web Analytics isn't seeing the other index.
Need help with parsing out some events from our exchange data where we want to track license changes on exchange accounts. I’m able to pinpoint the events but having a hard time pulling specific data... See more...
Need help with parsing out some events from our exchange data where we want to track license changes on exchange accounts. I’m able to pinpoint the events but having a hard time pulling specific data out of a multivalue fields. Current Search String: index="o365data" dataset_name=account_management | spath "ExtendedProperties{}.Name" | search "ExtendedProperties{}.Value"="[\"AssignedLicense\",\"AssignedPlan\",\"TargetId.UserType\"]" | spath "ModifiedProperties{}.Name" | search "ModifiedProperties{}.Name"=AssignedLicense | table _time, UserId, Operation, ObjectId, ModifiedProperties{}.NewValue, ModifiedProperties{}.OldValue | sort _time desc Example Output: ** ***_time:* 2020-02-20 08:40:50 UserId: xxx@domain.ca Operation: Update user. ObjectId: xxxx@domain.ca ModifiedProperties{}.NewValue: [ "[SkuName=O365_BUSINESS_ESSENTIALS, AccountId=xxx SkuId=3b555118-da6a-4418-894f-7df1e2096870, DisabledPlans=[WHITEBOARD_PLAN1,MYANALYTICS_P2,KAIZALA_O365_P2,STREAM_O365_SMB,OFFICEMOBILE_SUBSCRIPTION,BPOS_S_TODO_1,FORMS_PLAN_E1,FLOW_O365_P1,POWERAPPS_O365_P1,PROJECTWORKMANAGEMENT,SWAY,SHAREPOINTWAC,YAMMER_ENTERPRISE,MCOSTANDARD,SHAREPOINTSTANDARD]]" ] [ { "SubscribedPlanId": "9738cc87-6e16-4861-9486-32ebdda261d1", "ServiceInstance": "Exchange/namprd17-001-01", "CapabilityStatus": 0, "AssignedTimestamp": "2020-02-20T14:40:50.0728559Z", "InitialState": null, "Capability": null, "ServicePlanId": "9aaf7827-d63c-4b61-89c3-182f06f82e5c" }, { "SubscribedPlanId": "502e8918-06b9-453d-a44b-9fc4920b0f20", "ServiceInstance": "TeamspaceAPI/NA001", "CapabilityStatus": 0, "AssignedTimestamp": "2020-02-20T14:40:50.0728559Z", "InitialState": null, "Capability": null, "ServicePlanId": "57ff2da0-773e-42df-b2af-ffb7a2317929" } ] AssignedLicense, AssignedPlan Member We just want the “SkuName” (“0365_BUSINESS_ESSENTIALS” from example), “ServiceInstance” (“Exchange/namprd17-001-01” / “TeamspaceAPI/NA001” from example), and “DisabledPlans” (“WHITEBOARD_PLAN1” / “MYANALYTICS_P2” / etc.. from example). Would someone be able to provide some assistance with this?