All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Stacked mode is not a valid option for line charts - try column charts
Try something like this index=gc source=apps | eval AMT=if(IND="DR", BASE_AMT*-1, BASE_AMT) | eval GLBL1=if(FCR="DR", GLBL*-1, GLBL) | eval DATE="20".REC_DATE | where DATE = strftime(relative_time(n... See more...
Try something like this index=gc source=apps | eval AMT=if(IND="DR", BASE_AMT*-1, BASE_AMT) | eval GLBL1=if(FCR="DR", GLBL*-1, GLBL) | eval DATE="20".REC_DATE | where DATE = strftime(relative_time(now(), "-30d@d"),"%Y%m%d") OR DATE=strftime(relative_time(now(), "@d"),"%Y%m%d") | stats sum(AMT) as w3AMT, sum(GLBL1) as w3FEE_AMT by DATE id | eval w4AMT=if(DATE=strftime(relative_time(now(), "@d"),"%Y%m%d"),null(),w3AMT) | eval w3AMT=if(DATE=strftime(relative_time(now(), "@d"),"%Y%m%d"),w3AMT,null()) | eval w4FEE_AMT=if(DATE=strftime(relative_time(now(), "@d"),"%Y%m%d"),null(),w3FEE_AMT) | eval w3FEE_AMT=if(DATE=strftime(relative_time(now(), "@d"),"%Y%m%d"),w3FEE_AMT,null()) | eval DATE=strftime(relative_time(now(), "@d"),"%Y%m%d") | stats values(*) as * by DATE id
I created a splunk dashboard that has a lot of filters (multiple dropdowns), and text input with different tokens, and with dynamic tables too. I want make it dynamic foreach filter that I choose, bu... See more...
I created a splunk dashboard that has a lot of filters (multiple dropdowns), and text input with different tokens, and with dynamic tables too. I want make it dynamic foreach filter that I choose, but for now it still can't be dynamic for every existing output and filter. Here my xml:     <form version="1.1" theme="dark"> <label>Dashboard Overview</label> <fieldset submitButton="false"> <input type="time" token="global_time" searchWhenChanged="true"> <label>Select Time</label> <default> <earliest>-24h@h</earliest> <latest>now</latest> </default> </input> <input type="dropdown" token="owner" searchWhenChanged="true"> <label>Select Owner</label> <choice value="*">All</choice> <default>*</default> <initialValue>*</initialValue> <fieldForLabel>owner</fieldForLabel> <fieldForValue>owner</fieldForValue> <search> <query>index=db_warehouse | dedup owner | fields owner | table owner</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> </input> <input type="dropdown" token="hostname" searchWhenChanged="true"> <label>Select Hostname</label> <choice value="*">All</choice> <default>*</default> <fieldForLabel>hostname</fieldForLabel> <fieldForValue>hostname</fieldForValue> <search> <query>index=db_warehouse hostname=$hostname$ owner=$owner$ ipaddress=$ipaddress$ cve=$cve$ cve=$cve$ | dedup hostname | fields hostname | table hostname</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <initialValue>*</initialValue> </input> <input type="dropdown" token="ipaddress" searchWhenChanged="true"> <label>Select by IP Address</label> <choice value="*">All</choice> <default>*</default> <fieldForLabel>ipaddress</fieldForLabel> <fieldForValue>dest</fieldForValue> <search> <query>index=db_warehouse | search hostname=$hostname$ owner=$owner$ ipaddress=$ipaddress$ cve=$cve$ | dedup dest | fields dest | table dest</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> </input> <input type="text" token="cve"> <label>Search CVE</label> <default>*</default> </input> </fieldset> <table> <title>Detail Information Table</title> <search> <query>index=db_warehouse | fields _time, hostname, dest, mac_address, vulnerability_title, os_version, os_description, severity, cvss_score, last_assessed_for_vulnerabilities, solution_types,cve, owner, dest_category | search hostname=$hostname$ owner=$owner$ ipaddress=$ipaddress$ cve=$cve$ | rename dest as ip, dest_category as category | table _time, hostname, ip, mac_address, vulnerability_title, owner, category, cve, os_version, os_description, severity, cvss_score, last_assessed_for_vulnerabilities, solution_types | dedup hostname</query> <earliest>$global_time.earliest$</earliest> <latest>$global_time.latest$</latest> </search>      Is there any reference or solution for this?
We have data similar to below and are looking to created a stacked timechart, however setting the stackmode does not seem to have any impact on the chart timestamp System Value TIME1 SYS1 VALUE1.... See more...
We have data similar to below and are looking to created a stacked timechart, however setting the stackmode does not seem to have any impact on the chart timestamp System Value TIME1 SYS1 VALUE1.1 TIME1 SYS2 VALUE2.1 TIME1 SYS3 VALUE3.1 TIME1 SYS4 VALUE4.1 TIME2 SYS1 VALUE1.2 TIME2 SYS2 VALUE2.2 TIME2 SYS3 VALUE3.2 TIME2 SYS4 VALUE4.2 timechart latest(Value) by System <option name="charting.chart.stackMode">stacked</option>
Hi, can anyone help me with the solution please. I have wineventlog as below. By default it considering the whitespace while parsing the fieldname. For eg: it should extract the field name as "Prov... See more...
Hi, can anyone help me with the solution please. I have wineventlog as below. By default it considering the whitespace while parsing the fieldname. For eg: it should extract the field name as "Provider Name", but instead it is extracting the field name as "Name". It considering whitespace and extracting the filename. Similarly I have many fields as highlighted below. please guide me where I have to make such change to get the correct field names. Sample Log: <Event xmlns='http://XXX.YYYY.com/win/2004/08/events/event'><System><Provider Name='Microsoft-Windows-Security-Auditing' Guid='{12345-1111-2222-a5ba-XXX}'/><EventID>2222</EventID><Version>0</Version><Level>0</Level><Task>12345</Task><Opcode>0</Opcode><Keywords>1110000000000000</Keywords><TimeCreated SystemTime='2024-07-24T11:36:15.892441300Z'/><EventRecordID>0123456789</EventRecordID><Correlation ActivityID='{11aa2222-abc2-0001-0002-XXXX1122}'/><Execution ProcessID='111' ThreadID='111'/><Channel>Security</Channel><Computer>YYY.xxx.com</Computer><Security/></System><EventData><Data Name='MemberName'>-</Data><Data Name='MemberSid'>CORP\gpininfra-svcaccounts</Data><Data Name='TargetUserName'>Administrators</Data><Data Name='TargetDomainName'>Builtin</Data><Data Name='TargetSid'>BUILTIN\Administrators</Data><Data Name='SubjectUserSid'>NT AUTHORITY\SYSTEM</Data><Data Name='SubjectUserName'>xyz$</Data><Data Name='SubjectDomainName'>CORP</Data><Data Name='SubjectLogonId'>1A2B</Data><Data Name='PrivilegeList'>-</Data></EventData></Event>
Hi Rajesh, It was the http configuration on my controller, as soon as I changed to https and re-deploy my cluster-agent, started reporting to my controller. Thanks for the help and patience. H... See more...
Hi Rajesh, It was the http configuration on my controller, as soon as I changed to https and re-deploy my cluster-agent, started reporting to my controller. Thanks for the help and patience. Have a great day! Regards Gustavo Marconi
OK so this size doesn't look like it should give you a problem, so it is possibly down to your actual data. Does it fail for all values of id? Are there other fields that you could try adding instead... See more...
OK so this size doesn't look like it should give you a problem, so it is possibly down to your actual data. Does it fail for all values of id? Are there other fields that you could try adding instead of count_err which might work? Can you break down the problem further to try and isolate the issue?
Are you sure your lookahead is big enough? I haven't counted exactly but your event seems close to exceeding that 650 characters mark before reaching the timestamp. Also - have you verified your TIM... See more...
Are you sure your lookahead is big enough? I haven't counted exactly but your event seems close to exceeding that 650 characters mark before reaching the timestamp. Also - have you verified your TIMESTAMP_PREFIX? That capture group looks strange and you have a very strange lookbehind which seems to not do what you think it should do. Verify it on regex101.com
This looks awfully close to a part of a json structure inserted as a string field in another json structure. It is bad on at least two levels. 1) Embedding json as escaped string prevents it from b... See more...
This looks awfully close to a part of a json structure inserted as a string field in another json structure. It is bad on at least two levels. 1) Embedding json as escaped string prevents it from being properly parsed by Splunk 2) Extracting from structured data with regexes is asking for trouble
1. CM does not manage SHC. CM manages indexer cluster. Deployer (not deployment server!) is used to push configuration to SHC 2. As @Tom_Lundie said - you don't add inputs using GUI on SHC. In fact,... See more...
1. CM does not manage SHC. CM manages indexer cluster. Deployer (not deployment server!) is used to push configuration to SHC 2. As @Tom_Lundie said - you don't add inputs using GUI on SHC. In fact, you shouldn't use SHC to run inputs. Even in a smaller environment you shouldn't run inputs on a standalone SH - that's what HFs are for.
Your recovery event doesn't seem to match the rex pattern you are applying to it. Are there other recovery events which do match? Do you want to ignore the recovery events which don't match the rex p... See more...
Your recovery event doesn't seem to match the rex pattern you are applying to it. Are there other recovery events which do match? Do you want to ignore the recovery events which don't match the rex pattern? P.S. You can leave the transaction command in if you like but I don't see what value it is giving you because all the information for the event appears to be in the single event (and therefore the transaction command is just wasting time and resources?).
What have you tried so far?  What error do you get?  Are you trying to extract the field at index-time or search-time? Have you tried this rex command in your search? | rex "SourceIp\\\\\\":\\\\\\"... See more...
What have you tried so far?  What error do you get?  Are you trying to extract the field at index-time or search-time? Have you tried this rex command in your search? | rex "SourceIp\\\\\\":\\\\\\"(?<SourceIp>[\d\.]+)"
So you didn't "find something else that helped".  You used my answer.
I don't understand the reply.  Did my answer work or not?  If your problem is resolved, then please click the "Accept as Solution" button to help future readers.
Hi @Tom_Lundie , I am checking is there anyways we can download sandboxing result as pdf. Regards, Harisha
Hi, If you are facing a specific error then please post it here. Otherwise if you just need general guidance then I would start with the documentation: Create a new playbook in Splunk SOAR (Cloud) ... See more...
Hi, If you are facing a specific error then please post it here. Otherwise if you just need general guidance then I would start with the documentation: Create a new playbook in Splunk SOAR (Cloud) - Splunk Documentation
Hi, This is by design, the problem with running modular inputs on the SHC layer is that if all of the nodes in the cluster attempt to run the input you would get duplicated data and all sorts of pro... See more...
Hi, This is by design, the problem with running modular inputs on the SHC layer is that if all of the nodes in the cluster attempt to run the input you would get duplicated data and all sorts of problems. Splunk seem to be actively developing a solution for this but do not officially support at the time of writing. That being said, a handful of apps do have official support (e.g. Splunk DB Connect). These seem to rely on the run_only_one directive in inputs.conf to ensure they only run on the captain node to prevent duplication. Unless your TA has official support for a deployment on a SHC, I would recommend using a separate, dedicated instance for input collection such as a Heavy Forwarder.
hi  5 columns and 79 rows
Below is the search query for icinga Problem and events too.   Below is the search query for Icinga Recovery and events.     If you want me to get rid of transaction command, thats fine.... See more...
Below is the search query for icinga Problem and events too.   Below is the search query for Icinga Recovery and events.     If you want me to get rid of transaction command, thats fine. I would like to group multiple events into a single meta-event that represents a single physical event.
Hello i want to extract ip field from a log but i give error. this is a part of my log: ",\"SourceIp\":\"10.10.6.0\",\"N i want 10.10.6.0 as a field. can you help me?