All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

| addcoltotals count_carmen inserts updates errors | eval count_carmen=if(isnull(_time),count_carmen-inserts-updates-errors,count_carmen) | eval inserts=if(isnull(_time),null(),inserts) | eval update... See more...
| addcoltotals count_carmen inserts updates errors | eval count_carmen=if(isnull(_time),count_carmen-inserts-updates-errors,count_carmen) | eval inserts=if(isnull(_time),null(),inserts) | eval updates=if(isnull(_time),null(),updates) | eval errors=if(isnull(_time),null(),errors)
Thanks, worked perfectly.
Hi @Jeff.Arnold, Thanks for asking your question on Community. I did a quick search in the community and found this older post. Check it out as it provides two links for different resources. htt... See more...
Hi @Jeff.Arnold, Thanks for asking your question on Community. I did a quick search in the community and found this older post. Check it out as it provides two links for different resources. https://community.appdynamics.com/t5/General-Discussions/Is-there-any-sample-application-and-data-I-can-use-to-learn-AppD/m-p/33936
Hello all, please could you help me with one question - it is possible to add an png image on a rectangle square? Just as an example the rectangle is set like this - it is possible to include there... See more...
Hello all, please could you help me with one question - it is possible to add an png image on a rectangle square? Just as an example the rectangle is set like this - it is possible to include there an image to the corner of the rectangle? <a href=""> <g> <rect style=fill:color_grey width="150" height="90" x=1200 y=200/> </g> </a>   Thank you for any help and answers.
I want to add three fields insert ,update and error then subtract it from count_carmen and add new row .
Hi Experts, I would like rename sourcetype at index time with below config. props.conf [source::test/source.txt] TRANSFORMS-sourcetype = newsourcetype Transforms.conf [newsourcetype] SO... See more...
Hi Experts, I would like rename sourcetype at index time with below config. props.conf [source::test/source.txt] TRANSFORMS-sourcetype = newsourcetype Transforms.conf [newsourcetype] SOURCE_KEY = MetaData:Sourcetype REGEX = regex to match existing sourcetype FORMAT = newsourcetype DEST_KEY = MetaData:Sourcetype   Now I would like apply below settings on new sourcetype.  [newsourcetype] TZ= Linebreaker= Truncate= etc.. will it work this way ? Please let me know.   Thanks. Ram    
we have a data of 14k events under event index, which is unstructured. I'm trying to ingest this data under a metric index at search time using mcollect command and was able to convert the event logs... See more...
we have a data of 14k events under event index, which is unstructured. I'm trying to ingest this data under a metric index at search time using mcollect command and was able to convert the event logs to metrics. As per the splunk docs, it states metric index is optimized for the storage and retrieval of metric data. While there is improvement in the search time, the storage size instead of decreasing it drastically increased. How does the storage is optimized incase of metric index? Is there any additional configuration that needs to e setup. I have updated the always_use_single_value_output for mcollect command to false under limits.conf
Hi, I have setup an environment to learn at home. I have 2 instances, one serving as a Splunk Forwarder where I have my data and the other serving as Deployment Server + indexer + search head. I ... See more...
Hi, I have setup an environment to learn at home. I have 2 instances, one serving as a Splunk Forwarder where I have my data and the other serving as Deployment Server + indexer + search head. I configured the serverclass and the app, however I'm not getting data into the index from the forwarder even tho I checked the logs in the latter and the connection is successful. Is it because of the trial license? Any thoughts why is it not working as expected? Any info would be appreciated. Thanks.
How do I change color single value trend by default it either red for negative, and green for positive. I want my single value color to be as I define in my range value even if trend is negative or p... See more...
How do I change color single value trend by default it either red for negative, and green for positive. I want my single value color to be as I define in my range value even if trend is negative or positive. below is my source code. <earliest>-30d@d</earliest> <latest>now</latest> <sampleRatio>1</sampleRatio> </search> <option name="colorBy">trend</option> <option name="colorMode">none</option> <option name="drilldown">none</option> <option name="numberPrecision">0</option> <option name="rangeColors">["0xff1414","0xdc4e41","0x53a051","0xf1813f","0xdc4e41"]</option> <option name="rangeValues">[20,30,50,100]</option> <option name="showSparkline">1</option> <option name="showTrendIndicator">1</option> <option name="trellis.enabled">0</option> <option name="trellis.scales.shared">1</option> <option name="trellis.size">medium</option> <option name="trendColorInterpretation">standard</option> <option name="trendDisplayMode">absolute</option> <option name="trendInterval">-30d</option> <option name="unitPosition">after</option> <option name="useColors">1</option> <option name="useThousandSeparators">1</option> </single> </panel>    Thank you in advance.
Hello, How to query a field in DBXQuery that contains colon?   I ran the following query and got an error.  Thank you  | dbxquery connection=visibility query="select abc:def from tableCompany" or... See more...
Hello, How to query a field in DBXQuery that contains colon?   I ran the following query and got an error.  Thank you  | dbxquery connection=visibility query="select abc:def from tableCompany" org.postgresql.util.PSQLException: ERROR: syntax error at or near ":" Position: I tried to put single quote | dbxquery connection=visibility query="select 'abc:def' from tableCompany" but it gave me the following result ?column? abc:def abc:def
Thank you very much for the quick help, that did the trick.
I'm facing a rather peculiar issue with dashboards. When non-admin users, or users without the admin_all_objects capability, access the dashboard, all panels display "Waiting for data..." indefinitel... See more...
I'm facing a rather peculiar issue with dashboards. When non-admin users, or users without the admin_all_objects capability, access the dashboard, all panels display "Waiting for data..." indefinitely. However, the strangest part is that if the user clicks on the search of a panel and is redirected to the search view, the results appear immediately. Here's what I've tried so far: Searched through community questions and issues, but found nothing that matches this issue exactly. Experimented with different capabilities, but it seems only the admin_all_objects capability solves this issue. Attempted to adjust the job limits similar to those set for admin users. Assigning admin_all_objects capability to all users is not a viable solution for me due to security concerns. Has anyone encountered this issue before? I'm running out of ideas and would appreciate any help or insights on this. Note: Tested also on a local instance deployed via ansible-role-for-splunk to reproduce.   Thank you in advance for your time and assistance.
Hello @splunkreal,   thanks for replying to this it was really out of my mind.  I reached the support about it, and the conclusion was that it is possible. The best way to increase the parameter i... See more...
Hello @splunkreal,   thanks for replying to this it was really out of my mind.  I reached the support about it, and the conclusion was that it is possible. The best way to increase the parameter is to do it gradually and monitor the effects on the platform.   Regards
Hello, We are new to the Splunk environment, and are using Enterprise v9.01. We have  complete driver package from CData that allows us to use 100+ different ODBC and JDBC drivers. I tried the Splu... See more...
Hello, We are new to the Splunk environment, and are using Enterprise v9.01. We have  complete driver package from CData that allows us to use 100+ different ODBC and JDBC drivers. I tried the Splunk DB connect add-on and I can connect to a SQL DB. Can Splunk actually make connections to other JDBC/ODBC data sources, MongoDB, Teams, One-note etc from CData. Please let us know.    
Hello, we are a from a software editor integration team and we would like to help our customer to integrate easily our logs in their splunk. So we developped a python script using your samples and ... See more...
Hello, we are a from a software editor integration team and we would like to help our customer to integrate easily our logs in their splunk. So we developped a python script using your samples and our own python script to access our Audit trail API. The current script is working well outside splunk and retrieve our logs/ as soon as there are new indexes and forward the json result to stdout. But as soon as we put it inside Splunk we have "ERROR ExecProcessor" errors which are not very self explanatory. ----------------------------------- 08-30-2023 06:33:05.632 -0700 ERROR ExecProcessor [4316 ExecProcessor] - message from... ...bin\scripts\Final-2.py"", line 57, in <module> 08-30-2023 06:33:05.632 -0700 ERROR ExecProcessor [4316 ExecProcessor] - message from .... ...bin\scripts\Final-2.py"" response = requests.get(url, headers={'Content-Type': 'application/json'}, cert=cert_context, verify = False) 08-30-2023 06:33:05.632 -0700 ERROR ExecProcessor [4316 ExecProcessor] - message from ... ...bin\scripts\Final-2.py"" File "C:\Program Files\Splunk\Python-3.7\lib\site-packages\requests\api.py", line 76, in get 08-30-2023 06:33:05.632 -0700 ERROR ExecProcessor [4316 ExecProcessor] - message from .... ...bin\scripts\Final-2.py"" return request('get', url, params=params, **kwargs) 08-30-2023 06:33:05.632 -0700 ERROR ExecProcessor [4316 ExecProcessor] - message from .... ...bin\scripts\Final-2.py"" File "C:\Program Files\Splunk\Python-3.7\lib\site-packages\requests\api.py", line 61, in request 08-30-2023 06:33:05.632 -0700 ERROR ExecProcessor [4316 ExecProcessor] - message from .... ...bin\scripts\Final-2.py"" return session.request(method=method, url=url, **kwargs) 08-30-2023 06:33:05.632 -0700 ERROR ExecProcessor [4316 ExecProcessor] - message from .... ...bin\scripts\Final-2.py"" File "C:\Program Files\Splunk\Python-3.7\lib\site-packages\requests\sessions.py", line 542, in request 08-30-2023 06:33:05.632 -0700 ERROR ExecProcessor [4316 ExecProcessor] - message from .... ...bin\scripts\Final-2.py"" resp = self.send(prep, **send_kwargs) 08-30-2023 06:33:05.632 -0700 ERROR ExecProcessor [4316 ExecProcessor] - message from .... ...bin\scripts\Final-2.py"" File "C:\Program Files\Splunk\Python-3.7\lib\site-packages\requests\sessions.py", line 655, in send 08-30-2023 06:33:05.632 -0700 ERROR ExecProcessor [4316 ExecProcessor] - message from .... ...bin\scripts\Final-2.py"" r = adapter.send(request, **kwargs) 08-30-2023 06:33:05.632 -0700 ERROR ExecProcessor [4316 ExecProcessor] - message from .... ...bin\scripts\Final-2.py"" File "C:\Program Files\Splunk\Python-3.7\lib\site-packages\requests\adapters.py", line 416, in send 08-30-2023 06:33:05.632 -0700 ERROR ExecProcessor [4316 ExecProcessor] - message from .... ...bin\scripts\Final-2.py"" self.cert_verify(conn, request.url, verify, cert) 08-30-2023 06:33:05.632 -0700 ERROR ExecProcessor [4316 ExecProcessor] - message from .... ...bin\scripts\Final-2.py"" File "C:\Program Files\Splunk\Python-3.7\lib\site-packages\requests\adapters.py", line 250, in cert_verify  It seems our script is refused at the line  response = requests.get(url, headers={'Content-Type': 'application/json'}, cert=cert_context, verify = False) We tried with or without verify = False with no clues why its refused. Did you have any ideas about why it's stuck inside Splunk ? (we tried in Linux and in Windows with the same Result) Best regards, TrustBuilder team
Field names with special characters such as dots (.) need to be referenced in single quotes, plus it looks like you time value is in milliseconds not seconds (used by epoch time). Try this: | makere... See more...
Field names with special characters such as dots (.) need to be referenced in single quotes, plus it looks like you time value is in milliseconds not seconds (used by epoch time). Try this: | makeresults | fields - _time | eval alert.createdAt=1693398386408 | eval c_time=strftime ('alert.createdAt'/1000,"%m-%d-%Y %H:%M:%S.%3N") | table c_time
Solution here: https://community.splunk.com/t5/Knowledge-Management/Kvstore-Status-failed/m-p/656113#M9664
Feels like I tried every suggestion here, from renaming the cert, generating my own, messing around with the windows cert store and the .conf files... And this solution finally worked! Thank you so m... See more...
Feels like I tried every suggestion here, from renaming the cert, generating my own, messing around with the windows cert store and the .conf files... And this solution finally worked! Thank you so much!!
Try like this <init> <set token="input">!@#$%^&amp;*(){}|\";:&lt;&gt;/\\[]</set> </init> <row> <panel depends="$alwayshide$"> <html> <style> #escaped table tbod... See more...
Try like this <init> <set token="input">!@#$%^&amp;*(){}|\";:&lt;&gt;/\\[]</set> </init> <row> <panel depends="$alwayshide$"> <html> <style> #escaped table tbody td div.multivalue-subcell[data-mv-index="1"] { display: none; } </style> </html> </panel> <panel id="escaped"> <table> <title>$escaped$</title> <search> <query>| makeresults | fields - _time | eval param=$input|s$ | eval param=mvappend(param,replace(param,"([!@#$%^&amp;*\(\)\{\}\|\";:&lt;&gt;\/\\\[\]])","\\\\\1"))</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="drilldown">cell</option> <option name="refresh.display">progressbar</option> <drilldown> <eval token="escaped">mvindex($click.value$,1)</eval> </drilldown> </table> </panel> </row>
Hi All,   For those who are familiar with AWS Cloudtrail logs, these have details about every api call, every event that occurs in your AWS account.  Is there an equivalent  of the same in Azure that... See more...
Hi All,   For those who are familiar with AWS Cloudtrail logs, these have details about every api call, every event that occurs in your AWS account.  Is there an equivalent  of the same in Azure that can be ingested in Splunk ? We have "Splunk Add-on for Microsoft Cloud Services" installed in our environment.    What input or config is required to pull in cloudtrail type equivalent logs ??   As of now,  we are getting compute logs and Azure AD events via this add-on.