All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I have created an addon with a few input parameters. One of them is a dropdown list box. I am seeing that when I add a data input from within the app created by the addon automatically the dropdown s... See more...
I have created an addon with a few input parameters. One of them is a dropdown list box. I am seeing that when I add a data input from within the app created by the addon automatically the dropdown shows fine and I can select an item from it. However, when I create the same data input from the Settings->Data Inputs menu item, the dropdown list box is shown as a textbox. Any ideas on what I might be doing wrong? Thanks in advance.
@PickleRick Thanks and I think, I had run them before but I tried again to verify and  splunk list monitor output matches with the splunk list inputstatus output. I will try btool next. 
In the data, there is an array of 5 commit IDs. For some reason, it is only returning 3 values. Not sure why  2 values are missing. Would like a fresh set of eyes to take a look please. Query ind... See more...
In the data, there is an array of 5 commit IDs. For some reason, it is only returning 3 values. Not sure why  2 values are missing. Would like a fresh set of eyes to take a look please. Query index=XXXXX source="http:github-dev-token" eventtype="GitHub::Push" sourcetype="json_ae_git-webhook" | spath output=commit_id path=commits.id sourcetype definition [ json_ae_git-webhook ] AUTO_KV_JSON=false CHARSET=UTF-8 KV_MODE=json LINE_BREAKER=([\r\n]+) NO_BINARY_CHECK=true SHOULD_LINEMERGE=true TRUNCATE=100000 category=Structured description=JavaScript Object Notation format. For more information, visit http://json.org/ disabled=false pulldown_type=true Raw JSON data { "ref":"refs/heads/Dev", "before":"d53e9b3cb6cde4253e05019295a840d394a7bcb0", "after":"34c07bcbf557413cf42b601c1794c87db8c321d1", "commits":[ { "id":"a5c816a817d06e592d2b70cd8a088d1519f2d720", "tree_id":"15e930e14d4c62aae47a3c02c47eb24c65d11807", "distinct":false, "message":"rrrrrrrrrrrrrrrrrrrrrr", "timestamp":"2024-08-12T12:00:04-05:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/aaaaaaaaaaaa", "author":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "committer":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "added":[ ], "removed":[ ], "modified":[ "asdafasdad.json" ] }, { "id":"a3b3b6f728ccc0eb9113e7db723fbfc4ad220882", "tree_id":"3586aeb0a33dc5e236cb266c948f83ff01320a9a", "distinct":false, "message":"xxxxxxxxxxxxxxxxxxx", "timestamp":"2024-08-12T12:05:40-05:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/a3b3b6f728ccc0eb9113e7db723fbfc4ad220882", "author":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "committer":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "added":[ ], "removed":[ ], "modified":[ "sddddddf.json" ] }, { "id":"bdcd242d6854365ddfeae6b4f86cf7bc1766e028", "tree_id":"8286c537f7dee57395f44875ddb8b2cdb7dd48b2", "distinct":false, "message":"Updating pipeline: pl_gwp_file_landing_check. Adding Sylvan Performance", "timestamp":"2024-08-12T12:06:10-05:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/bdcd242d6854365ddfeae6b4f86cf7bc1766e028", "author":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "committer":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "added":[ ], "removed":[ ], "modified":[ "asadwefvdx.json" ] }, { "id":"108ebd4ff8ae9dd70e669e2ca49e293684d5c37a", "tree_id":"5a6d71393611718b8576f8a63cdd34ce619f17dd", "distinct":false, "message":"asdrwerwq", "timestamp":"2024-08-12T10:09:33-07:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/108ebd4ff8ae9dd70e669e2ca49e293684d5c37a", "author":{ "name":"dfsd", "email":"l.llllllllllll@aaaaaa.com", "username":"aaaaaa" }, "committer":{ "name":"lllllllllllll", "email":"l.llllllllllll@abc.com", "username":"aaaaaa" }, "added":[ ], "removed":[ ], "modified":[ "A.json", "A.json", "A.json" ] }, { "id":"34c07bcbf557413cf42b601c1794c87db8c321d1", "tree_id":"5a6d71393611718b8576f8a63cdd34ce619f17dd", "distinct":true, "message":"asadasd", "timestamp":"2024-08-12T13:32:45-05:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/34c07bcbf557413cf42b601c1794c87db8c321d1", "author":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "committer":{ "name":"GitasdjwqaikHubasdqw", "email":"noreply@gitskcaskadahuqwdqbqwdqaw.com", "username":"wdkcszjkcsebwdqwdfqwdawsldqodqw" }, "added":[ ], "removed":[ ], "modified":[ "a.json", "A1.json", "A1.json" ] } ], "head_commit":{ "id":"34c07bcbf557413cf42b601c1794c87db8c321d1", "tree_id":"5a6d71393611718b8576f8a63cdd34ce619f17dd", "distinct":true, "message":"sadwad from xxxxxxxxxxxxxxx/IH-5942-Pipeline-Change\n\nIh 5asdsazdapeline change", "timestamp":"2024-08-12T13:32:45-05:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/3weweeeeeeeee, "author":{ "name":"askjas", "email":"101218171+asfsfgwsrsd@users.noreply.github.com", "username":"asdwasdcqwasfdc-qwgbhvcfawdqxaiwdaszxc" }, "committer":{ "name":"GsdzvcweditHuscwsab", "email":"noreply@gitasdcwedhub.com", "username":"wefczeb-fwefvdszlow" }, "added":[ ], "removed":[ ], "modified":[ "zzzzzzz.json", "Azzzzz.json", "zzzz.json" ] } }
Are you sure you want Splunk Enterprise? You listed Enterprise Security as associated product. Anyway, 6.6 has been released something like 7 or 8 years ago and has been EOL for at least 4 years now... See more...
Are you sure you want Splunk Enterprise? You listed Enterprise Security as associated product. Anyway, 6.6 has been released something like 7 or 8 years ago and has been EOL for at least 4 years now. You're still running that?
The link you were pointed to is a very old thread. Now the same functionality is implemented with a special command so you can do (on your UFs, not on your SH!) splunk list monitor and splunk list... See more...
The link you were pointed to is a very old thread. Now the same functionality is implemented with a special command so you can do (on your UFs, not on your SH!) splunk list monitor and splunk list inputstatus The first command wil, show you effective configuration of your monitor inputs. The second one will give you the state of your inputs. Of course you can additionally verify your combined config using splunk btool inputs list monitor --debug
These are two separate issues. One is that you're doing a call to https endpoint without verifying server's certificate. That's not a very secure thing to do (especially that you're authenticating y... See more...
These are two separate issues. One is that you're doing a call to https endpoint without verifying server's certificate. That's not a very secure thing to do (especially that you're authenticating yourself against some unverified party) so you're getting a warning from the script. Another thing is that you're not properly authenticating to the server. That' why you're getting an error response from the server.
Read my notes and kept trying until I got it!  index=etims_na sourcetype=etims_prod platformId=5 bank_fiid=COST | eval response_time=round(if(strftime(_time,"%Z") == "EDT",((j_timestamp-entry_t... See more...
Read my notes and kept trying until I got it!  index=etims_na sourcetype=etims_prod platformId=5 bank_fiid=COST | eval response_time=round(if(strftime(_time,"%Z") == "EDT",((j_timestamp-entry_timestamp)-14400000000)/1000000,((j_timestamp-entry_timestamp)-14400000000)/1000000-3600),3) | stats count AS Transactions count(eval(response_time <= 1)) AS "Good" count(eval(response_time <= 2)) AS "Fair" count(eval(response_time > 2)) AS "Unacceptable" avg(response_time) AS "Average" BY bank_fiid | eval "%Good"=(Good/Transactions)*100 | eval "%Fair"=(Fair/Transactions)*100 | eval "%Unacceptable"=(Unacceptable/Transactions)*100 | addinfo | eval "Report Date"=strftime(info_min_time, "%m/%Y") | table bank_fiid, "Transactions", "Good", "%Good" "Fair", "%Fair", "Unacceptable", "%Unacceptable", "Average", "Report Date" | rename bank_fid as "Vision ID"
Valid keys are documented in the related .spec file in $SPLUNK_HOME/etc/system/README.  For instance, for the keys for props.conf see $SPLUNK_HOME/etc/system/README/props.conf.spec. You also can fin... See more...
Valid keys are documented in the related .spec file in $SPLUNK_HOME/etc/system/README.  For instance, for the keys for props.conf see $SPLUNK_HOME/etc/system/README/props.conf.spec. You also can find the information in the Admin Manual.
Hi, We have a custom python service being monitored by APM using the Opentelemetry agent. We have been successful in tracing spans related to our unsupported database driver (clickhouse-driver) but ... See more...
Hi, We have a custom python service being monitored by APM using the Opentelemetry agent. We have been successful in tracing spans related to our unsupported database driver (clickhouse-driver) but are wondering if there is some tag we can use to get APM to recognize these calls as database calls for the purposes of the "Database Query Performance" screen. I had hoped we could just fill out a bunch of the `db.*` semantic conventions but none have so far worked to get it to show as a database call (though the instrumented data do show up in the span details). Any tips?
Is there a way to get a list of valid keys for a stanza? For example: If you get "Invalid key in stanza" for something like: [file_integrity] exclude = /file/path It doesn't like the "exclu... See more...
Is there a way to get a list of valid keys for a stanza? For example: If you get "Invalid key in stanza" for something like: [file_integrity] exclude = /file/path It doesn't like the "exclude" but is there an alternative "key" value to accomplish the same? Thanks in advance!  
I'm trying to achieve the following output using the table command, but am hitting a snag.  Vision ID Transactions Good % Good Fair % Fair Unacceptable % Unacceptable Average ... See more...
I'm trying to achieve the following output using the table command, but am hitting a snag.  Vision ID Transactions Good % Good Fair % Fair Unacceptable % Unacceptable Average Response Time Report Date ABC STORE (ABCD) 159666494 159564563 99.9361601 101413 0.063515518 518 0.000324426 0.103864001 Jul-24 Total 159666494 159564563 99.9361601 101413 0.063515518 518 0.000324426 0.103864001 Jul-24                     Thresholds   response <= 1s   1s < response <= 3s 3s < response       Here is my broken query: index=etims_na sourcetype=etims_prod platformId=5 bank_fiid = ABCD | eval response_time=round(if(strftime(_time,"%Z") == "EDT",((j_timestamp-entry_timestamp)-14400000000)/1000000,((j_timestamp-entry_timestamp)-14400000000)/1000000-3600),3) | stats count AS Total count(eval(response_time<=1)) AS "Good" count(eval(response_time<=2)) AS "Fair" count(eval(response_time>2)) AS "Unacceptable" avg(response_time) AS "Average" BY Vision_ID | eval %Good= round((Good/total)*100,2), %Fair = round((Fair/total)*100,2), %Unacceptable = round((Unacceptable/total)*100,2) | addinfo | eval "Report Date"=strftime(info_min_time, "%m/%Y") | table "Vision_ID", "Transactions", "Good", "%Good" "Fair", "%Fair", "Unacceptable", "%Unacceptable", "Average", "Report Date" The help is always appreciated. Thanks!
Thanks @PaulPanther, I checked the link on my SH but not sure what exactly I am looking for. I did search for missing logs (secure and audit.log) but didn't see anything but at the same time didn't s... See more...
Thanks @PaulPanther, I checked the link on my SH but not sure what exactly I am looking for. I did search for missing logs (secure and audit.log) but didn't see anything but at the same time didn't see mention of logs those are being ingested, like message and cron. Thanks for your help.
I have done many Splunk React apps (Enterprise Security, Mission Control, etc.). In my experience, the easiest way is to embed your React code in an app. One of the clearest ways to do this is to fol... See more...
I have done many Splunk React apps (Enterprise Security, Mission Control, etc.). In my experience, the easiest way is to embed your React code in an app. One of the clearest ways to do this is to follow the instructions provided in the Splunk UI library. It can be somewhat daunting at first to use Splunk UI and it's build scripting, but so many things will just work once you have your React code packaged in app. Authentication won't be an issue at all for you anymore and you can easily call Splunk endpoints and they will just work.
| savedsearch "Incident Review - Main" time_filter="" event_id_filter="" source_filter="" security_domain_filter="" status_filter="status=\"1\"" owner_filter="" urgency_filter="urgency=\"critical\" O... See more...
| savedsearch "Incident Review - Main" time_filter="" event_id_filter="" source_filter="" security_domain_filter="" status_filter="status=\"1\"" owner_filter="" urgency_filter="urgency=\"critical\" OR urgency=\"high\" OR urgency=\"medium\" OR urgency=\"low\" OR urgency=\"unknown\"" tag_filter=""
similar error here, resolved i had a correlation rule is ES calling saved search, it required the addition of "type_filter"
Hi @fvincenzi , the easiest way is to ask to Splunk Support. Otherwise some weeks ago, someone hinted a site containing old versions. Ciao. giuseppe
Hello, In need download splunk enterprise 7.2.* in order to upgrade from version 6.6. Where can i find the older versions?   Thank you
Looking to add tooltip sting of site names included in the same lookup file as the long lat on a cluster map.   IS this even possible?
I am looking to add text as well.   I am trying to add tooltip string but havent had any luck.
Hi @MK3 , sorry but there's some confision in your question: to forward data from Forwarders to Splunk Enterprise you have to follow the instructions at: https://docs.splunk.com/Documentation/Splu... See more...
Hi @MK3 , sorry but there's some confision in your question: to forward data from Forwarders to Splunk Enterprise you have to follow the instructions at: https://docs.splunk.com/Documentation/SplunkCloud/latest/Forwarding https://docs.splunk.com/Documentation/Splunk/9.3.0/Data/Forwarddata to forward data you need outputs.conf that can be in $SPLUNK_HOME/etc/system/local or a  dedicated app. to take logs, you need inputs.conf that's in the same folder. props.conf and transforms.conf are  in the same folder, but usually aren't relevant on Forwarders (if Universal) $SPLUNK_HOME is the folder where you installed Splunk, by default it's C:\Program Files\splunk on Windows and /opt/splunk on Linux. You cannot send indexed data from an Heavy Forwarder, because it doesn't index data, but maybe you mean coocked data: you can send coocked (or uncooked data) to a third party using syslog. To send data to an external database you must use DB-Connect on Search Heads, but it's a different thing. Ciao. Giuseppe