All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I know this thread is a few years old, but I hope you are still active. Splunk is not pulling the OID off of smartcards to handle the full login itself. So, we set up Apache and I made the remoteUser... See more...
I know this thread is a few years old, but I hope you are still active. Splunk is not pulling the OID off of smartcards to handle the full login itself. So, we set up Apache and I made the remoteUser and RequestHeader configurations you described. When Splunk receives the header, nothing happens though. It logs an entry that ProxySSO is not configured. Have you seen this issue and know how to get past it to still use LDAP authentication in Splunk but passing the user name to from the Proxy via your described method?
I have a query that counts totals for each day for the past 7 days and produces these results: 2, 0, 2, 0, 0, 0, 0. No matter what I do, the SINGLE with timechart and trendlines enabled produced igno... See more...
I have a query that counts totals for each day for the past 7 days and produces these results: 2, 0, 2, 0, 0, 0, 0. No matter what I do, the SINGLE with timechart and trendlines enabled produced ignores the trailing zeros and displays a 2, with a trendling of increasing 2. It should diplay a zero with a zero trend line representing the last two segments (both zero). Before the main query (as recommended) I have used the | makeresults earliest"-7d@d" count =0 to ensure the days with zero count are included. I have tried the suggested appendpipe option: | appendpipe [| stats count | where count=0 | addinfo | eval _time=info_min_time | table _time count] and the appendpipe with max(count) option: | appendpipe [| stats count | where count=0 | addinfo | eval time=info_min_time." ".info_max_time | table time count | makemv time | mvexpand time | rename time as _time | timechart span=1d max(count) as count] Neither create the correct timechart. From the dashboard in the Edit UI mode, if I click on the query magnifying glass and open in a new tab, the results do NOT diplay the trailing zeros. If I copy and paste the query into a search bar with the time picker set to All Time, I get the correct values: 2, 0, 2, 0, 0, 0, 0. Is there an option setting I may have wrong? How do I fix this?
Hi @Gregory.Burkhead, Have you reported this one or the others to AppDynamic Support? How do I submit a Support ticket? An FAQ 
Hi, Here is the query and the results. Visualization panels should get created/deleted automatically depends on the rows under the Page column.   index="*" appID="*" environment=* tags="*" ... See more...
Hi, Here is the query and the results. Visualization panels should get created/deleted automatically depends on the rows under the Page column.   index="*" appID="*" environment=* tags="*" stepName="*" status=FAILED | rex field=stepName "^(?<Page>[^\:]+)" | rex field=stepName "^\'(?<Page>[^\'\:]+)" | rex field=stepName "\:(?P<action>.*)" | eval Page=lower(Page) | stats count(scenario) as "Number of Scenarios" by Page | table Page, "Number of Scenarios"   I created the created single value visualization panel manually based on the rows, but if the rows are decreased dynamically, I could see N/A in most of the visualization panels. So, auto-scaling of visualization panel is needed in this scenario.    
I'm seeing this error from the _internal index in the web_service.log and the python.log: "startup:116 - Unable to read in product version information; isSessionKeyDefined=True error=[HTTP 401] Cl... See more...
I'm seeing this error from the _internal index in the web_service.log and the python.log: "startup:116 - Unable to read in product version information; isSessionKeyDefined=True error=[HTTP 401] Client is not authenticated" Does anyone have more information on this error?
Hi I would like to have the citrix cloud add-on to be installed into the Splunk cloud,how can I achieve this?
@Mohd_Harahsheh9 Please find below the Tenable and Splunk integration documents.  Tenable and Splunk Integration Guide  Troubleshooting (tenable.com) Tenable Data in Splunk Dashboard  --- If thi... See more...
@Mohd_Harahsheh9 Please find below the Tenable and Splunk integration documents.  Tenable and Splunk Integration Guide  Troubleshooting (tenable.com) Tenable Data in Splunk Dashboard  --- If this reply helps you, Karma would be appreciated.
@SCruz Follow the below document and go through your requirement to install the add-on or app in Splunk Cloud.  Install an add-on in Splunk Cloud Platform - Splunk Documentation  --- If this reply... See more...
@SCruz Follow the below document and go through your requirement to install the add-on or app in Splunk Cloud.  Install an add-on in Splunk Cloud Platform - Splunk Documentation  --- If this reply helps you, Karma would be appreciated.
Hi Thanks. This is almost what I need. I think I need to expand on my requirements a bit more.   ```Example``` |makeresults |eval sample="100" |eval perc="45" |eval name=if(sample=100,"C",N/A) |time... See more...
Hi Thanks. This is almost what I need. I think I need to expand on my requirements a bit more.   ```Example``` |makeresults |eval sample="100" |eval perc="45" |eval name=if(sample=100,"C",N/A) |timechart max(sample) as "The Sample yields $name$", avg(perc) as "percentage" ```Expected Outcome would by a timechart with column named "The Sample yields C" and another column titled "percentage"``` Using the BY clause appends that eval'd field in the column name, but not all columns need that field name. I'd think that the easiest way about doing it would be some type of variable replacement but it seems that the AS clause does not allow that.
Well I guess it is a bug, then.  There are quite a few bugs.
@abhi04  @abhi04 Hello Abhi, To on-board the logs you have to use Splunk Add-ons not Apps. They handle tasks related to data ingestion, parsing, extraction etc. Splunk-certified or written TAs (tech... See more...
@abhi04  @abhi04 Hello Abhi, To on-board the logs you have to use Splunk Add-ons not Apps. They handle tasks related to data ingestion, parsing, extraction etc. Splunk-certified or written TAs (technology add-ons) adhere to the Common Information Model (CIM) and are often used for data parsing. An app in Splunk provides a front-end interface for visualizing data. It’s like a user-friendly dashboard that allows you to explore and analyze information. If this reply helps you, Karma would be appreciated. !!! 
Hi Team, how can I ingest Genesys cloud logs into splunk? I see two apps 1.  Genesys Pulse Add-on for Splunk https://splunkbase.splunk.com/app/5255 2. Genesys Cloud Operational Analytics App  ht... See more...
Hi Team, how can I ingest Genesys cloud logs into splunk? I see two apps 1.  Genesys Pulse Add-on for Splunk https://splunkbase.splunk.com/app/5255 2. Genesys Cloud Operational Analytics App  https://splunkbase.splunk.com/app/6552 For the Genesys Pulse Add-on for Splunk, I was able to see here we need to setup the configuration but for Genesys Cloud Operational Analytics App i dont follow the setup configuration.   Which app should be used for Genesys Cloud log ingestion into splunk Cloud?
Hi @abroun, probably this is the only case where join could be the best solution:   some-search | join type=left id [ search some-search-index $id$ | eval epoch = _time | where epoch ... See more...
Hi @abroun, probably this is the only case where join could be the best solution:   some-search | join type=left id [ search some-search-index $id$ | eval epoch = _time | where epoch < $timestamp$ | sort BY _time | head 1 | fields id status type ] | table id time status type   Ciao. Giuseppe
Hi fellow Splunkers, i recently came across an authentication Token created by splunk-system-user and i had no clue where this token came from and my splunkadmin colleagues didnt created the token... See more...
Hi fellow Splunkers, i recently came across an authentication Token created by splunk-system-user and i had no clue where this token came from and my splunkadmin colleagues didnt created the token either. Is it a feature/normal that Splunk will generate a Token every single time you click on "view on mobile" from the menu of a xml dashboard? Can we turn it off?  We dont want users to be able to freely create an infinite amount of authentication tokens, because it would make overview of tokens way harder and we dont have configured the secure gateway. 
I have smart card authentication enabled on my onprem enterprise system.  I'm using the built in capability that Splunk has now, not using Apache.  Been working great but when I upgraded my system fr... See more...
I have smart card authentication enabled on my onprem enterprise system.  I'm using the built in capability that Splunk has now, not using Apache.  Been working great but when I upgraded my system from 9.0.3 to 9.2.1, it get an Unauthorized error when trying to logon.  I changed requireclientcert to false so I could logon with username and password.  Checked all my LDAP settings and everything looks the same.  Even added another DNS to see if that would change anything but no luck, still getting the unauthorized error.
Thanks @danspav for your response. First of all, I didn't mention that I'm using Splunk Enterprise 9.0.6 if that makes a difference. The provided XML code is similar to the one originally posted exc... See more...
Thanks @danspav for your response. First of all, I didn't mention that I'm using Splunk Enterprise 9.0.6 if that makes a difference. The provided XML code is similar to the one originally posted except it removes the <set> element. Although I tried it and when a button was clicked it is adding the following "form.link_dash" parameter to the main dashboards URL: /app/search/dash_main?form.link_dash=dash_a With this modified URL now in the URL bar, if the browser reload button is pressed it is opening a new tab to dash_a after loading and rendering the main dashboard, as if the button was clicked. It is like prefilling the button value from the URL parameter.
I'm trying to install “Cisco Networks App for Splunk Enterprise” and “Cisco Networks Add-on for Splunk Enterprise” in SplunkCloud  Version:9.1.2308.203Build:d153a0fad666 but is not possible. When I ... See more...
I'm trying to install “Cisco Networks App for Splunk Enterprise” and “Cisco Networks Add-on for Splunk Enterprise” in SplunkCloud  Version:9.1.2308.203Build:d153a0fad666 but is not possible. When I try and search them they do not appear, and If I try to upload them I receive a message informing: "This app is available for installation directly from Splunkbase. To install this app, use the App Browser page in Splunk Web." but there is nowhere to be found.
It looks like "epoch_password_last_modified" is a multivalue field; assuming you want to continue processing this a set of multivalue fields (although I think you might be better off expanding to ind... See more...
It looks like "epoch_password_last_modified" is a multivalue field; assuming you want to continue processing this a set of multivalue fields (although I think you might be better off expanding to individual events or not creating the multivalue fields in the first place), you could try something like this | eval time_difference=mvmap(epoch_password_last_modified, epoch_current_time - epoch_password_last_modified)
I've found the solution. The problem was mine. If I put : "testcsv.csv" -> it doesn't work. But if I remove the ".csv", it works perfectly... Thanks for your reply.
@siemless  This may be best to discuss in the Slack Users but I could not find you so I will respond here. In some cases, I have boxes where the /opt/splunk dir is mounted to a separate drive w/ mo... See more...
@siemless  This may be best to discuss in the Slack Users but I could not find you so I will respond here. In some cases, I have boxes where the /opt/splunk dir is mounted to a separate drive w/ mount point /opt/splunk.  In that case you can just swap the disk, I learned this method from AWS support.  But that takes preplanning. In some cases, I have boxes that are jacked up, either volume issues across multiple disks or  just not setup to swap disks.  In that case you can use the Splunk docs >>> https://docs.splunk.com/Documentation/Splunk/9.2.1/Installation/MigrateaSplunkinstance I argued with Splunk about the documentation steps but they claim the steps are correct, although I still believe confusing.  FWIW this is what I did... 1 >Create a new host with new OS (in my case I  rename /re-IP to the original afterward). 2 > Install the same version of Splunk on new host (I used a .tar), set systemd, set same admin pwd, then stop Splunkd, maybe test a restart and reboot, to verify. 3 > Stop Splunkd on old host, tar up /opt/splunk, copy over the old.tar to new box, untar over the new install, then start Splunkd. That worked for me, and going fwd all new hosts will be configured for the disk-swappable process. Good luck