All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Reviving the thread a year later: I have the same problem, had it back in Splunk 6.6.2 and still seeing it in Splunk 8.2.6 years later. No idea why - but I really needed to work around it. Here is w... See more...
Reviving the thread a year later: I have the same problem, had it back in Splunk 6.6.2 and still seeing it in Splunk 8.2.6 years later. No idea why - but I really needed to work around it. Here is what I came up with: I open a file, with a name derived from search id, with exclusive access for creation in a special `lock` folder. If it succeeds, I proceed with the rest of the code. If it fails, I realize that it was already "caught" by the previous run of the same command and bail out. Of course, I need some way of tidying up that `lock` folder, which is something that can be done with a scripted input, and not too frequently - once a day or even once a week should be plenty. In theory, I should be able to remove (unlink) that lock file right from the second instance, but it bit me in the back, so I abandoned the idea. Might want to revisit now, after so many years...
The first row can easily be excluded because there is no Count.  But the weird _raw signifies some unusual characteristics.  Failure to extract db_bulk_write_time suggests the same.  You need to post... See more...
The first row can easily be excluded because there is no Count.  But the weird _raw signifies some unusual characteristics.  Failure to extract db_bulk_write_time suggests the same.  You need to post more realistic/representative data.
also in this case 2024-08-12 10:53:53.455 2.75 3   2000 s 2024-08-12 10:53:56.205 2.765 the 2nd row should be 2.75 instead
Hi ,   I am new to Spunk just got Free Cloud Trial. I did the followings : 1- Logged in to Cloud trial instance 2- Created Index name winpc   3- App > Univeral forwarded and downloaded on Win PC... See more...
Hi ,   I am new to Spunk just got Free Cloud Trial. I did the followings : 1- Logged in to Cloud trial instance 2- Created Index name winpc   3- App > Univeral forwarded and downloaded on Win PC 4- Installed Forwarded on WInPC during step on use this agent with selected use with cloud instance 5- Receiver index left blank had no idea about my splun instance FQDN /IP 6- Checked services Splunk universal forwarded service running as Logon As Local system Issues : 1- No Logs I can see into index winpc created after waiting a hour or so 2- How can I tell forwarded to forward win and sysmon logs too should I edit inputs.conf file ?   Kindly guide and help so that I may get logs and learn any further .   Regards  
I get and my first row count is empty _raw is weird too _time processing_time Count db_bulk_write_time no_msg_wait_time _raw 2024-08-12 10:55:41.200 1.226     1000 . 2024-08-12 10:55:... See more...
I get and my first row count is empty _raw is weird too _time processing_time Count db_bulk_write_time no_msg_wait_time _raw 2024-08-12 10:55:41.200 1.226     1000 . 2024-08-12 10:55:40.872 0.312 1   0 s 2024-08-12 10:55:37.122 3.75 1   3000 s 2024-08-12 10:55:36.809 0.313 1   0 s 2024-08-12 10:55:33.106 3.688 1   3000 s 2024-08-12 10:55:32.778 0.313 1   0 s 2024-08-12 10:55:29.028 3.75 1   3000 s 2024-08-12 10:55:28.700 0.328 1   0 s 2024-08-12 10:55:24.950 3.75 1   3000 s 2024-08-12 10:55:24.622 0.312 1   0 s 2024-08-12 10:55:21.888 2.734 1   2000 s 2024-08-12 10:55:20.122 1.766 1   1000 s
I requested again last week yet no reply.
Can you expand on how your team did it? Ideally with step-by-step methods.
Thank you. It worked.
Request it again at https://dev.splunk.com/enterprise/dev_license
Hi,  I have previously had Splunk Dev license which I use for testing. As my license expired, I requested for a new one. It's been more that 3 weeks, yet my request is still pending. Any help is ap... See more...
Hi,  I have previously had Splunk Dev license which I use for testing. As my license expired, I requested for a new one. It's been more that 3 weeks, yet my request is still pending. Any help is appreciated.    Thanks    
Hi @PickleRick, it receives the tcpdump connection showing the syslog activity and information whenever we log on, log off or send a test message from the iDRAC machine, but it is not ingested in Spl... See more...
Hi @PickleRick, it receives the tcpdump connection showing the syslog activity and information whenever we log on, log off or send a test message from the iDRAC machine, but it is not ingested in Splunk and somehow it gets lost from the Log collector machine.
Also could you please share some example dashboard which you have used
Hi  I would like to display the count of the error code.
I think the approach should be adjusted.  When a user selects 2023, you can always make any value out of it, e.g., "2022, 2023".  Theoretically, you can even use a secondary token setter to calculate... See more...
I think the approach should be adjusted.  When a user selects 2023, you can always make any value out of it, e.g., "2022, 2023".  Theoretically, you can even use a secondary token setter to calculate if the input is free text, not a selector.  Then, you search can simply be index=cls_prod_app appname=Lacerte applicationversion IN ($applicationversion$) message="featureperfmetrics" NOT(isinternal="*") taxmodule=$taxmodule$ $hostingprovider$ datapath=* operation=createclient $concurrentusers$ latest=-365d@d | eval totaltimeinsec = totaltime/1000 | bin span=1m _time | timechart p95(totaltimeinsec) as RecordedTime by applicationversion limit=0 Here is an example in Simple XML for input: <input type="dropdown" token="applicationversion"> <label>Version</label> <choice value="2023,2024">2024</choice> <choice value="2022,2023">2023</choice> ] <prefix> </prefix> <suffix> </suffix> </input>
Hi @jaibalaraman, You can use multiple Sankey visualizations to display a single source-target-value combination, or you can create mock visualizations using boxes, text, and a single-value visualiz... See more...
Hi @jaibalaraman, You can use multiple Sankey visualizations to display a single source-target-value combination, or you can create mock visualizations using boxes, text, and a single-value visualization. In this Splunk 9.3 example, I've used three adjacent boxes, with the center box having 50% transparency. A markdown element is placed over the center box to provide the text, and a single-value element is placed to the right to provide a count. In your case, however, 403120 appears to be an event identifier and not a count. What are you trying to communicate with individual tiles that can't be represented by a Sankey diagram?
It should be set up such that: 1. A search in Splunk Enterprise has fields you find interesting 2. This search is used in the "Splunk App for SOAR Export" to send data to SOAR 3. Each result in yo... See more...
It should be set up such that: 1. A search in Splunk Enterprise has fields you find interesting 2. This search is used in the "Splunk App for SOAR Export" to send data to SOAR 3. Each result in your Splunk search should create an artifact in SOAR, and put them into a SOAR container based on the field configured in the "Splunk App for SOAR Export" to be the grouping field. 4. The artifacts will have CEF fields containing the data of the fields of your Splunk search. Then you can run the playbooks in SOAR on your containers with the artifacts, and the playbooks can run actions using the CEF fields in your artifacts as inputs. Can you confirm that you can view the artifact in SOAR and that it has CEF fields containing your data?
Here is conf presentation about using TLS with splunk https://conf.splunk.com/files/2023/slides/SEC1936B.pdf
In the search query, I am trying to view a csv dataset that shows clusters on a map. I manage to get a visualisation with different sized bubbles based on the values, bigger bubbles for bigger values... See more...
In the search query, I am trying to view a csv dataset that shows clusters on a map. I manage to get a visualisation with different sized bubbles based on the values, bigger bubbles for bigger values. However, once i add it to an existing dashboard, the bubbles disappear. When i navigate to "Data Configurations" -> "Layer Type" to "Marker", now the dashboard has the clusters, however they are markers of the same size instead of bubbles sized to different values.   Here is the source code of my visualisation:  {     "type": "splunk.map",     "options": {         "center": [             1.339638489909646,             103.82878183020011         ],         "zoom": 11,         "baseLayerTileServer": "https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png",         "baseLayerTileServerType": "raster",         "layers": [             {                 "type": "marker",                 "latitude": "> primary | seriesByName('latitude')",                 "longitude": "> primary | seriesByName('longitude')",                 "bubbleSize": "> primary | frameWithoutSeriesNames('_geo_bounds_east', '_geo_bounds_west', '_geo_bounds_north', '_geo_bounds_south', 'latitude', 'longitude') | frameBySeriesTypes('number')",                 "seriesColors": [                     "#7b56db",                     "#cb2196",                     "#008c80",                     "#9d6300",                     "#f6540b",                     "#ff969e",                     "#99b100",                     "#f4b649",                     "#ae8cff",                     "#8cbcff",                     "#813193",                     "#0051b5",                     "#009ceb",                     "#00cdaf",                     "#00490a",                     "#dd9900",                     "#465d00",                     "#ff677b",                     "#ff6ace",                     "#00689d"                 ]             }         ]     },     "dataSources": {         "primary": "ds_TmJ6iHdE"     },     "title": "Dengue Clusters",     "context": {},     "containerOptions": {},     "showProgressBar": false,     "showLastUpdated": false }
@ab73863- Try this approach with sessionKey. Specify Splunk Cloud URL and port number should be 8089 but you can confirm with your Splunk Cloud Representative or Splunk Cloud Support.   I hope... See more...
@ab73863- Try this approach with sessionKey. Specify Splunk Cloud URL and port number should be 8089 but you can confirm with your Splunk Cloud Representative or Splunk Cloud Support.   I hope this helps!!!
Thanks for your answer, let me do it and check if it works.   Also why are you doing ssl on inputs.conf as per docs it should be done on outsputs of HF