All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello comrades, After my poor research, I found that only heavy forwarder supports props.conf, but it was like 5 or 6 years old posts. I wonder that UF could now support props.conf? Also how do I up... See more...
Hello comrades, After my poor research, I found that only heavy forwarder supports props.conf, but it was like 5 or 6 years old posts. I wonder that UF could now support props.conf? Also how do I upgrade to HF? Many thanks,  
Hi @Harley ..i am 80% sure that these apps/add-ons will work smooth on the isolated networks. need to test this scenario
I'm trying to extract all the CVEs and associated their CVSS scores from Shodan's API (JSON response). The response is typically in the format where the number after data depends on the number of ser... See more...
I'm trying to extract all the CVEs and associated their CVSS scores from Shodan's API (JSON response). The response is typically in the format where the number after data depends on the number of services detected, example data: data :[ 0 22/tcp/OpenSSH :{ … }, 1 80/tcp/Apache httpd :{ vulns :{ "CVE-2013-6501" :{ cvss :4.6, references :[ … ], summary :"The default soap.wsdl_cache_dir setting in (1) php.ini-production and (2) php.ini-development in PHP through 5.6.7 specifies the /tmp directory, which makes it easier for local users to conduct WSDL injection attacks by creating a file under /tmp with a predictable filename that is used by the get_sdl function in ext/soap/php_sdl.c.", Current search: | curl method=get uri=https://api.shodan.io/shodan/host/"IP"?key=APIKEY | spath input=curl_message path="data{0}.vulns" output=test_array | mvexpand test_array | spath input=test_array | table CVE*.cvss When using curl from WebTools, spath doesn't appear to be extracting all the fields (e.g. only 4 of the 15 CVEs are displayed in the table), likely because of the 5000 character limit for spath. Is there another method that would be able to keep data like the CVE, cvss and summary linked while splitting the data? Delim via comma seems like it wouldn't be possible since the summaries also include commas.
on isolated network/no internet access considered a transport network.  Is there a way to still use add ons/ apps?  
Thank you so much . It works for me!!!
Thanks Rick! What Rich Galloway stated was that "Splunk recently changed geo-ip providers and no longer ships with a MaxMind database." If that is the case, I was asking what company is the new geo... See more...
Thanks Rick! What Rich Galloway stated was that "Splunk recently changed geo-ip providers and no longer ships with a MaxMind database." If that is the case, I was asking what company is the new geo-ip provider that has taken over from MaxMind ? Also, what version of SE did the switchover over, and what directory is the new geo-IP DB in, and what is the new mmdb file name? Best regards, Dennis  
Hi @Amit79 ... as you are saying its working fine in Dev and its not working in prod.. so something missing /wrong in the prod environment.  Pls confirm the lookup file in prod if its having yesterd... See more...
Hi @Amit79 ... as you are saying its working fine in Dev and its not working in prod.. so something missing /wrong in the prod environment.  Pls confirm the lookup file in prod if its having yesterdays data or not.. pls compare dev and prod for the lookup file, search command.. etc.. thanks. 
Tomorrow is CX Day and we are so excited to be able to shine the spotlight on all things customer experience. At Splunk, we truly believe serving our customers is the bedrock of our business, and tha... See more...
Tomorrow is CX Day and we are so excited to be able to shine the spotlight on all things customer experience. At Splunk, we truly believe serving our customers is the bedrock of our business, and that building resilience helps transform the customer experience to drive better business outcomes. Learn more about CX Day and dig into some awesome content we’ve created just for the occasion here. Then join us over on LinkedIn Live tomorrow at 11 AM PT / 2 PM ET as Splunk's Chief Customer Officer, Toni Pavlovich, chats with CX influencer, Blake Morgan. It'll be a conversation full of learning, fun, and celebration.   What does Customer Experience mean to you? Have an example of a particularly amazing customer experience? Drop your thoughts below!
Hello All, I  am calculating burnrate in splunk,  and using addinfo for enrichment to display it on the dashboard. Burnrate is getting calculated but previous day burnrate is not getting stored whi... See more...
Hello All, I  am calculating burnrate in splunk,  and using addinfo for enrichment to display it on the dashboard. Burnrate is getting calculated but previous day burnrate is not getting stored which splunk could refer. I am running the report and pushing the values to it using outputlookup command, &  from there below script is reading it. In Dev environment its working fine, but when  I am moving to production its breaking, the values are getting calculated but nit getting saved and burnrate values are not getting connected as per below graph. Here is the script | inputlookup append=t lkp_add.csv | addinfo | eval timestamp=if(isnull(timestamp),round(strptime(date + " 23:59:59","%Y-%m-%d %H:%M:%S"), 0), timestamp), threshold=1 | where desc="add info" and timestamp>=(now()-(info_max_time-info_min_time)) | stats max(burnrate) as burnrate max(threshold) as threshold by timestamp | eval _time=strptime(timestamp,"%s") | timechart max(burnrate) as burnrate max(threshold) as threshold      
Hi Splunk Team,  The Documentation says UF default installation port is 8989 (https://docs.splunk.com/Documentation/Forwarder/9.1.0/Forwarder/Installanixuniversalforwarder ) (from 9.1.0... before 9... See more...
Hi Splunk Team,  The Documentation says UF default installation port is 8989 (https://docs.splunk.com/Documentation/Forwarder/9.1.0/Forwarder/Installanixuniversalforwarder ) (from 9.1.0... before 9.1.0, that default port information is not updated in that page at all)   May i know if its a typo and it should have been 8089.. please suggest, thanks. 
I install Splunk on Ubuntu and installed Splunk app called Cisco eStreamer client. How can I fix the issue? I configure Cisco Firepower Management Center and Splunk according to this video. https... See more...
I install Splunk on Ubuntu and installed Splunk app called Cisco eStreamer client. How can I fix the issue? I configure Cisco Firepower Management Center and Splunk according to this video. https://www.youtube.com/watch?v=pEXM5PVkvH8&t=104s&ab_channel=CiscoSecureFirewall I got an error: root@platform-dns:/opt/splunk/etc/apps/TA-eStreamer/bin/encore# ../splencore.sh test Traceback (most recent call last): File "./estreamer/preflight.py", line 33, in <module> import estreamer.crossprocesslogging File "/opt/splunk/etc/apps/TA-eStreamer/bin/encore/estreamer/__init__.py", line 27, in <module> from estreamer.connection import Connection File "/opt/splunk/etc/apps/TA-eStreamer/bin/encore/estreamer/connection.py", line 23, in <module> import ssl File "/opt/splunk/lib/python3.7/ssl.py", line 98, in <module> import _ssl # if we can't import it, let the error propagate ImportError: /opt/splunk/lib/python3.7/lib-dynload/_ssl.cpython-37m-x86_64-linux-gnu.so: undefined symbol: SSL_state
HI Community, I have been tasked with getting AWS Cloudtrail logs into Splunk. I have spent some time not just reading how to accomplish this but also testing it on my own AWS environment. The org t... See more...
HI Community, I have been tasked with getting AWS Cloudtrail logs into Splunk. I have spent some time not just reading how to accomplish this but also testing it on my own AWS environment. The org that I work for uses control tower (not on the current version) to provide landing zones. If you know anything about the control tower, it basically provisions accounts on your behalf and sets up guardrails for ease of scalability. One account that is provisioned is name log archive which I am interested in.  My question is, would I access this archiving account and setup a cloudwatch group and kinesis firehose stream? Or do I need to access the logs in this archive logging account from another account? Maybe I am not asking this question correctly but it seems like the control tower makes log aggregation easier but also complicates how to access the logs.  Let me know if clarification is needed. Thanks!
To chart over time you use the timechart command. It is a functional equivalent of charting over _time with a bin command applied beforehand. It's just shorter and more straightforward. But both tim... See more...
To chart over time you use the timechart command. It is a functional equivalent of charting over _time with a bin command applied beforehand. It's just shorter and more straightforward. But both timechart and chart work over only one category field. If you want to analyze time series over more than one variable fields you need to combine them into a single artificial field. For example (yes, I know this particular search would be more effective with tstats insteads of stats but that's just to show the general idea): index=_internal earliest=-2h | eval series=sourcetype."-".host | timechart span=10m count by series
Hi @yackle_official! Thanks for checking in on Answers. Since this is an old post, I recommend starting a new thread with your question, so it can gain more current visibility.   Cheers! -Kara ... See more...
Hi @yackle_official! Thanks for checking in on Answers. Since this is an old post, I recommend starting a new thread with your question, so it can gain more current visibility.   Cheers! -Kara D, Splunk Community Manager  
Ok. That's interesting because the SplunkUniversalForwarder app is an app which indeed as @gcusello pointed out comes with your UF installation but it typically does not contain the local directory. ... See more...
Ok. That's interesting because the SplunkUniversalForwarder app is an app which indeed as @gcusello pointed out comes with your UF installation but it typically does not contain the local directory. As far as I remember the configurations you make with CLI splunk commands (like splunk add monitor) land in etc/system/local directory so they should not be there either. While technically you can make changes to the default apps you shouldn't do so because in case of upgrade you'll overwrite the changes in apps that come with the installation package with your own versions again which might be undesirable. So you should not touch the default apps. So I'd try to see where did those settings come from - either someone configured them manually (which is the "least bad" case here because on upgrade the "default" directory should get overwritten but "local" should should stay untouched) or your DS is serving this app (in which case you might want to check where it is being pushed to). Anyway, if it's been done manually, you can always just do your favourite configuration automation software (ansible?) and just remove the file from your UFs. Or you can just deploy an app with a higher precedence which will override the settings from the problematic config. See https://docs.splunk.com/Documentation/Splunk/9.1.1/Admin/Wheretofindtheconfigurationfiles
Did you ever get an answer to this?
Things don't happen by themselves usually. You must have sone something as root so that resulting files ended up being owned by root.
One additional remark about time manipulation - don't render it to a string unless you are absolutely sure you won't be doing anything else with it. And even better - don't render the _value_ to a st... See more...
One additional remark about time manipulation - don't render it to a string unless you are absolutely sure you won't be doing anything else with it. And even better - don't render the _value_ to a string - leave the value in an epoch timestamp but use fieldformat command to only display it rendered to a string.
Perfect @chris_barrett .  Thanks for the response.
Don't know about the provider but the database is updated only on Splunk upgrades. You can do manual updates but they will be overwritten when you upgrade your Splunk installation unless you set a cu... See more...
Don't know about the provider but the database is updated only on Splunk upgrades. You can do manual updates but they will be overwritten when you upgrade your Splunk installation unless you set a custom path to the database file.