All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

All commands with SEDCMD
that page gives very good list of Splunk commands... which step you are stuck exactly.. 
Sir, If I can frankly say all I'm trying to do is here. Configure the Splunk Add-on for Windows - Splunk Documentation  unfortunately there isn't any information about forwarders here.
Yes, HF requires a license.    For a heavy forwarder (HF), you should set up one of the following options: 1) Make the HF a slave of a license master. This will give the HF all of the enterprise c... See more...
Yes, HF requires a license.    For a heavy forwarder (HF), you should set up one of the following options: 1) Make the HF a slave of a license master. This will give the HF all of the enterprise capabilities - and the HF will consume no license, as long as it does not index data. 2) Install the forwarder license. This will give the HF many enterprise capabilities, but not all. The HF will be able to parse and forward data. However, it will not be permitted to index and it will not be able to act as a deployment server (as an example). This is the option I would usually choose. (Note that the Universal Forwarder has the forwarder license pre-installed.) answer from - https://community.splunk.com/t5/Getting-Data-In/Do-we-need-a-license-for-Heavy-forwarder/m-p/210451  
Hello comrade inventsekar, Thank you for your help, do I need other kind of licensing to use HV?
Hi @BoldKnowsNothin ... the UF, still supports, only very limited props.conf tasks.  https://docs.splunk.com/Documentation/Splunk/9.1.1/admin/Propsconf on this document, just do a control-F and sea... See more...
Hi @BoldKnowsNothin ... the UF, still supports, only very limited props.conf tasks.  https://docs.splunk.com/Documentation/Splunk/9.1.1/admin/Propsconf on this document, just do a control-F and search for universal.. you will get around 8 matches... only these settings are supported.  >>> Also how do I upgrade to HF? generally you dont want to upgrade a UF to a HF.. you need to install a new/fresh HF separately on a system.  you downlad splunk enterprise package and install it.. and then enable it as a heavy forwarder. let us know if you have doubts.. thanks. 
Hello comrades, After my poor research, I found that only heavy forwarder supports props.conf, but it was like 5 or 6 years old posts. I wonder that UF could now support props.conf? Also how do I up... See more...
Hello comrades, After my poor research, I found that only heavy forwarder supports props.conf, but it was like 5 or 6 years old posts. I wonder that UF could now support props.conf? Also how do I upgrade to HF? Many thanks,  
Hi @Harley ..i am 80% sure that these apps/add-ons will work smooth on the isolated networks. need to test this scenario
I'm trying to extract all the CVEs and associated their CVSS scores from Shodan's API (JSON response). The response is typically in the format where the number after data depends on the number of ser... See more...
I'm trying to extract all the CVEs and associated their CVSS scores from Shodan's API (JSON response). The response is typically in the format where the number after data depends on the number of services detected, example data: data :[ 0 22/tcp/OpenSSH :{ … }, 1 80/tcp/Apache httpd :{ vulns :{ "CVE-2013-6501" :{ cvss :4.6, references :[ … ], summary :"The default soap.wsdl_cache_dir setting in (1) php.ini-production and (2) php.ini-development in PHP through 5.6.7 specifies the /tmp directory, which makes it easier for local users to conduct WSDL injection attacks by creating a file under /tmp with a predictable filename that is used by the get_sdl function in ext/soap/php_sdl.c.", Current search: | curl method=get uri=https://api.shodan.io/shodan/host/"IP"?key=APIKEY | spath input=curl_message path="data{0}.vulns" output=test_array | mvexpand test_array | spath input=test_array | table CVE*.cvss When using curl from WebTools, spath doesn't appear to be extracting all the fields (e.g. only 4 of the 15 CVEs are displayed in the table), likely because of the 5000 character limit for spath. Is there another method that would be able to keep data like the CVE, cvss and summary linked while splitting the data? Delim via comma seems like it wouldn't be possible since the summaries also include commas.
on isolated network/no internet access considered a transport network.  Is there a way to still use add ons/ apps?  
Thank you so much . It works for me!!!
Thanks Rick! What Rich Galloway stated was that "Splunk recently changed geo-ip providers and no longer ships with a MaxMind database." If that is the case, I was asking what company is the new geo... See more...
Thanks Rick! What Rich Galloway stated was that "Splunk recently changed geo-ip providers and no longer ships with a MaxMind database." If that is the case, I was asking what company is the new geo-ip provider that has taken over from MaxMind ? Also, what version of SE did the switchover over, and what directory is the new geo-IP DB in, and what is the new mmdb file name? Best regards, Dennis  
Hi @Amit79 ... as you are saying its working fine in Dev and its not working in prod.. so something missing /wrong in the prod environment.  Pls confirm the lookup file in prod if its having yesterd... See more...
Hi @Amit79 ... as you are saying its working fine in Dev and its not working in prod.. so something missing /wrong in the prod environment.  Pls confirm the lookup file in prod if its having yesterdays data or not.. pls compare dev and prod for the lookup file, search command.. etc.. thanks. 
Tomorrow is CX Day and we are so excited to be able to shine the spotlight on all things customer experience. At Splunk, we truly believe serving our customers is the bedrock of our business, and tha... See more...
Tomorrow is CX Day and we are so excited to be able to shine the spotlight on all things customer experience. At Splunk, we truly believe serving our customers is the bedrock of our business, and that building resilience helps transform the customer experience to drive better business outcomes. Learn more about CX Day and dig into some awesome content we’ve created just for the occasion here. Then join us over on LinkedIn Live tomorrow at 11 AM PT / 2 PM ET as Splunk's Chief Customer Officer, Toni Pavlovich, chats with CX influencer, Blake Morgan. It'll be a conversation full of learning, fun, and celebration.   What does Customer Experience mean to you? Have an example of a particularly amazing customer experience? Drop your thoughts below!
Hello All, I  am calculating burnrate in splunk,  and using addinfo for enrichment to display it on the dashboard. Burnrate is getting calculated but previous day burnrate is not getting stored whi... See more...
Hello All, I  am calculating burnrate in splunk,  and using addinfo for enrichment to display it on the dashboard. Burnrate is getting calculated but previous day burnrate is not getting stored which splunk could refer. I am running the report and pushing the values to it using outputlookup command, &  from there below script is reading it. In Dev environment its working fine, but when  I am moving to production its breaking, the values are getting calculated but nit getting saved and burnrate values are not getting connected as per below graph. Here is the script | inputlookup append=t lkp_add.csv | addinfo | eval timestamp=if(isnull(timestamp),round(strptime(date + " 23:59:59","%Y-%m-%d %H:%M:%S"), 0), timestamp), threshold=1 | where desc="add info" and timestamp>=(now()-(info_max_time-info_min_time)) | stats max(burnrate) as burnrate max(threshold) as threshold by timestamp | eval _time=strptime(timestamp,"%s") | timechart max(burnrate) as burnrate max(threshold) as threshold      
Hi Splunk Team,  The Documentation says UF default installation port is 8989 (https://docs.splunk.com/Documentation/Forwarder/9.1.0/Forwarder/Installanixuniversalforwarder ) (from 9.1.0... before 9... See more...
Hi Splunk Team,  The Documentation says UF default installation port is 8989 (https://docs.splunk.com/Documentation/Forwarder/9.1.0/Forwarder/Installanixuniversalforwarder ) (from 9.1.0... before 9.1.0, that default port information is not updated in that page at all)   May i know if its a typo and it should have been 8089.. please suggest, thanks. 
I install Splunk on Ubuntu and installed Splunk app called Cisco eStreamer client. How can I fix the issue? I configure Cisco Firepower Management Center and Splunk according to this video. https... See more...
I install Splunk on Ubuntu and installed Splunk app called Cisco eStreamer client. How can I fix the issue? I configure Cisco Firepower Management Center and Splunk according to this video. https://www.youtube.com/watch?v=pEXM5PVkvH8&t=104s&ab_channel=CiscoSecureFirewall I got an error: root@platform-dns:/opt/splunk/etc/apps/TA-eStreamer/bin/encore# ../splencore.sh test Traceback (most recent call last): File "./estreamer/preflight.py", line 33, in <module> import estreamer.crossprocesslogging File "/opt/splunk/etc/apps/TA-eStreamer/bin/encore/estreamer/__init__.py", line 27, in <module> from estreamer.connection import Connection File "/opt/splunk/etc/apps/TA-eStreamer/bin/encore/estreamer/connection.py", line 23, in <module> import ssl File "/opt/splunk/lib/python3.7/ssl.py", line 98, in <module> import _ssl # if we can't import it, let the error propagate ImportError: /opt/splunk/lib/python3.7/lib-dynload/_ssl.cpython-37m-x86_64-linux-gnu.so: undefined symbol: SSL_state
HI Community, I have been tasked with getting AWS Cloudtrail logs into Splunk. I have spent some time not just reading how to accomplish this but also testing it on my own AWS environment. The org t... See more...
HI Community, I have been tasked with getting AWS Cloudtrail logs into Splunk. I have spent some time not just reading how to accomplish this but also testing it on my own AWS environment. The org that I work for uses control tower (not on the current version) to provide landing zones. If you know anything about the control tower, it basically provisions accounts on your behalf and sets up guardrails for ease of scalability. One account that is provisioned is name log archive which I am interested in.  My question is, would I access this archiving account and setup a cloudwatch group and kinesis firehose stream? Or do I need to access the logs in this archive logging account from another account? Maybe I am not asking this question correctly but it seems like the control tower makes log aggregation easier but also complicates how to access the logs.  Let me know if clarification is needed. Thanks!
To chart over time you use the timechart command. It is a functional equivalent of charting over _time with a bin command applied beforehand. It's just shorter and more straightforward. But both tim... See more...
To chart over time you use the timechart command. It is a functional equivalent of charting over _time with a bin command applied beforehand. It's just shorter and more straightforward. But both timechart and chart work over only one category field. If you want to analyze time series over more than one variable fields you need to combine them into a single artificial field. For example (yes, I know this particular search would be more effective with tstats insteads of stats but that's just to show the general idea): index=_internal earliest=-2h | eval series=sourcetype."-".host | timechart span=10m count by series
Hi @yackle_official! Thanks for checking in on Answers. Since this is an old post, I recommend starting a new thread with your question, so it can gain more current visibility.   Cheers! -Kara ... See more...
Hi @yackle_official! Thanks for checking in on Answers. Since this is an old post, I recommend starting a new thread with your question, so it can gain more current visibility.   Cheers! -Kara D, Splunk Community Manager