All Topics

Top

All Topics

August 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this month’s edition of indexEducation, the newsletter that takes an untraditional twist on... See more...
August 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this month’s edition of indexEducation, the newsletter that takes an untraditional twist on what’s new with Splunk Education. We hope the updates about our courses, certification program, and self-paced training will feed your obsession to learn, grow, and advance your careers. Let’s get started with an index for maximum performance readability: Training You Gotta Take | Things You Needa Know | Places You’ll Wanna Go  Training You Gotta Take Last Minute Learning  | Grab a seat at the eleventh hour Are you a serial procrastinator or obsessed with maximizing every opportunity? No matter your mindset, Splunk Last Minute Learning is designed just for you. It allows you to rapidly get more technical training under your belt or catch up on those classes you might have put off. And, since Splunk Training Units expire one year after purchase, taking a class at the last minute may be just what you need to ensure those training units don’t go unused. Simply sign up using your Splunk.com account and pay with your company training units or a credit card. We see you – and we like the way you think. Gotta get training in | Last minute instructor-led courses The Latest | Exploring and Analyzing Data with Splunk  If you’re a fan of Andrew Huberman and his four-hour podcasts about things like optimizing your sleep, then you probably have no problem with your attention span or your thirst for knowledge. If that’s the case, this new 9-hour course should probably be next up on your playlist. Exploring and Analyzing Data with Splunk is for users who want to attain operational intelligence level 4, (business insights) and covers exploratory data analysis by using statistical tools and custom visualizations. This course provides the opportunity to dip your feet into the deep (and complex) pool of data science before heading off to our more advanced Splunk for Analytics and Data Science course. And, hat tip to Andrew, you’ll probably even sleep better knowing it creates the foundation for what’s next in data science.  Gotta learn something new | Enroll today Things You Needa Know There’s a new certification in town | Splunk Certified Cybersecurity Defense Engineer  When you hear the words, SOC and SOAR and security, do you visualize some kind of superhero swooping in to save the day? Well, we do – and it’s everyday people like you that we see. So, take our latest certification exam to validate your skills as a SOC Engineer using Splunk Enterprise Security and Splunk SOAR and earn your Splunk Certified Cybersecurity Defense Engineer certification. This exam establishes an intermediate-level standard for users of Splunk Enterprise, Enterprise Security, and Splunk SOAR who wish to be certified as cybersecurity professionals. We don’t hand out capes after you pass, but we do give you a badge.  Needa show your stuff | Certifications are validation Splunk Community Slack FTW | Become a valued member today It’s like hanging at the community pool, but with more like-minded tech folks – and no swimsuit required. The Splunk Community is your free way to learn, stay up to date, connect, share ideas, and find success. Here you’ll find a collection of resources where you can ask questions, get answers, connect with other Splunk enthusiasts – via Splunk Answers, User Groups, and Slack. If you're not already a member of the Splunk Community Slack, we’d love to welcome you aboard. By joining and inviting others to sign up, you not only become part of a great group, but you have a chance to win a $50 Splunk Store gift card! Unlike the community pool, which traditionally closes by September 1, this offer runs through September 20. Hurry, it’s a real win-win.  Needa know more about using Splunk | Join Slack Places You’ll Wanna Go SMARTNESS Series | Meet Tom Sometimes it’s hard to visualize how far we can take our careers – until we hear about how others have done it. So, if you’re looking for inspiration and best practices for growing your career with Splunk, maybe Episode 1 of our new Splunk Education SMARTNESS series featuring Tom Kopchak will help you imagine the possibilities. Say hi to Tom and let us know what you think about the interview. And, while you’re at it, get to know Splunk Education and Splunk Certification – a few of the avenues Tom believes impacted his career trajectory.  Wanna be inspired | This interview series is the place The Splunk Community | SplunkTrust fezzes in collab with Lantern  The SplunkTrust is a group of highly skilled and knowledgeable Splunk users who are trusted advisors to Splunk. These widely-recognizable fez wearers are selected based on their exceptional technical skills and suggestions, which are often used to influence future Splunk features and applications. Today, these fine friends are also collaborating with Lantern, the Splunk customer success center that provides expert advice and tips for managing Splunk more efficiently. Have you ever been tempted to use Splunk to uncover some really outlandish insights? Well, so have we! With Lantern, you can learn about some unusual ways these incredible minds are using Splunk.  Wanna go to the horse stable | One Trust member Splunks his hobby Find Your Way | Learning Bits and Breadcrumbs Go Stream It  | The Latest Course Releases (Some with Non-English Captions!) Go Last Minute | Seats Still Available for ILT Go Deep | Register for Security and Observability Tech Talks  Go to STEP | Get Upskilled Go Discuss Stuff | Join the Community Go Social | LinkedIn for News Go Index It | Subscribe to our Newsletter   Thanks for sharing a few minutes of your day with us – whether you’re looking to grow your mind, career, or spirit, you can bet your sweet SaaS, we got you. If you think of anything else we may have missed, please reach out to us at indexEducation@splunk.com.    Answer to Index This:  888 + 88 + 8 + 8 + 8 = 1000
This month is a collection of special news! From Magic Quadrant updates to AppDynamics integrations to Regional Expansion and feature enhancements, Splunk delivers a new level of observability to ITO... See more...
This month is a collection of special news! From Magic Quadrant updates to AppDynamics integrations to Regional Expansion and feature enhancements, Splunk delivers a new level of observability to ITOps and engineering teams to accelerate their troubleshooting workflows and effectively reduce their MTTx. Observability News: Splunk Observability Cloud was named a Leader in the 2024 Gartner® Magic Quadrant™ for Observability Platforms. Splunk Cloud customers can connect their logs with AppDynamics's application data Splunk Observability Cloud running on AWS is now available in UK and German realms Latest Splunk Observability Enhancements New "Install" Action In Infrastructure Inventory Improved Search and Navigation with Global Search RUM Custom Indexed Tags APM Landing Page Improvements Learn More About These Updates and Capabilities New Log Observer Connect for AppDynamics: Combine the power of AppDynamics and Splunk Cloud Platform to pinpoint issues faster in traditional and hybrid applications. With a single-click button, zoom in on the relevant logs from AppDynamics in Splunk Cloud’s search and reporting interface. Learn more  New Splunk Observability Realms Launch in Germany and the UK: On August 20, Splunk launched two new EMEA Observability Realms on AWS in London and Frankfurt. Customers in these regions can now access IM, APM, RUM, and Synthetic Monitoring, so they have all the capabilities available in the current EU0 (Dublin) Realm. These expansions help remove some regulatory compliance blockers for customers in the region and will help our EMEA teams better support our customers on their observability journeys. Learn more in English or German New "Install" Action In Infrastructure Inventory: We are adding actions to the existing “view-only” Infrastructure Inventory page for three cloud providers (AWS, GCP, Azure) in Observability Cloud Platform’s Data Management so you can see the services discovered, get relevant recommendations for successful instrumentation, and see which services are instrumented or still need to be instrumented. Learn more Improved Search and Navigation with Global Search: Splunk has made key UX improvements in Splunk Observability Cloud to optimize your searching experience! Now, whenever you’re using the magnifying glass icon for your searches, results will be split into distinct categories such as APM, Infrastructure or Dashboards, making it easier and faster to access relevant available resources. Learn more RUM Custom Indexed Tags: Unlock the power of your session and span tags in RUM with Custom Indexed Tags! RUM customers can now leverage tags from their spans and sessions for greater troubleshooting and performance monitoring use cases by indexing them. These indexed tags will appear in Tag Spotlight tiles for advanced correlation and troubleshooting analysis, and will allow users to filter down on their metrics based on tag values. For example, customers can choose to slice and dice RUM metrics, such as error rate and web vitals, with business-specific tags like support tier, loyalty level, or custom location labels. Indexed tags will also appear in all filtering experiences in RUM. Admin users will be able to choose which tags they would like to index using the “MetricSets” configuration. Learn more APM Landing Page Improvements: With landing page improvements we're making it easier for engineers to search, explore, and navigate between services to make troubleshooting faster and easier. Learn more
Hello, I am currently working on project that involves integrating Splunk with Azure Virtual Desktop (AVD). Could you please provide me with any available documentation or resources that detail th... See more...
Hello, I am currently working on project that involves integrating Splunk with Azure Virtual Desktop (AVD). Could you please provide me with any available documentation or resources that detail the process or best practices for this integration? Any guidance or links to relevant materials would be greatly appreciated. Thank you in advance for your assistance. Best regards,
Hi! I am working as an IAM Specialist but I am looking to pivot to Splunk. I would like to set up a Splunk Enterprise environment using VMware where I can practice the basics and move to more advanc... See more...
Hi! I am working as an IAM Specialist but I am looking to pivot to Splunk. I would like to set up a Splunk Enterprise environment using VMware where I can practice the basics and move to more advanced functions including getting a solid base and understanding of networking. After seeing many videos for all kinds of set ups, I am not sure which would be best for me; set up wise, and I was wondering if anyone can help give me a set up that works best based off my laptop configurations. I would like to practice on VMs for both Windows/Linux. Laptop Config:  Lenovo IdeaPad touchscreen - AMD Ryzen 7 7730U : WUXGA - 16GB - 1TB SSD - OS Windows 11 Any information would be highly appreciated. 
I want to create one static field by looking status value = Issue host m_nname status A cpu Ok B disk Ok C memory Issue D netwok Ok E storage Issue   Issue fou... See more...
I want to create one static field by looking status value = Issue host m_nname status A cpu Ok B disk Ok C memory Issue D netwok Ok E storage Issue   Issue found in status column few field heath created with Bad value. Like below. host m_nname status Health A cpu Ok Bad B disk Ok Bad C memory Issue Bad D netwok Ok Bad E storage Issue Bad  
So i am using multiselect to take dynamic input from user and it is working fine when i have individual searches running to populate dynamic list for each input but since for all those inputs my base... See more...
So i am using multiselect to take dynamic input from user and it is working fine when i have individual searches running to populate dynamic list for each input but since for all those inputs my base search is same so i had thought to use Splunk's base search feature to populate the list which works fine at first submit but now when the panels are loaded and user wants to change the value in multiselect input it does not list all the values which were available at first . So wanted to know if is there something we can do to have this working in same fashion as it works for individual dynamics searches meaning the underlying values which were returned at first should remain intact or at least when the user is selecting "All" option it should repopulate that list. I had tried using tokens set unset and stuff but no luck. I also tried having different base search for multiselect dropdown and panel but that too didn't worked. Following is xml with base search which has the issue of reselecting multiselect dropdown values after submission - <form version="1.1" theme="light"> <label>testing Clone</label> <search id="base_dropdown"> <query>index=main sourcetype=access_combined_wcookie status IN ($status_tok$) file IN ($file_tok$) itemId IN ($itemId_tok$)</query> <earliest>$time_tok.earliest$</earliest> <latest>$time_tok.latest$</latest> </search> <search id="base_panel"> <query>index=main sourcetype=access_combined_wcookie status IN ($status_tok$) file IN ($file_tok$) itemId IN ($itemId_tok$)</query> <earliest>$time_tok.earliest$</earliest> <latest>$time_tok.latest$</latest> </search> <fieldset submitButton="true" autoRun="true"> <input type="time" token="time_tok"> <label>Time</label> <default> <earliest>-7d@d</earliest> <latest>now</latest> </default> </input> <input type="multiselect" token="status_tok"> <label>status</label> <choice value="*">All</choice> <default>*</default> <delimiter>,</delimiter> <fieldForLabel>status</fieldForLabel> <fieldForValue>status</fieldForValue> <search base="base_dropdown"> <query>|stats count by status|sort 0 + status</query> </search> <valuePrefix>"</valuePrefix> <valueSuffix>"</valueSuffix> </input> <input type="multiselect" token="file_tok"> <label>file</label> <choice value="*">All</choice> <default>*</default> <delimiter>,</delimiter> <fieldForLabel>file</fieldForLabel> <fieldForValue>file</fieldForValue> <search base="base_dropdown"> <query>|stats count by file|sort 0 + file</query> </search> <valuePrefix>"</valuePrefix> <valueSuffix>"</valueSuffix> </input> <input type="multiselect" token="itemId_tok"> <label>itemId</label> <choice value="*">All</choice> <default>*</default> <delimiter>,</delimiter> <fieldForLabel>itemId</fieldForLabel> <fieldForValue>itemId</fieldForValue> <search base="base_dropdown"> <query>|stats count by itemId|sort 0 + itemId</query> </search> <valuePrefix>"</valuePrefix> <valueSuffix>"</valueSuffix> </input> </fieldset> <row> <panel> <table> <title>Count </title> <search base="base_panel"> <query>| stats count</query> <!--- <earliest>$time_tok.earliest$</earliest> <latest>$time_tok.latest$</latest>--> </search> <option name="drilldown">none</option> </table> </panel> </row> </form>   Following is without base search for multiselect drop down which works as expected- <form version="1.1" theme="light"> <label>testing</label> <!--<search id="base_dropdown"> <query>index=main sourcetype=access_combined_wcookie status IN ($status_tok$) file IN ($file_tok$) itemId IN ($itemId_tok$)</query> <earliest>$time_tok.earliest$</earliest> <latest>$time_tok.latest$</latest> </search>--> <search id="base_panel"> <query>index=main sourcetype=access_combined_wcookie status IN ($status_tok$) file IN ($file_tok$) itemId IN ($itemId_tok$)</query> <earliest>$time_tok.earliest$</earliest> <latest>$time_tok.latest$</latest> </search> <fieldset submitButton="true" autoRun="true"> <input type="time" token="time_tok"> <label>Time</label> <default> <earliest>-7d@d</earliest> <latest>now</latest> </default> </input> <input type="multiselect" token="status_tok"> <label>status</label> <choice value="*">All</choice> <default>*</default> <delimiter>,</delimiter> <fieldForLabel>status</fieldForLabel> <fieldForValue>status</fieldForValue> <search> <query>index=main sourcetype=access_combined_wcookie earliest="$time_tok.earliest$" latest="$time_tok.latest$" |stats count by status|sort 0 + status</query> </search> <valuePrefix>"</valuePrefix> <valueSuffix>"</valueSuffix> </input> <input type="multiselect" token="file_tok"> <label>file</label> <choice value="*">All</choice> <default>*</default> <delimiter>,</delimiter> <fieldForLabel>file</fieldForLabel> <fieldForValue>file</fieldForValue> <search> <query>index=main sourcetype=access_combined_wcookie earliest=$time_tok.earliest$ latest="$time_tok.latest$"|stats count by file|sort 0 + file</query> </search> <valuePrefix>"</valuePrefix> <valueSuffix>"</valueSuffix> </input> <input type="multiselect" token="itemId_tok"> <label>itemId</label> <choice value="*">All</choice> <default>*</default> <delimiter>,</delimiter> <fieldForLabel>itemId</fieldForLabel> <fieldForValue>itemId</fieldForValue> <search> <query>index=main sourcetype=access_combined_wcookie earliest=$time_tok.earliest$ latest="$time_tok.latest$"|stats count by itemId|sort 0 + itemId</query> </search> <valuePrefix>"</valuePrefix> <valueSuffix>"</valueSuffix> </input> </fieldset> <row> <panel> <table> <title>Count</title> <search base="base_panel"> <query>| stats count</query> <!--- <earliest>$time_tok.earliest$</earliest> <latest>$time_tok.latest$</latest>--> </search> <option name="drilldown">none</option> </table> </panel> </row> </form> Dashboard 
I have a sample data pushed to splunk as below: Help me with splunk query where I want only unique server names with final status as second column. compare both horizantally & vertically for each ser... See more...
I have a sample data pushed to splunk as below: Help me with splunk query where I want only unique server names with final status as second column. compare both horizantally & vertically for each server second column status, if any of the second column value is No for that server then consider No as final status for that server, if all the second column values are Yes for a Server, then consider that server final status as Yes. sample.csv: ServerName,Status Server1,Yes Server1,No Server1,Yes Server2,No Server2,No Server3,Yes Server3,Yes Server4,Yes Server5,No Server6,Yes Server6,No Server6,Yes Server6,No Server7,Yes Server7,Yes Server7,Yes Server7,Yes Server8,No Server8,No Server8,No Server8,No Output should looks similar to below:  ServerName,FinalStatus Server1,No Server2,No Server3,Yes Server4,Yes Server5,No Server6,No Server7,Yes Server8,No
All I learning for prompt is that I need to open broser and prompt with SOAR GUI. Is any Rest API or link available for answer prompt ? I want to pass some variable in the mail. If somebody click ... See more...
All I learning for prompt is that I need to open broser and prompt with SOAR GUI. Is any Rest API or link available for answer prompt ? I want to pass some variable in the mail. If somebody click certain link, It will accept or reject the prompt for event "4" base on API automatically. It will reduce IT's workload!
Hello Splunkers,  I have 7 files in JSON format ( the JSON format is the same for each files) , so i applied one parsing for all * On UF *     [source::/opt/splunk/etc/apps/app_name/result/*.j... See more...
Hello Splunkers,  I have 7 files in JSON format ( the JSON format is the same for each files) , so i applied one parsing for all * On UF *     [source::/opt/splunk/etc/apps/app_name/result/*.json] INDEXED_EXTRACTIONS=json EVENT_BREAKER_ENABLE = true EVENT_BREAKER = ([\r\n]+)     *On IDX*     [sourcetype_name] SHOULD_LINEMERGE=false LINE_BREAKER=([\r\n]+) NO_BINARY_CHECK=true CHARSET=UTF-8 TIME_PREFIX=\"timestamp\"\:\s\" MAX_TIMESTAMP_LOOKAHEAD=19 TIME_FORMAT=%Y-%m-%dT%H:%M:%S TRUNCATE=999999     *on Search Head*     [sourcetype_name] KV_MODE=none       Parsing works for all files except one Here is an excerpt, timestamp with none value Can you help me on this ?   
HI Team, When i am trying to exclude one field by inserting condition sessionId!=X its not working . even though I used "NOT" condition but the field which i am trying to exclude is still showing in... See more...
HI Team, When i am trying to exclude one field by inserting condition sessionId!=X its not working . even though I used "NOT" condition but the field which i am trying to exclude is still showing in results. could you please help how i can exclude  particular field host="*"  sessionId!=X  host="*" NOT sessionId!=X 
Hello, I've created a dashboard that is showing 4 teams in dropdown menu. Now when I choose one of the team, I want to only see the panels for the specific team.  I've created the drop-down input a... See more...
Hello, I've created a dashboard that is showing 4 teams in dropdown menu. Now when I choose one of the team, I want to only see the panels for the specific team.  I've created the drop-down input and given it a label called Team. I have created statis options like Team 1, Team 2, Team 3, Team 4. So, my question is how do I assign each panel chart to one of the teams in the drop down? From some of the online searching I've done - it is asking to use tokenization concept. Could you please help me achieve this result.
  Hi, I'm trying to get the Guard duty log using the Splunk Add-on for AWS app. The input method is Generic S3, and logs from cloudtrail or WAF come in well, but the Guard duty log is not comi... See more...
  Hi, I'm trying to get the Guard duty log using the Splunk Add-on for AWS app. The input method is Generic S3, and logs from cloudtrail or WAF come in well, but the Guard duty log is not coming in. Of course, the data is in the S3 bucket. I'm attaching the guard duty.log.   Thank you.
Hello everyone! How can we solve the problem of searching for secrets in all or some splunk indexes so that splunk is not heavily loaded: how can this be implemented? (approach).  It is obvious that... See more...
Hello everyone! How can we solve the problem of searching for secrets in all or some splunk indexes so that splunk is not heavily loaded: how can this be implemented? (approach).  It is obvious that the list of indexes needs to be limited. What else?
We have below data in json format, i need help with a custom json response handler so splunk can break every event separately.  Each event starts with the record_id { "eventData": [ { "record_id"... See more...
We have below data in json format, i need help with a custom json response handler so splunk can break every event separately.  Each event starts with the record_id { "eventData": [ { "record_id": "19643", "eventID": "1179923", "loginID": "PLI", "userDN": "cn=564SD21FS8DF32A1D87FAD1F,cn=Users,dc=us,dc=oracle,dc=com", "type": "CredentialValidation", "ipAddress": "w.w.w.w", "status": "success", "accessTime": "2024-08-29T06:23:03.487Z", "oooppd": "5648sd1csd-952f-d630a41c87ed-000a3e2d", "attributekey": "User-Agent", "attributevalue": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/128.0.0.0 Safari/537.36" }, { "record_id": "19644", "eventID": "1179924", "loginID": "OKP", "userDN": "cn=54S6DF45S212XCV6S8DF7,cn=Users,dc=us,dc=CVGH,dc=com", "type": "Logout", "ipAddress": "X.X.X.X", "status": "success", "accessTime": "2024-08-29T06:24:05.040Z", "oooppd": "54678S3D2FS962SDFV3246S8DF", "attributekey": "User-Agent", "attributevalue": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/128.0.0.0 Safari/537.36" } ] }
Hi there, i have a file monitoring stanza on a universal forwarder where i filter using transforms.conf to only get logentries i need, because the server writes logentries of multiple business proce... See more...
Hi there, i have a file monitoring stanza on a universal forwarder where i filter using transforms.conf to only get logentries i need, because the server writes logentries of multiple business processes into the same logfile. Now i need entries of another process with different ACL in a different index from that logfile but in our QS cluster while the first datainput still ingests into our PROD cluster So i have my inputs.conf [monitor://<path_to_logfile>] disabled = 0 index = <dataspecific index 1> sourcetype = <dataspecific sourcetype 1> a props.conf [<dataspecific sourcetype 1>] SHOULD_LINEMERGE = true BREAK_ONLY_BEFORE_DATE = true TRUNCATE = 1500 TIME_PREFIX = ^ MAX_TIMESTAMP_LOOKAHEAD = 20 TIME_FORMAT = [%y/%m/%d %H:%M:%S] TRANSFORMS-set = setnull, setparsing and a transforms.conf [setnull] REGEX = . DEST_KEY = queue FORMAT = nullQueue [setparsing] REGEX = (<specific regex>) DEST_KEY = queue FORMAT = indexQueue As standalone Stanza i would need the new input like this, with its own setparsing transforms [monitor://<path_to_logfile>] disabled = 0 index = <dataspecific index 2> sourcetype = <dataspecific sourcetype 2> _TCP_ROUTING = qs_cluster   to be honest i could just create a second stanza thats a little different and still reads the same file, but i dont want two tailreader on the same file. What possibilities do i have? Thanks in advance
I have never been one to understand regex, however I need to extract everything after the first entry (#172...) into it's own field.  Let's call it manual_entry.  I'm getting tired of searching and r... See more...
I have never been one to understand regex, however I need to extract everything after the first entry (#172...) into it's own field.  Let's call it manual_entry.  I'm getting tired of searching and randomly trying things. #1724872356 exit #1724872357 exit #1724872463 cat .bashrc #1724872485 sudo cat /etc/profile.d/join-timestamp-history.sh #1724872512 exit #1724877740 firefox   manual_entry exit exit cat .bashrc sudo cat /etc/profile.d/join-timestamp-history.sh exit firefox    
Hello members, i'm struggling with something i have configured data inputs, and indexer name on the HF and makes the app pointing to Search Head & reporting, Also forwarded to logs from the other sy... See more...
Hello members, i'm struggling with something i have configured data inputs, and indexer name on the HF and makes the app pointing to Search Head & reporting, Also forwarded to logs from the other system as syslog data to Heavy forwarder  i have configured also the same index on HF at the cluster master and pushed that to all indexers but when i'm looking for that index in SH ( Search Head ) there is no result    can someone help me please ...   Thanks
Hello, Splunk db_connect is indexing only 10k events per hour at a time no matter what setting I configure in inputs. db connect version is 3.1.0 db connect db_inputs.conf is    [ABC] connection ... See more...
Hello, Splunk db_connect is indexing only 10k events per hour at a time no matter what setting I configure in inputs. db connect version is 3.1.0 db connect db_inputs.conf is    [ABC] connection = ABC_PROD disabled = 0 host = 1.1.1.1 index = test index_time_mode = dbColumn interval = 900 mode = rising query = SELECT *\ FROM "mytable"\ WHERE "ID" > ?\ ORDER BY "ID" ASC source = XYZ sourcetype = XYZ:lis input_timestamp_column_number = 28 query_timeout = 60 tail_rising_column_number = 1 max_rows = 10000000 fetch_size = 100000    when i run the query using dbxquery in splunk i do get more than 10k events. Also i tried max_rows = 0 which basically should ingest everything but its not working.   how can I ingest unlimited rows.
I'm working on a dashboard in which the user enters a list of hosts.  The issue I'm running into is they must add an asterisk to the host name or it isn't found in the search.  This what the SPL look... See more...
I'm working on a dashboard in which the user enters a list of hosts.  The issue I'm running into is they must add an asterisk to the host name or it isn't found in the search.  This what the SPL looks like.     index=os_* (`wineventlog_security` OR sourcetype=linux_secure) host IN ( host1*, host2*, host3*, host4*, host5*, host6*, host7*, host8* ) earliest=-7d@d | dedup host | eval sourcetype=if(sourcetype = "linux_secure", sourcetype, source) | fillnull value="" | table host, index, sourcetype, _raw     If there is no * then there are no results.  What I would like to be able to do is have them enter hostname, FQDN, and either upper or lower case and the SPL would change it to lower case, remove any FQDN parts, add the *, and then search.  So far I haven't come up with SPL that works.  Any thoughts? TIA, Joe
Hi, Please share the configuration documents on panorama side for integrating this app with Splunk SOAR