All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Good morning,  So I am trying to monitor all files within this directory /var/log/syslog/<IP> Directory structure: /var/log/syslog/<IP>/2024/01 | 02 | 03 | 04 | 05 | 06 | 07/secure | cron | mes... See more...
Good morning,  So I am trying to monitor all files within this directory /var/log/syslog/<IP> Directory structure: /var/log/syslog/<IP>/2024/01 | 02 | 03 | 04 | 05 | 06 | 07/secure | cron | messages Hope this makes sense there are multiple subdirectories, the end goal is to monitor secure, cron, and messages I wrote this stanza within inputs.conf and the configuration did take on the Universal Forwarder [monitor:///var/log/syslog/192.168.1.1/.../secure] disabled = false host_segment = 4 index = insght [monitor:///var/log/syslog/192.168.1.1/.../cron] disabled = false host_segment = 4 index = insght [monitor:///var/log/syslog/192.168.1.1/.../messages] disabled = false host_segment = 4 index = insght   I have also tried this to capture all subdirs/files [monitor:///var/log/syslog/192.168.1.1] disabled = false host_segment = 4 recursive = true index = insght   Also within _internal I get this message:  INFO TaillingProcess [#### MainTailingThread] - Parsing configuration stanza: monitor:///var/log/syslog/<IP>   Which seems to hang there with no other messages logged for the particular stanza(s)   IP Address used is notional, thanks for the help! 
Does anyone have a template of capabilities you think are necessary for a role specific to CISOs, ISSM/ISSO, and Analysts. I know we can probably just use the User and Power Users as a baseline but w... See more...
Does anyone have a template of capabilities you think are necessary for a role specific to CISOs, ISSM/ISSO, and Analysts. I know we can probably just use the User and Power Users as a baseline but was wondering if anyone had any other inputs or identified specific items they think those roles need.
Hmm, the documentation says map can use a subsearch 3. Use the map command with a subsearch For complex ad hoc searches, use a subsearch for your map search https://docs.splunk.com/Documentation/S... See more...
Hmm, the documentation says map can use a subsearch 3. Use the map command with a subsearch For complex ad hoc searches, use a subsearch for your map search https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Map#Basic_examples
I see another syntax error.  The map command expects its search string to be in quotation marks rather than as a subsearch.  The $earliest$ form doesn't work in subsearches (except in a dashboard). ... See more...
I see another syntax error.  The map command expects its search string to be in quotation marks rather than as a subsearch.  The $earliest$ form doesn't work in subsearches (except in a dashboard).   index=event sourcetype=eventdat | where like(details,"...")) | eval earliest=strftime(floor(_time), "%m/%d/%Y:%H:%M:%S"), latest=strftime(ceil(_time+2), "%m/%d/%Y:%H:%M:%S") | table _time details earliest latest | map maxsearches=10 search = "index=sys_stats sourcetype=statdat device="..." earliest=$earliest$ latest=$latest$ | stats count as counter | eval details=$details$, earliest=\"$earliest$\", latest=\"$latest$\" | table _time details counter earliest latest"    
I've been requested to identify unused knowledge objects. I'm honestly not sure on the best way to go about this request. I have checked the next scheduled time. I'm not sure if that's all i need to ... See more...
I've been requested to identify unused knowledge objects. I'm honestly not sure on the best way to go about this request. I have checked the next scheduled time. I'm not sure if that's all i need to do before contacting object owners. Any ideas or documentation to help me accomplish this task will be most appreciated. Thank you!
In Splunk ES and the platform, this error keeps appearing and I couldn't resolve it. Could not load lookup=LOOKUP-useragentstrings
Hi @richgalloway  Thanks for the tip, I've updated my query     index=event sourcetype=eventdat | where like(details,"...")) | eval earliest=strftime(floor(_time), "%m/%d/%Y:%H:%M:%S"), latest=s... See more...
Hi @richgalloway  Thanks for the tip, I've updated my query     index=event sourcetype=eventdat | where like(details,"...")) | eval earliest=strftime(floor(_time), "%m/%d/%Y:%H:%M:%S"), latest=strftime(ceil(_time+2), "%m/%d/%Y:%H:%M:%S") | table _time details earliest latest | map [ search index=sys_stats sourcetype=statdat device="..." earliest=$earliest$ latest=$latest$ | stats count as counter | eval details=$details$, earliest="$earliest$", latest="$latest$" | table _time details counter earliest latest] maxsearches=10     It's still throwing the error Invalid value "$earliest$" for time term 'earliest'     
Yes, this is possible. I can add the filter, but it will be 1to1, so 1 input per user to monitor. If you have 100s of users to monitor, that will become cumbersome to manage. Did you have an idea on... See more...
Yes, this is possible. I can add the filter, but it will be 1to1, so 1 input per user to monitor. If you have 100s of users to monitor, that will become cumbersome to manage. Did you have an idea on how multiple users would be configured in the UI?
Hi @yuanliu , Thank a lot you for your help on this matter. I apologize for the confusion earlier; it turns out that the building_from_index_search field did indeed contain multiple values per row, ... See more...
Hi @yuanliu , Thank a lot you for your help on this matter. I apologize for the confusion earlier; it turns out that the building_from_index_search field did indeed contain multiple values per row, which was causing the matching issues. Adding the | mvexpand building_from_index_search line helped by creating separate rows, which resolved the problem and allowed me to obtain the expected output. Now, I’d like to address a similar requirement. While the current approach effectively identifies the unique values in building_from_index_search that are not present in roomlookup_buildings.csv buildings, I also need to find unique values specifically in the roomlookup_buildings.csv buildings that do not appear in the building_from_index_search field. Could you help me with a query to achieve this? Thank you in advance for your assistance!
Hi @tomjb94 , is there some word (e.g. the word "preprod") or string that you can add to your main search (not replacing search by field but adding to it)? this approach will give you more speed in... See more...
Hi @tomjb94 , is there some word (e.g. the word "preprod") or string that you can add to your main search (not replacing search by field but adding to it)? this approach will give you more speed in your searches. index=idem attrs.GW2_ENV_CLASS=preprod http_status=5* http_status!=503 NOT "mon-tx-" preprod then, can you reduce the time window? if you have too many events, you couldaccelarate your searches scheduling a search and saving results in a summary index and then use this index for your searches. Ciao. Giuseppe
Hi -   I am currently looking to optimise the search below as it is using a lot of search head resource: index=idem attrs.GW2_ENV_CLASS=preprod http_status=5* http_status!=503 NOT "mon-tx-" Sam... See more...
Hi -   I am currently looking to optimise the search below as it is using a lot of search head resource: index=idem attrs.GW2_ENV_CLASS=preprod http_status=5* http_status!=503 NOT "mon-tx-" Sample JSON result set:     @timestamp: 2024-07-31T12:41:20+00:00 attrs.AWS_AMI_ID: attrs.AWS_AZ: eu-west-1c attrs.AWS_INSTANCE_ID: i-0591d93b5e5881da9 attrs.AWS_REGION: eu-west-1 attrs.GW2_APP_VERSION: attrs.GW2_ENV_CLASS: preprod attrs.GW2_ENV_NUMBER: 0 attrs.GW2_SERVICE: idem body_bytes: 1620 bytes_sent: 2060 client_cert_expire_in_days: 272 client_cert_expiry_date: Apr 30 10:11:07 2025 GMT client_cert_issuer_dn: CN=******* PROD SUB CA2,O=Fidelity National Information Services,L=Jacksonville,ST=Florida,C=US client_cert_verification: SUCCESS client_dn: CN=idem-semantic-monitoring-preprod,OU=Gateway2Cloudops,O=Fidelity National Information Services,L=London,C=GB container_id: 17b7167ec5f2d20ec10704550fc8f2c2b9daedc835ce5fe0828ac86651983517 container_name: /idem-kong-1 correlationId: hostname: 17b7167ec5f2 http_content_type: application/vnd.*******.idempotency-v1.0+json http_referer: http_status: 200 http_user_agent: curl/8.5.0 log: {"@timestamp": "2024-07-31T12:41:20+00:00", "correlationId": "", "request_method": "POST", "hostname": "17b7167ec5f2", "http_status": 200, "bytes_sent": 2060, "body_bytes": 1620, "request_length": 1689, "request": "POST /idempotency/entries/update HTTP/2.0", "http_user_agent": "curl/8.5.0", "http_referer": "", "body_bytes": 1620, "remote_addr": "10.140.49.156", "remote_user": "", "response_time_s": 0.007, "client_dn": "CN=idem-semantic-monitoring-preprod,OU=Gateway2Cloudops,O=Fidelity National Information Services,L=London,C=GB", "client_cert_issuer_dn": "CN=******* RSA PROD SUB CA2,O=Fidelity National Information Services,L=Jacksonville,ST=Florida,C=US", "client_cert_expiry_date": "Apr 30 10:11:07 2025 GMT", "client_cert_expire_in_days": "272", "client_cert_verification": "SUCCESS", "wpg_correlation_id": "mon-tx-ecs-1722429678-idem-pp-2.preprod.euw1.gw2.*******.io", "http_content_type": "application/vnd.******.idempotency-v1.0+json", "uri_path": "/idempotency/entries/update"} parser: json remote_addr: 10.140.49.156 remote_user: request: POST /idempotency/entries/update HTTP/2.0 request_length: 1689 request_method: POST response_time_s: 0.007 source: stdout uri_path: /idempotency/entries/update wpg_correlation_id: mon-tx-ecs-1722429678-idem-pp-2.preprod.euw1.gw2.*******.io   I have tried adding additional filtering on particular fields, but it is not having the desired effect. Please note, the wildcards in the JSON are where i have masked this for the purposes of this community case. Thanks,
Hi @nabeel652 , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
It's not that $earliest$ is not being passed, it's that the value being passed is invalid.  The value for the earliest option must be a time modifier ("-1d", for example) or a timestamp in the format... See more...
It's not that $earliest$ is not being passed, it's that the value being passed is invalid.  The value for the earliest option must be a time modifier ("-1d", for example) or a timestamp in the format %m/%d/%Y:%H:%M:%S.  It cannot be an epoch timestamp, but you can use strftime to convert an epoch into the expected format. | eval earliest = strftime(earliest, "%m/%d/%Y:%H:%M:%S")  
If you are already receiving syslog on your rsyslog, it's better to send it to splunk using HEC input on Splunk's side and omhttp action on rsyslog's side.
Hello I have been trying to send logs to a Splunk TCP input using rsyslog but I cannot make it work. I know this is not related to your question but is there any way you can share how you did it ?
Hi Splukers I'm looking for cross compare some events with other system data, using an initial search for the event and then using map to load data from another index   index=event sourcetype=even... See more...
Hi Splukers I'm looking for cross compare some events with other system data, using an initial search for the event and then using map to load data from another index   index=event sourcetype=eventdat | where like(details,"...")) | eval earliest=floor(_time), latest=ceil(_time+2) | table _time details earliest latest | map [ search index=sys_stats sourcetype=statdat device="..." earliest=$earliest$ latest=$latest$ | stats count as counter | eval details=$details$, earliest=$earliest$, latest=$latest$ | table _time details counter earliest latest] maxsearches=10     When running I get the error: Invalid value "$earliest$" for time term 'earliest' I've tried $$ and "$...$" with no luck. I can't figure out why $earliest$ isn't being passed.   Any help would be appreciated (:   Notes: I've reviewed these posts but they don't seem relevant https://community.splunk.com/t5/Splunk-Search/Invalid-value-X-for-time-term-earliest-but-only-for-specific/m-p/624962#M217251 https://community.splunk.com/t5/Splunk-Search/Invalid-value-quot-week-quot-for-time-term-earliest/m-p/469491#M132104  
Hi All, I have a percentage value in one field in  a dashboard studio. I need to add below colours 0% - 5% Green 5% - 10% Yellow others RED
Hi  Can someone tell me how we can use a csv file using a lookup and extract the details from a file in a field which we can use for further calculations.  Example: A csv file (dummy.csv) with the ... See more...
Hi  Can someone tell me how we can use a csv file using a lookup and extract the details from a file in a field which we can use for further calculations.  Example: A csv file (dummy.csv) with the below details are saved in Splunk and we need to extract the details present in the file after the date in a new field in SPlunk and use the new field for further calculations.  Data in the dummy.csv file :  "Monday,01/07/2024",T2S Live Timing,"[OTHER] BILL invoice for CSDs Billing period 10-30 June ",,,,,, "Tuesday,02/07/2024",, ,,,,,, "Wednesday,03/07/2024",,"[OTHER] BILL invoice for NCBs Billing period 10-30 June",,,,,, "Thursday,04/07/2024",, ,[OTHER] DKK Service window between 19.35 - 23.59 ,,,,, "Friday,05/07/2024",T2S Synchronised Release day,,,,,,, "Saturday,06/07/2024",,[4CB] T2-T2S Site Recovery (internal technical test) ,[4CB] T2-T2S Site Recovery (internal technical test) ,,,,, "Sunday,07/07/2024",,[4CB] T2-T2S Site Recovery (internal technical test) ,[4CB] T2-T2S Site Recovery (internal technical test) ,,,,, "Monday,08/07/2024",T2S Live Timing, ,,,,,, How we can use the lookup and eval command to find the data present in the above file after the date ??  Example :  Date = 01/07/2024  Output = T2S Live Timing Date = 02/07/2024  Output = Blank Space  Date = 03/07/2024  Output = Blank Space  Date = 04/07/2024  Output = Blank Space  Date = 05/07/2024  Output = T2S Synchronised Release day  
A search can be longer than the URI allows for opening in a new tab, which causes the 414 Request-URI Too Long error.  There are multiple workarounds: Refactor the search and/or move long portions ... See more...
A search can be longer than the URI allows for opening in a new tab, which causes the 414 Request-URI Too Long error.  There are multiple workarounds: Refactor the search and/or move long portions of the query into an inputlookup command or search macro. Edit the URL to remove the query and only use the SID (as long as the search ID hasn't expired) For the second option, you can make a "Bookmarklet" that removes all of the URL parameters except the SID: javascript&colon; window.location.href = window.location.href.replace(/\?.*?(\bsid=[^&]+).*/, '?$1') Note: Khoros is breaking the bookmarklet; replace &colon; with : If you click on that bookmarklet when you get the error, it will open the search.    
Hi @tuts , in my opinion, only the knowledge of a security analyst can help you in the search. You could install some app for the technologies you have but, I think that only a deep knowledge of at... See more...
Hi @tuts , in my opinion, only the knowledge of a security analyst can help you in the search. You could install some app for the technologies you have but, I think that only a deep knowledge of attack methods and technics can support you. Ciao. Giuseppe