All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Please share your complete _raw event in a code block </>
it is tryiing to connect but it failes with name or service uknown
Hi,   I have a excel file on a linux server at a particular path. I have created a input file to monitor this file , but Im not receiving any logs. Can anyone help me how to get that excel daily ... See more...
Hi,   I have a excel file on a linux server at a particular path. I have created a input file to monitor this file , but Im not receiving any logs. Can anyone help me how to get that excel daily by creating  a input.conf 
I modified the query which is somehow fetching the latest details from the index but my event has multiple fields and can you advise how can I achieve the same for the other fields as it's only takin... See more...
I modified the query which is somehow fetching the latest details from the index but my event has multiple fields and can you advise how can I achieve the same for the other fields as it's only taking groupByAction Event as below- It has the fileds - groupByAction, groupByUser, lastOneMonth, lastOneWeek, lastOneDay "groupByUser": "[{\"requestedBy\": \"rdbmntp\", \"TotalRequests\": 38717}, {\"requestedBy\": \"pstapm\", \"TotalRequests\": 15126}, {\"requestedBy\": \"pirddb\", \"TotalRequests\": 13925}, {\"requestedBy\": \"fiddbtsp\", \"TotalRequests\": 8808}, {\"requestedBy\": \"bkpbs\", \"TotalRequests\": 6513}, {\"requestedBy\": \"arraymgr\", \"TotalRequests\": 5004}, {\"requestedBy\": \"zstapm\", \"TotalRequests\": 4758}, {\"requestedBy\": \"pdspadm\", \"TotalRequests\": 4313}, {\"requestedBy\": \"ptpsadm\", \"TotalRequests\": 3473}, {\"requestedBy\": \"glfinp\", \"TotalRequests\": 3450}]"  
Hi @sekhar463, I suppose that "Node" from the second search is the hostname of the first and that you want to use the Node from the second as kay to filter the first search. If this is true, you ca... See more...
Hi @sekhar463, I suppose that "Node" from the second search is the hostname of the first and that you want to use the Node from the second as kay to filter the first search. If this is true, you can use the second search as a subsearch of the first, renaming the field, something like this: index=_internal sourcetype=splunkd source="/opt/splunk/var/log/splunk/metrics.log" group=tcpin_connections os=Windows [ search index=ivz_em_solarwinds source="solwarwinds_query://Test_unmanaged_Nodes_Data" | table Node Account Status From Until | dedup Node | rename Node AS hostnae | fields hostname ] | dedup hostname | eval age=(now()-_time) | eval LastActiveTime=strftime(_time,"%y/%m/%d %H:%M:%S") | eval Status=if(age< 3600,"Running","DOWN") | rename age AS Age | eval Age=tostring(Age,"duration") | lookup 0010_Solarwinds_Nodes_Export Caption as hostname OUTPUT Application_Primary_Support_Group AS CMDB2_Application_Primary_Support_Group, Application_Primary AS CMDB2_Application_Primary, Support_Group AS CMDB2_Support_Group NodeID AS SW2_NodeID Enriched_SW AS Enriched_SW2 Environment AS CMDB2_Environment | eval Assign_To_Support_Group=if(Assign_To_Support_Group_Tag="CMDB_Support_Group", CMDB2_Support_Group, CMDB2_Application_Primary_Support_Group) | table _time, hostname,sourceIp, Status, LastActiveTime, Age, SW2_NodeID,Assign_To_Support_Group, CMDB2_Support_Group,CMDB2_Environment | where Status="DOWN" AND NOT isnull(SW2_NodeID) AND CMDB2_Environment="Production" | sort 0 hostname This solution has only one limitation: the subsearch can have max 50,000 results. Ciao. Giuseppe
Thanks for your suggestions!!
Hi @aditsss, the easiest approach I hint is to use JS and CSS following the instructions in the Splunk Dashboard Examples app (https://splunkbase.splunk.com/app/1603). Otherwise, you could find on ... See more...
Hi @aditsss, the easiest approach I hint is to use JS and CSS following the instructions in the Splunk Dashboard Examples app (https://splunkbase.splunk.com/app/1603). Otherwise, you could find on internet a site to find some special images (e.g. https://fsymbols.com/) to copy some symbols to use as usual chars, the visualization of the Splunk code isn't so good (because it's a little bit moved), but the resul is really near to your requirement. ten you can use them in your search: <row> <panel> <table> <search> <query> index="abc*" sourcetype =600000304_gg_abs_ipc2 source="/amex/app/gfp-settlement-raw/logs/gfp-settlement- raw.log" "ReadFileImpl - ebnc event balanced successfully" | eval keyword=if(searchmatch("ReadFileImpl - ebnc event balanced successfully")," ","") | eval phrase="ReadFileImpl - ebnc event balanced successfully" | table phrase keyword </query> <earliest>-1d@d</earliest> <latest>@d</latest> <sampleRatio>1</sampleRatio> </search> <option name="count">20</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="rowNumbers">false</option> <option name="totalsRow">true</option> <option name="wrap">true</option> <format type="color" field="keyword"> <colorPalette type="list">[#118832,#1182F3,#CBA700,#D94E17,#D41F1F] </colorPalette> <scale type="threshold">0,30,70,100</scale> </format> </table> </panel> </row> Ciao. Giuseppe
Hi Team, I have 2 splunk searches in which i want to exclude of hostname in first search matches with Node field in the 2nd search. how can i modify for joining this 2 searches to exclude hostname.... See more...
Hi Team, I have 2 splunk searches in which i want to exclude of hostname in first search matches with Node field in the 2nd search. how can i modify for joining this 2 searches to exclude hostname. common field is hostname field in first one and it will be as Node field in the 2nd search  index=_internal sourcetype=splunkd source="/opt/splunk/var/log/splunk/metrics.log" group=tcpin_connections os=Windows | dedup hostname | eval age=(now()-_time) | eval LastActiveTime=strftime(_time,"%y/%m/%d %H:%M:%S") | eval Status=if(age< 3600,"Running","DOWN") | rename age AS Age | eval Age=tostring(Age,"duration") | lookup 0010_Solarwinds_Nodes_Export Caption as hostname OUTPUT Application_Primary_Support_Group AS CMDB2_Application_Primary_Support_Group, Application_Primary AS CMDB2_Application_Primary, Support_Group AS CMDB2_Support_Group NodeID AS SW2_NodeID Enriched_SW AS Enriched_SW2 Environment AS CMDB2_Environment | eval Assign_To_Support_Group=if(Assign_To_Support_Group_Tag="CMDB_Support_Group", CMDB2_Support_Group, CMDB2_Application_Primary_Support_Group) | table _time, hostname,sourceIp, Status, LastActiveTime, Age, SW2_NodeID,Assign_To_Support_Group, CMDB2_Support_Group,CMDB2_Environment |where Status="DOWN" AND NOT isnull(SW2_NodeID) AND CMDB2_Environment="Production" | sort 0 hostname   index=ivz_em_solarwinds source="solwarwinds_query://Test_unmanaged_Nodes_Data" | table Node Account Status From Until | dedup Node
Hi there, we have setup splunk in airgapped environment. Windows forwarding log to HF via UF agent port 9997. HF then forwards the log to indexer rsyslog via data diode. We are receiving logs in ind... See more...
Hi there, we have setup splunk in airgapped environment. Windows forwarding log to HF via UF agent port 9997. HF then forwards the log to indexer rsyslog via data diode. We are receiving logs in indexer which is having special characters. Can anyone know how to troubleshoot this? Thankyou in advance @splunk 
write now i am getting error when i try to ping splunkdeploy.customerscallnow.com: name or service not known..i seem to follow a prety nice instruction but i am not yet able to connect 
Hi @karthikm, I suppose that you're speaking of an on-premise installation. Which Add-On are you using for the data ingestion? if I correctly remember, it's possible to define the index for each d... See more...
Hi @karthikm, I suppose that you're speaking of an on-premise installation. Which Add-On are you using for the data ingestion? if I correctly remember, it's possible to define the index for each data source by GUI, anyway, you could see the inputs.conf in tha used Add-On and see if the inputs (as tey should be!) are in two different stanzas. If not, you can override the index value finding a regex that identifies the Firewall Logs and follow the configurations described in my previous answer https://community.splunk.com/t5/Splunk-Search/How-to-change-index-based-on-MetaData-Source/m-p/619936 or other answers in Community. Ciao. Giuseppe
Hi @aditsss,  good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Hi Team, I am using below query: <row> <panel> <table> <search> <query>index="abc*" sourcetype =600000304_gg_abs_ipc2 source="/amex/app/gfp-settlement-raw/logs/gfp-settlement-raw.log" "ReadFile... See more...
Hi Team, I am using below query: <row> <panel> <table> <search> <query>index="abc*" sourcetype =600000304_gg_abs_ipc2 source="/amex/app/gfp-settlement-raw/logs/gfp-settlement-raw.log" "ReadFileImpl - ebnc event balanced successfully" | eval keyword=if(searchmatch("ReadFileImpl - ebnc event balanced successfully"),"True","")| eval phrase="ReadFileImpl - ebnc event balanced successfully"|table phrase keyword</query> <earliest>-1d@d</earliest> <latest>@d</latest> <sampleRatio>1</sampleRatio> </search> <option name="count">20</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="rowNumbers">false</option> <option name="totalsRow">true</option> <option name="wrap">true</option> <format type="color" field="keyword"> <colorPalette type="list">[#118832,#1182F3,#CBA700,#D94E17,#D41F1F]</colorPalette> <scale type="threshold">0,30,70,100</scale> </format> </table> </panel> </row> I want along with true and phrase  one checkmark should also come  in another column. Can someone guide me. Phrase keyword ReadFileImpl - ebnc event balanced successfully True ReadFileImpl - ebnc event balanced successfully True
The eval _raw is just to set up sample data in line with your example and is not intended for use in your dashboards.
Hi, I would like to get the list of all users, with roles and last login via splunk query. I tried the following query with a time range of "alltime" but it shows incorrect date for some users:  i... See more...
Hi, I would like to get the list of all users, with roles and last login via splunk query. I tried the following query with a time range of "alltime" but it shows incorrect date for some users:  index=_audit action="login attempt" | stats max(timestamp) by user Thank you, Kind regards Marta  
I have a HEC and I am receiving logs from CloudWatch and the default index is set to "aws". From the same HEC token I am also receiving Firewall logs from CloudWatch and these logs are also going to ... See more...
I have a HEC and I am receiving logs from CloudWatch and the default index is set to "aws". From the same HEC token I am also receiving Firewall logs from CloudWatch and these logs are also going to the index "aws". How can I transform the Firewall logs coming from the same HEC token from a different source to be assigned to index "paloalto"? I tried using the below config but it doesn't work props.conf [source::syslogng:dev/syslogng/*] TRANSFORMS-hecpaloalto = hecpaloalto disabled = false transforms.conf [hecpaloalto] DEST_KEY = _MetaData:Index REGEX = (.*) FORMAT = palo_alto I created the index palo_alto in the cluster master indexes.conf, applied cluster bundles to the indexers. And also applied the above config using deployment server to the Indexers. For some reason the logs are still going to the aws index.
Hi @GaetanVP, sincerely, it's the first time I see this command! Anyway, here you can find more infos https://community.splunk.com/t5/Security/Forgot-Pass4symmKey/m-p/378993 Ciao. Giuseppe
Hello Splunkers, I am used to use the following command to decrypt $7 Splunk configuration password such as pass4SymmKey or sslConfig.   splunk show-decrypted --value '<encrypted_value>'    I ha... See more...
Hello Splunkers, I am used to use the following command to decrypt $7 Splunk configuration password such as pass4SymmKey or sslConfig.   splunk show-decrypted --value '<encrypted_value>'    I have several questions regarding this command :  1/ Do you ever find any official documentation about it ? I was  looking here but not result : https://docs.splunk.com/Documentation/Splunk/9.1.0/Admin/CLIadmincommands 2/ Is it possible to use this command for $6 encrypted (hased ?) string, like the one stored for admin password stored in $SPLUNK_HOME/etc/passwd. I suppose it's not possible since it's a password and it should not be "reversible" for security reason. 3/ This question is related to the previous one. Is it right to say that $7 value has been encrypted since it's possible to revert it and $6 has been hashed because it's impossible to get the clear value back ? Thanks for your help ! GaetanVP
Thanks. I just refreshed but it only has the predefined values as per the search query and not as per the event data. eval _raw="\"groupByAction\": \"[{\\\"totalCount\\\": 40591, \\\"action\\\": \\\... See more...
Thanks. I just refreshed but it only has the predefined values as per the search query and not as per the event data. eval _raw="\"groupByAction\": \"[{\\\"totalCount\\\": 40591, \\\"action\\\": \\\"update_statistics table\\\"}, {\\\"totalCount\\\": 33724, \\\"action\\\": \\\"reorg index\\\"}, {\\\"totalCount\\\": 22015, \\\"action\\\": \\\"job report\\\"}, {\\\"totalCount\\\": 10236, \\\"action\\\": \\\"reorg table\\\"}, {\\\"totalCount\\\": 7389, \\\"action\\\": \\\"truncate table\\\"}, {\\\"totalCount\\\": 3291, \\\"action\\\": \\\"defrag table\\\"}, {\\\"totalCount\\\": 2291, \\\"action\\\": \\\"sp_recompile table\\\"}, {\\\"totalCount\\\": 2172, \\\"action\\\": \\\"add range partitions\\\"}, {\\\"totalCount\\\": 2088, \\\"action\\\": \\\"update_statistics index\\\"}, {\\\"totalCount\\\": 2069, \\\"action\\\": \\\"drop range partitions\\\"}]\""   The above data is only available in the dashboard and not the latest event data
Hi, Just to confirm/enquire more on this, what you meant is that we will be creating a service/script to run on the particular server ? Or there is already a Splunk default config file which  have t... See more...
Hi, Just to confirm/enquire more on this, what you meant is that we will be creating a service/script to run on the particular server ? Or there is already a Splunk default config file which  have the settings for us to edit.