All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello, I have a problem with a custom app in Splunk. I've written a simple app that uses the Python requests-library to query the Microsoft Graph API. It works perfectly for most queries, but when I... See more...
Hello, I have a problem with a custom app in Splunk. I've written a simple app that uses the Python requests-library to query the Microsoft Graph API. It works perfectly for most queries, but when I try to use it to get all users in our AAD environment, it throws an error: ERROR ChunkedExternProcessor [111784 phase_1] - Failed to find newline while reading transport header. This always happens at the same page (I have to use pagination, since the API returns 100 lines per response). I've looked at that page, and the one after, but nothing special caught my eye. This is a Splunk-specific issue: I can use the requests-library to get all the results and the json-library to dump them with no problems, but when I use these in conjunction with splunklib and yield the results as rows, I get the error above. The logs (with debug-mode on) don't seem to have any other clues. Could this be an encoding issue - could the results have some special characters that throw the Python code off somehow? Any help is greatly appreciated!
anyone else find that running the automagic app version updates is hit and miss, sometimes it works, and sometimes it doesn't  Error:  An error occurred while installing the app: 500
Hi there, I have a search where I want to see where one date field is the same or starts before another but my search results only shows me events where both dates are the same, can you help? I am ... See more...
Hi there, I have a search where I want to see where one date field is the same or starts before another but my search results only shows me events where both dates are the same, can you help? I am trying to find events which contains an end date that is before the created date. The data isn't create so a typical date entry would be 12112022 index=UAT sourcetype="Test_Txt_data" | eval end_date_epoch = strptime(end_date,"%d%m%Y") | eval created_date_epoch = strptime(created_date,"%d%m%Y") | where end_date_epoch <= created_date_epoch | eval end_date = strftime(end_date_epoch, "%d/%m/%Y"), created_date = strftime(created_date_epoch, "%d/%m/%Y") | table proposal, created_date, end_date
Hi, I want to use 'AND' keyword either in startsWith or in endsWith. <<search>> | transaction startsWith="some text" AND "some other text" endswith="some text" AND "some other text" Is this possib... See more...
Hi, I want to use 'AND' keyword either in startsWith or in endsWith. <<search>> | transaction startsWith="some text" AND "some other text" endswith="some text" AND "some other text" Is this possible?  
Hi All, I am working on analyzing processing time among 10 devices and categorize all the evnets into 3 categories, including "Max", "Avg" and "PR99" (Which means its processing time in percent 99 i... See more...
Hi All, I am working on analyzing processing time among 10 devices and categorize all the evnets into 3 categories, including "Max", "Avg" and "PR99" (Which means its processing time in percent 99 in all events) for each device. Raw data: Category Processing time(sec) Device id Max 121 1 PR99 106 1 Avg 70 1 Max 117 2 PR99 106 2 Avg 71 2 Max 78 3 PR99 77 3 Avg 62 3 ... .... ...   I want to display the category into seperated panel only if the category is selected. Does any one have suggestion  on how to implement this on Splunk?  For example: if select "Max", "Avg" through some scroll bar Panel 1:  List "Max" only Category Processing time(sec) Device id Max 121 1 Max 117 2 Max 78 3 ... .... ... Panel 2: List "Avg" only Category Processing time(sec) Device id Avg 70 1 Avg 71 2 Avg 62 3 ... .... ...   Thank you. Jounman
hey, im trying to delete events that got into the system on a specific time range. i see the events when i use splunk time range picker    but when i try and use where to find those events wit... See more...
hey, im trying to delete events that got into the system on a specific time range. i see the events when i use splunk time range picker    but when i try and use where to find those events without time picker i can't find them and im too scared to just run delete query without specifying exactly what i want to delete.   iv'e also tried only one "where" clause with earliest and latest didn't worked too.   what am i doing wrong?
Hello. how to collects microsoft exchange 2019 audit logs to splunk
despite having a local\outputs.conf file properly populated with 6 Indexers one of our non clustered Search Heads does not show anything under Forward data as defined in the Web GUI any suggestions... See more...
despite having a local\outputs.conf file properly populated with 6 Indexers one of our non clustered Search Heads does not show anything under Forward data as defined in the Web GUI any suggestions where and how to check what is over writing this 
What is the difference between standard and transparent federated search type in splunk with examples or usecase?
Hi We index the accesses made on a filer. For each action on a file, events are generated and indexed in Splunk. The copy of a file does not directly generate a "copy" event but the "Event.Syst... See more...
Hi We index the accesses made on a filer. For each action on a file, events are generated and indexed in Splunk. The copy of a file does not directly generate a "copy" event but the "Event.System.EventName" field consecutively takes the three values "Open Object", "Get Object Attributes", "Read Object". This corresponds to three events in Splunk with no real common fields. How to build a query that would identify this consecutive sequence of events to alert us of a file copy ? Maybe the streamstat command could be used but I can't figure out how.
I'm working on an input.conf from a universal forwarder when I noticed the first stanza is missing a ] ex: [WinEventLog://Application instead of [WinEventLog://Application]  Since I do not have ... See more...
I'm working on an input.conf from a universal forwarder when I noticed the first stanza is missing a ] ex: [WinEventLog://Application instead of [WinEventLog://Application]  Since I do not have direct access to the UF to check the logs.  What effect, if any, would this have on log ingestion?   thanks
Hi. I would like to import the services defined in our CMDB as ITSI Services. It seems that you have to set a specific team when you do the import, and for my trial it created duplicate services, if... See more...
Hi. I would like to import the services defined in our CMDB as ITSI Services. It seems that you have to set a specific team when you do the import, and for my trial it created duplicate services, if the service was defined in another team, than the one I specified at load time. I could the create an import for each team, but it seems quite cumbersome, when I can get the team in the data I get from the CMDB, so I would really like to know, if anybody has tried and found a solution, where you are not hard defining the team, but use values or even the team id from ITSI to create the services in one go?   KIind regards las
Hi, Anyone here tried to turn off the option for Export PDF in certain dashboards or in all dashboards?  
I want to be able to drilldown on a field if the value is an IP address.  If it is not an IP address it will be some string value of "N/A" or something similar and the field will not clickable some o... See more...
I want to be able to drilldown on a field if the value is an IP address.  If it is not an IP address it will be some string value of "N/A" or something similar and the field will not clickable some other way to handle the drilldown function for fields with that value.
I have all_ip filed that contains all my ips. now I want to split it to public ip and private ip: public_ip, private_ip, all_ip: and when private_ip is null I want to put the value from all_ip in ... See more...
I have all_ip filed that contains all my ips. now I want to split it to public ip and private ip: public_ip, private_ip, all_ip: and when private_ip is null I want to put the value from all_ip in public_ip field.  first I did:  | eval private_ip=if(like(all_ip,"XXXX.%") OR like(all_ip,"XXX.%"),all_ip,null()) and now I need to do (all the rest fill it in public_ip field.  this is possible?  thanks!
Hi everyone, I am searching a way to have a list of every alert (sending email) goes along with: schedule (cron), last run,  send email (sent or not) Until now I can find this list of info but stil... See more...
Hi everyone, I am searching a way to have a list of every alert (sending email) goes along with: schedule (cron), last run,  send email (sent or not) Until now I can find this list of info but still not success to have the last run and send email     |rest/servicesNS/-/App_name/saved/searches | fields title disabled actions alert.severity cron_schedule action.email.to action.email.bcc is_schedule max_concurrent next_schedule_time run_n_times | where disabled=0 |where actions="email" |table title cron_schedule action.email.to action.email.bcc is_schedule max_concurrent next_schedule_time run_n_times     Anyone has an idea, please?  Thanks in advanced!
Please help me to show the timings on below barchart, i am using chart count over by description to view the file name on graph when i point the mouse pointer to chart, but i couldn't able to provide... See more...
Please help me to show the timings on below barchart, i am using chart count over by description to view the file name on graph when i point the mouse pointer to chart, but i couldn't able to provide the timings on xaxis , below are the query and graph   index=xxxxx sourcetype = xxxx source="xxxxxx_*.log" |eval description=case(Suspend Like "S","Suspended",Suspend Like "P","Partially-Completed",Suspend Like "C","Completed")|eval File_Name= description."-".TC_File_Name|table _time File_Name TC_File_Name description |chart count(File_Name) over TC_File_Name by description  
I am using Splunk 8.0.8. I have python versions 2.7 and 3.7 installed in $Splunk_Home/bin folder but all my python scripts are getting executed with python 2.7. I even tried changing python.version=p... See more...
I am using Splunk 8.0.8. I have python versions 2.7 and 3.7 installed in $Splunk_Home/bin folder but all my python scripts are getting executed with python 2.7. I even tried changing python.version=python3 in server.conf under ./etc/system/local but still scripts are running using python 2.7. I even tried python.version=forced_python3 in server.conf but no luck. Can someone please let me know where I need to change python version so that all my scripts starts using python3.7. ?
I see that in our environment some of our Search Heads are setup as forwarders and some are not, I think this environment like most grew from one server to a multiple server environment all before my... See more...
I see that in our environment some of our Search Heads are setup as forwarders and some are not, I think this environment like most grew from one server to a multiple server environment all before my time Now we have Search Heads and dedicated Deployment servers aka Forwarders which leads me to believe we no loner need the Search Heads to forward anything, so is there a way I can see what is using the Search Heads as forwarders?
I have a search: (index=.... sourcetype=....| stats count(transaction) as "Transaction") How ever when I use this search for ITSI my result in KPIs is: Anyone know why and how to fix th... See more...
I have a search: (index=.... sourcetype=....| stats count(transaction) as "Transaction") How ever when I use this search for ITSI my result in KPIs is: Anyone know why and how to fix this Thank you for your help.