All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

The goal is to get Entra logs into Splunk Cloud and alert on non-domain affiliated logins. Can't seem to find any documentation on.
@gcusello  Thanks for replying! That pull results, but it's doesn't populate anything in the statistic tab, which is our main issue (old query shows 20, alternate one that we don't really want to u... See more...
@gcusello  Thanks for replying! That pull results, but it's doesn't populate anything in the statistic tab, which is our main issue (old query shows 20, alternate one that we don't really want to use shows 1700). The total results are around 38k as well so we're not going over, but definitely I think correcting the search issue is a good idea. Any ideas about the statistics piece?
Hello, I have below inputs stanza to monitor the syslog feed coming to index=base,  Now we need to filter the out with a specific host names and re route them to new index monitor:///usr/loc... See more...
Hello, I have below inputs stanza to monitor the syslog feed coming to index=base,  Now we need to filter the out with a specific host names and re route them to new index monitor:///usr/local/apps/logs/*/base_log/*/*/*/*.log] disabled = 0 sourcetype = base:syslog index = base host_segment = 9 example I have hosts  (serverxyz.myserver.com, myhostabc.myserver.com, myhostuvw.myserver.com), now i want to match *xyz* and *abc* and re route to new index. since the old config has /*/ which feeds everything to old index i wanted to add balklist to the old stanza to avoid ingesting to both index. OLD Stanza : monitor:///usr/local/apps/logs/*/base_log/*/*/*/*.log] disabled = 0 sourcetype = base:syslog index = base host_segment = 9 blacklist = (*xyz*|.*\/*abc*\/) NEW  Stanza : monitor:///usr/local/apps/logs/*/base_log/*/*/*xyz*/*.log] disabled = 0 sourcetype = base:syslog index = mynewindex host_segment = 9 monitor:///usr/local/apps/logs/*/base_log/*/*/*abc*/*.log] disabled = 0 sourcetype = base:syslog index = mynewindex host_segment = 9
Hi @Rak, probably the issue is that the subsearch used in the join command has more than 50,000 results and this limit gives you incomplete results. In addition there's an error in your search: it ... See more...
Hi @Rak, probably the issue is that the subsearch used in the join command has more than 50,000 results and this limit gives you incomplete results. In addition there's an error in your search: it isn't useful to have a main search and then a seach command. At least the jin command is to avoid or to use when there isn't any other result, because it'0s very slow and resource consuming. So please to rethink your search following my approach: (index=testindex OR index=testindex2 source="insertpath" ErrorCodesResponse=PlanInvalid TraceId=*) OR (index=test ("Test SKU")) | eval type=if(index="test","2","1") | stats earliest('@t') AS '@t' values('@m') AS '@m' values(RequestPath) AS RequestPath dc(type) AS type_count BY TraceId | where type_count=2 | eval date=strftime(strptime('@t', "%Y-%m-%dT%H:%M:%S.%6N%Z"), "%Y-%m-%d"), time=strftime(strptime('@t', "%Y-%m-%dT%H:%M:%S.%6N%Z"), "%H:%M") | table time, date, TraceId, @MT,RequestPath I'm not sure if the check on the number of types is relevant or not. Ciao. Giuseppe
Were you able to get this to work? I've been asked if we can use gMSA w/DB connect. 
Tried it and it is not working for me
Hello, We have a query for an alert that was working prior, but is no longer returning the correct results. We haven't changed anything on our instance, so I'm not sure as to what would be the caus... See more...
Hello, We have a query for an alert that was working prior, but is no longer returning the correct results. We haven't changed anything on our instance, so I'm not sure as to what would be the cause. Query is below (I blanked out the index names, etc of course). I tested it with an different query which is returning the expected results, but I'd like to figure out what's going on with this one. index=testindex OR index=testindex2 source="insertpath" ErrorCodesResponse=PlanInvalid | search TraceId=* | stats values(TraceId) as TraceId | mvexpand TraceId | join type=inner TraceId [search index=test ("Test SKU") | fields TraceId,@t,@mt,RequestPath] | eval date=strftime(strptime('@t', "%Y-%m-%dT%H:%M:%S.%6N%Z"), "%Y-%m-%d"), time=strftime(strptime('@t', "%Y-%m-%dT%H:%M:%S.%6N%Z"), "%H:%M") | table time, date, TraceId, @MT,RequestPath
Hi everyone, I am trying to create a multi KPI alert. I have tens of services with 4-5 KPIs each. Using the multi KPI alert I want to create a correlation search which can send me an email alert if ... See more...
Hi everyone, I am trying to create a multi KPI alert. I have tens of services with 4-5 KPIs each. Using the multi KPI alert I want to create a correlation search which can send me an email alert if any of the KPIs are in critical severity for more than15 minutes.  After selecting Status over time in the MultiKPI creation window, we have to set trigger for each of the KPIs.  Is there a way to set the same trigger for all the KPIs? For example if any KPI is at Critical severity level >=50% of the last 30 minutes. Seems like I am missing something, no way I have to click and set trigger for each KPI hundreds of times. Thanks!
@klowk when you added the passAuth = <admin user>, did you not have to specify the password anywhere? how does it authenticate the user
I did not, as said above in my post, i'm very new to the subject and i asked how to check if the conf was taken into account. Thanks for telling me how, i did check and splunk does seem to take the d... See more...
I did not, as said above in my post, i'm very new to the subject and i asked how to check if the conf was taken into account. Thanks for telling me how, i did check and splunk does seem to take the default conf written.
@jlstanley did you find a fix for this? Im running into the error as well
Como criar uma busca de emprego através de uma API REST?   A ferramenta que devo usar é o Azure Data Factory para chamar uma API REST.   Estou a efetuar um POST Search com url="  https://edp.... See more...
Como criar uma busca de emprego através de uma API REST?   A ferramenta que devo usar é o Azure Data Factory para chamar uma API REST.   Estou a efetuar um POST Search com url="  https://edp.splunkcloud.com:8089/services/search/v2/jobs?output_mode=json " e body={\n \"search\": \"search%20index%3D\"oper_event_dynatrace_perf\" source=\"dynatrace_timeseries_metrics_v2://dynatrace_synthetic_browser_totalduration\"%20earliest%3D-96h}"   Na resposta ao POST a API envolve um sheduler SID que faz referência a uma pesquisa que não é o que eu coloquei no search do POST. Verifiquei no Activity>Jobs do Splunk e não foi criado nenhum Job associado ao meu search nem ao meu usuário.   Como posso construir o POST search para criar um Job do meu search através da API do Splunk ?   Entrada: { "método": "POST", "cabeçalhos": { "Tipo de conteúdo": "aplicativo/json; conjunto de caracteres=UTF-8" }, "url": " https://edp.splunkcloud.com:8089/services/search/v2/jobs?output_mode=json ", "connectVia": { "referenceName": "integrationRuntime1", "tipo": "IntegrationRuntimeReference" }, "corpo": " {\n \"pesquisar\": \"pesquisar%20índice%3D\"oper_event_dynatrace_perf\" fonte=\"dynatrace_timeseries_metrics_v2://dynatrace_synthetic_browser_totalduration\"%20mais%3D-96h}", "autenticação": { "tipo": "Básico", "nome do usuário": "saazrITAnalytD01", "senha": { "tipo": "SecureString", "valor": "***********" } } } Saída: { "ligações": {}, "origem": " https://edp.splunkcloud.com:8089/services/search/v2/jobs ", "atualizado": "2024-11-21T16:04:41Z", "gerador": { "construir": "be317eb3f944", "versão": "9.2.2406.109" }, "entrada": [ { "name": "search ```Verifique se algum dos modelos ..., "id": " https://edp.splunkcloud.com:8089/services/search/v2/jobs/scheduler_dGlhZ28uZ29uY2FsdmVzQGJvcmRlci1pbm5vdmF0aW9uLmNvbQ_YWlvcHNfc3RvcmFnZV9wcm9qZWN0aW9u__RMD546f44b20564d9b63_at_1732179600_6116 ", "atualizado": "2024-11-21T09:00:30.684Z", "ligações": { "alternativa": "/serviços/pesquisa/v2/empregos/scheduler_dGlhZ28uZ29uY2FsdmVzQGJvcmRlci1pbm5vdmF0aW 9uLmNvbQ_YWlvcHNfc3RvcmFnZV9wcm9qZWN0aW9u__RMD546f44b20564d9b63_at_1732179600_6116", "search_telemetry.json": "/serviços/pesquisa/v2/empregos/scheduler_dGlhZ28uZ29uY2FsdmVzQGJvcmRlci1pbm5vdmF0aW9uLmNvbQ_YWlvcHNfc3RvcmFnZV9wcm9qZWN0aW9u__RMD546f44b20564d9b63_at_1732179600_6116/search_telemetry.json", "search.log": "/serviços/pesquisa/v2/empregos/scheduler_dGlhZ28uZ29uY2FsdmVzQGJvcmRlci1pbm5vdmF0aW9uLmNvbQ_YWlvcHNfc3RvcmFnZV9wcm9qZWN0aW9u__RMD546f44b20564d9b63_at_1732179600_6116/search.log", "eventos": "/serviços/pesquisa/v2/empregos/scheduler_dGlhZ28uZ29uY2FsdmVzQGJvcmRlci1pbm5vdmF0aW9uLmNvbQ_YWlvcHNfc3RvcmFnZV9wcm9qZWN0aW9u__RMD546f44b20564d9b63_at_1732179600_6116/eventos", "resultados": "/serviços/pesquisa/v2/empregos/scheduler_dGlhZ28uZ29uY2FsdmVzQGJvcmRlci1pbm5vdmF0aW9uLmNvbQ_YWlvcHNfc3RvcmFnZV9wcm9qZWN0aW9u__RMD546f44b20564d9b63_at_1732179600_6116/resultados", "results_preview": "/serviços/pesquisa/v2/empregos/scheduler_dGlhZ28uZ29uY2FsdmVzQGJvcmRlci1pbm5vdmF0aW9uLmNvbQ_YWlvcHNfc3RvcmFnZV9wcm9qZWN0aW9u__RMD546f44b20564d9b63_at_1732179600_6116/results_preview", "linha do tempo": "/serviços/pesquisa/v2/empregos/scheduler_dGlhZ28uZ29uY2FsdmVzQGJvcmRlci1pbm5vdmF0aW9uLmNvbQ_YWlvcHNfc3RvcmFnZV9wcm9qZWN0aW9u__RMD546f44b20564d9b63_at_1732179600_6116/linha do tempo", "resumo": "/serviços/pesquisa/v2/empregos/scheduler_dGlhZ28uZ29uY2FsdmVzQGJvcmRlci1pbm5vdmF0aW9uLmNvbQ_YWlvcHNfc3RvcmFnZV9wcm9qZWN0aW9u__RMD546f44b20564d9b63_at_1732179600_6116/resumo", "controle": "/serviços/pesquisa/v2/empregos/agendador_dGlhZ28uZ29uY2FsdmVzQGJvcmRlci1pbm5vdmF0aW9uLmNvbQ_YWlvcHNfc3RvcmFnZV9wcm9qZWN0aW9u__RMD546f44b20564d9b63_at_1732179600_6116/controle" }, "publicado": "2024-11-21T09:00:27Z", "autor": tiago.goncalves@border-innovation.com , "contente": { "bundleVersion": "11289842698950824761", "canSummarize": falso, "cursorTime": "1970-01-01T00:00:00Z", "defaultSaveTTL": "604800", "defaultTTL": "600", "delegar": "agendador", "diskUsage": 593920, "dispatchState": "CONCLUÍDO", "feitoProgresso": 1, "contagem de gotas": 0, "earliestTime": "2024-11-21T00:00:00Z", "eventoDisponívelContagem": 0, "Contagem de eventos": 0, "eventFieldCount": 0, "eventIsStreaming": falso, "eventIsTruncated": falso, "eventSearch": "pesquisar (index=_internal ...", "eventSorting": "nenhum", "isBatchModeSearch": verdadeiro, "isDone": verdadeiro, "isEventsPreviewEnabled": falso, "isFailed": falso, "isFinalized": falso, "isPaused": falso, "isPreviewEnabled": falso, "isRealTimeSearch": falso, "isRemoteTimeline": falso, "isSaved": falso, "isSavedSearch": verdadeiro, "isTimeCursored": verdadeiro, "isZombie": falso, "is_prjob": verdadeiro, "palavras-chave": "app::aiops_storage_projection index::_internal result_count::0 \"savedsearch_name::edp aiops sp*\" search_type::scheduled source::*scheduler.log", "label": "EDP AIOPS - Falha no treino dos modelos de previsão", "latestTime": "2024-11-21T09:00:00Z", "normalizedSearch": "litsearch (índice=_interno ..., "numPreviews": 0, "optimizedSearch": "| pesquisa (índice=_internal app=..., "phase0": "litsearch (índice=_interno ..., "phase1": "addinfo tipo=contagem rótulo..., "pid": "3368900", "prioridade": 5, "proveniência": "agendador", "remoteSearch": "litsearch (índice=_interno ..., "reportSearch": "tabela _time..., "resultadoContagem": 0, "resultIsStreaming": falso, "resultPreviewCount": 0, "runDuration": 3.304000000000000003, "sampleRatio": "1", "sampleSeed": "0", "savedSearchLabel": "{\"proprietário\":\ tiago.goncalves@border-innovation.com\ ,\"app\":\"aiops_storage_projection\",\"compartilhamento\":\"app\"}", "scanCount": 10, "search": "search ```Verifique se ..., "searchCanBeEventType": falso, "searchEarliestTime": 1732147200, "pesquisarÚltimaHora": 1732179600, "searchTotalBucketsCount": 48, "searchTotalEliminatedBucketsCount": 14, "sid": "agendador_dGlhZ28uZ29uY2FsdmVzQGJvcmRlci1pbm5vdmF0aW9uLmNvbQ_YWlvcHNfc3RvcmFnZV9wcm9qZWN0aW9u__RMD546f44b20564d9b63_at_1732179600_6116", "statusBuckets": 0, "ttl": 147349, ... } } } }
Fixed with  .highcharts-series.highcharts-series-0.highcharts-column-series.highcharts-tracker rect { fill:#28a197; stroke:#28a197;} .highcharts-series.highcharts-series-1.highcharts-column-s... See more...
Fixed with  .highcharts-series.highcharts-series-0.highcharts-column-series.highcharts-tracker rect { fill:#28a197; stroke:#28a197;} .highcharts-series.highcharts-series-1.highcharts-column-series.highcharts-tracker rect { fill:#f47738; stroke: #f47738;} .highcharts-series.highcharts-series-2.highcharts-column-series.highcharts-tracker rect { fill:#6f72af; stroke: #6f72af;}
HI Everyone, Hope you all are doing well.   I am trying to deploy the CIM on Search Head Cluster Environment, and I have some questions: 1- I found under /default two files (inputs.conf & indexes... See more...
HI Everyone, Hope you all are doing well.   I am trying to deploy the CIM on Search Head Cluster Environment, and I have some questions: 1- I found under /default two files (inputs.conf & indexes.conf) that seems to me they are related to indexer cluster not search heads cluster, am I true? 2- what is "cim_modactions index definition is used with the common action model alerts and auditing", i didnt know the actual meaning? Splunk Common Information Model (CIM) 
I do feel a bit stupid now.. My Cron was wrong. The method was perfectly sane. I did struggle to find any actual documentation to say that this was a way of doing it, so I hope this question will h... See more...
I do feel a bit stupid now.. My Cron was wrong. The method was perfectly sane. I did struggle to find any actual documentation to say that this was a way of doing it, so I hope this question will help future searchers determine that. Thanks for helping my grey matter along
Hi @Crotyo , could you share your search? Ciao. Giuseppe
OK. Did you verify what Splunk actually sees? | rest /data/indexes/myindex Some of this info you can also see in Settings->Indexes  
I am getting ready to attempt the rapid7 Nexpose addon. Did it end up working for you? I am wondering if there is a better approach since the app only has two stars on splunk base and is not a splunk... See more...
I am getting ready to attempt the rapid7 Nexpose addon. Did it end up working for you? I am wondering if there is a better approach since the app only has two stars on splunk base and is not a splunk supported app. 
I tried that and the search return empty. I tried checking the inputlookup command and it did list all the numbers.
I did try that and the search result return empty.