All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

The split function can break up the field for you. index="okta" actor.alternateId="*mydomain*" outcome.reason=*CHALLENGE* client.geographicalContext.country!="" actor.displayName!="Okta System" AND ... See more...
The split function can break up the field for you. index="okta" actor.alternateId="*mydomain*" outcome.reason=*CHALLENGE* client.geographicalContext.country!="" actor.displayName!="Okta System" AND NOT "okta_svc_acct" ``` The trim function removes the braces from the ends of the field ``` | eval behaviors=split(trim('debugContext.debugData.behaviors', "{}"),",") | mvexpand behaviors | bin _time span=45d | stats count by outcome.reason, behaviors | sort -count
Hi @gcusello  and @richgalloway,   One final question to get clarity about data models. Let's assume I have an index that has data retention time of 1 month and a data model acceleration summary f... See more...
Hi @gcusello  and @richgalloway,   One final question to get clarity about data models. Let's assume I have an index that has data retention time of 1 month and a data model acceleration summary for 3 months. How will the data model act in this case. Will data models have accelerated data that goes until 3 months or will the data models drop the data once the index drops them?   Regards, Pravin
Hi @Choi_Hyun, as I said, for my knowledge, that app shouldn't be used for inputs. also because it isn't possible to manage it by Deployment Server. Ciao. Giuseppe
Hi @Splunk235 , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
@Yann.Buccellato, are you using the on-prem or SaaS version of the product and docs? * On-prem doc: https://docs.appdynamics.com/appd/onprem/latest/en/extend-appdynamics/integration-modules/integra... See more...
@Yann.Buccellato, are you using the on-prem or SaaS version of the product and docs? * On-prem doc: https://docs.appdynamics.com/appd/onprem/latest/en/extend-appdynamics/integration-modules/integrate-appdynamics-with-servicenow-cmdb-and-event-management * SaaS doc: https://docs.appdynamics.com/appd/23.x/latest/en/extend-appdynamics/integration-modules/integrate-appdynamics-with-servicenow-cmdb-and-event-management We're looking into the issue now, but it'll be helpful to understand what deployment you're using and what docs you're looking at. Thanks!
Hi Giuseppe, Thank you for your response. I also have no idea why an input.conf file was created or how it was created. I will test to see if my deployment server can push out an empty input.conf ... See more...
Hi Giuseppe, Thank you for your response. I also have no idea why an input.conf file was created or how it was created. I will test to see if my deployment server can push out an empty input.conf file to that folder, otherwise I might just have to use PowerShell to just delete and replace that file on our hosts. To make it clear, this behavior is unusual right?
Im trying to break out the comma separated values in my results but im brain farting. I want to break out the specific reasons - {New Geo-Location=NEGATIVE, New Device=POSITIVE, New IP=NEGATIVE, New ... See more...
Im trying to break out the comma separated values in my results but im brain farting. I want to break out the specific reasons - {New Geo-Location=NEGATIVE, New Device=POSITIVE, New IP=NEGATIVE, New State=NEGATIVE, New Country=NEGATIVE, Velocity=NEGATIVE, New City=NEGATIVE} index="okta" actor.alternateId="*mydomain*" outcome.reason=*CHALLENGE* client.geographicalContext.country!="" actor.displayName!="Okta System" AND NOT "okta_svc_acct" | bin _time span=45d | stats count by outcome.reason, debugContext.debugData.behaviors | sort -count outcome.reason debugContext.debugData.behaviors Sign-on policy evaluation resulted in CHALLENGE {New Geo-Location=NEGATIVE, New Device=POSITIVE, New IP=NEGATIVE, New State=NEGATIVE, New Country=NEGATIVE, Velocity=NEGATIVE, New City=NEGATIVE} Sign-on policy evaluation resulted in CHALLENGE {New Geo-Location=NEGATIVE, New Device=NEGATIVE, New IP=NEGATIVE, New State=NEGATIVE, New Country=NEGATIVE, Velocity=NEGATIVE, New City=NEGATIVE} Sign-on policy evaluation resulted in CHALLENGE {New Geo-Location=NEGATIVE, New Device=POSITIVE, New IP=POSITIVE, New State=NEGATIVE, New Country=NEGATIVE, Velocity=NEGATIVE, New City=NEGATIVE} Sign-on policy evaluation resulted in CHALLENGE {New Geo-Location=POSITIVE, New Device=POSITIVE, New IP=POSITIVE, New State=NEGATIVE, New Country=NEGATIVE, Velocity=NEGATIVE, New City=POSITIVE} Sign-on policy evaluation resulted in CHALLENGE {New Geo-Location=NEGATIVE, New Device=NEGATIVE, New IP=POSITIVE, New State=NEGATIVE, New Country=NEGATIVE, Velocity=NEGATIVE, New City=NEGATIVE}
this worked - thanks.  lol I was looking for the differences between the previous versions of sendemail.  I was hoping that 9.1.2 would have gotten deployed sooner than later, but this will work unti... See more...
this worked - thanks.  lol I was looking for the differences between the previous versions of sendemail.  I was hoping that 9.1.2 would have gotten deployed sooner than later, but this will work until then.
Probably because you didn't say you wanted "*" and you are probably missing some backslashes - try this <input type="checkbox" token="checkbox" id="checkABC"> <label></label> <choice... See more...
Probably because you didn't say you wanted "*" and you are probably missing some backslashes - try this <input type="checkbox" token="checkbox" id="checkABC"> <label></label> <choice value="*">All</choice> <choice value="AA">AA</choice> <choice value="BB">BB</choice> <choice value="CC">CC</choice> <change> <condition match="match($checkbox$,&quot;\\*&quot;)"> <unset token="A"></unset> <unset token="B"></unset> <unset token="C"></unset> <set token="form.checkbox">*</set> </condition> <condition> <eval token="A">if(match($checkbox$,"AA"),"A",null())</eval> <eval token="B">if(match($checkbox$,"BB"),"B",null())</eval> <eval token="C">if(match($checkbox$,"CC"),"C",null())</eval> </condition> </change> <default>AA,BB,CC</default> <initialValue>AA,BB,CC</initialValue> <delimiter>,</delimiter> </input>  
Sorry, I meant to say that the size of indexes (index1, index2, index 3, and so on) all together sums upto 250 GB. But the sizing case with datamodels was 250 Gb for 1 on them, 11GB of another, some ... See more...
Sorry, I meant to say that the size of indexes (index1, index2, index 3, and so on) all together sums upto 250 GB. But the sizing case with datamodels was 250 Gb for 1 on them, 11GB of another, some megabytes for the next one., and so on. Actually, the datamodel has only the requested field accelrated but the summary range is 1 year. This obviously makes sense for the growing size of data models.   Thanks, Pravin
but then it won't be by time also , no ?
Hi @sarit_s  chart command will not work with multiple fileds , try using stats 
Hi @sarit_s, in the chart command you can use only one field for the OVER or the BY option, you cannot use two fields. the only way (if acceptable) is concatenate the two fields in one: | eval Col... See more...
Hi @sarit_s, in the chart command you can use only one field for the OVER or the BY option, you cannot use two fields. the only way (if acceptable) is concatenate the two fields in one: | eval Column=UserAgent."|".LoginType | chart values(SuccessRatioBE) AS SuccessRatioBE over _time BY Column  Ciao. Giuseppe
Hi to all, I'm a newbee in Splunk and I need to check If the Splunk Cloud is receiving traffic form our network infrastructure. I have thought to do via API request but I don't find the url where to... See more...
Hi to all, I'm a newbee in Splunk and I need to check If the Splunk Cloud is receiving traffic form our network infrastructure. I have thought to do via API request but I don't find the url where to do the request. Could anybody to send me where I can find documentation to do this??? Or how can I do this?? Thanks in advance!! David.
Hello Im trying to run a chart command grouped by 2 fields but im getting an error: this is my query :   | chart values(SuccessRatioBE) as SuccessRatioBE over _time by UserAgent LoginType ... See more...
Hello Im trying to run a chart command grouped by 2 fields but im getting an error: this is my query :   | chart values(SuccessRatioBE) as SuccessRatioBE over _time by UserAgent LoginType   and im getting this error : "Error in 'chart' command: The argument 'LoginType' is invalid." I also tried with comma to separate between the fields and ticks also
In the example, the lexicographic order will process the transformations in the same order, as TRANSFORMS-1 comes before TRANSFORMS-2 (and so on).  
hi @cmlombardo , the order of transformation is relevant! only for example, if you read at https://docs.splunk.com/Documentation/Splunk/9.1.1/Forwarding/Routeandfilterdatad#Filter_event_data_and_se... See more...
hi @cmlombardo , the order of transformation is relevant! only for example, if you read at https://docs.splunk.com/Documentation/Splunk/9.1.1/Forwarding/Routeandfilterdatad#Filter_event_data_and_send_to_queues , if you want to keep some events and discard the rest, you have to executebefore the transofrmation on all the events (regex=.) and then the tranformation of a part of data, if you change the order, the tranformation doesn't work. Ciao. Giuseppe
It's not necessarily true that the data will be the same after each example.  In some (many/most?) cases, the order of transformations could be significant and is why I recommend using the second for... See more...
It's not necessarily true that the data will be the same after each example.  In some (many/most?) cases, the order of transformations could be significant and is why I recommend using the second format.
So, if they are processed in lexicographical order, then the result should be the same once the data passes through my 2 transformation examples. Best practice, as I understand it, is to list the tr... See more...
So, if they are processed in lexicographical order, then the result should be the same once the data passes through my 2 transformation examples. Best practice, as I understand it, is to list the transformations in the second form TRANFORMS=tr1,tr2,tr3 so that there is no doubt on the order they are processed.  
Hi @_pravin , the disk space used for accelerated Data Models is usually calculated with this formula: disk_space = dayly_used_license * 3.4 this formula is described in the Splunk Architecting tr... See more...
Hi @_pravin , the disk space used for accelerated Data Models is usually calculated with this formula: disk_space = dayly_used_license * 3.4 this formula is described in the Splunk Architecting training course. So it's very strange that you have 250GB of index and 250 GB of Data Model. This is possible only if you configured in your Data Model also the _raw field and this isn't a best practice becase in a Data Model you should have only the fields requested in your searches, not all the _raw of all events. Ciao. Giuseppe