All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

In addition to mistaken path notation ({} for array) as @PickleRick , you also do not need an extra spath if all you want is a multivalued field named commit_id.  Splunk should have taken care of ext... See more...
In addition to mistaken path notation ({} for array) as @PickleRick , you also do not need an extra spath if all you want is a multivalued field named commit_id.  Splunk should have taken care of extraction.   index=XXXXX source="http:github-dev-token" eventtype="GitHub::Push" sourcetype="json_ae_git-webhook" | rename commits{}.id as commit_id   This is a full emulation   | makeresults format=json data="[{ \"ref\":\"refs/heads/Dev\", \"before\":\"d53e9b3cb6cde4253e05019295a840d394a7bcb0\", \"after\":\"34c07bcbf557413cf42b601c1794c87db8c321d1\", \"commits\":[ { \"id\":\"a5c816a817d06e592d2b70cd8a088d1519f2d720\", \"tree_id\":\"15e930e14d4c62aae47a3c02c47eb24c65d11807\", \"distinct\":false, \"message\":\"rrrrrrrrrrrrrrrrrrrrrr\", \"timestamp\":\"2024-08-12T12:00:04-05:00\", \"url\":\"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/aaaaaaaaaaaa\", \"author\":{ \"name\":\"aaaaaa aaaaaa\", \"email\":\"101218171+aaaaaa@users.noreply.github.com\", \"username\":\"aaaaaa\"}, \"committer\":{ \"name\":\"aaaaaa aaaaaa\", \"email\":\"101218171+aaaaaa@users.noreply.github.com\", \"username\":\"aaaaaa\"}, \"added\":[], \"removed\":[], \"modified\":[\"asdafasdad.json\"]}, { \"id\":\"a3b3b6f728ccc0eb9113e7db723fbfc4ad220882\", \"tree_id\":\"3586aeb0a33dc5e236cb266c948f83ff01320a9a\", \"distinct\":false, \"message\":\"xxxxxxxxxxxxxxxxxxx\", \"timestamp\":\"2024-08-12T12:05:40-05:00\", \"url\":\"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/a3b3b6f728ccc0eb9113e7db723fbfc4ad220...\", \"author\":{ \"name\":\"aaaaaa aaaaaa\", \"email\":\"101218171+aaaaaa@users.noreply.github.com\", \"username\":\"aaaaaa\"}, \"committer\":{ \"name\":\"aaaaaa aaaaaa\", \"email\":\"101218171+aaaaaa@users.noreply.github.com\", \"username\":\"aaaaaa\"}, \"added\":[], \"removed\":[], \"modified\":[ \"sddddddf.json\"]}, { \"id\":\"bdcd242d6854365ddfeae6b4f86cf7bc1766e028\", \"tree_id\":\"8286c537f7dee57395f44875ddb8b2cdb7dd48b2\", \"distinct\":false, \"message\":\"Updating pipeline: pl_gwp_file_landing_check. Adding Sylvan Performance\", \"timestamp\":\"2024-08-12T12:06:10-05:00\", \"url\":\"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/bdcd242d6854365ddfeae6b4f86cf7bc1766e...\", \"author\":{ \"name\":\"aaaaaa aaaaaa\", \"email\":\"101218171+aaaaaa@users.noreply.github.com\", \"username\":\"aaaaaa\"}, \"committer\":{ \"name\":\"aaaaaa aaaaaa\", \"email\":\"101218171+aaaaaa@users.noreply.github.com\", \"username\":\"aaaaaa\"}, \"added\":[], \"removed\":[], \"modified\":[ \"asadwefvdx.json\"]}, { \"id\":\"108ebd4ff8ae9dd70e669e2ca49e293684d5c37a\", \"tree_id\":\"5a6d71393611718b8576f8a63cdd34ce619f17dd\", \"distinct\":false, \"message\":\"asdrwerwq\", \"timestamp\":\"2024-08-12T10:09:33-07:00\", \"url\":\"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/108ebd4ff8ae9dd70e669e2ca49e293684d5c...\", \"author\":{ \"name\":\"dfsd\", \"email\":\"l.llllllllllll@aaaaaa.com\", \"username\":\"aaaaaa\"}, \"committer\":{ \"name\":\"lllllllllllll\", \"email\":\"l.llllllllllll@abc.com\", \"username\":\"aaaaaa\"}, \"added\":[], \"removed\":[], \"modified\":[\"A.json\",\"A.json\",\"A.json\"]},{ \"id\":\"34c07bcbf557413cf42b601c1794c87db8c321d1\", \"tree_id\":\"5a6d71393611718b8576f8a63cdd34ce619f17dd\", \"distinct\":true, \"message\":\"asadasd\", \"timestamp\":\"2024-08-12T13:32:45-05:00\", \"url\":\"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/34c07bcbf557413cf42b601c1794c87db8c32...\", \"author\":{ \"name\":\"aaaaaa aaaaaa\", \"email\":\"101218171+aaaaaa@users.noreply.github.com\", \"username\":\"aaaaaa\"}, \"committer\":{ \"name\":\"GitasdjwqaikHubasdqw\", \"email\":\"noreply@gitskcaskadahuqwdqbqwdqaw.com\", \"username\":\"wdkcszjkcsebwdqwdfqwdawsldqodqw\"}, \"added\":[], \"removed\":[], \"modified\":[ \"a.json\", \"A1.json\", \"A1.json\"]}], \"head_commit\":{ \"id\":\"34c07bcbf557413cf42b601c1794c87db8c321d1\", \"tree_id\":\"5a6d71393611718b8576f8a63cdd34ce619f17dd\", \"distinct\":true, \"message\":\"sadwad from xxxxxxxxxxxxxxx/IH-5942-Pipeline-Change\n\nIh 5asdsazdapeline change\", \"timestamp\":\"2024-08-12T13:32:45-05:00\", \"url\":\"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/3weweeeeeeeee\", \"author\":{ \"name\":\"askjas\", \"email\":\"101218171+asfsfgwsrsd@users.noreply.github.com\", \"username\":\"asdwasdcqwasfdc-qwgbhvcfawdqxaiwdaszxc\" }, \"committer\":{ \"name\":\"GsdzvcweditHuscwsab\", \"email\":\"noreply@gitasdcwedhub.com\", \"username\":\"wefczeb-fwefvdszlow\"}, \"added\":[], \"removed\":[], \"modified\":[\"zzzzzzz.json\",\"Azzzzz.json\",\"zzzz.json\" ]}}]" | spath ``` the above emulates index=XXXXX source="http:github-dev-token" eventtype="GitHub::Push" sourcetype="json_ae_git-webhook" ``` | rename commits{}.id as commit_id | table commit_id   The output is commit_id a5c816a817d06e592d2b70cd8a088d1519f2d720 a3b3b6f728ccc0eb9113e7db723fbfc4ad220882 bdcd242d6854365ddfeae6b4f86cf7bc1766e028 108ebd4ff8ae9dd70e669e2ca49e293684d5c37a 34c07bcbf557413cf42b601c1794c87db8c321d1
 Yes, the search is semantically equivalent.  That was my point one in previous comment.  If your index search already has a field named ip, there is no need to run search command in a second pipe (c... See more...
 Yes, the search is semantically equivalent.  That was my point one in previous comment.  If your index search already has a field named ip, there is no need to run search command in a second pipe (command).  You also do not need those source evaluation because join doesn't care about them. | inputlookup host.csv | rename ip_address as ip | join max=0 type=left ip [ search index=risk ip="10.1.0.0/16" | fields ip risk score contact ] | join max=0 type=left ip [ search index=risk ip="10.2.0.0/16" | fields ip risk score contact ] | join max=0 type=left ip [ search index=risk ip="10.3.0.0/16" | fields ip risk score contact ] | table ip, host, risk, score, contact  I do not see any dedup in your mock code but I assume that you have customization that is not shown.  
I'm very new to Splunk.  I have two tokens as input to a dashboard and want to change a query based on which one is entered.   My base query (with no dashboard)  eventtype=builder user_id IN (<v... See more...
I'm very new to Splunk.  I have two tokens as input to a dashboard and want to change a query based on which one is entered.   My base query (with no dashboard)  eventtype=builder user_id IN (<value1>, <value2>, etc.) | eval ..... I created a dashboard and want to use tokens for the input.   token1=$id$ token2=$email$ If the token1 has data, I want to execute eventtype=builder user_id IN ($id$) | eval....  otherwise, I want to execute  eventtype=builder user_mail in $email$ | eval .....  
Well, you can try to make a compound regex containing some alternative branches. Also you seem to have some XML-like structure there. If it's a valid XML, why not just parse the XML into fields and ... See more...
Well, you can try to make a compound regex containing some alternative branches. Also you seem to have some XML-like structure there. If it's a valid XML, why not just parse the XML into fields and check for existence of specific fields? I'm also not sure about the rest of the search but honestly speaking it's too late and I'm too tired at the moment to look into it.
Do you need to do this in SPL during search or are you trying to define a field extraction? Anyway, the usual answer to "regex" and "json" in one sentence is usually "don't fiddle with regex on stru... See more...
Do you need to do this in SPL during search or are you trying to define a field extraction? Anyway, the usual answer to "regex" and "json" in one sentence is usually "don't fiddle with regex on structured data". WIth SPL it's relatively easy - extract your fields either with KV_MODE=json or explicitly using spath and do | rex input=attributes.Comment__c "with (?<failures_no>\d+) failures" With field extraction it might not be that easy because transforms which you could call on a json-extracted field are called before autoextractions. So you might actually need to define extraction based on raw data with that regex but that will be unintuitive to maintain since your data seems to be a well-formed json and  with json you'd actually expect the explicitly named fields, not some funky stuff pulled from somewhere from the middle.
requirements: find and save sensitive data fields from logs Save log snippet around sensitive data field Remove duplicates for mule apps and sensitive data field Create table showing mule app nam... See more...
requirements: find and save sensitive data fields from logs Save log snippet around sensitive data field Remove duplicates for mule apps and sensitive data field Create table showing mule app name, sensitive data, and log snippet is there a way to improve the search query so I don't have to duplicate the rex commands every time I need to add a new sensitive data value? (app_name is an existing custom field) index="prod"  |rex field=_raw (?i)(?<birthDate>(birthDate))|rex field=_raw (?i)(?<dob>(dob)) |rex field=_raw (?i)(?<birthday>(birthday)) |rex field=_raw (?i)(?<birthDateLog>(birthDate).*?\w\W) |rex field=_raw (?i)(?<dobLog>(dob).*?\w\W) |rex field=_raw (?i)(?<birthdayLog>(birthday).*?\w\W)|eval SENSITIVE_DATA= mvappend(birthDate,dob,birthday) |eval SENSITIVE_DATA_LOWER=lower(SENSITIVE_DATA) | dedup app_name SENSITIVE_DATA_LOWER |eval SENSITIVE_DATA_LOG=mvappend(birthDateLog,dobLog,birthdayLog) |stats list(SENSITIVE_DATA_LOG) as SENSITIVE_DATA_LOG list(SENSITIVE_DATA_LOWER) as SENSITIVE_DATA_LOWER by app_name | table app_name SENSITIVE_DATA_LOWER SENSITIVE_DATA_LOG   example output: app_name SENSITIVE_DATA_LOWER SENSITIVE_DATA_LOG s-api dob birthdate dob: 01/01/2024 birthdate:  09-09-1999 p-api birthday birthday: August 23, 2024
Assuming you wanted to say path=commits{}.id it seems to work for me. | makeresults | eval _raw="{ \"ref\":\"refs/heads/Dev\", \"before\":\"d53e9b3cb6cde4253e05019295a840d394a7bcb0\", \"after... See more...
Assuming you wanted to say path=commits{}.id it seems to work for me. | makeresults | eval _raw="{ \"ref\":\"refs/heads/Dev\", \"before\":\"d53e9b3cb6cde4253e05019295a840d394a7bcb0\", \"after\":\"34c07bcbf557413cf42b601c1794c87db8c321d1\", \"commits\":[ { \"id\":\"a5c816a817d06e592d2b70cd8a088d1519f2d720\", \"tree_id\":\"15e930e14d4c62aae47a3c02c47eb24c65d11807\", \"distinct\":false, \"message\":\"rrrrrrrrrrrrrrrrrrrrrr\", \"timestamp\":\"2024-08-12T12:00:04-05:00\", \"url\":\"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/aaaaaaaaaaaa\", \"author\":{ \"name\":\"aaaaaa aaaaaa\", \"email\":\"101218171+aaaaaa@users.noreply.github.com\", \"username\":\"aaaaaa\"}, \"committer\":{ \"name\":\"aaaaaa aaaaaa\", \"email\":\"101218171+aaaaaa@users.noreply.github.com\", \"username\":\"aaaaaa\"}, \"added\":[], \"removed\":[], \"modified\":[\"asdafasdad.json\"]}, { \"id\":\"a3b3b6f728ccc0eb9113e7db723fbfc4ad220882\", \"tree_id\":\"3586aeb0a33dc5e236cb266c948f83ff01320a9a\", \"distinct\":false, \"message\":\"xxxxxxxxxxxxxxxxxxx\", \"timestamp\":\"2024-08-12T12:05:40-05:00\", \"url\":\"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/a3b3b6f728ccc0eb9113e7db723fbfc4ad220...\", \"author\":{ \"name\":\"aaaaaa aaaaaa\", \"email\":\"101218171+aaaaaa@users.noreply.github.com\", \"username\":\"aaaaaa\"}, \"committer\":{ \"name\":\"aaaaaa aaaaaa\", \"email\":\"101218171+aaaaaa@users.noreply.github.com\", \"username\":\"aaaaaa\"}, \"added\":[], \"removed\":[], \"modified\":[ \"sddddddf.json\"]}, { \"id\":\"bdcd242d6854365ddfeae6b4f86cf7bc1766e028\", \"tree_id\":\"8286c537f7dee57395f44875ddb8b2cdb7dd48b2\", \"distinct\":false, \"message\":\"Updating pipeline: pl_gwp_file_landing_check. Adding Sylvan Performance\", \"timestamp\":\"2024-08-12T12:06:10-05:00\", \"url\":\"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/bdcd242d6854365ddfeae6b4f86cf7bc1766e...\", \"author\":{ \"name\":\"aaaaaa aaaaaa\", \"email\":\"101218171+aaaaaa@users.noreply.github.com\", \"username\":\"aaaaaa\"}, \"committer\":{ \"name\":\"aaaaaa aaaaaa\", \"email\":\"101218171+aaaaaa@users.noreply.github.com\", \"username\":\"aaaaaa\"}, \"added\":[], \"removed\":[], \"modified\":[ \"asadwefvdx.json\"]}, { \"id\":\"108ebd4ff8ae9dd70e669e2ca49e293684d5c37a\", \"tree_id\":\"5a6d71393611718b8576f8a63cdd34ce619f17dd\", \"distinct\":false, \"message\":\"asdrwerwq\", \"timestamp\":\"2024-08-12T10:09:33-07:00\", \"url\":\"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/108ebd4ff8ae9dd70e669e2ca49e293684d5c...\", \"author\":{ \"name\":\"dfsd\", \"email\":\"l.llllllllllll@aaaaaa.com\", \"username\":\"aaaaaa\"}, \"committer\":{ \"name\":\"lllllllllllll\", \"email\":\"l.llllllllllll@abc.com\", \"username\":\"aaaaaa\"}, \"added\":[], \"removed\":[], \"modified\":[\"A.json\",\"A.json\",\"A.json\"]},{ \"id\":\"34c07bcbf557413cf42b601c1794c87db8c321d1\", \"tree_id\":\"5a6d71393611718b8576f8a63cdd34ce619f17dd\", \"distinct\":true, \"message\":\"asadasd\", \"timestamp\":\"2024-08-12T13:32:45-05:00\", \"url\":\"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/34c07bcbf557413cf42b601c1794c87db8c32...\", \"author\":{ \"name\":\"aaaaaa aaaaaa\", \"email\":\"101218171+aaaaaa@users.noreply.github.com\", \"username\":\"aaaaaa\"}, \"committer\":{ \"name\":\"GitasdjwqaikHubasdqw\", \"email\":\"noreply@gitskcaskadahuqwdqbqwdqaw.com\", \"username\":\"wdkcszjkcsebwdqwdfqwdawsldqodqw\"}, \"added\":[], \"removed\":[], \"modified\":[ \"a.json\", \"A1.json\", \"A1.json\"]}], \"head_commit\":{ \"id\":\"34c07bcbf557413cf42b601c1794c87db8c321d1\", \"tree_id\":\"5a6d71393611718b8576f8a63cdd34ce619f17dd\", \"distinct\":true, \"message\":\"sadwad from xxxxxxxxxxxxxxx/IH-5942-Pipeline-Change\n\nIh 5asdsazdapeline change\", \"timestamp\":\"2024-08-12T13:32:45-05:00\", \"url\":\"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/3weweeeeeeeee\", \"author\":{ \"name\":\"askjas\", \"email\":\"101218171+asfsfgwsrsd@users.noreply.github.com\", \"username\":\"asdwasdcqwasfdc-qwgbhvcfawdqxaiwdaszxc\" }, \"committer\":{ \"name\":\"GsdzvcweditHuscwsab\", \"email\":\"noreply@gitasdcwedhub.com\", \"username\":\"wefczeb-fwefvdszlow\"}, \"added\":[], \"removed\":[], \"modified\":[\"zzzzzzz.json\",\"Azzzzz.json\",\"zzzz.json\" ]}}" | spath output=commit_id path=commits{}.id | table commit_id  shows 5 values Splunk 9.3.0
Hi @yuanliu, Thank you again for your analysis and suggestion. The environment where I am working is very restrictive about making changes to limits.conf due to a resource problem. The index i... See more...
Hi @yuanliu, Thank you again for your analysis and suggestion. The environment where I am working is very restrictive about making changes to limits.conf due to a resource problem. The index in the real data set I am working on has more than 1 million rows in a1 day time frame, I got it down to 150k after filtering out with specific subnets, fields, and dedups.  If this is the case, Is splitting the sub search for the join the only way to it?  Does the following search for splitting look correct? Please let me know if you have a better idea or workaround.  The real data has a lot more than 3 IP subnets     Thank you again. | inputlookup host.csv | rename ip_address as ip | eval source="csv" | join max=0 type=left ip [ search index=risk | fields ip risk score contact | search ip="10.1.0.0/16" | eval source="risk1" ] | join max=0 type=left ip [ search index=risk | fields ip risk score contact | search ip="10.2.0.0/16" | eval source="risk2" ] | join max=0 type=left ip [ search index=risk | fields ip risk score contact | search ip="10.3.0.0/16" | eval source="risk3" ] | table ip, host, risk, score, contact
We have json logs, from the below logs we need to get the rex for the failures count which is mentioned in the logs like (7 failures) We need rex to get the count for failures  count. {"attributes"... See more...
We have json logs, from the below logs we need to get the rex for the failures count which is mentioned in the logs like (7 failures) We need rex to get the count for failures  count. {"attributes": {"type" : "rar_Log__c", "url": "/data/log/v4.0/subject/rar"}, "Application_Id__c": "MOT-Branch", "Category__c": "MOT-Branch", "Comment__c": "Class Name: MOT_Date3DayPurgeBatch - LCT Declined or Not Funded applications deletion completed 3 batches with 3 failures.3", "Contact_Id__c": null, "CreatedById" : 657856MHQA, "CreatedDate": "2022-02-21T16:04:01.000+0000", "Description__c": null} {"attributes": {"type" : "rar_Log__c", "url": "/data/log/v4.0/subject/rar"}, "Application_Id__c": "MOT-Branch", "Category__c": "MOT-Branch", "Comment__c": "Class Name: MOT_Date3DayPurgeBatch - LCT Declined or Not Funded applications deletion completed 4 batches with 4 failures.4", "Contact_Id__c": null, "CreatedById" : 657856MHQA, "CreatedDate": "2022-02-21T16:04:01.000+0000", "Description__c": null} {"attributes": {"type" : "rar_Log__c", "url": "/data/log/v4.0/subject/rar"}, "Application_Id__c": "MOT-Branch", "Category__c": "MOT-Branch", "Comment__c": "Class Name: MOT_Date3DayPurgeBatch - LCT Declined or Not Funded applications deletion completed 5 batches with 5 failures.5", "Contact_Id__c": null, "CreatedById" : 657856MHQA, "CreatedDate": "2022-02-21T16:04:01.000+0000", "Description__c": null} {"attributes": {"type" : "rar_Log__c", "url": "/data/log/v4.0/subject/rar"}, "Application_Id__c": "MOT-Branch", "Category__c": "MOT-Branch", "Comment__c": "Class Name: MOT_Date3DayPurgeBatch - LCT Declined or Not Funded applications deletion completed 7 batches with 7 failures.7", "Contact_Id__c": null, "CreatedById" : 657856MHQA, "CreatedDate": "2022-02-21T16:04:01.000+0000", "Description__c": null} {"attributes": {"type" : "rar_Log__c", "url": "/data/log/v4.0/subject/rar"}, "Application_Id__c": "MOT-Branch", "Category__c": "MOT-Branch", "Comment__c": "Class Name: MOT_Date3DayPurgeBatch - LCT Declined or Not Funded applications deletion completed 10 batches with 10 failures.10", "Contact_Id__c": null, "CreatedById" : 657856MHQA, "CreatedDate": "2022-02-21T16:04:01.000+0000", "Description__c": null}  
source=lastlog corresponds with the source setting in the inputs.conf for the lastlog.sh script so this one checks out. It doesn't have anything to do with lastlog file. Check your splunk btool pro... See more...
source=lastlog corresponds with the source setting in the inputs.conf for the lastlog.sh script so this one checks out. It doesn't have anything to do with lastlog file. Check your splunk btool props list lastlog --debug | grep DATETIME_CONFIG output. If it shows anything else than CURRENT as a value, it means you're overwriting this in the shown file.  
Also be sure that you're not using any TLS-inspection solution if it's on-prem.
Hi, It's possible that the database you're using isn't supported for Database Query Performance. I suggest checking the supported list here: https://docs.splunk.com/observability/en/apm/db-query-pe... See more...
Hi, It's possible that the database you're using isn't supported for Database Query Performance. I suggest checking the supported list here: https://docs.splunk.com/observability/en/apm/db-query-perf/db-perf-reference.html#supported-dbs Also, you could check your APM MetricSets in settings->APM MetricSets and make sure that Database Query Performance is enabled and active.
Is it possible to perform custom attribute mapping when syncing user attributes using SAML2 authentication? I know we can map external attributes to first_name, last_name, etc. But we have a need to ... See more...
Is it possible to perform custom attribute mapping when syncing user attributes using SAML2 authentication? I know we can map external attributes to first_name, last_name, etc. But we have a need to set first_name to a nickname attribute if it exists or has a value and if not fallback to the firstname attribute. The configuration doesn't allow us to map two external attributes to the same SOAR user attribute. Wasn't sure if there was a way to script this somewhere or if we are stuck performing this mapping on the IdP?
Well, it's not only whether they match but if they do contain the stuff you want ingested. If the "list monitor" doesn't show the files you want to read you're not gonna get them.
I have created an addon with a few input parameters. One of them is a dropdown list box. I am seeing that when I add a data input from within the app created by the addon automatically the dropdown s... See more...
I have created an addon with a few input parameters. One of them is a dropdown list box. I am seeing that when I add a data input from within the app created by the addon automatically the dropdown shows fine and I can select an item from it. However, when I create the same data input from the Settings->Data Inputs menu item, the dropdown list box is shown as a textbox. Any ideas on what I might be doing wrong? Thanks in advance.
@PickleRick Thanks and I think, I had run them before but I tried again to verify and  splunk list monitor output matches with the splunk list inputstatus output. I will try btool next. 
In the data, there is an array of 5 commit IDs. For some reason, it is only returning 3 values. Not sure why  2 values are missing. Would like a fresh set of eyes to take a look please. Query ind... See more...
In the data, there is an array of 5 commit IDs. For some reason, it is only returning 3 values. Not sure why  2 values are missing. Would like a fresh set of eyes to take a look please. Query index=XXXXX source="http:github-dev-token" eventtype="GitHub::Push" sourcetype="json_ae_git-webhook" | spath output=commit_id path=commits.id sourcetype definition [ json_ae_git-webhook ] AUTO_KV_JSON=false CHARSET=UTF-8 KV_MODE=json LINE_BREAKER=([\r\n]+) NO_BINARY_CHECK=true SHOULD_LINEMERGE=true TRUNCATE=100000 category=Structured description=JavaScript Object Notation format. For more information, visit http://json.org/ disabled=false pulldown_type=true Raw JSON data { "ref":"refs/heads/Dev", "before":"d53e9b3cb6cde4253e05019295a840d394a7bcb0", "after":"34c07bcbf557413cf42b601c1794c87db8c321d1", "commits":[ { "id":"a5c816a817d06e592d2b70cd8a088d1519f2d720", "tree_id":"15e930e14d4c62aae47a3c02c47eb24c65d11807", "distinct":false, "message":"rrrrrrrrrrrrrrrrrrrrrr", "timestamp":"2024-08-12T12:00:04-05:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/aaaaaaaaaaaa", "author":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "committer":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "added":[ ], "removed":[ ], "modified":[ "asdafasdad.json" ] }, { "id":"a3b3b6f728ccc0eb9113e7db723fbfc4ad220882", "tree_id":"3586aeb0a33dc5e236cb266c948f83ff01320a9a", "distinct":false, "message":"xxxxxxxxxxxxxxxxxxx", "timestamp":"2024-08-12T12:05:40-05:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/a3b3b6f728ccc0eb9113e7db723fbfc4ad220882", "author":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "committer":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "added":[ ], "removed":[ ], "modified":[ "sddddddf.json" ] }, { "id":"bdcd242d6854365ddfeae6b4f86cf7bc1766e028", "tree_id":"8286c537f7dee57395f44875ddb8b2cdb7dd48b2", "distinct":false, "message":"Updating pipeline: pl_gwp_file_landing_check. Adding Sylvan Performance", "timestamp":"2024-08-12T12:06:10-05:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/bdcd242d6854365ddfeae6b4f86cf7bc1766e028", "author":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "committer":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "added":[ ], "removed":[ ], "modified":[ "asadwefvdx.json" ] }, { "id":"108ebd4ff8ae9dd70e669e2ca49e293684d5c37a", "tree_id":"5a6d71393611718b8576f8a63cdd34ce619f17dd", "distinct":false, "message":"asdrwerwq", "timestamp":"2024-08-12T10:09:33-07:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/108ebd4ff8ae9dd70e669e2ca49e293684d5c37a", "author":{ "name":"dfsd", "email":"l.llllllllllll@aaaaaa.com", "username":"aaaaaa" }, "committer":{ "name":"lllllllllllll", "email":"l.llllllllllll@abc.com", "username":"aaaaaa" }, "added":[ ], "removed":[ ], "modified":[ "A.json", "A.json", "A.json" ] }, { "id":"34c07bcbf557413cf42b601c1794c87db8c321d1", "tree_id":"5a6d71393611718b8576f8a63cdd34ce619f17dd", "distinct":true, "message":"asadasd", "timestamp":"2024-08-12T13:32:45-05:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/34c07bcbf557413cf42b601c1794c87db8c321d1", "author":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "committer":{ "name":"GitasdjwqaikHubasdqw", "email":"noreply@gitskcaskadahuqwdqbqwdqaw.com", "username":"wdkcszjkcsebwdqwdfqwdawsldqodqw" }, "added":[ ], "removed":[ ], "modified":[ "a.json", "A1.json", "A1.json" ] } ], "head_commit":{ "id":"34c07bcbf557413cf42b601c1794c87db8c321d1", "tree_id":"5a6d71393611718b8576f8a63cdd34ce619f17dd", "distinct":true, "message":"sadwad from xxxxxxxxxxxxxxx/IH-5942-Pipeline-Change\n\nIh 5asdsazdapeline change", "timestamp":"2024-08-12T13:32:45-05:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/3weweeeeeeeee, "author":{ "name":"askjas", "email":"101218171+asfsfgwsrsd@users.noreply.github.com", "username":"asdwasdcqwasfdc-qwgbhvcfawdqxaiwdaszxc" }, "committer":{ "name":"GsdzvcweditHuscwsab", "email":"noreply@gitasdcwedhub.com", "username":"wefczeb-fwefvdszlow" }, "added":[ ], "removed":[ ], "modified":[ "zzzzzzz.json", "Azzzzz.json", "zzzz.json" ] } }
Are you sure you want Splunk Enterprise? You listed Enterprise Security as associated product. Anyway, 6.6 has been released something like 7 or 8 years ago and has been EOL for at least 4 years now... See more...
Are you sure you want Splunk Enterprise? You listed Enterprise Security as associated product. Anyway, 6.6 has been released something like 7 or 8 years ago and has been EOL for at least 4 years now. You're still running that?
The link you were pointed to is a very old thread. Now the same functionality is implemented with a special command so you can do (on your UFs, not on your SH!) splunk list monitor and splunk list... See more...
The link you were pointed to is a very old thread. Now the same functionality is implemented with a special command so you can do (on your UFs, not on your SH!) splunk list monitor and splunk list inputstatus The first command wil, show you effective configuration of your monitor inputs. The second one will give you the state of your inputs. Of course you can additionally verify your combined config using splunk btool inputs list monitor --debug
These are two separate issues. One is that you're doing a call to https endpoint without verifying server's certificate. That's not a very secure thing to do (especially that you're authenticating y... See more...
These are two separate issues. One is that you're doing a call to https endpoint without verifying server's certificate. That's not a very secure thing to do (especially that you're authenticating yourself against some unverified party) so you're getting a warning from the script. Another thing is that you're not properly authenticating to the server. That' why you're getting an error response from the server.