Solved!
The Problem: "Invalid Signature" due to incorrect r/s encoding
When using ES256 (ECDSA with P-256 curve and SHA-256), the signature returned by Java's Signature.sign() method is in DER format (ASN.1 encoded). This DER-encoded signature consists of two components:
r: the first half of the ECDSA signature
s: the second half of the signature
🔧 Why it fails by default
Each of r and s must be exactly 32 bytes long, and combined into a raw 64-byte binary signature before being Base64URL-encoded for the JWT.
However:
When extracting r and s using a DER parser (e.g. BouncyCastle's ASN1InputStream), sometimes:
r or s may be 33 bytes, because Java encodes them with a leading zero byte if the number is negative (to make it positive in signed BigInteger representation).
The extra byte causes the combined raw signature to be 65 bytes instead of 64.
JWT validators like Apple or jwt.io expect exactly 64 bytes for the ECDSA signature — 32 bytes for r, 32 bytes for s.
The solution
You must:
Trim the leading zero byte (if it's present and length == 33).
Pad with zeros if the length is less than 32.
Throw an error if the array is longer than 33 (because that’s never valid).
<!--- Example logic in ColdFusion --->
<cfif arrayLen(r) EQ 33 AND r[1] EQ 0>
<cfset r = arraySlice(r, 2, 32)> <!-- remove leading zero -->
</cfif>
<!-- Now pad if r < 32 -->