arm64 regression fix

- Fix potential clobbering of user vector register state by AES ghash code
 -----BEGIN PGP SIGNATURE-----
 Version: GnuPG v1
 
 iQEcBAABAgAGBQJbYtldAAoJELescNyEwWM0aisH/ReBqfJb1kTr5ybUriSTsuTl
 Vjn8O17aH+177P3xlY47+cYWsvewJxiy7MldlLEiER0tsyjYvqTDuzy7+ffdJJ8R
 rs0MGhYh9WpD3OVjng7TmwXD71xefk/gLq4MPqmgn9e/DoAt4HO/7jC4c+mSwxRZ
 gHsAH5g6WUclRvU1zXaT8QKdZnmwudPFvy5O+bQYQrIJr++zBiyJ47qu1+TjJQuC
 kVmM7XV0c0L1fK9z7A18PcW+tMlIu15ITzJwEercXen/7XypdDOufgc4Y9odHCkC
 5ZWnV5wZtaJIuo/JWWPnoheWfTqIVF7ggXwsaXmZ0hKfQkBvbJ8fxhogCIWgt40=
 =rpVW
 -----END PGP SIGNATURE-----

Merge tag 'arm64-fixes' of git://git.kernel.org/pub/scm/linux/kernel/git/arm64/linux

Pull arm64 regression fix from Will Deacon:
 "Ard found a nasty arm64 regression in 4.18 where the AES ghash/gcm
  code doesn't notify the kernel about its use of the vector registers,
  therefore potentially corrupting live user state.

  The fix is straightforward and Herbert agreed for it to go via arm64"

* tag 'arm64-fixes' of git://git.kernel.org/pub/scm/linux/kernel/git/arm64/linux:
  crypto/arm64: aes-ce-gcm - add missing kernel_neon_begin/end pair
This commit is contained in:
Linus Torvalds 2018-08-02 10:21:49 -07:00
commit 8cda548ffb
1 changed files with 6 additions and 2 deletions

View File

@ -488,9 +488,13 @@ static int gcm_decrypt(struct aead_request *req)
err = skcipher_walk_done(&walk,
walk.nbytes % AES_BLOCK_SIZE);
}
if (walk.nbytes)
pmull_gcm_encrypt_block(iv, iv, NULL,
if (walk.nbytes) {
kernel_neon_begin();
pmull_gcm_encrypt_block(iv, iv, ctx->aes_key.key_enc,
num_rounds(&ctx->aes_key));
kernel_neon_end();
}
} else {
__aes_arm64_encrypt(ctx->aes_key.key_enc, tag, iv,
num_rounds(&ctx->aes_key));