Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Try and Increment when converting hash to bigint #128

Merged
merged 5 commits into from
Aug 26, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,8 @@ blake2b_simd = "0.5.7"
cryptoxide = "0.1.2"
curve25519-dalek = "3"
digest = "0.9"
generic-array = "0.14"
typenum = "1.13"
ff-zeroize = "0.6.3"
hex = { version = "0.4", features = ["serde"] }
hmac = "0.11"
Expand Down
28 changes: 24 additions & 4 deletions src/cryptographic_primitives/hashing/blake2b512.rs
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,12 @@
(https://github.com/KZen-networks/curv)
License MIT: https://github.com/KZen-networks/curv/blob/master/LICENSE
*/

use blake2b_simd::{Params, State};
use typenum::Unsigned;

use crate::arithmetic::traits::*;
use crate::elliptic::curves::{Curve, Point, Scalar};
use crate::elliptic::curves::{Curve, ECScalar, Point, Scalar};
use crate::BigInt;

/// Wrapper over [blake2b_simd](blake2b_simd::State) exposing facilities to hash bigints, elliptic points,
Expand All @@ -17,9 +19,13 @@ pub struct Blake {
}

impl Blake {
const HASH_LENGTH: usize = 64;
pub fn with_personal(persona: &[u8]) -> Self {
Self {
state: Params::new().hash_length(64).personal(persona).to_state(),
state: Params::new()
.hash_length(Self::HASH_LENGTH)
.personal(persona)
.to_state(),
}
}

Expand All @@ -38,8 +44,22 @@ impl Blake {
}

pub fn result_scalar<E: Curve>(&self) -> Scalar<E> {
let n = self.result_bigint();
Scalar::from_bigint(&n)
let scalar_len = <<E::Scalar as ECScalar>::ScalarLength as Unsigned>::to_usize();
assert!(
Self::HASH_LENGTH >= scalar_len,
"Output size of the hash({}) is smaller than the scalar length({})",
Self::HASH_LENGTH,
scalar_len
);
// Try and increment.
for i in 0u32.. {
let mut starting_state = self.state.clone();
let hash = starting_state.update(&i.to_be_bytes()).finalize();
if let Ok(scalar) = Scalar::from_bytes(&hash.as_bytes()[..scalar_len]) {
return scalar;
}
}
unreachable!("The probably of this reaching is extremely small ((2^n-q)/(2^n))^(2^32)")
}

#[deprecated(
Expand Down
23 changes: 19 additions & 4 deletions src/cryptographic_primitives/hashing/ext.rs
Original file line number Diff line number Diff line change
@@ -1,9 +1,10 @@
use digest::Digest;
use hmac::crypto_mac::MacError;
use hmac::{Hmac, Mac, NewMac};
use typenum::Unsigned;

use crate::arithmetic::*;
use crate::elliptic::curves::{Curve, Point, Scalar};
use crate::elliptic::curves::{Curve, ECScalar, Point, Scalar};

/// [Digest] extension allowing to hash elliptic points, scalars, and bigints
///
Expand Down Expand Up @@ -82,7 +83,7 @@ pub trait DigestExt {

impl<D> DigestExt for D
where
D: Digest,
D: Digest + Clone,
{
fn input_bigint(&mut self, n: &BigInt) {
self.update(&n.to_bytes())
Expand All @@ -102,8 +103,22 @@ where
}

fn result_scalar<E: Curve>(self) -> Scalar<E> {
let n = self.result_bigint();
Scalar::from_bigint(&n)
let scalar_len = <<E::Scalar as ECScalar>::ScalarLength as Unsigned>::to_usize();
assert!(
Self::output_size() >= scalar_len,
"Output size of the hash({}) is smaller than the scalar length({})",
Self::output_size(),
scalar_len
);
// Try and increment.
for i in 0u32.. {
let starting_state = self.clone();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I noticed that we have to add + Clone in a lot of places because of this .clone(). It's not really critical, but we can eliminate this by finding first H(data || 0x1 || 0x2 || ... || i) instead of H(data || i). Are there any downsides of this approach?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good question, I'll try to think if this can introduce any attacks.
(small note, AFAIK all hash functions implement Clone)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, all hashes must implement Clone. My suggestion is only about saving few keystrokes. If you think it might introduce any attacks, let's leave it as is

Copy link
Contributor

@survived survived Aug 10, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Btw do we even need to specify + Clone constraint? According to the documentation, Digest trait is implemented for any D that D: Clone + FixedOutput + .... Can't Rust deduce that any Digest implements Clone? See https://docs.rs/digest/0.9.0/digest/trait.Digest.html#impl-Digest

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's implemented for it doesn't inherit it :/
I can instead do D: FixedOutput + Update + Clone but sadly I can't remove clone

let hash = starting_state.chain(i.to_be_bytes()).finalize();
if let Ok(scalar) = Scalar::from_bytes(&hash[..scalar_len]) {
return scalar;
}
}
unreachable!("The probably of this reaching is extremely small ((2^n-q)/(2^n))^(2^32)")
}

fn digest_bigint(bytes: &[u8]) -> BigInt {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ impl<E: Curve> LdeiProof<E> {
statement: &LdeiStatement<E>,
) -> Result<LdeiProof<E>, InvalidLdeiStatement>
where
H: Digest,
H: Digest + Clone,
{
if statement.alpha.len() != statement.g.len() {
return Err(InvalidLdeiStatement::AlphaLengthDoesntMatchG);
Expand Down Expand Up @@ -129,7 +129,7 @@ impl<E: Curve> LdeiProof<E> {
/// true, otherwise rejects.
pub fn verify<H>(&self, statement: &LdeiStatement<E>) -> Result<(), ProofError>
where
H: Digest,
H: Digest + Clone,
{
let e = H::new()
.chain_points(&statement.g)
Expand Down
16 changes: 9 additions & 7 deletions src/elliptic/curves/bls12_381/g1.rs
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
use std::fmt;

use ff_zeroize::PrimeField;
use generic_array::GenericArray;
use pairing_plus::bls12_381::{G1Compressed, G1Uncompressed, G1};
use pairing_plus::hash_to_curve::HashToCurve;
use pairing_plus::hash_to_field::ExpandMsgXmd;
Expand Down Expand Up @@ -72,8 +73,8 @@ impl ECPoint for G1Point {
type Scalar = FieldScalar;
type Underlying = PK;

type CompressedPoint = G1Compressed;
type UncompressedPoint = G1Uncompressed;
type CompressedPointLength = typenum::U48;
type UncompressedPointLength = typenum::U96;

fn zero() -> G1Point {
G1Point {
Expand Down Expand Up @@ -150,12 +151,12 @@ impl ECPoint for G1Point {
}
}

fn serialize_compressed(&self) -> Self::CompressedPoint {
G1Compressed::from_affine(self.ge)
fn serialize_compressed(&self) -> GenericArray<u8, Self::CompressedPointLength> {
*GenericArray::from_slice(G1Compressed::from_affine(self.ge).as_ref())
}

fn serialize_uncompressed(&self) -> Self::UncompressedPoint {
G1Uncompressed::from_affine(self.ge)
fn serialize_uncompressed(&self) -> GenericArray<u8, Self::UncompressedPointLength> {
*GenericArray::from_slice(G1Uncompressed::from_affine(self.ge).as_ref())
}

fn deserialize(bytes: &[u8]) -> Result<G1Point, DeserializationError> {
Expand Down Expand Up @@ -290,7 +291,8 @@ mod tests {
// Generate base_point2
let cs = &[1u8];
let msg = &[1u8];
let point = <G1 as HashToCurve<ExpandMsgXmd<old_sha2::Sha256>>>::hash_to_curve(msg, cs).into_affine();
let point = <G1 as HashToCurve<ExpandMsgXmd<old_sha2::Sha256>>>::hash_to_curve(msg, cs)
.into_affine();
assert!(point.in_subgroup());

// Print in uncompressed form
Expand Down
16 changes: 9 additions & 7 deletions src/elliptic/curves/bls12_381/g2.rs
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
use std::fmt;

use ff_zeroize::{PrimeField, ScalarEngine};
use generic_array::GenericArray;
use pairing_plus::bls12_381::{G2Compressed, G2Uncompressed, G2};
use pairing_plus::hash_to_curve::HashToCurve;
use pairing_plus::hash_to_field::ExpandMsgXmd;
Expand Down Expand Up @@ -79,8 +80,8 @@ impl ECPoint for G2Point {
type Scalar = FieldScalar;
type Underlying = PK;

type CompressedPoint = G2Compressed;
type UncompressedPoint = G2Uncompressed;
type CompressedPointLength = typenum::U96;
type UncompressedPointLength = typenum::U192;

fn zero() -> G2Point {
G2Point {
Expand Down Expand Up @@ -157,12 +158,12 @@ impl ECPoint for G2Point {
}
}

fn serialize_compressed(&self) -> Self::CompressedPoint {
G2Compressed::from_affine(self.ge)
fn serialize_compressed(&self) -> GenericArray<u8, Self::CompressedPointLength> {
*GenericArray::from_slice(G2Compressed::from_affine(self.ge).as_ref())
}

fn serialize_uncompressed(&self) -> Self::UncompressedPoint {
G2Uncompressed::from_affine(self.ge)
fn serialize_uncompressed(&self) -> GenericArray<u8, Self::UncompressedPointLength> {
*GenericArray::from_slice(G2Uncompressed::from_affine(self.ge).as_ref())
}

fn deserialize(bytes: &[u8]) -> Result<G2Point, DeserializationError> {
Expand Down Expand Up @@ -292,7 +293,8 @@ mod tests {
// Generate base_point2
let cs = &[1u8];
let msg = &[1u8];
let point = <G2 as HashToCurve<ExpandMsgXmd<old_sha2::Sha256>>>::hash_to_curve(msg, cs).into_affine();
let point = <G2 as HashToCurve<ExpandMsgXmd<old_sha2::Sha256>>>::hash_to_curve(msg, cs)
.into_affine();
assert!(point.in_subgroup());

// Print in uncompressed form
Expand Down
9 changes: 5 additions & 4 deletions src/elliptic/curves/bls12_381/scalar.rs
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
use std::fmt;

use ff_zeroize::{Field, PrimeField, PrimeFieldRepr, ScalarEngine};
use generic_array::GenericArray;
use pairing_plus::bls12_381::{Fr, FrRepr};
use rand::rngs::OsRng;
use zeroize::Zeroizing;
Expand Down Expand Up @@ -38,7 +39,7 @@ pub struct FieldScalar {
impl ECScalar for FieldScalar {
type Underlying = SK;

type ScalarBytes = [u8; 32];
type ScalarLength = typenum::U32;

fn random() -> FieldScalar {
FieldScalar {
Expand Down Expand Up @@ -75,11 +76,11 @@ impl ECScalar for FieldScalar {
BigInt::from_bytes(&bytes)
}

fn serialize(&self) -> Self::ScalarBytes {
fn serialize(&self) -> GenericArray<u8, Self::ScalarLength> {
let repr = self.fe.into_repr();
let mut bytes = [0u8; SECRET_KEY_SIZE];
repr.write_be(&mut bytes[..]).unwrap();
bytes
GenericArray::from(bytes)
}

fn deserialize(bytes: &[u8]) -> Result<Self, DeserializationError> {
Expand All @@ -90,7 +91,7 @@ impl ECScalar for FieldScalar {
repr.read_be(bytes.as_ref()).unwrap();
Ok(FieldScalar {
purpose: "deserialize",
fe: Fr::from_repr(repr).unwrap().into(),
fe: Fr::from_repr(repr).or(Err(DeserializationError))?.into(),
})
}

Expand Down
19 changes: 10 additions & 9 deletions src/elliptic/curves/curve_ristretto.rs
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ use std::sync::atomic;
use curve25519_dalek::constants::{BASEPOINT_ORDER, RISTRETTO_BASEPOINT_POINT};
use curve25519_dalek::ristretto::CompressedRistretto;
use curve25519_dalek::traits::{Identity, IsIdentity};
use generic_array::GenericArray;
use rand::thread_rng;
use sha2::{Digest, Sha256};
use zeroize::{Zeroize, Zeroizing};
Expand Down Expand Up @@ -83,7 +84,7 @@ impl Curve for Ristretto {
impl ECScalar for RistrettoScalar {
type Underlying = SK;

type ScalarBytes = [u8; 32];
type ScalarLength = typenum::U32;

fn random() -> RistrettoScalar {
RistrettoScalar {
Expand Down Expand Up @@ -118,8 +119,8 @@ impl ECScalar for RistrettoScalar {
BigInt::from_bytes(&t)
}

fn serialize(&self) -> Self::ScalarBytes {
self.fe.to_bytes()
fn serialize(&self) -> GenericArray<u8, Self::ScalarLength> {
GenericArray::from(self.fe.to_bytes())
}

fn deserialize(bytes: &[u8]) -> Result<Self, DeserializationError> {
Expand Down Expand Up @@ -209,8 +210,8 @@ impl ECPoint for RistrettoPoint {
type Scalar = RistrettoScalar;
type Underlying = PK;

type CompressedPoint = [u8; 32];
type UncompressedPoint = [u8; 32];
type CompressedPointLength = typenum::U32;
type UncompressedPointLength = typenum::U32;

fn zero() -> RistrettoPoint {
RistrettoPoint {
Expand Down Expand Up @@ -253,12 +254,12 @@ impl ECPoint for RistrettoPoint {
None
}

fn serialize_compressed(&self) -> Self::CompressedPoint {
self.ge.compress().to_bytes()
fn serialize_compressed(&self) -> GenericArray<u8, Self::CompressedPointLength> {
GenericArray::from(self.ge.compress().to_bytes())
}

fn serialize_uncompressed(&self) -> Self::UncompressedPoint {
self.ge.compress().to_bytes()
fn serialize_uncompressed(&self) -> GenericArray<u8, Self::UncompressedPointLength> {
GenericArray::from(self.ge.compress().to_bytes())
}

fn deserialize(bytes: &[u8]) -> Result<RistrettoPoint, DeserializationError> {
Expand Down
29 changes: 13 additions & 16 deletions src/elliptic/curves/ed25519.rs
Original file line number Diff line number Diff line change
Expand Up @@ -9,14 +9,11 @@
// based on https://docs.rs/cryptoxide/0.1.0/cryptoxide/curve25519/index.html
// https://cr.yp.to/ecdh/curve25519-20060209.pdf

use std::fmt;
use std::fmt::Debug;
use std::ops;
use std::ptr;
use std::str;
use std::sync::atomic;
use std::{fmt, ops, ptr, str};

use cryptoxide::curve25519::*;
use generic_array::GenericArray;
use zeroize::{Zeroize, Zeroizing};

use crate::arithmetic::traits::*;
Expand Down Expand Up @@ -126,7 +123,7 @@ impl Curve for Ed25519 {
impl ECScalar for Ed25519Scalar {
type Underlying = SK;

type ScalarBytes = [u8; 32];
type ScalarLength = typenum::U32;

// we chose to multiply by 8 (co-factor) all group elements to work in the prime order sub group.
// each random fe is having its 3 first bits zeroed
Expand Down Expand Up @@ -173,8 +170,8 @@ impl ECScalar for Ed25519Scalar {
BigInt::from_bytes(&t)
}

fn serialize(&self) -> Self::ScalarBytes {
self.fe.to_bytes()
fn serialize(&self) -> GenericArray<u8, Self::ScalarLength> {
GenericArray::from(self.fe.to_bytes())
}

fn deserialize(bytes: &[u8]) -> Result<Self, DeserializationError> {
Expand Down Expand Up @@ -264,7 +261,7 @@ impl ECScalar for Ed25519Scalar {
}
}

impl Debug for Ed25519Scalar {
impl fmt::Debug for Ed25519Scalar {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
write!(
f,
Expand All @@ -281,7 +278,7 @@ impl PartialEq for Ed25519Scalar {
}
}

impl Debug for Ed25519Point {
impl fmt::Debug for Ed25519Point {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
write!(
f,
Expand Down Expand Up @@ -309,8 +306,8 @@ impl ECPoint for Ed25519Point {
type Underlying = PK;
type Scalar = Ed25519Scalar;

type CompressedPoint = [u8; 32];
type UncompressedPoint = [u8; 32];
type CompressedPointLength = typenum::U32;
type UncompressedPointLength = typenum::U32;

fn zero() -> Ed25519Point {
*ZERO
Expand Down Expand Up @@ -365,12 +362,12 @@ impl ECPoint for Ed25519Point {
Some(PointCoords { x: xrecover(&y), y })
}

fn serialize_compressed(&self) -> Self::CompressedPoint {
self.ge.to_bytes()
fn serialize_compressed(&self) -> GenericArray<u8, Self::CompressedPointLength> {
GenericArray::from(self.ge.to_bytes())
}

fn serialize_uncompressed(&self) -> Self::UncompressedPoint {
self.ge.to_bytes()
fn serialize_uncompressed(&self) -> GenericArray<u8, Self::UncompressedPointLength> {
GenericArray::from(self.ge.to_bytes())
}

fn deserialize(bytes: &[u8]) -> Result<Ed25519Point, DeserializationError> {
Expand Down
Loading