对DIZK感兴趣的小伙伴可以想到DIZK的源代码:https://github.com/scipr-lab/dizkDIZK的改版较为较少,网卓新闻网,最后一个patch也是2018年底了:commit 81d72a3406e2500561551cbf4d651f230146bb92Merge: e98dd9c 7afd0beAuthor: Howard Wu [emailprotected]Date: Wed Dec 12 11:17:57 2018 -0800Merge pull request #6 from gnosis/profiler_error #5 - Throw Error when profiler has wrong APP parameter.1. 源代码结构DIZK是在Spark框架上,用java语言构建的分布式零科学知识证明系统。源代码的结构如下:algebra - 各种计算出来:椭圆曲线,域/群,FFT以及多倍点(fixedMSM和VarMSM)。
bace - batch证明的涉及构建。relations - 电路的回应:r1cs以及QAP。reductions - 构建r1cs到QAP的转化成。zk_proof_systems - Groth16证明系统profiler - 性能测试逻辑熟知libsnark的小伙伴,对这些术语应当深感较为平易近人。
2. DistributedSetupDistributedSetup构建了分布式的Setup逻辑:main/java/zk_proof_systems/zkSNARK/DistributedSetup.javagenerate函数构建了Pk/Vk的分解。public static FieldT extends AbstractFieldElementExpandedFieldT,G1T extends AbstractG1G1T,G2T extends AbstractG2G2T,GTT extends AbstractGTGTT,PairingT extends AbstractPairingG1T, G2T, GTTCRSFieldT, G1T, G2T, GTT generate(final R1CSRelationRDDFieldT r1cs,final FieldT fieldFactory,final G1T g1Factory,final G2T g2Factory,final PairingT pairing,final Configuration config) {2.1 QAP转化成final QAPRelationRDDFieldT qap = R1CStoQAPRDD.R1CStoQAPRelation(r1cs, t, config);留意,QAP用QAPRelationRDD类回应。
2.2 计算出来deltaABC/gammaABCfinal JavaPairRDDLong, FieldT betaAt = qap.At().mapValues(a - a.mul(beta));final JavaPairRDDLong, FieldT alphaBt = qap.Bt().mapValues(b - b.mul(alpha));final JavaPairRDDLong, FieldT ABC = betaAt.union(alphaBt).union(qap.Ct()).reduceByKey(FieldT::add).persist(config.storageLevel());final JavaPairRDDLong, FieldT gammaABC = ABC.filter(e - e._1numInputs).mapValues(e - e.mul(inverseGamma));final JavaPairRDDLong, FieldT deltaABC = ABC.filter(e - e._1 = numInputs).mapValues(e - e.mul(inverseDelta));2.3 计算出来At/Bt的密集度final long numNonZeroAt = qap.At().filter(e - !e._2.isZero()).count();final long numNonZeroBt = qap.Bt().filter(e - !e._2.isZero()).count();2.4 计算出来FixedMSM(G1/G2)final G1T generatorG1 = g1Factory.random(config.seed(), config.secureSeed());final int scalarSizeG1 = generatorG1.bitSize();final long scalarCountG1 = numNonZeroAt + numNonZeroBt + numVariables;final int windowSizeG1 = FixedBaseMSM.getWindowSize(scalarCountG1 / numPartitions, generatorG1);final ListListG1T windowTableG1 = FixedBaseMSM.getWindowTable(generatorG1, scalarSizeG1, windowSizeG1);以上是G1的MSM的计算出来(G2类似于),留意windowSizeG1,是所有的“非零”系数的个数除以Partition的个数。2.5 分解CRS(Pk/Vk)final ProvingKeyRDDFieldT, G1T, G2T provingKey = new ProvingKeyRDD(alphaG1,betaG1,betaG2,deltaG1,deltaG2,deltaABCG1,queryA,queryB,queryH,r1cs);final VerificationKeyG1T, G2T, GTT verificationKey = new VerificationKey(alphaG1betaG2,gammaG2,deltaG2,UVWGammaG1);留意:Pk用于RDD(ProvingKeyRDD)回应。3. DistributedProverDistributedProver构建了分布式的Prover逻辑:main/java/zk_proof_systems/zkSNARK/DistributedProver.javaprove函数构建了证明分解逻辑。
public static FieldT extends AbstractFieldElementExpandedFieldT, G1T extendsAbstractG1G1T, G2T extends AbstractG2G2TProofG1T, G2T prove(3.1 分解witnessfinal QAPWitnessRDDFieldT qapWitness = R1CStoQAPRDD.R1CStoQAPWitness(provingKey.r1cs(), primary, oneFullAssignment, fieldFactory, config);QAPWitnessRDD定义在main/java/relations/qap/QAPWitnessRDD.java,还包括输出信息以及H多项式系数(FFT计算出来取得)。3.2 产生随机数final FieldT r = fieldFactory.random(config.seed(), config.secureSeed());final FieldT s = fieldFactory.random(config.seed(), config.secureSeed());3.3 计算出来Evaluationfinal JavaRDDTuple2FieldT, G1T computationA =.join(provingKey.queryA(), numPartitions).values();final G1T evaluationAt = VariableBaseMSM.distributedMSM(computationA);通过VariableBaseMSM计算出来A/B/deltaABC以及H的Evaluation。
3.4 分解证明// A = alpha + sum_i(a_i*A_i(t)) + r*deltafinal G1T A = alphaG1.add(evaluationAt).add(deltaG1.mul(r));// B = beta + sum_i(a_i*B_i(t)) + s*deltafinal Tuple2G1T, G2T B = new Tuple2( betaG1.add(evaluationBt._1).add(deltaG1.mul(s)), betaG2.add(evaluationBt._2).add(deltaG2.mul(s)));// C = sum_i(a_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta) + A*s + r*b - r*s*deltafinal G1T C = evaluationABC.add(A.mul(s)).add(B._1.mul(r)).sub(rsDelta);4. Profilingmain/java/profiler/Profiler.java是性能测试的入口类,获取了各种算子的性能测试能力,还包括分布式和单机版本。在scripts目录下也获取了Spark运营的脚本。
配置文件,DIZK是用于Amazon的EC2机器展开测试的。./spark-ec2/copy-dir /home/ec2-user/export JAVA_HOME="/usr/lib/jvm/java-1.8.0"for TOTAL_CORES in 8; do for SIZE in `seq 15 25`; doexport APP=dizk-largeexport MEMORY=16Gexport MULTIPLIER=2export CORES=1export NUM_EXECUTORS=$((TOTAL_CORES / CORES))export NUM_PARTITIONS=$((TOTAL_CORES * MULTIPLIER))/root/spark/bin/spark-submit --conf spark.driver.memory=$MEMORY ... --class "profiler.Profiler" /home/ec2-user/dizk-1.0.jar $NUM_EXECUTORS $CORES $MEMORY $APP $SIZE $NUM_PARTITIONS donedone感兴趣的小伙伴,可以自己改为一下脚本,在本地环境运营DIZK。
总结:DIZK,是在Spark大数据计算出来框架下的分布式零科学知识证明系统。DIZK的代码较为明晰,注解也较为原始。DistributedSetup和DistributedProver是Setup和Prover的构建。
DIZK获取了原始的Profiling的代码。
本文关键词:8868体育官网下载,DIZK,源代码,导读,“,8868,体育,官网,下载,”,对
本文来源:8868体育官网下载-www.lovebychelsea.com