@ -163,7 +163,7 @@ Let $s= \log m,~ s'= \log n$.
where $ \sigma _ j,~ \theta _ j $ are the checks from LCCCS and CCCS respectively with $ x = r _ x' $ .
where $ \sigma _ j,~ \theta _ j $ are the checks from LCCCS and CCCS respectively with $ x = r _ x' $ .
\item V: $ e _ 1 \leftarrow \widetilde { eq } ( r _ x, r _ x' ) $ , $ e _ 2 \leftarrow \widetilde { eq } ( \beta , r _ x' ) $ \\
\item V: $ e _ 1 \leftarrow \widetilde { eq } ( r _ x, r _ x' ) $ , $ e _ 2 \leftarrow \widetilde { eq } ( \beta , r _ x' ) $ \\
check:
check:
$$ c = \left ( \sum _ { j \in [ t ] } \gamma ^ j e _ 1 \sigma _ j + \gamma ^ { t + 1 } e _ 2 \left ( \sum _ { i = 1 } ^ q c _ i \cdot \prod _ { j \in S _ i } \sigma \right ) \right ) $$
$$ c = \left ( \sum _ { j \in [ t ] } \gamma ^ j \cdot e_ 1 \cdot \ sigma _ j \right ) + \gamma ^ { t + 1 } \cdot e_ 2 \cdot \left ( \sum _ { i = 1 } ^ q c _ i \cdot \prod _ { j \in S _ i } \theta _ j \right ) $$
which should be equivalent to the $ g ( x ) $ computed by $ V,P $ in the sum-check protocol.
which should be equivalent to the $ g ( x ) $ computed by $ V,P $ in the sum-check protocol.
\item $ V \rightarrow P: \rho \in ^ R \mathbb { F } $
\item $ V \rightarrow P: \rho \in ^ R \mathbb { F } $
\item $ V, P $ : output the folded LCCCS instance $ ( C', u', \mathsf { x } ', r _ x', v _ 1 ', \ldots , v _ t' ) $ , where $ \forall i \in [ t ] $ :
\item $ V, P $ : output the folded LCCCS instance $ ( C', u', \mathsf { x } ', r _ x', v _ 1 ', \ldots , v _ t' ) $ , where $ \forall i \in [ t ] $ :
@ -173,7 +173,11 @@ Let $s= \log m,~ s'= \log n$.
\mathsf { x} ' & \leftarrow \mathsf { x} _ 1 + \rho \cdot \mathsf { x} _ 2\\
\mathsf { x} ' & \leftarrow \mathsf { x} _ 1 + \rho \cdot \mathsf { x} _ 2\\
v_ i' & \leftarrow \sigma _ i + \rho \cdot \theta _ i
v_ i' & \leftarrow \sigma _ i + \rho \cdot \theta _ i
\end { align*}
\end { align*}
\item $ P $ : output folded witness: $ \widetilde { w } ' \leftarrow \widetilde { w } _ 1 + \rho \cdot \widetilde { w } _ 2 $ .
\item $ P $ : output folded witness and the folded $ r _ w' $ (random value used for the witness commitment $ C $ ):
\begin { align*}
\widetilde { w} ' & \leftarrow \widetilde { w} _ 1 + \rho \cdot \widetilde { w} _ 2\\
r_ w' & \leftarrow r_ { w_ 1} + \rho \cdot r_ { w_ 2}
\end { align*}
\end { enumerate}
\end { enumerate}
@ -285,7 +289,116 @@ where $e_1 = \widetilde{eq}(r_x, r_x')$ and $e_2=\widetilde{eq}(\beta, r_x')$.
Which is the check that $ V $ performs at step $ 5 $ .
Which is the check that $ V $ performs at step $ 5 $ .
\subsection { Multifolding for multiple instances}
The multifolding of multiple LCCCS \& CCCS instances is not shown in the HyperNova paper, but Srinath Setty gave an overview in the PSE HyperNova presentation. This section unfolds it.
We're going to do this example with parameters \textcolor { orange} { LCCCS: $ \mu = 2 $ } , \textcolor { cyan} { CCCS: $ \nu = 2 $ } , which means that we have 2 LCCCS instances and 2 CCCS instances.
Assume we have 4 $ z $ vectors, $ z _ 1 ,~ \textcolor { orange } { z _ 2 } $ for the two LCCCS instances, and $ z _ 3 ,~ \textcolor { cyan } { z _ 4 } $ for the two CCCS instances, where $ z _ 1 ,~z _ 3 $ are the vectors that we already had in the example with $ \mu = 1 , \nu = 1 $ , and $ z _ 2 ,~z _ 4 $ are the extra ones that we're adding now.
In \emph { step 3} of the multifolding with more than one LCCCS and more than one CCCS instances, we have:
\begin { align*}
g(x) & := \left ( \sum _ { j \in [t]} \gamma ^ j \cdot L_ { 1,j} (x) + \textcolor { orange} { \gamma ^ { t+j} \cdot L_ { 2,j} (x)} \right )
+ \gamma ^ { 2t+1} \cdot Q_ 1(x) + \textcolor { cyan} { \gamma ^ { 2t+2} \cdot Q_ 2(x)} \\
& L_ { 1,j} (x) := \widetilde { eq} (r_ { 1,x} , x) \cdot \left (
\sum _ { y \in \{ 0,1\} ^ { s'} } \widetilde { M} _ j(x, y) \cdot \widetilde { z} _ 1(y)
\right )\\
& \textcolor { orange} { L_ { 2,j} (x)} := \widetilde { eq} (\textcolor { orange} { r_ { 2,x} } , x) \cdot \left (
\sum _ { y \in \{ 0,1\} ^ { s'} } \widetilde { M} _ j(x, y) \cdot \textcolor { orange} { \widetilde { z} _ 2(y)}
\right )\\
& Q_ 1(x) := \widetilde { eq} (\beta , x) \cdot \left (
\sum _ { i=1} ^ q c_ i \cdot \prod _ { j \in S_ i} \left ( \sum _ { y \in \{ 0, 1\} ^ { s'} } \widetilde { M} _ j(x, y) \cdot \widetilde { z} _ 3(y) \right )\right )\\
& \textcolor { cyan} { Q_ 2(x)} := \widetilde { eq} (\textcolor { cyan} { \beta '} , x) \cdot \left (
\sum _ { i=1} ^ q c_ i \cdot \prod _ { j \in S_ i} \left ( \sum _ { y \in \{ 0, 1\} ^ { s'} } \widetilde { M} _ j(x, y) \cdot \textcolor { cyan} { \widetilde { z} _ 4(y)} \right )\right )
\end { align*}
\framebox { \begin { minipage} { 4.3 in}
A generic definition of $ g ( x ) $ for $ \mu > 1 ~ \nu > 1 $ , would be
$$
g(x) := \left ( \sum _ { i \in [\mu ]} \left ( \sum _ { j \in [t]} \gamma ^ { i \cdot t+j} \cdot L_ { i,j} (x) \right ) \right )
+ \left ( \sum _ { i \in [\nu ]} \gamma ^ { \mu \cdot t + i} \cdot Q_ i(x) \right )
$$
\end { minipage} }
Recall, the original $ g ( x ) $ definition was
$$ \textcolor { gray } { g ( x ) : = \left ( \sum _ { j \in [ t ] } \gamma ^ j \cdot L _ j ( x ) \right ) + \gamma ^ { t + 1 } \cdot Q ( x ) } $$
\vspace { 0.5cm}
In \emph { step 4} , $ P \rightarrow V $ :
$ ( \{ \sigma _ { 1 ,j } \} , \textcolor { orange } { \{ \sigma _ { 2 ,j } \} } , \{ \theta _ { 1 ,j } \} , \textcolor { cyan } { \{ \theta _ { 2 ,j } \} } ) ,~ \text { where } ~ \forall j \in [ t ] $ ,
$$ \sigma _ { 1 ,j } = \sum _ { y \in \{ 0 , 1 \} ^ { s' } } \widetilde { M } _ j ( r _ x', y ) \cdot \widetilde { z } _ 1 ( y ) $$
$$ \textcolor { orange } { \sigma _ { 2 ,j } } = \sum _ { y \in \{ 0 , 1 \} ^ { s' } } \widetilde { M } _ j ( r _ x', y ) \cdot \textcolor { orange } { \widetilde { z } _ 2 ( y ) } $$
$$ \theta _ { 1 ,j } = \sum _ { y \in \{ 0 , 1 \} ^ { s' } } \widetilde { M } _ j ( r _ x', y ) \cdot \widetilde { z } _ 3 ( y ) $$
$$ \textcolor { cyan } { \theta _ { 2 ,j } } = \sum _ { y \in \{ 0 , 1 \} ^ { s' } } \widetilde { M } _ j ( r _ x', y ) \cdot \textcolor { cyan } { \widetilde { z } _ 4 ( y ) } $$
\framebox { \begin { minipage} { 4.3 in}
so in a generic way,\\
$ P \rightarrow V $ :
$ ( \{ \sigma _ { i,j } \} , \{ \theta _ { k,j } \} ) ,~ \text { where } ~ \forall ~ j \in [ t ] ,~ \forall ~ i \in [ \mu ] ,~ \forall ~ k \in [ \nu ] $
where
$$ \sigma _ { i,j } = \sum _ { y \in \{ 0 , 1 \} ^ { s' } } \widetilde { M } _ j ( r _ x', y ) \cdot \widetilde { z } _ i ( y ) $$
$$ \theta _ { k,j } = \sum _ { y \in \{ 0 , 1 \} ^ { s' } } \widetilde { M } _ j ( r _ x', y ) \cdot \widetilde { z } _ { \mu + k } ( y ) $$
\end { minipage} }
\vspace { 1cm}
And in \emph { step 5} , $ V $ checks
% TODO check orange gamma^ j...
\begin { align*}
c & = \left (\sum _ { j \in [t]} \gamma ^ j \cdot e_ 1 \cdot \sigma _ { 1,j}
~\textcolor { orange} { + \gamma ^ { t+j} \cdot e_ 1 \cdot \sigma _ { 2,j} } \right )\\
& + \gamma ^ { 2t+1} \cdot e_ 2 \cdot \left ( \sum _ { i=1} ^ q c_ i \cdot \prod _ { j \in S_ i} \theta _ j \right )
+ \textcolor { cyan} { \gamma ^ { 2t+2} \cdot e_ 2 \cdot \left ( \sum _ { i=1} ^ q c_ i \cdot \prod _ { j \in S_ i} \theta _ j \right )}
\end { align*}
where
% TODO check e_ 4
$ e _ 1 \leftarrow \widetilde { eq } ( r _ { 1 ,x } , r _ x' ) ,~ e _ 2 \leftarrow \widetilde { eq } ( r _ { 2 ,x } , r _ x' ) $ , $ e _ 3 \leftarrow \widetilde { eq } ( \beta , r _ x' ) ,~ e _ 4 \leftarrow \widetilde { eq } ( \beta ', r _ x' ) $ (note: wip, pending check for $ \beta , \beta ' $ used in step 3).
\vspace { 0.5cm}
\framebox { \begin { minipage} { 4.3 in}
A generic definition of the check would be
$$
c = \sum _ { i \in [\mu ]} \left (\sum _ { j \in [t]} \gamma ^ { i \cdot t + j} \cdot e_ i \cdot \sigma _ { i,j} \right ) \\
+ \sum _ { k \in [\nu ]} \gamma ^ { \mu \cdot t+k} \cdot e_ k \cdot \left ( \sum _ { i=1} ^ q c_ i \cdot \prod _ { j \in S_ i} \theta _ { k,j} \right )
$$
\end { minipage} }
where the original check was\\
$ \textcolor { gray } { c = \left ( \sum _ { j \in [ t ] } \gamma ^ j \cdot e _ 1 \cdot \sigma _ j \right ) + \gamma ^ { t + 1 } \cdot e _ 2 \cdot \left ( \sum _ { i = 1 } ^ q c _ i \cdot \prod _ { j \in S _ i } \theta _ j \right ) } $
% TODO
% Pending questions:
% - \beta & \beta ' can be the same? or related somehow like \beta '=\beta ^ 2 ?
\vspace { 0.5cm}
And for the \emph { step 7} ,
\begin { align*}
C' & \leftarrow C_ 1 + \rho \cdot C_ 2 + \rho ^ 2 C_ 3 + \rho ^ 3 C_ 4 + \ldots = \sum _ { i \in [\mu + \nu ]} \rho ^ i \cdot C_ i \\
u' & \leftarrow \sum _ { i \in [\mu ]} \rho ^ i \cdot u_ i + \sum _ { i \in [\nu ]} \rho ^ { \mu + i-1} \cdot 1\\
\mathsf { x} ' & \leftarrow \sum _ { i \in [\mu +\nu ]} \rho ^ i \cdot \mathsf { x} _ i\\
v_ i' & \leftarrow \sum _ { i \in [\mu ]} \rho ^ i \cdot \sigma _ i + \sum _ { i \in [\nu ]} \rho ^ { \mu + i-1} \cdot \theta _ i\\
\end { align*}
and \emph { step 8} ,
\begin { align*}
\widetilde { w} ' & \leftarrow \sum _ { i \in [\mu +\nu ]} \rho ^ i\cdot \widetilde { w} _ i\\
r_ w' & \leftarrow \sum _ { i \in [\mu +\nu ]} \rho ^ i \cdot r_ { w_ i} \\
\end { align*}
Note that over all the multifolding for $ \mu > 1 $ and $ \nu > 1 $ , we can easily parallelize most of the computation.
\vspace { 2cm}
% % % % % % APPENDIX
% % % % % % APPENDIX
\appendix
\appendix