You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

471 lines
20 KiB

  1. \documentclass{article}
  2. \usepackage[utf8]{inputenc}
  3. \usepackage{amsfonts}
  4. \usepackage{amsthm}
  5. \usepackage{amsmath}
  6. \usepackage{mathtools}
  7. \usepackage{enumerate}
  8. \usepackage{hyperref}
  9. \usepackage{xcolor}
  10. \usepackage{centernot}
  11. \usepackage{algorithm}
  12. \usepackage{algpseudocode}
  13. \usepackage{pgf-umlsd} % diagrams
  14. % message between threads. From https://tex.stackexchange.com/a/174765
  15. % Example:
  16. % \bloodymess[delay]{sender}{message content}{receiver}{DIR}{start note}{end note}
  17. \newcommand{\bloodymess}[7][0]{
  18. \stepcounter{seqlevel}
  19. \path
  20. (#2)+(0,-\theseqlevel*\unitfactor-0.7*\unitfactor) node (mess from) {};
  21. \addtocounter{seqlevel}{#1}
  22. \path
  23. (#4)+(0,-\theseqlevel*\unitfactor-0.7*\unitfactor) node (mess to) {};
  24. \draw[->,>=angle 60] (mess from) -- (mess to) node[midway, above]
  25. {#3};
  26. \if R#5
  27. \node (\detokenize{#3} from) at (mess from) {\llap{#6~}};
  28. \node (\detokenize{#3} to) at (mess to) {\rlap{~#7}};
  29. \else\if L#5
  30. \node (\detokenize{#3} from) at (mess from) {\rlap{~#6}};
  31. \node (\detokenize{#3} to) at (mess to) {\llap{#7~}};
  32. \else
  33. \node (\detokenize{#3} from) at (mess from) {#6};
  34. \node (\detokenize{#3} to) at (mess to) {#7};
  35. \fi
  36. \fi
  37. }
  38. % prevent warnings of underfull \hbox:
  39. \usepackage{etoolbox}
  40. \apptocmd{\sloppy}{\hbadness 4000\relax}{}{}
  41. \theoremstyle{definition}
  42. \newtheorem{definition}{Def}[section]
  43. \newtheorem{theorem}[definition]{Thm}
  44. % custom lemma environment to set custom numbers
  45. \newtheorem{innerlemma}{Lemma}
  46. \newenvironment{lemma}[1]
  47. {\renewcommand\theinnerlemma{#1}\innerlemma}
  48. {\endinnerlemma}
  49. \title{Notes on HyperNova}
  50. \author{arnaucube}
  51. \date{May 2023}
  52. \begin{document}
  53. \maketitle
  54. \begin{abstract}
  55. Notes taken while reading about HyperNova \cite{cryptoeprint:2023/573} and CCS\cite{cryptoeprint:2023/552}.
  56. Usually while reading papers I take handwritten notes, this document contains some of them re-written to $LaTeX$.
  57. The notes are not complete, don't include all the steps neither all the proofs.
  58. Thanks to \href{https://twitter.com/asn_d6}{George Kadianakis} for clarifications, and the authors \href{https://twitter.com/srinathtv}{Srinath Setty} and \href{https://twitter.com/abhiramko}{Abhiram Kothapalli} for answers on chats and twitter.
  59. \end{abstract}
  60. \tableofcontents
  61. \section{CCS}
  62. \subsection{R1CS to CCS overview}
  63. \begin{description}
  64. \item[R1CS instance] $S_{R1CS} = (m, n, N, l, A, B, C)$\\
  65. where $m, n$ are such that $A \in \mathbb{F}^{m \times n}$, and $l$ such that the public inputs $x \in \mathbb{F}^l$. Also $z=(w, 1, x) \in \mathbb{F}^n$, thus $w \in \mathbb{F}^{n-l-1}$.
  66. \item[CCS instance] $S_{CCS} = (m, n, N, l, t, q, d, M, S, c)$\\
  67. where we have the same parameters than in $S_{R1CS}$, but additionally:\\
  68. $t=|M|$, $q = |c| = |S|$, $d$= max degree in each variable.
  69. \item[R1CS-to-CCS parameters] $n=n,~ m=m,~ N=N,~ l=l,~ t=3,~ q=2,~ d=2$, $M=\{A,B,C\}$, $S=\{\{0,~1\},~ \{2\}\}$, $c=\{1,-1\}$
  70. \end{description}
  71. The CCS relation check:
  72. $$\sum_{i=0}^{q-1} c_i \cdot \bigcirc_{j \in S_i} M_j \cdot z ==0$$
  73. where $z=(w, 1, x) \in \mathbb{F}^n$.
  74. In our R1CS-to-CCS parameters is equivalent to
  75. \begin{align*}
  76. &c_0 \cdot ( (M_0 z) \circ (M_1 z) ) + c_1 \cdot (M_2 z) ==0\\
  77. \Longrightarrow &1 \cdot ( (A z) \circ (B z) ) + (-1) \cdot (C z) ==0\\
  78. \Longrightarrow &( (A z) \circ (B z) ) - (C z) ==0
  79. \end{align*}
  80. which is equivalent to the R1CS relation: $Az \circ Bz == Cz$
  81. An example of the conversion from R1CS to CCS implemented in SageMath can be found at\\
  82. \href{https://github.com/arnaucube/math/blob/master/r1cs-ccs.sage}{https://github.com/arnaucube/math/blob/master/r1cs-ccs.sage}.
  83. Similar relations between Plonkish and AIR arithmetizations to CCS are shown in the CCS paper \cite{cryptoeprint:2023/552}, but for now with the R1CS we have enough to see the CCS generalization idea and to use it for the HyperNova scheme.
  84. \subsection{Committed CCS}
  85. $R_{CCCS}$ instance: $(C, \mathsf{x})$, where $C$ is a commitment to a multilinear polynomial in $s'-1$ variables.
  86. Sat if:
  87. \begin{enumerate}[i.]
  88. \item $\text{Commit}(pp, \widetilde{w}) = C$
  89. \item $\sum_{i=1}^q c_i \cdot \left( \prod_{j \in S_i} \left( \sum_{y \in \{0,1\}^{\log m}} \widetilde{M}_j(x, y) \cdot \widetilde{z}(y) \right) \right)$\\
  90. where $\widetilde{z}(y) = \widetilde{(w, 1, \mathsf{x})}(x) ~\forall x \in \{0, 1\}^{s'}$
  91. \end{enumerate}
  92. \subsection{Linearized Committed CCS}
  93. $R_{LCCCS}$ instance: $(C, u, \mathsf{x}, r, v_1, \ldots, v_t)$, where $C$ is a commitment to a multilinear polynomial in $s'-1$ variables, and $u \in \mathbb{F},~ \mathsf{x} \in \mathbb{F}^l,~ r \in \mathbb{F}^s,~ v_i \in \mathbb{F} ~\forall i \in [t]$.
  94. Sat if:
  95. \begin{enumerate}[i.]
  96. \item $\text{Commit}(pp, \widetilde{w}) = C$
  97. \item $\forall i \in [t],~ v_i = \sum_{y \in \{0,1\}^{s'}} \widetilde{M}_i(r, y) \cdot \widetilde{z}(y)$\\
  98. where $\widetilde{z}(y) = \widetilde{(w, u, \mathsf{x})}(x) ~\forall x \in \{0, 1\}^{s'}$
  99. \end{enumerate}
  100. \section{Multifolding Scheme for CCS}
  101. Recall sum-check protocol notation: \underline{$C \leftarrow \langle P, V(r) \rangle (g, l, d, T)$} means
  102. $$T=\sum_{x_1 \in \{0,1\}} \sum_{x_2 \in \{0,1\}} \cdots \sum_{x_l \in \{0,1\}} g(x_1, x_2, \ldots, x_l)$$
  103. where $g$ is a $l$-variate polynomial, with degree at most $d$ in each variable, and $T$ is the claimed value.
  104. \vspace{1cm}
  105. Let $s= \log m,~ s'= \log n$.
  106. \begin{enumerate}
  107. \item $V \rightarrow P: \gamma \in^R \mathbb{F},~ \beta \in^R \mathbb{F}^s$
  108. \item $V: r_x' \in^R \mathbb{F}^s$
  109. \item $V \leftrightarrow P$: sum-check protocol:
  110. $$c \leftarrow \langle P, V(r_x') \rangle (g, s, d+1, \underbrace{\sum_{j \in [t]} \gamma^j \cdot v_j}_\text{T})$$
  111. (in fact, $T=(\sum_{j \in [t]} \gamma^j \cdot v_j) \underbrace{+ \gamma^{t+1} \cdot Q(x)}_{=0}) = \sum_{j \in [t]} \gamma^j \cdot v_j$)\\
  112. where:
  113. \begin{align*}
  114. g(x) &:= \underbrace{\left( \sum_{j \in [t]} \gamma^j \cdot L_j(x) \right)}_\text{LCCCS check} + \underbrace{\gamma^{t+1} \cdot Q(x)}_\text{CCCS check}\\
  115. \text{for LCCCS:}~ L_j(x) &:= \widetilde{eq}(r_x, x) \cdot \left(
  116. \underbrace{\sum_{y \in \{0,1\}^{s'}} \widetilde{M}_j(x, y) \cdot \widetilde{z}_1(y)}_\text{this is the check from LCCCS}
  117. \right)\\
  118. \text{for CCCS:}~ Q(x) := &\widetilde{eq}(\beta, x) \cdot \left(
  119. \underbrace{ \sum_{i=1}^q c_i \cdot \prod_{j \in S_i} \left( \sum_{y \in \{0, 1\}^{s'}} \widetilde{M}_j(x, y) \cdot \widetilde{z}_2(y) \right) }_\text{this is the check from CCCS}
  120. \right)
  121. \end{align*}
  122. Notice that
  123. $$v_j= \sum_{y\in \{0,1\}^{s'}} \widetilde{M}_j(r, y) \cdot \widetilde{z}(y) = \sum_{x\in \{0,1\}^s} L_j(x)$$
  124. \item $P \rightarrow V$: $\left( (\sigma_1, \ldots, \sigma_t), (\theta_1, \ldots, \theta_t) \right)$, where $\forall j \in [t]$,
  125. $$\sigma_j = \sum_{y \in \{0,1\}^{s'}} \widetilde{M}_j(r_x', y) \cdot \widetilde{z}_1(y)$$
  126. $$\theta_j = \sum_{y \in \{0, 1\}^{s'}} \widetilde{M}_j(r_x', y) \cdot \widetilde{z}_2(y)$$
  127. where $\sigma_j,~\theta_j$ are the checks from LCCCS and CCCS respectively with $x=r_x'$.
  128. \item V: $e_1 \leftarrow \widetilde{eq}(r_x, r_x')$, $e_2 \leftarrow \widetilde{eq}(\beta, r_x')$\\
  129. check:
  130. $$c = \left(\sum_{j \in [t]} \gamma^j \cdot e_1 \cdot \sigma_j \right) + \gamma^{t+1} \cdot e_2 \cdot \left( \sum_{i=1}^q c_i \cdot \prod_{j \in S_i} \theta_j \right)$$
  131. which should be equivalent to the $g(x)$ computed by $V,P$ in the sum-check protocol.
  132. \item $V \rightarrow P: \rho \in^R \mathbb{F}$
  133. \item $V, P$: output the folded LCCCS instance $(C', u', \mathsf{x}', r_x', v_1', \ldots, v_t')$, where $\forall i \in [t]$:
  134. \begin{align*}
  135. C' &\leftarrow C_1 + \rho \cdot C_2\\
  136. u' &\leftarrow u + \rho \cdot 1\\
  137. \mathsf{x}' &\leftarrow \mathsf{x}_1 + \rho \cdot \mathsf{x}_2\\
  138. v_i' &\leftarrow \sigma_i + \rho \cdot \theta_i
  139. \end{align*}
  140. \item $P$: output folded witness and the folded $r_w'$ (random value used for the witness commitment $C$):
  141. \begin{align*}
  142. \widetilde{w}' &\leftarrow \widetilde{w}_1 + \rho \cdot \widetilde{w}_2\\
  143. r_w' &\leftarrow r_{w_1} + \rho \cdot r_{w_2}
  144. \end{align*}
  145. \end{enumerate}
  146. \vspace{1cm}
  147. Multifolding flow:
  148. \begin{center}
  149. \begin{sequencediagram}
  150. \newinst[1]{p}{Prover}
  151. \newinst[3]{v}{Verifier}
  152. \bloodymess[1]{v}{$\gamma,~\beta,~r_x'$}{p}{L}{
  153. \shortstack{
  154. $\gamma \in \mathbb{F},~ \beta \in \mathbb{F}^s$\\
  155. $r_x' \in \mathbb{F}^s$
  156. }
  157. }{}
  158. \bloodymess[1]{p}{$c,~ \pi_{SC}$}{v}{R}{sum-check prove}{sum-check verify}
  159. \bloodymess[1]{p}{$\{\sigma_j\},~\{\theta_j\}$}{v}{R}{compute $\{\sigma_j\}, \{\theta_j\}~ \forall j \in [t]$}{verify $c$ with $\{\sigma_j\}, \{\theta_j\}$ relation}
  160. \bloodymess[1]{v}{$\rho$}{p}{L}{$\rho \in^R \mathbb{F}$}{}
  161. \callself[0]{p}{fold LCCCS instance}{p}
  162. \prelevel
  163. \callself[0]{v}{fold LCCCS instance}{v}
  164. \callself[0]{p}{fold $\widetilde{w}$}{p}
  165. \end{sequencediagram}
  166. \end{center}
  167. \vspace{1cm}
  168. Now, to see the verifier check from step 5, observe that in LCCCS, since $\widetilde{w}$ satisfies,
  169. \begin{align*}
  170. v_j &= \sum_{y \in \{0,1\}^{s'}} \widetilde{M}_j(r_x, y) \cdot \widetilde{z}_1(y)\\
  171. &= \sum_{x \in \{0,1\}^s}
  172. \underbrace{
  173. \widetilde{eq}(r_x, x) \cdot \left( \sum_{y \in \{0,1\}^{s'}} \widetilde{M}_j(x, y) \cdot \widetilde{z}_1(y) \right)
  174. }_{L_j(x)}\\
  175. &= \sum_{x \in \{0,1\}^s} L_j(x)
  176. \end{align*}
  177. Observe also that in CCCS, since $\widetilde{w}$ satisfies,
  178. $$
  179. 0=\underbrace{\sum_{i=1}^q c_i \cdot \prod_{j \in S_i} \left( \sum_{y \in \{0, 1\}^{s'}} \widetilde{M}_j(x, y) \cdot \widetilde{z}_2(y) \right)}_{q(x)}
  180. $$
  181. we have that
  182. $$
  183. G(X) = \sum_{x \in \{0,1\}^s} eq(X, x) \cdot q(x)
  184. $$
  185. is multilinear, and can be seen as a Lagrange polynomial where coefficients are evaluations of $q(x)$ on the hypercube.
  186. For an honest prover, all these coefficients are zero, thus $G(X)$ must necessarily be the zero polynomial. Thus $G(\beta)=0$ for $\beta \in^R \mathbb{F}^s$.
  187. \begin{align*}
  188. % 0&=\sum_{i=1}^q c_i \cdot \prod_{j \in S_i} \left( \sum_{y \in \{0, 1\}^{s'}} \widetilde{M}_j(\beta, y) \cdot \widetilde{z}_2(y) \right)\\
  189. 0&=G(\beta) = \sum_{x \in \{0,1\}^s} eq(\beta, x) \cdot q(x)\\
  190. &= \sum_{x \in \{0,1\}^s}
  191. \underbrace{\widetilde{eq}(\beta , x) \cdot
  192. \overbrace{
  193. \sum_{i=1}^q c_i \cdot \prod_{j \in S_i} \left( \sum_{y \in \{0, 1\}^{s'}} \widetilde{M}_j(x, y) \cdot \widetilde{z}_2(y) \right)
  194. }^{q(x)}
  195. }_{Q(x)}\\
  196. &= \sum_{x \in \{0,1\}^s} Q(x)
  197. \end{align*}
  198. \framebox{\begin{minipage}{4.3 in}
  199. \begin{footnotesize}
  200. \textbf{Note}: notice that this past equation is related to Spartan paper \cite{cryptoeprint:2019/550}, lemmas 4.2 and 4.3, where instead of
  201. $$q(x) = \sum_{i=1}^q c_i \cdot \prod_{j \in S_i} \left( \sum_{y \in \{0, 1\}^{s'}} \widetilde{M}_j(x, y) \cdot \widetilde{z}_2(y) \right)$$
  202. for our R1CS example, we can restrict it to just $M_0,M_1,M_2$, which would be
  203. $$=\left( \sum_{y \in \{0,1\}^s} \widetilde{M_0}(x, y) \cdot \widetilde{z}(y) \right) \cdot \left( \sum_{y \in \{0,1\}^s} \widetilde{M_1}(x, y) \cdot \widetilde{z}(y) \right) - \sum_{y \in \{0,1\}^s} \widetilde{M_2}(x, y) \cdot \widetilde{z}(y)$$
  204. and we can see that $q(x)$ is the same equation $\widetilde{F}_{io}(x)$ that we had in Spartan:
  205. $$
  206. \widetilde{F}_{io}(x)=\left( \sum_{y \in \{0,1\}^s} \widetilde{A}(x, y) \cdot \widetilde{z}(y) \right) \cdot \left( \sum_{y \in \{0,1\}^s} \widetilde{B}(x, y) \cdot \widetilde{z}(y) \right) - \sum_{y \in \{0,1\}^s} \widetilde{C}(x, y) \cdot \widetilde{z}(y)
  207. $$
  208. where
  209. $$Q_{io}(t) = \sum_{x \in \{0,1\}^s} \widetilde{F}_{io}(x) \cdot \widetilde{eq}(t,x)=0$$
  210. and V checks $Q_{io}(\tau)=0$ for $\tau \in^R \mathbb{F}^s$, which in HyperNova is $G(\beta)=0$ for $\beta \in^R \mathbb{F}^s$.
  211. $Q_{io}(\cdot)$ is a zero-polynomial ($G(\cdot)$ in HyperNova), it evaluates to zero for all points in its domain iff $\widetilde{F}_{io}(\cdot)$ evaluates to zero at all points in the $s$-dimensional boolean hypercube.
  212. \begin{align*}
  213. \text{Spartan} &\longleftrightarrow \text{HyperNova}\\
  214. \tau &\longleftrightarrow \beta\\
  215. \widetilde{F}_{io}(x) &\longleftrightarrow q(x)\\
  216. Q_{io}(\tau) &\longleftrightarrow G(\beta)
  217. \end{align*}
  218. So, in HyperNova
  219. $$0 = \sum_{x \in \{0,1\}^s} Q(x) = \sum_{x \in \{0,1\}^s} \widetilde{eq}(\beta,x) \cdot q(x)$$
  220. \end{footnotesize}
  221. \end{minipage}}
  222. \vspace{1cm}
  223. Comming back to HyperNova equations, we can now see that
  224. \begin{align*}
  225. c &= g(r_x')\\
  226. &= \left( \sum_{j \in [t]} \gamma^j \cdot L_j(r_x') \right) + \gamma^{t+1} \cdot Q(r_x')\\
  227. &= \left( \sum_{j \in [t]} \gamma^j \cdot \overbrace{e_1 \cdot \sigma_j}^{L_j(r_x')} \right) + \gamma^{t+1} \cdot \overbrace{e_2 \cdot \sum_{i \in [q]} c_i \prod_{j \in S_i} \theta_j}^{Q(x)}
  228. \end{align*}
  229. where $e_1 = \widetilde{eq}(r_x, r_x')$ and $e_2=\widetilde{eq}(\beta, r_x')$.
  230. Which is the check that $V$ performs at step $5$.
  231. \subsection{Multifolding for multiple instances}
  232. The multifolding of multiple LCCCS \& CCCS instances is not shown in the HyperNova paper, but Srinath Setty gave an overview in the PSE HyperNova presentation. This section unfolds it.
  233. We're going to do this example with parameters \textcolor{orange}{LCCCS: $\mu = 2$}, \textcolor{cyan}{CCCS: $\nu = 2$}, which means that we have 2 LCCCS instances and 2 CCCS instances.
  234. Assume we have 4 $z$ vectors, $z_1,~ \textcolor{orange}{z_2}$ for the two LCCCS instances, and $z_3,~ \textcolor{cyan}{z_4}$ for the two CCCS instances, where $z_1,~z_3$ are the vectors that we already had in the example with $\mu=1,\nu=1$, and $z_2,~z_4$ are the extra ones that we're adding now.
  235. In \emph{step 3} of the multifolding with more than one LCCCS and more than one CCCS instances, we have:
  236. \begin{align*}
  237. g(x) &:= \left( \sum_{j \in [t]} \gamma^j \cdot L_{1,j}(x) + \textcolor{orange}{\gamma^{t+j} \cdot L_{2,j}(x)} \right)
  238. + \gamma^{2t+1} \cdot Q_1(x) + \textcolor{cyan}{\gamma^{2t+2} \cdot Q_2(x)} \\
  239. &L_{1,j}(x) := \widetilde{eq}(r_{1,x}, x) \cdot \left(
  240. \sum_{y \in \{0,1\}^{s'}} \widetilde{M}_j(x, y) \cdot \widetilde{z}_1(y)
  241. \right)\\
  242. &\textcolor{orange}{L_{2,j}(x)} := \widetilde{eq}(\textcolor{orange}{r_{2,x}}, x) \cdot \left(
  243. \sum_{y \in \{0,1\}^{s'}} \widetilde{M}_j(x, y) \cdot \textcolor{orange}{\widetilde{z}_2(y)}
  244. \right)\\
  245. &Q_1(x) := \widetilde{eq}(\beta, x) \cdot \left(
  246. \sum_{i=1}^q c_i \cdot \prod_{j \in S_i} \left( \sum_{y \in \{0, 1\}^{s'}} \widetilde{M}_j(x, y) \cdot \widetilde{z}_3(y) \right)\right)\\
  247. &\textcolor{cyan}{Q_2(x)} := \widetilde{eq}(\textcolor{cyan}{\beta'}, x) \cdot \left(
  248. \sum_{i=1}^q c_i \cdot \prod_{j \in S_i} \left( \sum_{y \in \{0, 1\}^{s'}} \widetilde{M}_j(x, y) \cdot \textcolor{cyan}{\widetilde{z}_4(y)} \right)\right)
  249. \end{align*}
  250. \framebox{\begin{minipage}{4.3 in}
  251. A generic definition of $g(x)$ for $\mu>1~\nu>1$, would be
  252. $$
  253. g(x) := \left( \sum_{i \in [\mu]} \left( \sum_{j \in [t]} \gamma^{i \cdot t+j} \cdot L_{i,j}(x) \right) \right)
  254. + \left( \sum_{i \in [\nu]} \gamma^{\mu \cdot t + i} \cdot Q_i(x) \right)
  255. $$
  256. \end{minipage}}
  257. Recall, the original $g(x)$ definition was
  258. $$\textcolor{gray}{g(x) := \left( \sum_{j \in [t]} \gamma^j \cdot L_j(x) \right) + \gamma^{t+1} \cdot Q(x)}$$
  259. \vspace{0.5cm}
  260. In \emph{step 4}, $P \rightarrow V$:
  261. $(\{\sigma_{1,j}\}, \textcolor{orange}{\{\sigma_{2,j}\}}, \{\theta_{1,j}\}, \textcolor{cyan}{\{\theta_{2,j}\}}),~ \text{where} ~\forall j \in [t]$,
  262. $$\sigma_{1,j} = \sum_{y \in \{0,1\}^{s'}} \widetilde{M}_j(r_x', y) \cdot \widetilde{z}_1(y)$$
  263. $$\textcolor{orange}{\sigma_{2,j}} = \sum_{y \in \{0,1\}^{s'}} \widetilde{M}_j(r_x', y) \cdot \textcolor{orange}{\widetilde{z}_2(y)}$$
  264. $$\theta_{1,j} = \sum_{y \in \{0, 1\}^{s'}} \widetilde{M}_j(r_x', y) \cdot \widetilde{z}_3(y)$$
  265. $$\textcolor{cyan}{\theta_{2,j}} = \sum_{y \in \{0, 1\}^{s'}} \widetilde{M}_j(r_x', y) \cdot \textcolor{cyan}{\widetilde{z}_4(y)}$$
  266. \framebox{\begin{minipage}{4.3 in}
  267. so in a generic way,\\
  268. $P \rightarrow V$:
  269. $(\{\sigma_{i,j}\}, \{\theta_{k,j}\}),~ \text{where} ~\forall~ j \in [t],~ \forall~ i \in [\mu],~ \forall~ k \in [\nu]$
  270. where
  271. $$\sigma_{i,j} = \sum_{y \in \{0,1\}^{s'}} \widetilde{M}_j(r_x', y) \cdot \widetilde{z}_i(y)$$
  272. $$\theta_{k,j} = \sum_{y \in \{0, 1\}^{s'}} \widetilde{M}_j(r_x', y) \cdot \widetilde{z}_{\mu+k}(y)$$
  273. \end{minipage}}
  274. \vspace{1cm}
  275. And in \emph{step 5}, $V$ checks
  276. % TODO check orange gamma^j...
  277. \begin{align*}
  278. c &= \left(\sum_{j \in [t]} \gamma^j \cdot e_1 \cdot \sigma_{1,j}
  279. ~\textcolor{orange}{+ \gamma^{t+j} \cdot e_1 \cdot \sigma_{2,j}}\right)\\
  280. &+ \gamma^{2t+1} \cdot e_2 \cdot \left( \sum_{i=1}^q c_i \cdot \prod_{j \in S_i} \theta_j \right)
  281. + \textcolor{cyan}{\gamma^{2t+2} \cdot e_2 \cdot \left( \sum_{i=1}^q c_i \cdot \prod_{j \in S_i} \theta_j \right)}
  282. \end{align*}
  283. where
  284. % TODO check e_4
  285. $e_1 \leftarrow \widetilde{eq}(r_{1,x}, r_x'),~ e_2 \leftarrow \widetilde{eq}(r_{2,x}, r_x')$, $e_3 \leftarrow \widetilde{eq}(\beta, r_x'),~ e_4 \leftarrow \widetilde{eq}(\beta', r_x')$ (note: wip, pending check for $\beta, \beta'$ used in step 3).
  286. \vspace{0.5cm}
  287. \framebox{\begin{minipage}{4.3 in}
  288. A generic definition of the check would be
  289. $$
  290. c = \sum_{i \in [\mu]} \left(\sum_{j \in [t]} \gamma^{i \cdot t + j} \cdot e_i \cdot \sigma_{i,j} \right) \\
  291. + \sum_{k \in [\nu]} \gamma^{\mu \cdot t+k} \cdot e_k \cdot \left( \sum_{i=1}^q c_i \cdot \prod_{j \in S_i} \theta_{k,j} \right)
  292. $$
  293. \end{minipage}}
  294. where the original check was\\
  295. $\textcolor{gray}{c = \left(\sum_{j \in [t]} \gamma^j \cdot e_1 \cdot \sigma_j \right) + \gamma^{t+1} \cdot e_2 \cdot \left( \sum_{i=1}^q c_i \cdot \prod_{j \in S_i} \theta_j \right)}$
  296. % TODO
  297. % Pending questions:
  298. % - \beta & \beta' can be the same? or related somehow like \beta'=\beta^2 ?
  299. \vspace{0.5cm}
  300. And for the \emph{step 7},
  301. \begin{align*}
  302. C' &\leftarrow C_1 + \rho \cdot C_2 + \rho^2 C_3 + \rho^3 C_4 + \ldots = \sum_{i \in [\mu + \nu]} \rho^i \cdot C_i \\
  303. u' &\leftarrow \sum_{i \in [\mu]} \rho^i \cdot u_i + \sum_{i \in [\nu]} \rho^{\mu + i-1} \cdot 1\\
  304. \mathsf{x}' &\leftarrow \sum_{i \in [\mu+\nu]} \rho^i \cdot \mathsf{x}_i\\
  305. v_i' &\leftarrow \sum_{i \in [\mu]} \rho^i \cdot \sigma_i + \sum_{i \in [\nu]} \rho^{\mu + i-1} \cdot \theta_i\\
  306. \end{align*}
  307. and \emph{step 8},
  308. \begin{align*}
  309. \widetilde{w}' &\leftarrow \sum_{i \in [\mu+\nu]} \rho^i\cdot \widetilde{w}_i\\
  310. r_w' &\leftarrow \sum_{i \in [\mu+\nu]} \rho^i \cdot r_{w_i}\\
  311. \end{align*}
  312. Note that over all the multifolding for $\mu >1$ and $\nu>1$, we can easily parallelize most of the computation.
  313. \vspace{2cm}
  314. %%%%%% APPENDIX
  315. \appendix
  316. \section{Appendix: Some details}
  317. This appendix contains some notes on things that don't specifically appear in the paper, but that would be needed in a practical implementation of the scheme.
  318. \subsection{Matrix and Vector to Sparse Multilinear Extension}
  319. Let $M \in \mathbb{F}^{m \times n}$ be a matrix. We want to compute its MLE
  320. $$\widetilde{M}(x_1, \ldots, x_l) = \sum_{e \in \{0, 1 \}^l} M(e) \cdot \widetilde{eq}(x, e)$$
  321. We can view the matrix $M \in \mathbb{F}^{m \times n}$ as a function with the following signature:
  322. $$M(\cdot): \{0,1\}^s \times \{0,1\}^{s'} \rightarrow \mathbb{F}$$
  323. where $s = \lceil \log m \rceil,~ s' = \lceil \log n \rceil$.
  324. An entry in $M$ can be accessed with a $(s+s')$-bit identifier.
  325. eg.:
  326. $$
  327. M = \begin{pmatrix}
  328. 1 & 2 & 3\\
  329. 4 & 5 & 6\\
  330. \end{pmatrix}
  331. \in \mathbb{F}^{3 \times 2}
  332. $$
  333. $m = 3,~ n = 2,~~~ s = \lceil \log 3 \rceil = 2,~ s' = \lceil \log 2 \rceil = 1$
  334. So, $M(x, y) = x$, where $x \in \{0,1\}^s,~ y \in \{0,1\}^{s'},~ x \in \mathbb{F}$
  335. $$
  336. M = \begin{pmatrix}
  337. M(00,0) & M(01,0) & M(10,0)\\
  338. M(00,1) & M(01,1) & M(10,1)\\
  339. \end{pmatrix}
  340. \in \mathbb{F}^{3 \times 2}
  341. $$
  342. This logic can be defined as follows:
  343. \begin{algorithm}[H]
  344. \caption{Generating a Sparse Multilinear Polynomial from a matrix}
  345. \begin{algorithmic}
  346. \State set empty vector $v \in (\text{index:}~ \mathbb{Z}, x: \mathbb{F}^{s \times s'})$
  347. \For {$i$ to $m$}
  348. \For {$j$ to $n$}
  349. \If {$M_{i,j} \neq 0$}
  350. \State $v.\text{append}( \{ \text{index}: i \cdot n + j,~ x: M_{i,j} \} )$
  351. \EndIf
  352. \EndFor
  353. \EndFor
  354. \State return $v$ \Comment {$v$ represents the evaluations of the polynomial}
  355. \end{algorithmic}
  356. \end{algorithm}
  357. Once we have the polynomial, its MLE comes from
  358. $$\widetilde{M}(x_1, \ldots, x_{s+s'}) = \sum_{e \in \{0,1\}^{s+s'}} M(e) \cdot \widetilde{eq}(x, e)$$
  359. $$M(X) \in \mathbb{F}[X_1, \ldots, X_s]$$
  360. \paragraph{Multilinear extensions of vectors}
  361. Given a vector $u \in \mathbb{F}^m$, the polynomial $\widetilde{u}$ is the MLE of $u$, and is obtained by viewing $u$ as a function mapping ($s=\log m$)
  362. $$u(x): \{0,1\}^s \rightarrow \mathbb{F}$$
  363. $\widetilde{u}(x, e)$ is the multilinear extension of the function $u(x)$
  364. $$\widetilde{u}(x_1, \ldots, x_s) = \sum_{e \in \{0,1\}^s} u(e) \cdot \widetilde{eq}(x, e)$$
  365. \bibliography{paper-notes.bib}
  366. \bibliographystyle{unsrt}
  367. \end{document}