\section{Encoding R1CS instances as low-degree polynomials}
\section{R1CS into Sum-Check protocol}
\begin{definition}{R1CS}
$\exists w \in\mathbb{F}^{m - |io| -1}$ such that $(A \cdot z)\circ(B \cdot z)=(C \cdot z)$, where $z=(io, 1, w)$.
\end{definition}
\textbf{Thm 4.1}$\forall$ R1CS instance $x =(\mathbb{F}, A, B, C, io, m, n)$, $\exists$ a degree-3 log m-variate polynomial $G$ such that $\sum_{x \in\{0,1\}^{log m}} G(x)=0$.
\textbf{Thm 4.1}$\forall$ R1CS instance $x =(\mathbb{F}, A, B, C, io, m, n)$, $\exists$ a degree-3 log m-variate polynomial $G$ such that $\sum_{x \in\{0,1\}^{log m}} G(x)=0$ iff $\exists$ a witness $w$ such that $Sat_{R1CS}(x, w)=1$.
%\begin{theorem}{4.1} // TODO use theorem gadget
%$\forall$
%\begin{end}
\vspace{0.5cm}
For a RCS instance $x$, let $s =\lceil log m \rceil$.
%For a RCS instance $x$, let $s =\lceil\log m \rceil$.
We can view matrices $A, B, C \in\mathbb{F}^{m \times m}$ as functions $\{0,1\}^s \times\{0,1\}^s \rightarrow\mathbb{F}$.
We can view matrices $A, B, C \in\mathbb{F}^{m \times m}$ as functions $\{0,1\}^s \times\{0,1\}^s \rightarrow\mathbb{F}$ ($s=\lceil\log m \rceil$).
For a given witness $w$ to $x$, let $z=(io, 1, w)$.
View $z$ as a function $\{0,1\}^s \rightarrow\mathbb{F}$, so any entry in $z$ can be accessed with a $s$-bit identifier.
where $\widetilde{eq}(t, x)=\prod_{i=1}^s (t_i \cdot x_i +(1- t_i)\cdot(1- x_i))$.
where $\widetilde{eq}(t, x)=\prod_{i=1}^s (t_i \cdot x_i +(1- t_i)\cdot(1- x_i))$, which is the MLE of $eq(x,e)=\{1 ~\text{if}~ x=e,~ 0 ~\text{otherwise}\}$.
Basically $Q_{io}(\cdot)$ is a multivariate polynomial such that
$$Q_{io}(t)=\widetilde{F}_{io}(t) ~\forall t \in\{0,1\}^s$$
@ -105,17 +108,24 @@ $\Longleftrightarrow$ iff $\widetilde{F}_{io}(\cdot)$ encodes a witness $w$ such
To check that $Q_{io}(\cdot)$ is a zero-polynomial: check $Q_{io}(\tau)=0,~ \tau\in^R \mathbb{F}^s$ (Schwartz-Zippel-DeMillo–Lipton lemma).
\paragraph{Recap}
\begin{itemize}
\item[] We have that $Sat_{R1CS}(x,w)=1$ iff $F_{io}(x)=0$.
\item[] To be able to use sum-check, we use its polynomial extension $\widetilde{F}_{io}(x)$, using sum-check to prove that $\widetilde{F}_{io}(x)=0 ~\forall x \in\{0, 1\}^s$, which means that $Sat_{R1CS}(x,~w)=1$.
\item[] To prevent potential canceling terms, we combine $\widetilde{F}_{io}(x)$ with $\widetilde{eq}(t, x)$, obtaining $G_{io, \tau}(x)=\widetilde{F}_{io}(x)\cdot\widetilde{eq}(t, x)$.
\item[] Thus $Q_{io}(t)=\sum_{x \in\{0,1\}^s}\widetilde{F}_{io}(x)\cdot\widetilde{eq}(t, x)$, and then we prove that $Q_{io}(\tau)=0$, for $\tau\in^R \mathbb{F}^s$.
\end{itemize}
\section{NIZKs with succint proofs for R1CS}
From Thm 4.1: to check R1CS instance $(\mathbb{F}, A, B, C, io, m, n)$ V can check if
$$\sum_{x \in\{0,1\}^s} G_{io, \tau}(r_x)$$
where $r_x \in\mathbb{F}^s$.
$\sum_{x \in\{0,1\}^s} G_{io, \tau}(x)=0$, which through sum-check protocol can be reduced to $e_x = G_{io, \tau}(r_x)$, where $r_x \in\mathbb{F}^s$.
only one term in $M_{r_x}(r_y)$ depends on prover's witness: $\widetilde{Z}(r_y)$
only one term in $M_{r_x}(r_y)$ depends on prover's witness: $\widetilde{Z}(r_y)$, the other terms can be computed locally by V in $O(n)$ time (Section 6 of the paper for sub-linear in $n$).
P sends a commitment to $\widetilde{w}(\cdot)$ (= MLE of the witness $w$) to V before the first instance of the sum-check protocol.
Instead of evaluating $\widetilde{Z}(r_y)$ in $O(|w|)$ communications, P sends a commitment to $\widetilde{w}(\cdot)$ (= MLE of the witness $w$) to V before the first instance of the sum-check protocol.
\paragraph{Recap}
\begin{itemize}
\item[] To check the R1CS instance, V can check $\sum_{x \in\{0,1\}^s} G_{io, \tau}(x)=0$, which through the sum-check is reduced to $e_x = G_{io, \tau}(r_x)$, for $r_x \in\mathbb{F}^s$.
\item[] Evaluating $G_{io, \tau}(x)$ ($G_{io, \tau}(x)=\widetilde{F}_{io}(x)\cdot\widetilde{eq}(\tau, x)$) is not cheap. Evaluating $\widetilde{eq}(\tau, r_x)$ takes $O(log~m)$, but to evaluate $\widetilde{F}_{io}(r_x)$, V needs to evaluate $\widetilde{A}, \widetilde{B}, \widetilde{C}, \widetilde{Z},~ \forall y \in\{0,1\}^s$
\item[] P makes 3 separate claims: $\overline{A}(r_x)=v_A,~ \overline{B}(r_x)=v_B,~ \overline{C}(r_x)=v_C$, so V can evaluate $G_{io, \tau}(r_x)=(v_A \cdot v_B - v_C)\cdot\widetilde{eq}(r_x, \tau)$
\item[] The previous claims are combined into a single claim (random linear combination) to use only a single sum-check protocol:
\item[]$c=L(r_x)=\sum_{y \in\{0,1\}^s} M_{r_x}(y)$, where $M_{r_x}(y)$ is a s-variate polynomial with deg $\leq2$ in each variable ($\Longleftrightarrow\mu= s,~ l=2,~ T=c$). Only $\widetilde{Z}(r_y)$ depends on P's witness, the other terms can be computed locally by V.
\item[] Instead of evaluating $\widetilde{Z}(r_y)$ in $O(|w|)$ communications, P uses a commitment to $\widetilde{w}(\cdot)$ (= MLE of the witness $w$).
\end{itemize}
\subsection{Full protocol}
\begin{footnotesize}
(Recall: Sum-Check params: $\mu$: n vars, n rounds, $l$: degree in each variable upper bound, $T$: claimed result.)
@ -195,7 +241,9 @@ P sends a commitment to $\widetilde{w}(\cdot)$ (= MLE of the witness $w$) to V b
\end{enumerate}
\end{itemize}
\vspace{2cm}
Section 6 of the paper, describes how in step 16, instead of evaluating $\widetilde{A},~\widetilde{B},~\widetilde{C}$ at $r_x,~r_y$ with $O(n)$ costs, P commits to $\widetilde{A},~\widetilde{B},~\widetilde{C}$ and later provides proofs of openings.