Cargando…

Round-Efficient Secure Inference Based on Masked Secret Sharing for Quantized Neural Network

Existing secure multiparty computation protocol from secret sharing is usually under this assumption of the fast network, which limits the practicality of the scheme on the low bandwidth and high latency network. A proven method is to reduce the communication rounds of the protocol as much as possib...

Descripción completa

Detalles Bibliográficos
Autores principales: Wei, Weiming, Tang, Chunming, Chen, Yucheng
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9955064/
https://www.ncbi.nlm.nih.gov/pubmed/36832755
http://dx.doi.org/10.3390/e25020389
Descripción
Sumario:Existing secure multiparty computation protocol from secret sharing is usually under this assumption of the fast network, which limits the practicality of the scheme on the low bandwidth and high latency network. A proven method is to reduce the communication rounds of the protocol as much as possible or construct a constant-round protocol. In this work, we provide a series of constant-round secure protocols for quantized neural network (QNN) inference. This is given by masked secret sharing (MSS) in the three-party honest-majority setting. Our experiment shows that our protocol is practical and suitable for low-bandwidth and high-latency networks. To the best of our knowledge, this work is the first one where the QNN inference based on masked secret sharing is implemented.