TY - GEN
T1 - Privacy-aware edge computing based on adaptive DNN partitioning
AU - Shi, Chengshuai
AU - Chen, Lixing
AU - Shen, Cong
AU - Song, Linqi
AU - Xu, Jie
PY - 2019/12
Y1 - 2019/12
N2 - Recent years have witnessed deep neural networks (DNNs) become the de facto tool in many applications such as image classification and speech recognition. But significant unmet needs remain in performing DNN inference tasks on mobile devices. Although edge computing enables complex DNN inference tasks to be performed in close proximity to the mobile device, performance optimization requires a carefully designed synergy between the edge and the mobile device. Moreover, the confidentiality of uploaded data to the possibly untrusted edge server is of great concern. In this paper, we investigate the impact of DNN partitioning on the inference latency performance and the privacy risks in edge computing. Based on the obtained insights, we design an offloading strategy that adaptively partitions the DNN in varying network environments to make the optimal tradeoff between performance and privacy for battery-powered mobile devices. This strategy is designed under the learning-aided Lyapunov optimization framework and has a provable performance guarantee. Finally, we build a small- scale testbed to demonstrate the efficacy of the proposed offloading scheme.
AB - Recent years have witnessed deep neural networks (DNNs) become the de facto tool in many applications such as image classification and speech recognition. But significant unmet needs remain in performing DNN inference tasks on mobile devices. Although edge computing enables complex DNN inference tasks to be performed in close proximity to the mobile device, performance optimization requires a carefully designed synergy between the edge and the mobile device. Moreover, the confidentiality of uploaded data to the possibly untrusted edge server is of great concern. In this paper, we investigate the impact of DNN partitioning on the inference latency performance and the privacy risks in edge computing. Based on the obtained insights, we design an offloading strategy that adaptively partitions the DNN in varying network environments to make the optimal tradeoff between performance and privacy for battery-powered mobile devices. This strategy is designed under the learning-aided Lyapunov optimization framework and has a provable performance guarantee. Finally, we build a small- scale testbed to demonstrate the efficacy of the proposed offloading scheme.
UR - http://www.scopus.com/inward/record.url?scp=85081984150&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85081984150&partnerID=8YFLogxK
U2 - 10.1109/GLOBECOM38437.2019.9013742
DO - 10.1109/GLOBECOM38437.2019.9013742
M3 - Conference contribution
AN - SCOPUS:85081984150
T3 - 2019 IEEE Global Communications Conference, GLOBECOM 2019 - Proceedings
BT - 2019 IEEE Global Communications Conference, GLOBECOM 2019 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2019 IEEE Global Communications Conference, GLOBECOM 2019
Y2 - 9 December 2019 through 13 December 2019
ER -