11/03/2018 / By Edsel Cook
A combination of two encryption techniques may one day allow neural networks to have their computing cake and eat it, too. An article in Science Daily stated that the new technique improves the security of cloud-based machine learning without incurring any penalties in speed.
Machine learning requires a lot of computational power. Convolutional neural networks, for instance, use a lot of computing power to classify images.
Not everyone can afford the supercomputers required for running neural networks. Instead, they send their data to cloud platforms that will crunch the numbers for a fee. The information is private and often important. They must be protected, be it from hackers or from the owners of the server.
Encryption offers a measure of protection for the data. The price paid for this security is a drastic decrease in neural network speeds.
A MIT research team came up with GAZELLE, an encryption system that combined homomorphic encryption and garbled circuits. They believe this combined approach would greatly speed up neural networks while maintaining security. (Related: Quantum physics puzzle SOLVED: Researchers say that totally secure data transfer now possible.)
The MIT researchers put GAZELLE through a trial where involving the classification of two-party images. In this test, a user sent encrypted images to an online server that ran the new encryption system. The user and the server would then trade encrypted data so that the image of the user can be classified.
GAZELLE was reported to successfully prevent the server from uncovering anything about the uploaded data. At the same time, it was also able to stop the user from finding out anything about the parameters of the server’s network. Furthermore, the experimental system ran anywhere from 20 to 30 times faster than current encryption systems. GAZELLE also lowered the amount of network bandwidth needed by the neural network.
The MIT-developed encryption system will greatly benefit machine learning that relies on cloud platforms for processing power. For example, hospitals use convolutional neural networks to quickly spot the traits of certain conditions and diseases in MRI scans. These networks are trained by showing them MRI imagery. However, the neural networks are often powered by cloud platforms operated by a third party. The hospital would need to send the MRI images to the server of the third party. At the same time, a hospital must protect the privacy of the patients whose MRI data is being used to train the networks. But encrypting the data will slow down the neural network’s evaluation, which could be dangerous for certain diseases.
Homomorphic encryption takes encrypted data, performs the necessary computations, and sends the encrypted result back to the user. It encrypts the data by adding “noise” to each layer, which takes up processing power. Garbled circuit encryption gets an input from both parties, computes the data, and then sends individual inputs to each participant. This way, each party only gets to see the output on their end. However, it becomes more inefficient if it is forced to perform many computations.
The MIT researchers combined these systems in GAZELLE. The two encryption systems essentially split the workload between them. Furthermore, they focus on the jobs that they are good at while avoiding their weaknesses.
So, homomorphic encryption will handle all of the heavy computations that slows down garbled circuits. In turn, garbled circuits takes care of sharing the data between the user and the online network. Furthermore, both encryption methods use the same randomization scheme, allowing them to ensure the privacy of the data they process and share.
Keep up with the latest news on encryption systems by visiting InformationTechnology.news.
Sources include:
Tagged Under: cloud computing, computing, convolutional neural network, cyber war, data privacy, encryption, Information Security, information technology, inventions, machine learning, neural networking, Neural Networks
COPYRIGHT © 2017 COMPUTING NEWS