Mo et al., 2020 - Google Patents
Darknetz: towards model privacy at the edge using trusted execution environmentsMo et al., 2020
View PDF- Document ID
- 14889554492878328609
- Author
- Mo F
- Shamsabadi A
- Katevas K
- Demetriou S
- Leontiadis I
- Cavallaro A
- Haddadi H
- Publication year
- Publication venue
- Proceedings of the 18th International Conference on Mobile Systems, Applications, and Services
External Links
Snippet
We present DarkneTZ, a framework that uses an edge device's Trusted Execution Environment (TEE) in conjunction with model partitioning to limit the attack surface against Deep Neural Networks (DNNs). Increasingly, edge devices (smartphones and consumer IoT …
- 230000001537 neural 0 abstract description 14
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/57—Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
- G06F21/577—Assessing vulnerabilities and evaluating computer system security
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/70—Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
- G06F21/82—Protecting input, output or interconnection devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/70—Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
- G06F21/71—Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information
- G06F21/74—Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information operating in dual or compartmented mode, i.e. at least one secure mode
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Mo et al. | Darknetz: towards model privacy at the edge using trusted execution environments | |
| US11971980B2 (en) | Using trusted execution environments to perform a communal operation for mutually-untrusted devices | |
| Ahmad et al. | OBLIVIATE: A Data Oblivious Filesystem for Intel SGX. | |
| Kok et al. | Ransomware, threat and detection techniques: A review | |
| US12093371B2 (en) | Data distribution using a trusted execution environment in an untrusted device | |
| Sun et al. | Shadownet: A secure and efficient on-device model inference system for convolutional neural networks | |
| Goldstein et al. | Preventing DNN model IP theft via hardware obfuscation | |
| US11947659B2 (en) | Data distribution across multiple devices using a trusted execution environment in a mobile device | |
| US20190058577A1 (en) | Techniques for key provisioning in a trusted execution environment | |
| Sun et al. | Rearguard: Secure keyword search using trusted hardware | |
| Li et al. | P3M: a PIM-based neural network model protection scheme for deep learning accelerator | |
| Schlögl et al. | eNNclave: Offline inference with model confidentiality | |
| Messaoud et al. | Shielding federated learning systems against inference attacks with ARM TrustZone | |
| Nayan et al. | {SoK}: All you need to know about {On-Device}{ML} model extraction-the gap between research and practice | |
| Zhang et al. | SoftME: A Software‐Based Memory Protection Approach for TEE System to Resist Physical Attacks | |
| Li et al. | Teeslice: Protecting sensitive neural network models in trusted execution environments when attackers have pre-trained models | |
| Babar et al. | Trusted deep neural execution—a survey | |
| Kato et al. | Olive: Oblivious federated learning on trusted execution environment against the risk of sparsification | |
| Li et al. | ENIGMA: Low-latency and privacy-preserving edge inference on heterogeneous neural network accelerators | |
| Bai et al. | Secmdp: Towards privacy-preserving multimodal deep learning in end-edge-cloud | |
| Chang et al. | Rig: A simple, secure and flexible design for password hashing | |
| Costa et al. | SecureQNN: introducing a privacy-preserving framework for QNNs at the deep edge | |
| Naghibijouybari et al. | Covert channels on gpgpus | |
| Yang et al. | Penetralium: Privacy-preserving and memory-efficient neural network inference at the edge | |
| Casell et al. | A performance analysis for confidential federated learning |