Neural Acceleration for General-Purpose Approximate Programs

Subscribers:
344,000
Published on ● Video Link: https://www.youtube.com/watch?v=xIf6nckKwmw



Duration: 1:03:21
531 views
7


We are exploring a learning-based approach to the acceleration of approximate programs. We describe a program transformation, called the Parrot transformation, that selects and trains a neural network to mimic a region of imperative code. After the learning transformation phase, the compiler replaces the original code with an invocation of a low-power accelerator called a neural processing unit (NPU). The NPU is tightly coupled to the processor's speculative pipeline, since many of the accelerated code regions are small. Since neural networks produce inherently approximate results, we define a programming model that allows programmers to identify approximable code regions---code that can produce imprecise but acceptable results. Mimicking approximable code regions with an NPU is both faster and more energy efficient than executing the original code. For a set of diverse applications, NPU acceleration provides significant speedups and energy savings with dedicated digital hardware. We study how efficient software implementation of neural network with a few extensions to the ISA can result in application speedup with the Parrot transformation even without dedicated hardware implementation of neural networks. We further explore the feasibility of using analog NPUs to accelerate imperative approximate code. I will present the results from these studies.




Other Videos By Microsoft Research


2016-08-01Programming Devices and Services with P - Lecture 2
2016-08-01Practical Statically-checked Deterministic Parallelism
2016-08-01Real-time Air Quality Monitoring Network Using Low-Cost Devices
2016-08-01Experiments in Indoor Localization and Vehicle Classification
2016-08-01Achieving Household Energy Breakdown at Scale
2016-08-01Verifying Constant-Time Implementations
2016-07-28Quantum Computation for Quantum Chemistry: Status, Challenges, and Prospects - Session 3
2016-07-28Asymptotic behavior of the Cheeger constant of super-critical percolation in the square lattice
2016-07-28Recovering Washington’s Wolves & Preserving the Critical Link
2016-07-28The similarity distance on graphs and graphons
2016-07-28Neural Acceleration for General-Purpose Approximate Programs
2016-07-28Snow Hydrology at the Scale of Mountain Ranges
2016-07-28Vote Privacy, Revisited: New Definitions, Tools and Constructions
2016-07-28Dispelling an Old Myth about an Ancient Algorithm
2016-07-28Behavior Based Authentication using Gestures and Signatures
2016-07-28Approximating the Expansion Profile and Almost Optimal Local Graph Clustering
2016-07-28Stochastic Dual Coordinate Ascent and its Proximal Extension for Regularized Loss Minimization
2016-07-28A Practical Approach to Reduce the Power Consumption of LCD Displays
2016-07-28CryptDB: Processing Queries on an Encrypted Database
2016-07-28Performing Time, Space and Light
2016-07-28Probabilistic Methods for Efficient Search & Statistical Learning in Extremely HighDimensional Data



Tags:
microsoft research