Azure IoT TypeEdge : a strongly-typed development experience for Azure IoT Edge

How about designing and deploying intelligent machine-learned models onto resource constrained platforms and small single-board computers, like Raspberry Pi, Arduino, and micro:bit? How interesting would that be?
This is exactly what the open source Embedded Learning Library (ELL) project is about. The deployed models run locally, without requiring a network connection and without relying on servers in the cloud. ELL is an early preview of the embedded AI and machine learning technologies developed at Microsoft Research.
Chris Lovett from Microsoft Research gives us a fantastic demo of the project in this episode of the IoT Show.
Get the ELL code on GitHub: https://github.com/microsoft/ell
Embedded, tiny, on Phyton :-) - very funny, in MS style.
@Anatoly: the raspberry pi demo is using Python, just for convenience, but the audio keyword spotting demo is pure C++.
@CrisLovett: thanks for the answer. I just meant that say use of Intel CPU with embedded neurons would be not bad idea.
Sequentilal programming with Phyton or even C++ is, I'm afraid, in times slower than impementing FPGA or Neurochips, where
thousands operations are performed in parallel.
BTW. This channel9 works in such a manner that I must post twice...
@Anatoly: oh, sure, many companies are working on hardware optimization of neural networks including Intel. For large complicated vision models it makes sense to use special hardware including GPU, TPU, NPU, and FPGA, even custom ASICs. ELL can target some of these hardware optimizations also, when it is provided by the LLVM back end (for example LLVM can already target Qualcomm Hexagon DSP chips).