February 6, 2020

Building AI with computation and data distributed using cpus of autonomous vehicles

I've been trying to calculate how much compute power would be available for machine learning tasks if the entire fleet of ~1M Tesla's were available (at some trickle usage of incremental battery use, or maybe just when driving) and compare that to what Amazon has with Echo's (not anything like as powerful) or just spare load on Amazon/ Apple / Google cloud platforms.

It might be a unique opportunity to develop distributed AI in an interesting computational base.
I'll update as I find out more.

Restoring Medley Interlisp: running well on modern systems

at Interlisp.org and on GitHub (and see issue list ) It's great to work with old friends, as if 30 years hadn't passed.  We have a...