I've been trying to calculate how much compute power would be available for machine learning tasks if the entire fleet of ~1M Tesla's were available (at some trickle usage of incremental battery use, or maybe just when driving) and compare that to what Amazon has with Echo's (not anything like as powerful) or just spare load on Amazon/ Apple / Google cloud platforms.
It might be a unique opportunity to develop distributed AI in an interesting computational base.
I'll update as I find out more.
Subscribe to:
Posts (Atom)
Restoring Medley Interlisp: running well on modern systems
at Interlisp.org and on GitHub (and see issue list ) It's great to work with old friends, as if 30 years hadn't passed. We have a...


-
at Interlisp.org and on GitHub (and see issue list ) It's great to work with old friends, as if 30 years hadn't passed. We have a...
-
At some point I had the silly idea that I should be listed in Wikipedia. Now like a monkey's claw, like Midas' Touch, I discovered i...
-
I originally wrote this as blog post & made updates, but now available as IETF Internet Draft , for discussion on www-tag@w3.org . Orig...
