I've been trying to calculate how much compute power would be available for machine learning tasks if the entire fleet of ~1M Tesla's were available (at some trickle usage of incremental battery use, or maybe just when driving) and compare that to what Amazon has with Echo's (not anything like as powerful) or just spare load on Amazon/ Apple / Google cloud platforms.
It might be a unique opportunity to develop distributed AI in an interesting computational base.
I'll update as I find out more.
Subscribe to:
Posts (Atom)
Medley Interlisp Project, by Larry Masinter et al.
I haven't been blogging -- most of my focus has been on Medley Interlisp. Tell me what you think!

-
I haven't been blogging -- most of my focus has been on Medley Interlisp. Tell me what you think!
-
A year and a half into the project, I'm still having fun at Interlisp.Org . Check out our Annual Report . One of my favorite hacks: On...
-
at Interlisp.org and on GitHub (and see issue list ) It's great to work with old friends, as if 30 years hadn't passed. We have a...