I've been trying to calculate how much compute power would be available for machine learning tasks if the entire fleet of ~1M Tesla's were available (at some trickle usage of incremental battery use, or maybe just when driving) and compare that to what Amazon has with Echo's (not anything like as powerful) or just spare load on Amazon/ Apple / Google cloud platforms.
It might be a unique opportunity to develop distributed AI in an interesting computational base.
I'll update as I find out more.
Subscribe to:
Posts (Atom)
Medley Interlisp Project, by Larry Masinter et al.
I haven't been blogging -- most of my focus has been on Medley Interlisp. Tell me what you think!
-
I'm still working out where/how to blog. I took a mailing list posting on 'meaning of names and operations of services' with a...
-
At some point I had the silly idea that I should be listed in Wikipedia. Now like a monkey's claw, like Midas' Touch, I discovered i...
-
HTML5 If there are 300 implementations of a specification, all different, but you take the 4 "important implementations" and...