Avatar

Thoughts (Public Board)

by ,ndo, No refunds or exchanges! Fullstop!, Saturday, June 29, 2024, 20:52 (430 days ago) @ Cornpop Sutton

The question has multiple parts.

The code itself probably does. I haven't looked at it in any depth and am presuming that the code is broken and needs fixing. I have a python system but haven't downloaded the packages mentioned. It was a sheer fluke that the machine wrote a python script. OTOH python is popular so not a surprising choice of language.

The training has practical issues. Do you have multiple GPUs? :) But even a single CPU for 10x or 100x the number of hours would presumably get the training done eventually.

Constructing the training data might raise its own questions.

As far as memory goes, it doesn't look to me that an inordinate amount is required. But who knows how much memory these google people had available to them. In their paper they say they had 8 GPUs. Each GPU has its own RAM, of course. Single CPU perhaps?

Probably doable by nobodies like us were they sufficiently patient.


Complete thread:

 RSS Feed of thread