Comments on: The Balancing Act Of Training Generative AI https://www.nextplatform.com/2023/07/17/the-balancing-act-of-training-generative-ai/ In-depth coverage of high-end computing at large enterprises, supercomputing centers, hyperscale data centers, and public clouds. Tue, 14 May 2024 01:47:29 +0000 hourly 1 https://wordpress.org/?v=6.7.1 By: HuMo https://www.nextplatform.com/2023/07/17/the-balancing-act-of-training-generative-ai/#comment-211333 Wed, 19 Jul 2023 04:11:23 +0000 https://www.nextplatform.com/?p=142661#comment-211333 In reply to 8^p.

Hmmm … might also consider the Encephalization Quotient (EQ) in this deepest and most insightful of AI analogies … roughly the ratio of thinking-power to brawning-power. There, the shrew ties the raven and chimp at 2.5 while the Lamb (Google’s Lamb-da?), Chinchilla (Google again?), and Llama (Meta’s?) slightly disappoint at 0.6, 0.8, and 0.9, respectively — values that nominally match those of the MMLU Table in this TNP article quite nicely (couldn’t find the GPT, the palm, nor the inflection-1, in contemporary published EQ tables … unfortunately).

The human still shines at 7.5 EQ, like a tasty mid-week cognitive cocktail of equal parts chimp, raven, and shrew, perfect for a Shakespearean Midsummer Night (can be substituted with 6 parts lamb, plus 4 parts llama, if needed, in the afternoon, or morning)!

]]>
By: 8^p https://www.nextplatform.com/2023/07/17/the-balancing-act-of-training-generative-ai/#comment-211317 Tue, 18 Jul 2023 18:46:48 +0000 https://www.nextplatform.com/?p=142661#comment-211317 In reply to q^8.

On the Shakespearean upside, they’ve surely Tamed the Shrew (soricidae, 50M neurons, 10 shrews = 1 lamb)! 8^p

]]>
By: Timothy Prickett Morgan https://www.nextplatform.com/2023/07/17/the-balancing-act-of-training-generative-ai/#comment-211308 Tue, 18 Jul 2023 13:59:48 +0000 https://www.nextplatform.com/?p=142661#comment-211308 In reply to hoohoo.

It is a bunch of inflection points, from Inflection AI’s homepage. But yes. Indeed.

]]>
By: hoohoo https://www.nextplatform.com/2023/07/17/the-balancing-act-of-training-generative-ai/#comment-211306 Tue, 18 Jul 2023 13:19:28 +0000 https://www.nextplatform.com/?p=142661#comment-211306 An abstract picture for an abstract article about an abstract technology!

]]>
By: deadbeef https://www.nextplatform.com/2023/07/17/the-balancing-act-of-training-generative-ai/#comment-211294 Tue, 18 Jul 2023 06:21:22 +0000 https://www.nextplatform.com/?p=142661#comment-211294 One could think of LLMs as a nice recipe for applying vast amounts of classical computing power to very difficult problems. This kind of immortal computing has always had super human capabilities in many ways, and been glaringly deficient in others. So – applying it at such truly vast scale is bound to produce super-human results – in some areas.

Which is not to be sniffed at. But is it intelligence? That’s probably in the eye of the beholder …

Doing super-computer sized things really well with a lot less work than a load of HPC programmers writing MPC code is undoubtedly a new capability.

]]>
By: q^8 https://www.nextplatform.com/2023/07/17/the-balancing-act-of-training-generative-ai/#comment-211279 Tue, 18 Jul 2023 00:20:32 +0000 https://www.nextplatform.com/?p=142661#comment-211279 Food for thought (tasty from a gastronomic POV)! The local butcher (in southern France; not named Mary) did have little lamb brains for sale, that apparently people do cook-up and eat down there (also kidneys, and testicles … all of which probably have curative powers in local shamanism). The sheep may have 500 million neurons, each connected to 10,000 others, for a total of 5 trillion synapses (using Octopus as a proxy in WikiPedia’s “List of animals by number of neurons”; in honor of the Nobel Prize winning “Giant Squid Axon”).

From the one-synapse = one-parameter (weight) perspective, and assuming the lamb uses 10% of its brain, one concludes that a 500 billion parameter model = one lamb (approximately). The question then naturally arises, considering GPT-4’s trillion parameters, of the degree to which one should expect that feeding 30 billion books and web-pages to two lambs, will result in the development of superintelligence (granted that goats may be more appropriate — they eat anything)?

As common sense dictates, and practical experience tells us, any large number of monkeys with typewriters will eventually produce the entire leather-bound edition of the complete works of William Shakespeare … whenceforth we confidently conclude that there remains great promise for our bovidae-scaled experiment in the development of artificial general intelligence (thereof?)! q^8

]]>