rw-book-cover

Metadata

Highlights

Bill Gates says what’s been happening in AI in the last 12 months is “every bit as important as the PC or the internet” (View Highlight)

The rise of Generative AI has been several years in the making. Depending on how you look at it, it traces it roots back to deep learning (which is several decades old but dramatically accelerated after 2012) and the advent of Generative Adversarial Networks (GAN) in 2014, led by Ian Goodfellow, under the supervision of his professor and Turing Award recipient, Yoshua Bengio. (View Highlight)

the now famous paper “Attention is all you need.” (View Highlight)

Transformers brought a multimodal dimension to language models. There used to be separate architectures for computer vision, text and audio. With Transformers, one general architecture can now gobble up all sorts of data, leading to an overall convergence in AI. (View Highlight)

coders are figuring out how to work alongside Co-Pilot (View Highlight)

Early research seems to indicate significant improvements in developer productivity and happiness (View Highlight)

There’s already a marketplace for selling high quality text prompts – Promptbase. (View Highlight)

  • Note: Pfffffff…

Is AI content just… boring? Another strike against Generative AI is that it could be mostly underwhelming. Some commentators worry about an avalanche of uninteresting, formulaic content meant to help with SEO or demonstrate shallow expertise, not dissimilarly from what content farms (a la Demand Media) used to do ( What are the new AI chatbots for? Nothing good). (View Highlight)