3 Comments

You mention the pace of innovation of LLMs. What part of the logarithmic curve do you think we’re in - the beginning linear part or the end where we will see incremental gains?

Expand full comment

Interesting, it sounds like you're assuming it's decelerating? I'd guess that zoomed out things will continue to be exponential, but locally we'll hit speedbumps like running out of public training data or compute, or hitting some kind of limit of the transformers architecture, followed by a boost from synthetic data, multi-modality, training data or compute efficiency, new/augmented architectures, etc.

There's exponentially more talent, investment dollars, attention, and compute going into AI so I think that'll yield a related output

Expand full comment

Appreciate the thoughts. I'm genuinely not sure what part of the curve we're in. It's something I want to devote more time to, if I end up doing something in AI

Expand full comment