Home
Categories
EXPLORE
True Crime
Comedy
Society & Culture
Business
Sports
TV & Film
Technology
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts211/v4/92/f0/ad/92f0adf4-2b10-a63c-bc79-1889b710b139/mza_6601485165628379978.jpg/600x600bb.jpg
AI: post transformers
mcgrof
316 episodes
2 days ago
The transformer architecture revolutionized the world of Neural Networks. It was a springboard for what we know today as modern artificial intelligence. This podcast focuses on modern state of the art research paper reviews starting from the transformer and on.
Show more...
Technology
RSS
All content for AI: post transformers is the property of mcgrof and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
The transformer architecture revolutionized the world of Neural Networks. It was a springboard for what we know today as modern artificial intelligence. This podcast focuses on modern state of the art research paper reviews starting from the transformer and on.
Show more...
Technology
https://d3t3ozftmdmh3i.cloudfront.net/staging/podcast_uploaded_nologo/44199026/44199026-1754490757264-4f84f1d34e94a.jpg
DSPy and TextGrad: Compiling Language Model Systems
AI: post transformers
15 minutes 54 seconds
1 week ago
DSPy and TextGrad: Compiling Language Model Systems

These two academic papers introduce novel programming models aimed at systematically optimizing complex AI systems, particularly those built using Large Language Models (LLMs). The first source presents **DSPy**, a framework that abstracts traditional, hard-coded LLM pipelines into parameterized, declarative modules that can be automatically optimized using a compiler and **teleprompters**, demonstrating superior performance compared to hand-crafted prompts on tasks like math word problems. The second source introduces **TEXTGRAD**, a general optimization framework that utilizes LLMs to generate and propagate **natural language gradients**—textual feedback—through computation graphs, applying this "textual differentiation" approach successfully across diverse domains, including prompt optimization, code refinement, and scientific applications like molecular and medical treatment plan design. Both works highlight the shift from relying on expert prompt engineering to employing systematic, programmatic optimization techniques for compound AI systems.


Sources:

October 5, 2023

DSPY: COMPILING DECLARATIVE LANGUAGE

MODEL CALLS INTO SELF-IMPROVING PIPELINES

https://arxiv.org/pdf/2310.03714


June 11, 2024

TextGrad: Automatic “Differentiation” via Text

https://arxiv.org/pdf/2406.07496

AI: post transformers
The transformer architecture revolutionized the world of Neural Networks. It was a springboard for what we know today as modern artificial intelligence. This podcast focuses on modern state of the art research paper reviews starting from the transformer and on.