Home
Categories
EXPLORE
True Crime
Comedy
Business
Society & Culture
Sports
History
News
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts211/v4/92/f0/ad/92f0adf4-2b10-a63c-bc79-1889b710b139/mza_6601485165628379978.jpg/600x600bb.jpg
AI: post transformers
mcgrof
340 episodes
18 hours ago
The transformer architecture revolutionized the world of Neural Networks. It was a springboard for what we know today as modern artificial intelligence. This podcast focuses on modern state of the art research paper reviews starting from the transformer and on.
Show more...
Technology
RSS
All content for AI: post transformers is the property of mcgrof and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
The transformer architecture revolutionized the world of Neural Networks. It was a springboard for what we know today as modern artificial intelligence. This podcast focuses on modern state of the art research paper reviews starting from the transformer and on.
Show more...
Technology
https://d3t3ozftmdmh3i.cloudfront.net/staging/podcast_uploaded_nologo/44199026/44199026-1754490757264-4f84f1d34e94a.jpg
NeurIPS 2025: Large Language Diffusion Models
AI: post transformers
12 minutes 39 seconds
1 month ago
NeurIPS 2025: Large Language Diffusion Models

This research paper introduces LLaDA, an 8-billion parameter language model based on the masked diffusion model (MDM) architecture, specifically developed to challenge the assumption that core Large Language Model (LLM) capabilities are exclusive to autoregressive models (ARMs). Unlike ARMs that predict the next token sequentially, LLaDA employs a generative approach featuring a forward token-masking process and a reverse process that simultaneously predicts masked tokens using a Transformer network. Trained and evaluated from scratch, LLaDA demonstrates strong scalability and achieves performance comparable to advanced ARM baselines like LLaMA 3 8B across various benchmarks covering general knowledge, math, and code generation. Crucially, the non-autoregressive nature enables bidirectional modeling, which allows LLaDA to effectively address the reversal curse and outperform contemporary models, including GPT-4o, on complex reversal reasoning tasks. These findings confirm that fundamental generative modeling principles, rather than dependence on sequential ARMs, underpin essential LLM capabilities. The work concludes that diffusion models offer a promising new paradigm for building robust, large-scale language models.


Source:

https://openreview.net/pdf?id=KnqiC0znVF

AI: post transformers
The transformer architecture revolutionized the world of Neural Networks. It was a springboard for what we know today as modern artificial intelligence. This podcast focuses on modern state of the art research paper reviews starting from the transformer and on.