top of page

Search


Bringing Foundation Models to Small Data
This article explores TabPFN, a transformer-based foundation model designed for small tabular datasets. Trained on millions of synthetic datasets generated via structural causal models, TabPFN learns to predict labels through in-context learning. It outperforms traditional methods like CatBoost and XGBoost in both speed and accuracy, while offering robustness, interpretability, and fine-tuning capabilities. A breakthrough in tabular ML, it redefines what's possible on structu

Juan Manuel Ortiz de Zarate
Apr 1111 min read


Can a Chatbot Make Us Feel Better (or Worse)?
Can AI chatbots comfort us—or make us dependent? A study explores ChatGPT's emotional impact and the ethics of affective design.

Juan Manuel Ortiz de Zarate
Apr 49 min read


Tech Titans Turn to Atomic Power to Fuel the Future
Tech giants turn to nuclear energy to power AI, tackling rising energy demands and environmental impact with bold new strategies.

Juan Manuel Ortiz de Zarate
Mar 2910 min read


Diffusion Models: From Noise to Masterpiece
Explore how Diffusion Models are revolutionizing generative AI, from their mathematical foundations to applications in image and audio.

Juan Manuel Ortiz de Zarate
Mar 208 min read


The Brains Behind AI’s Evolution
Discover how neural networks power modern AI, from deep learning to generative models, shaping the future of technology and innovation.

Juan Manuel Ortiz de Zarate
Mar 149 min read


Diffusion LLM: Closer to Human Thought
SEDD redefines generative AI with human-like reasoning, enabling faster, high-quality text and code through discrete diffusion models.

Juan Manuel Ortiz de Zarate
Mar 79 min read
bottom of page