We’ve been feeling a nice jolt of energy in the past month, as many of our authors switched gears from summer mode into fall, with a renewed focus on learning, experimenting, and launching new projects.
We’ve published far more excellent posts in September than we could ever highlight here, but we still wanted to make sure you don’t miss some of our recent standouts. Below are ten articles that resonated strongly with our community—whether it’s by the sheer number of readers they attracted, the lively conversations they inspired, or the cutting-edge topics they covered. We’re sure you’ll enjoy exploring them.
- New ChatGPT Prompt Engineering Technique: Program Simulation
It’s fairly rare for an author’s TDS debut to become one of the most popular articles of the month, but Giuseppe Scalamogna’s article pulled off this feat thanks to an accessible and timely explainer on program simulation: a prompt-engineering technique that “aims to make ChatGPT operate in a way that simulates a program,” and can lead to impressive results.
- How to Program a Neural Network
Tutorials on neural networks are easy to find. Less common? A step-by-step guide that helps readers gain both an intuitive understanding of how they work, and the practical know-how for coding them from scratch. Callum Bruce delivered precisely that in his latest contribution.
- Don’t Start Your Data Science Journey Without These 5 Must-Do Steps — A Spotify Data Scientist’s Full Guide
If you’ve already discovered Khouloud El Alami’s writing, you won’t be surprised to learn her most recent post offers actionable insights presented in an accessible and engaging way. This one is geared towards data scientists at the earliest stages of their career: if you’re not sure how to set yourself on the right path, Khouloud’s advice will help you find your bearings.
- How to Design a Roadmap for a Machine Learning Project
For those of you who are already well into your ML journey, Heather Couture’s new article offers a helpful framework for streamlining the design of your next project. From a robust literature review to post-deployment maintenance, it covers all the bases for a successful, iterative workflow.
- Machine Learning’s Public Perception Problem
In a thought-provoking reflection, Stephanie Kirmer tackles a fundamental tension in the current debates around AI: “all our work in the service of building more and more advanced machine learning is limited in its possibility not by the number of GPUs we can get our hands on but by our capacity to explain what we build and educate the public on what it means and how to use it.”
- How to Build an LLM from Scratch
Taking a cue from the development process of models like GPT-3 and Falcon, Shawhin Talebi reviews the key aspects of creating a foundation LLM. Even if you’re not planning to train the next Llama anytime soon, it’s valuable to understand the practical considerations that go into such a massive undertaking.
- Your Own Personal ChatGPT
If you are in the mood for building and tinkering with language models, however, a great place to start is Robert A. Gonsalves’s detailed overview of what it takes to fine-tune OpenAI’s GPT-3.5 Turbo model to perform new tasks using your own custom data.
- How to Build a Multi-GPU System for Deep Learning in 2023
Don’t roll down your sleeves just yet—one of our most-read tutorials in September, by Antonis Makropoulos, focuses on deep-learning hardware and infrastructure, and walks us through the nitty-gritty details of choosing the right components for your project’s needs.
- Meta-Heuristics Explained: Ant Colony Optimization
For a more theoretical—but no less fascinating—topic, Hennie de Harder’s introduction to ant-colony optimization draws our attention to a “lesser-known gem” of an algorithm, explores how it took inspiration from the ingenious foraging behaviors of ants, and unpacks its inner workings. (In a follow-up post, Hennie also demonstrates how it can solve real-world problems.)
- Falcon 180B: Can It Run on Your Computer?
Closing on an ambitious note, Benjamin Marie sets out to find out if one can run the (very, very large) Falcon 180B model on consumer-grade hardware. (Spoiler alert: yes, with a couple of caveats.) It’s a valuable resource for anyone who’s weighing the pros and cons of working on a local machine vs. using cloud services—especially now that more and more open-source LLMs are arriving on the scene.