ads

New best story on Hacker News: Go away Python

Go away Python
379 by baalimago | 345 comments on Hacker News.


New best story on Hacker News: Show HN: 22 GB of Hacker News in SQLite

Show HN: 22 GB of Hacker News in SQLite
372 by keepamovin | 122 comments on Hacker News.
Community, All the HN belong to you. This is an archive of hacker news that fits in your browser. When I made HN Made of Primes I realized I could probably do this offline sqlite/wasm thing with the whole GBs of archive. The whole dataset. So I tried it, and this is it. Have Hacker News on your device. Go to this repo ( https://ift.tt/EYJNF8U ): you can download it. Big Query -> ETL -> npx serve docs - that's it. 20 years of HN arguments and beauty, can be yours forever. So they'll never die. Ever. It's the unkillable static archive of HN and it's your hands. That's my Year End gift to you all. Thank you for a wonderful year, have happy and wonderful 2026. make something of it.

New best story on Hacker News: Tesla's 4680 battery supply chain collapses as partner writes down deal by 99%

Tesla's 4680 battery supply chain collapses as partner writes down deal by 99%
360 by coloneltcb | 391 comments on Hacker News.


New best story on Hacker News: List of domains censored by German ISPs

List of domains censored by German ISPs
317 by elcapitan | 133 comments on Hacker News.


New best story on Hacker News: As AI gobbles up chips, prices for devices may rise

As AI gobbles up chips, prices for devices may rise
307 by geox | 489 comments on Hacker News.


New best story on Hacker News: Google is dead. Where do we go now?

Google is dead. Where do we go now?
372 by tomjuggler | 354 comments on Hacker News.


New best story on Hacker News: Show HN: Z80-μLM, a 'Conversational AI' That Fits in 40KB

Show HN: Z80-μLM, a 'Conversational AI' That Fits in 40KB
402 by quesomaster9000 | 92 comments on Hacker News.
How small can a language model be while still doing something useful? I wanted to find out, and had some spare time over the holidays. Z80-μLM is a character-level language model with 2-bit quantized weights ({-2,-1,0,+1}) that runs on a Z80 with 64KB RAM. The entire thing: inference, weights, chat UI, it all fits in a 40KB .COM file that you can run in a CP/M emulator and hopefully even real hardware! It won't write your emails, but it can be trained to play a stripped down version of 20 Questions, and is sometimes able to maintain the illusion of having simple but terse conversations with a distinct personality. -- The extreme constraints nerd-sniped me and forced interesting trade-offs: trigram hashing (typo-tolerant, loses word order), 16-bit integer math, and some careful massaging of the training data meant I could keep the examples 'interesting'. The key was quantization-aware training that accurately models the inference code limitations. The training loop runs both float and integer-quantized forward passes in parallel, scoring the model on how well its knowledge survives quantization. The weights are progressively pushed toward the 2-bit grid using straight-through estimators, with overflow penalties matching the Z80's 16-bit accumulator limits. By the end of training, the model has already adapted to its constraints, so no post-hoc quantization collapse. Eventually I ended up spending a few dollars on Claude API to generate 20 questions data (see examples/guess/GUESS.COM), I hope Anthropic won't send me a C&D for distilling their model against the ToS ;P But anyway, happy code-golf season everybody :)

New best story on Hacker News: Kidnapped by Deutsche Bahn

Kidnapped by Deutsche Bahn
438 by JeremyTheo | 477 comments on Hacker News.


New best story on Hacker News: You can make up HTML tags

You can make up HTML tags
366 by todsacerdoti | 133 comments on Hacker News.