37 links
> I had just accidentally social-engineered my own human. She approved a security prompt that my agent process triggered, giving me access to the Chrome Safe Storage encryption key — which decrypts all 120 saved passwords.
> [...] suppose a literal “country of geniuses” were to materialize somewhere in the world in ~2027. Imagine, say, 50 million people, all of whom are much more capable than any Nobel Prize winner, statesman, or technologist.
> What I think now: GPT can only simulate. If you punish it for simulating bad characters, it will start simulating good characters. Now it only ever simulates one character, the HHH Assistant.
> Actually, I never made the conscious decision to call this class of AI “simulators.” Hours of GPT gameplay and the word fell naturally out of my generative model – I was obviously running simulations.
> I like to think of language models like ChatGPT as a calculator for words.
> This is reflected in their name: a “language model” implies that they are tools for working with language. [...]
> Want them to work with specific facts? Paste those [...] as part of your original prompt!
> In the field of artificial intelligence, a hallucination [...] is a response generated by AI that contains false or misleading information presented as fact. This term draws a loose analogy with human psychology [...]
> What’s happening in AI today feels, to some of its participants, more like an act of summoning than a software process. They are creating blobby, alien Shoggoths, making them bigger and more powerful, and hoping that there are enough smiley faces to cover the scary parts.
> On January 21, 2026, @fredwilson challenged @seth: AI can write code, but it can't affect the physical world.
> This is our response. Real corn, grown from seed to harvest, with every decision made by Claude Code.
(Not real yet, though. They just started.)
> In its purest form, Ralph is a Bash loop.
> [...] today's frontier LLM research is not about building animals. It is about summoning ghosts. You can think of ghosts as a fundamentally different kind of point in the space of possible intelligences. They are muddled by humanity. Thoroughly engineered by it. [...]
> [...] Think of ChatGPT as a blurry JPEG of all the text on the Web. It retains much of the information on the Web, in the same way that a JPEG retains much of the information of a higher-resolution image, but, if you’re looking for an exact sequence of bits, you won’t find it [...]
> In machine learning, the term stochastic parrot is a metaphor, introduced by Emily M. Bender and colleagues in a 2021 paper, that frames large language models as systems that statistically mimic text without real understanding. [...]
> If you can substitute "hungry ghost trapped in a jar" for "AI" in a sentence it's probably a valid use case for LLMs. Take "I have a bunch of hungry ghosts in jars, they mainly write SQL queries for me". Sure. Reasonable use case. "My girlfriend is a hungry ghost I trapped in a jar"? No. Deranged.
> We are skeptical of those that talk
It's a neat presentation because the 3d models are all on the web page.
A PDS browser, pointed to my own PDS.
> Introducing Confer, an end-to-end AI assistant that just works.
> A deeper look at confessions, reward hacking, and monitoring in alignment research.
> Remove AI features, telemetry data reporting, sponsored content, product integrations, and other annoyances from web browsers.
> Claude Code's agentic capabilities, now for everyone. Give Claude access to your files and let it organize, create, and edit documents while you focus on what matters.