> LLMs are kind of like sails in that left free flowing they're completely useless but tightly bound and directed they can dramatically accelerate your progress
17 links
> [...] suppose a literal “country of geniuses” were to materialize somewhere in the world in ~2027. Imagine, say, 50 million people, all of whom are much more capable than any Nobel Prize winner, statesman, or technologist.
> When you're using [a coding agent] to clean up your codebase and improve code health, it's sort of like using a pressure washer. You can use it to clean your steps but you wouldn't use it to clean a painting.
> What I think now: GPT can only simulate. If you punish it for simulating bad characters, it will start simulating good characters. Now it only ever simulates one character, the HHH Assistant.
> Actually, I never made the conscious decision to call this class of AI “simulators.” Hours of GPT gameplay and the word fell naturally out of my generative model – I was obviously running simulations.
> I like to think of language models like ChatGPT as a calculator for words.
> This is reflected in their name: a “language model” implies that they are tools for working with language. [...]
> Want them to work with specific facts? Paste those [...] as part of your original prompt!
> In the field of artificial intelligence, a hallucination [...] is a response generated by AI that contains false or misleading information presented as fact. This term draws a loose analogy with human psychology [...]
> Searle imagines himself alone in a room following a computer program for responding to Chinese characters slipped under the door. Searle understands nothing of Chinese, and yet, by following the program, [...] this leads those outside to mistakenly suppose there is a Chinese speaker in the room.
> What’s happening in AI today feels, to some of its participants, more like an act of summoning than a software process. They are creating blobby, alien Shoggoths, making them bigger and more powerful, and hoping that there are enough smiley faces to cover the scary parts.
> Gas Town is just Gas Town. It started with Mad Max theming, but none of it is super strong. None of the roles are proper names from the series, and I’m bringing in theming from other sources as well [...]
> In its purest form, Ralph is a Bash loop.
> [...] today's frontier LLM research is not about building animals. It is about summoning ghosts. You can think of ghosts as a fundamentally different kind of point in the space of possible intelligences. They are muddled by humanity. Thoroughly engineered by it. [...]
> [...] Think of ChatGPT as a blurry JPEG of all the text on the Web. It retains much of the information on the Web, in the same way that a JPEG retains much of the information of a higher-resolution image, but, if you’re looking for an exact sequence of bits, you won’t find it [...]
> In machine learning, the term stochastic parrot is a metaphor, introduced by Emily M. Bender and colleagues in a 2021 paper, that frames large language models as systems that statistically mimic text without real understanding. [...]
> If you can substitute "hungry ghost trapped in a jar" for "AI" in a sentence it's probably a valid use case for LLMs. Take "I have a bunch of hungry ghosts in jars, they mainly write SQL queries for me". Sure. Reasonable use case. "My girlfriend is a hungry ghost I trapped in a jar"? No. Deranged.
> We are skeptical of those that talk
> [...] Now I have a full-time driver. I’m in the back seat, giving directions when asked, or perhaps browsing the Internet if the driver doesn’t need any help. I do look in now and then to make sure we’re going in the right direction. [...]