Yahoo Suche Web Suche

Suchergebnisse

  1. Suchergebnisse:
  1. Vor 2 Tagen · Andrej Karpathy (@karpathy) February 20, 2024. So, tokenization clearly presents challenges for generative AI. Can they be solved? Maybe. Feucht points to “byte-level” state space models ...

  2. Vor 3 Stunden · Andrej Karpathy, a prominent figure in AI research, recently painted a picture of an even more radical future. He envisions a “100% Fully Software 2.0 computer” where a single neural network ...

  3. Vor 5 Tagen · Andrej Karpathy (@karpathy) July 4, 2024 In another experiment, Karpathy reproduced the smallest version of GPT-2, which has 124 million parameters, in just over four hours. Interestingly, GitHub Copilot also started as an internal project and has gone on to become a powerful AI-powered code completion tool used by developers worldwide.

  4. Vor 3 Tagen · People like Feynman, Tanenbaum, Sussman, Susskind, and now Karpathy are exceedingly rare. Each of them is a gift for generations to come. So, when you find one whose style resonates with the way you think, I suggest watching their videos multiple times. :)

  5. Vor einem Tag · Let’s build GPT: from scratch, in code, spelled out by Andrej Karpathy. In this two hour video, you build a simplified version of ChatGPT from scratch with one of the co-founders of OpenAI, Andrej Karpathy. Karpathy previously was also a Director of AI at Tesla, where he led the development of the company’s Autopilot system. Cool cool cool ...

  6. Vor 5 Tagen · Andrej Karpathy: We're entering a new computing paradigm with large language models acting like CPUs, using tokens instead of bytes, and having a context window instead of RAM. This is the Large Language Model OS (LMOS).

  7. Vor 5 Tagen · Recurrent Neural Networks (RNNs), and specifically a variant with Long Short-Term Memory (LSTM), are enjoying renewed interest as a result of successful applications in a wide range of machine learning problems that involve sequential data.