GPT-2 (nonfiction): Difference between revisions

From Gnomon Chronicles
Jump to navigation Jump to search
No edit summary
No edit summary
Line 5: Line 5:
Because the model is probabilistic, it returns a different response every time you enter the same input.
Because the model is probabilistic, it returns a different response every time you enter the same input.


== In the News ==


<gallery>
</gallery>
== Fiction cross-reference ==
* [[Forbidden Ratio]]
* [[Gnomon algorithm]]
* [[Gnomon Chronicles]]
== Nonfiction cross-reference ==
* [[Computation (nonfiction)]]
* [[Machine learning (nonfiction)]]
External links:
* [https://www.youtube.com/watch?v=89A4jGvaaKk Unicorn AI - Computerphile] @ YouTube
* [https://boingboing.net/2019/07/05/computerphile-explains-the-fas.html Computerphile explains the fascinating AI storyteller, GPT-2] @ Boing Boing
* [https://blog.floydhub.com/gpt2/ How to Build OpenAI's GPT-2: "The AI That's Too Dangerous to Release"]
* [https://blog.floydhub.com/gpt2/ How to Build OpenAI's GPT-2: "The AI That's Too Dangerous to Release"]
[[Category:Nonfiction (nonfiction)]]
[[Category:Computing (nonfiction)]]

Revision as of 07:43, 6 July 2019

GPT-2 (Generative Pretrained Transformer 2) is a language model that was trained on 40GB of text scraped from websites that Reddit linked to and that had a Karma score of at least two.

The developers at OpenAI describe GPT-2 as "a large-scale unsupervised language model which generates coherent paragraphs of text, achieves state-of-the-art performance on many language modeling benchmarks, and performs rudimentary reading comprehension, machine translation, question answering, and summarization—all without task-specific training."

Because the model is probabilistic, it returns a different response every time you enter the same input.

In the News

Fiction cross-reference

Nonfiction cross-reference

External links: