go 实现 gpt3 分词编解码
Go to file
YanjinZhu e27b22b858 fix: 修改模块路径 2023-02-03 06:53:23 +00:00
.github/workflows Initial commit 2022-12-22 00:06:25 +01:00
.gitignore Initial commit 2022-12-22 00:06:25 +01:00
LICENSE Initial commit 2022-12-22 00:06:25 +01:00
Makefile Initial commit 2022-12-22 00:06:25 +01:00
README.md Update README.md 2023-01-02 01:02:35 +01:00
encoder.go Initial commit 2022-12-22 00:06:25 +01:00
encoder.json Initial commit 2022-12-22 00:06:25 +01:00
encoder_test.go Initial commit 2022-12-22 00:06:25 +01:00
go.mod fix: 修改模块路径 2023-02-03 06:53:23 +00:00
go.sum Initial commit 2022-12-22 00:06:25 +01:00
utils.go Initial commit 2022-12-22 00:06:25 +01:00
vocab.bpe Initial commit 2022-12-22 00:06:25 +01:00

README.md

go-gpt-3-encoder

Go BPE tokenizer (Encoder+Decoder) for GPT2 and GPT3.

About

GPT2 and GPT3 use byte pair encoding to turn text into a series of integers to feed into the model. This is a Go implementation of OpenAI's original Python encoder/decoder which can be found here.

This code was inspired by Javascript implementation and partially generated by OpenAI himself!

Install

go get github.com/samber/go-gpt-3-encoder

Usage

Compatible with Node >= 12

import tokenizer "github.com/samber/go-gpt-3-encoder"

encoder, err := tokenizer.NewEncoder()
if err != nil {
    log.Fatal(err)
}

str := "This is an example sentence to try encoding out on!"

encoded, err := encoder.Encode(str)
if err != nil {
    log.Fatal(err)
}

fmt.Println("We can look at each token and what it represents:")
for _, token := encoded {
  fmt.Printf("%s -- %s\n", token, encoder.Decode([]string{token}))
}

decoded := encoder.Decode(encoded)
fmt.Printf("We can decode it back into: %s\n", decoded)

Contribute

Some corner cases are not covered by this library. See @TODO in tests.