Skip to content

Latest commit

 

History

History
122 lines (86 loc) · 8.52 KB

SHA256.md

File metadata and controls

122 lines (86 loc) · 8.52 KB

SHA256

为了保证文件的完整性,请一定要检查下列文件SHA256值的一致性。

To ensure the completeness of the model, please check the folllowing SHA256 before using them.

Original LLaMA (by Meta AI)

consolidated.*.pth format (original)

下表展示了Facebook发布的原版英文LLaMA的SHA256。

The followings are SHA256 values for the original LLaMA files.

Model Parts SHA256 (consolidated.*.pth)
7B 00 700df0d3013b703a806d2ae7f1bfb8e59814e3d06ae78be0c66368a50059f33d
13B 00 745bf4e29a4dd6f411e72976d92b452da1b49168a4f41c951cfcc8051823cf08
13B 01 d5ccbcc465c71c0de439a5aeffebe8344c68a519bce70bc7f9f92654ee567085
33B 00 e23294a58552d8cdec5b7e8abb87993b97ea6eced4178ff2697c02472539d067
33B 01 4e077b7136c7ae2302e954860cf64930458d3076fcde9443f4d0e939e95903ff
33B 02 24a87f01028cbd3a12de551dcedb712346c0b5cbdeff1454e0ddf2df9b675378
33B 03 1adfcef71420886119544949767f6a56cb6339b4d5fcde755d80fe68b49de93b

pytorch.bin format (huggingface)

下表展示了原版英文LLaMA转换为HF格式之后的SHA256。如果你要使用huggingface上的模型,请务必比对。

The followings are SHA256 values for the original LLaMA files (HF format). If you want to use the models on huggingface model hub, make sure to check these values.

Model SHA256 (pytorch_model-*.bin)
7B 0087155d6df07106c1d910bfeb6aab1be8e612dfbf2b56ddfb4ccbde7dbd50d0
461bc5e50200db7813ff99cc0b9316c48ccbd6aaaa31bf8cf7bee0b64bc3eda3
13B dd20cdee2637408c6ab88c13f5c27d153bacd0e99f2d55f6a66fbd0269944436
1aba886c5f28d2e2e9b04ab3c4f5bc250c6b06efc1baa3f557677b3097f70e6a
2efc56cddb7877c830bc5b402ee72996aedb5694c9e8007bf1d52d72c0d97d26
33B 9c2a7223ab5f9cf3d46913d2b776e99cbd6ed93f69991594b92a8cef0c681a78
4984274738e52195f4b1a5b35d719cf0fade6df1f645507d92d61af4dd8dcdfe
64c73932562810c5c33b15bfec5921d3ced0e8cdb3766c214eda2f45fa3edd13
c7d72d11770f5b58eb45c2dd8e19aae2cbd5a03463b564de3945b21825ebacba
174128542031f4ad7ceb6c799e8e5461ec1ca91a72a01402c567e5f6a8b33d8c
80e2cfa18994385fa88f03d500017346dfd6dc1e58e957d046af39d9a7e254fa
065611608159615ced8d38473ee693129a1a0d872ced0ad8daf09290af7c7061

Our LLaMA/Alpaca Model

Tokenizer.model

下表展示了tokenizer.model的SHA256。请注意LLaMA与Alpaca的tokenizer.model不同。对于同一个模型类型,不同大小或者版本的tokenizer.model是相同的。例如,LLaMA-7B, LLaMA-13B, LLaMA-Plus-7B的tokenizer.model相同。

The followings are SHA256 values for tokenizer.model files. Note that tokenizer.model for LLaMA and Alpaca differ. However, different sizes or versions of LLaMA/Alpaca have the same tokenizer.model. For example, LLaMA-7B, LLaMA-13B, LLaMA-Plus-7B's tokenizer.model are the same.

Model Type SHA256
LLaMA (7B, 13B, 33B) e2676d4ca29ca1750f6ff203328d73b189321dc5776ceede037cbd36541d70c0
Alpaca (7B, 13B, 33B) 2d967e855b1213a439df6c8ce2791f869c84b4f3b6cfacf22b86440b8192a2f8

LoRA weight file: adapter_model.bin

下表展示了LoRA主体权重文件adapter_model.bin的SHA256。

The followings are SHA256 values for adapter_model.bin files.

LoRA Model (adapter_model.bin) SHA256
Chinese-LLaMA-7B 2a2c24d096f5d509f24946fdbd8c25e1ce4a0acb955902f7436d74c0c0379d86
Chinese-LLaMA-Plus-7B 8c928db86b2a0cf73f019832f921eb7e1e069ca21441b4bfa12c4381c6cc46be
Chinese-LLaMA-13B 6a4ce789d219bde122f8d9a20371937f2aa2ee86a2311d9f5e303df2e774f9fc
Chinese-LLaMA-Plus-13B 784fcff9c4bdf4e77d442a01158e121caf8fcce0f97ffb32396fe7a3617ee7e8
Chinese-LLaMA-33B 93a449bafb71ff1bb74a4a21e64e102e5078e5c3898eb40d013790072a0fa3de
Chinese-LLaMA-Plus-33B 16f2544f4b5be9840dbb1a8071a9bc42627ed4232be3b0b600b43f7b4b5f08a7
Chinese-Alpaca-7B 0d9b6ed8e4a7d1ae590a16c89a452a488d66ff07e45487972f61c2b6e46e36de
Chinese-Alpaca-Plus-7B 4ee0bf805c312a9a771624d481fbdb4485e1b0a70cd2a8da9f96137f177b795d
Chinese-Alpaca-Pro-7B 3cd2776908c3f5efe68bf6cf0248cb0e80fb7c55a52b8406325c9f0ca37b8594
Chinese-Alpaca-13B cb8dda3c005f3343a0740dcd7237fbb600cb14b6bff9b6f3d488c086a2f08ada
Chinese-Alpaca-Plus-13B a1fcdcb6d7e1068f925fb36ec78632c76058ba12ba352bed4d44060b8e6f4706
Chinese-Alpaca-Pro-13B f076b20fc2390ddbc35fd56d580d46ea834b33bbae34a4bb3cb7b571e60602e0
Chinese-Alpaca-33B 6b39da4c682e715a9de30b247b7e9b812d2d54f7d320ec9b452000a5cd4d178d
Chinese-Alpaca-Plus-33B 411f5b9351abcc33c13a82bdd97ddcff81ad7993a8ddb83085b7ea97fad92fc7
Chinese-Alpaca-Pro-33B 0e7ba4951f605d2c0a7f0bcb983d7f6ed075c8dd23fbbcbc8a8c9643247212a3

Merged files (consolidated.*.pth)

下表展示了合并LoRA权重后的全量模型权重(PyTorch版)的SHA256。PyTorch版本不影响实际权重数据,但影响meta信息,所以SHA256也会不同。建议合并模型时使用PyTorch >= 1.13.0版本,以确保以下SHA256有参考性。

⚠️ 请优先确保合并前的基模型和LoRA权重的SHA256是否与上述表中所述值一致。

The followings are SHA256 values for merged files (consolidated.*.pth). Note that the version of PyTorch does not affect actual weights but meta informations are slightly different. Please check SHA256 according to PyTorch version >= 1.13.0.

Model SHA256 (PyTorch >= 1.13.0)
Chinese-LLaMA-7B 245427a306e3253db3f534e2a1d7548a8eb781ae8761f9e98979b4aced6b43d8
Chinese-LLaMA-Plus-7B f8d380d63f77a08b7f447f5ec63f0bb1cde9ddeae2207e9f86e6b5f0f95a7955
Chinese-LLaMA-13B aa7f4599487ea2b0d0aca2b522c39370897f9afd9839aac7d02155957f1f019f
3954f3e7f7264994f23800a04423e6563cc1959ac699d9eaaa6801b4f9392ebd
Chinese-LLaMA-Plus-13B 4de7d188003c778f216342de2dc5c9a9c74278c701c63a7b6bcd7957f5ebfdf5
ff8046f9eb8b05dd86597c21edd07894aec00b31842a4c11996a4003091ea7c9
Chinese-LLaMA-33B 054e9b7dffa3b92a053ca32acac6e22b27c184ed2b8563f8e44e6570ba416357
a0fe86c45a0819f45a509776d82778b7de75fbff8d37afa97159b24de5448b7b
13df5f74dc7bc1204076b1febef818fb3cec978de27bf8fc85c70e7d62282df9
f4f28106c343c5804613faa9852f29fbc60764366bcb0d37ef2811a17be2d336
Chinese-LLaMA-Plus-33B 0da2fa92c054aa4010ffe1746afbc5bdd89c07b8e418d74debcbf0de74c00692
35e25e7fe47f978d52da6cdae3272b3b4ea42a37ea8ece84a4d758350d188397
c20f80a7da24b4be72a7f617311e3e2e9346a476d85b3f1a28095497bb1857b0
ae5276fe06fd9d039ed295d627c0048d5d96c2390de413f610b093c151370d3c
Chinese-Alpaca-7B fbfccc91183169842aac8d093379f0a449b5a26c5ee7a298baf0d556f1499b90
Chinese-Alpaca-Plus-7B 8b8f6551d0d83f93e378622b9f8dad0bec189da6c29d8a78de493e6aee9bd35f
Chinese-Alpaca-Pro-7B TBA
Chinese-Alpaca-13B 30cefb5be9091c3e17fbba5d91bf16266a2ddf86cde53370a9982b232ff8a2f4
ce946742b0f122f472e192c3f77d506e0c26578b4b881d07d919553333affecd
Chinese-Alpaca-Plus-13B 1834558214c1dddc0d8b2826ece086908b9d2293241d0e12cecb48a035ec561b
bf70001600ce166f6ca4ef59df5510f0582cdc119fb74e27d9cf3e4c7b142015
Chinese-Alpaca-Pro-13B TBA
Chinese-Alpaca-33B 72bfe67481c0df1b8c3b536acd15ac42c1163b0727b1beb6409ee31d14cb2490
fd2151ea714a6e0706a60cca5ab7abf8558e665d4cb001481c6df616c0821c16
4a7e3de6881769f9c2413f0867e67da20efdf4502602ab90483cb99c593e51ed
99c81a7a310802dcc579fe96288fbc18d4486f92020eaf925e1c33db8311378a
Chinese-Alpaca-Plus-33B TBA
Chinese-Alpaca-Pro-33B TBA

How To Check SHA256

In MacOS,

> shasum -a 256 your-model-file

In Linux,

> sha256sum your-model-file

In Windows,

> certutil -hashfile your-model-file sha256