bert使用指南( 三 )


python run_pretraining.py
3.3运行结果如下
INFO:tensorflow:Done running local_init_op.INFO:tensorflow:Evaluation [10/100]INFO:tensorflow:Evaluation [20/100]INFO:tensorflow:Evaluation [30/100]INFO:tensorflow:Evaluation [40/100]INFO:tensorflow:Evaluation [50/100]INFO:tensorflow:Evaluation [60/100]INFO:tensorflow:Evaluation [70/100]INFO:tensorflow:Evaluation [80/100]INFO:tensorflow:Evaluation [90/100]INFO:tensorflow:Evaluation [100/100]INFO:tensorflow:Finished evaluation at 2020-10-21-11:44:22INFO:tensorflow:Saving dict for global step 20000: global_step = 20000, loss = 4.2188535e-06, masked_lm_accuracy = 1.0, masked_lm_loss = 4.20574e-06, next_sentence_accuracy = 1.0, next_sentence_loss = 1.31130085e-08INFO:tensorflow:Saving 'checkpoint_path' summary for global step 20000: /mnt/home/tongji/car-corpus-lmodels/Models/Bert_word2vec/pretraining_output/model.ckpt-20000INFO:tensorflow:evaluation_loop marked as finishedINFO:tensorflow:***** Eval results *****INFO:tensorflow:global_step = 20000INFO:tensorflow:loss = 4.2188535e-06INFO:tensorflow:masked_lm_accuracy = 1.0INFO:tensorflow:masked_lm_loss = 4.20574e-06INFO:tensorflow:next_sentence_accuracy = 1.0INFO:tensorflow:next_sentence_loss = 1.31130085e-08INFO:tensorflow:*** all time ***INFO:tensorflow:Wed Oct 21 19:44:23 2020INFO:tensorflow:秒:6214.46689581871
时间相比如CPU 缩短了很多
4 总结
本文是新手的运行项目代码的手册,目的是将输入的语料经过bert预训练为后面识别提高准确度 。可以看出GPU训练的时间比CPU短