我想要一天分享一點「LLM從底層堆疊的技術」,並且每篇文章長度控制在三分鐘以內,讓大家不會壓力太大,但是又能夠每天成長一點。
!pip install transformers
!pip install xformers
!pip install shap
sentence = 'SHAP is a useful explainer'
import transformers
model = transformers.pipeline('sentiment-analysis', model = 'distilbert-base-uncased-finetuned-sst-2-english')
result = model(sentence)[0]
print(result)
import shap
explainer = shap.Explainer(model)
shap_values = explainer([sentence])
predicted_class = result['label']
shap.plots.text(shap_values[0, :, predicted_class])
結果如下: