NLP: Contextualized word embeddings from BERT

Extract contextualized word embeddings from BERT using Keras and TF

Word Embeddings

Transformers

Source: http://mlexplained.com/2017/12/29/attention-is-all-you-need-explained/

Bidirectional Encoder Representations from Transformer (BERT)

Source: https://arxiv.org/pdf/1810.04805.pdf
Source: http://jalammar.github.io/illustrated-bert/

BERT Word Embedding Extraction

!rm -rf bert
!git clone https://github.com/google-research/bert
import syssys.path.append('bert/')from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import codecs
import collections
import json
import re
import os
import pprint
import numpy as np
import tensorflow as tf
import modeling
import tokenization
embeddings = get_features([“This is a test sentence”], dim=50)
print(embeddings)

Future Work

Conclusion

References:

☰ PhD Candidate @ UoG ● Combining Cyber Security with Data Science ● Writing to Understand

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store