AD Teaching Wiki
  • Comments
  • Immutable Page
  • Menu
    • Navigation
    • RecentChanges
    • FindPage
    • Local Site Map
    • Help
    • HelpContents
    • HelpOnMoinWikiSyntax
    • Display
    • Attachments
    • Info
    • Raw Text
    • Print View
    • Edit
    • Load
    • Save
  • Login

FrontPage

Upload page content

You can upload content for the page named below. If you change the page name, you can also upload content for another page. If the page name is empty, we derive the page name from the file name.

File to load page content from
Page name
Comment

AD Teaching Wiki:
  • BachelorAndMasterProjectsAndTheses
  • BertEntityLinking

Your interests/skills: Deep Learning, Natural Language Processing, Knowledge Bases

Step 1: Choose a meaningful example (in the context of Entity Linking) and visualize BERT word embeddings. We hope to find that BERT word embeddings are useful to differentiate between identical names that refer to different entities. This can be used to motivate our BERT approach to Entity Linking.

Step 2: Implement a neural network that uses BERT word embeddings to link entities to their entries in the knowledge base WikiData. This could be a neural network that takes BERT word embeddings of a recognized entity mention in a text and word embeddings that represent a candidate entity in the knowledge base (candidate entities can be extracted from the knowledge base using the entity’s label and its aliases) and decides whether the mention should be linked to the entity.

Step 3: Compare your approach against state-of-the-art approaches and at least one baseline. What happens if you use Word2Vec / GloVe instead of BERT word embeddings? Evaluate your approach on at least 2 benchmarks (e.g. AIDA-CoNLL, Wikipedia with links, our benchmark, ....).

  • MoinMoin Powered
  • Python Powered
  • GPL licensed
  • Valid HTML 4.01