YoVDO

Pretraining and Finetuning a Transformer Model for Location Resolution

Offered By: Databricks via YouTube

Tags

Data Science Courses Deep Learning Courses Databricks Courses Transformer Models Courses Hugging Face Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Discover how to extract and link location entities from unstructured text data in this 20-minute conference talk by Evelyn Wang, Data Scientist at Balyasny Asset Management. Learn about leveraging Databricks and the Hugging Face Transformers library to pretrain and finetune a custom model tailored for location resolution. Explore the process of creating training datasets, deployment considerations, and insights for processing locations in text data at scale. Gain valuable knowledge on adapting entity linking methods to domain-specific knowledge bases, particularly for fine-grained location resolution. This talk offers practical insights for those interested in enhancing location intelligence, risk management, and building knowledge graphs.

Syllabus

Pretraining and Finetuning a Transformer Model for Location Resolution


Taught by

Databricks

Related Courses

Hugging Face on Azure - Partnership and Solutions Announcement
Microsoft via YouTube
Question Answering in Azure AI - Custom and Prebuilt Solutions - Episode 49
Microsoft via YouTube
Open Source Platforms for MLOps
Duke University via Coursera
Masked Language Modelling - Retraining BERT with Hugging Face Trainer - Coding Tutorial
rupert ai via YouTube
Masked Language Modelling with Hugging Face - Microsoft Sentence Completion - Coding Tutorial
rupert ai via YouTube