Session: Transfer Learning With BERT: Building a Text Classification Model
The domain of Natural Language Processing have seen a tremendous amount of research and innovation in the past couple of years to tackle the problem of implementing high quality machine learning and AI solutions using natural text. Text Classification is one such area that is extremely important in all sectors like finance, media, product development, etc. Building up a text classification system from scratch for every use case can be challenging in terms of cost as well as resources, considering there is a good amount of dataset to begin training with.
Here comes the concept of transfer learning. Using some of the models that has been pre-trained on terabytes of data and fine-tuning it based on the problem at hand is the new way to efficiently implement machine learning solutions without spending months on data cleaning pipeline.
This talk with highlight ways of implementing the newly launched BERT and fine tuning the base model to build an efficient text classifying model. Basic understanding of python is desirable. Code can be made available via Github for everyone to examine after the talk.
Bio: Jayeeta Putatunda
Jayeeta is a Data Scientist with a specialization in Natural Language Processing and have a Masters in Quantitative Methods and Modeling. Jayeeta is passionate about exploring new concepts in the data science domain and firmly believes that data is the best story teller. Currently, she is working on machine learning and big data projects at Indellient Inc., a leading software development and IT professional services company working with Fortune 100 companies. Prior to this, she worked as a research analyst at Deloitte. Jayeeta is passionate about her work and is actively engaged with some amazing organizations to promote and inspire more women to take up STEM. Jayeeta lives in New York, loves to cook and spends her summers hiking and exploring the city with her husband.