Introduction to BERT | DLMC43

Bidirectional Encoder Representations from Transformers is a Transformer-based machine learning technique for natural language processing pre-training developed by Google. BERT was created and published in 2018 by Jacob Devlin and his colleagues from Google.

Mar 15, 2021, 3:00 – 4:00 PM

52
RSVP'd

RSVP Now

Key Themes

Machine Learning

About this event

Bidirectional Encoder Representations from Transformers is a Transformer-based machine learning technique for natural language processing pre-training developed by Google. BERT was created and published in 2018 by Jacob Devlin and his colleagues from Google.

Speaker

  • Muhammad Huzaifa Shahbaz

    Lenaar Digital

    Co-founder

Partners

Office of Research, Innovation & Commercialization - ORIC logo

Office of Research, Innovation & Commercialization - ORIC

IEEE SSUET - Student Branch logo

IEEE SSUET - Student Branch

IEEE Computer Society SSUET logo

IEEE Computer Society SSUET

Amal4Ajar logo

Amal4Ajar

Organizers

  • Laiba Rafiq

    GDSC Lead

  • Maaz Farman

    SPARK⚑BIZ

    Community Mentor

  • shayan faiz

    techrics

    Outreach Coordinator

  • Muhammad Ahmer Zubair

    Sharp Edge

    Media Creative Lead

  • Ehtisham Ul Haq

    Tech Lead

  • Mohammad Nabeel Sohail

    AI and Chatbot Developer | Full Stack Web | PAFLA Ambassador | Public Speaker | Trainer

    Communications Lead

  • Kashan Khan Ghori

    Softseek International

    Operations Lead

  • Daniyal Jamil

    Technology Links

    Marketing Lead

  • Sami Faiz Qureshi

    ConciSafe

    Event Management Lead

  • Maham Amjad

    Content Writing Lead

  • Syed Affan Hussain

    HnH Soft Tech Solutions Pvt Ltd

    Host

Contact Us