Are you curious about how chatbots hold conversations or how ChatGPT generates human-like responses? This course in Natural Language Processing (NLP) is your gateway into the fascinating world where language meets AI. Designed for students and professionals alike, the course blends essential theory with hands-on experience to equip you with the skills needed to build intelligent language systems.

推荐体验
推荐体验
中级
Linear Algebra and Optimisation, Probability and Statistics, Introduction to Programming and Introduction to Data Analytics
推荐体验
推荐体验
中级
Linear Algebra and Optimisation, Probability and Statistics, Introduction to Programming and Introduction to Data Analytics
您将学到什么
Understand and recall core concepts and techniques in Natural Language Processing (NLP).
Analyse and evaluate NLP methods for varied tasks, considering performance, context, and suitability.
Design and develop real-world NLP applications by integrating multiple techniques.
您将获得的技能
您将学习的工具
要了解的详细信息

添加到您的领英档案
140 项作业
January 2026
了解顶级公司的员工如何掌握热门技能

该课程共有12个模块
In this module, the learners will be introduced to the course and its syllabus, setting the foundation for their learning journey. The course's introductory video will provide them with insights into the valuable skills and knowledge they can expect to gain throughout the duration of this course. Additionally, the syllabus reading will comprehensively outline essential course components, including course values, assessment criteria, grading system, schedule, details of live sessions, and a recommended reading list that will enhance the learner’s understanding of the course concepts. Moreover, this module offers the learners the opportunity to connect with fellow learners as they participate in a discussion prompt designed to facilitate introductions and exchanges within the course community.
涵盖的内容
2个视频1篇阅读材料1个讨论话题
2个视频•总计5分钟
- Course Introduction•3分钟
- Meet Your Instructor: Prof. Dr. Chetana Gavankar•2分钟
1篇阅读材料•总计10分钟
- Course Overview•10分钟
1个讨论话题•总计10分钟
- Meet Your Peers •10分钟
This module introduces the fundamental concepts of Natural Language Processing (NLP). It begins with the definition of NLP and explores a variety of real-world applications. You will gain an understanding of Natural Language Understanding (NLU) and Natural Language Generation (NLG). The module also covers key evaluation metrics used to assess NLP systems. Additionally, a hands-on lab session will guide you through the implementation of basic NLP preprocessing techniques.
涵盖的内容
13个视频4篇阅读材料12个作业1个讨论话题
13个视频•总计76分钟
- NLP Definition•3分钟
- NLP Applications•5分钟
- Why NLP is Hard?•10分钟
- Natural Language Understanding •4分钟
- Levels of Language Understanding•5分钟
- Natural Language Generation•4分钟
- Organisation of NLP System•6分钟
- Intrinsic vs. Extrinsic Evaluation•4分钟
- Challenges in Evaluation•4分钟
- NLP Tools Overview•7分钟
- Demo of NLP Tools•6分钟
- Basic NLP Application Development Using NLP Tools•13分钟
- Module Wrap-Up•6分钟
4篇阅读材料•总计60分钟
- Recommended Reading: What is NLP?•15分钟
- Recommended Reading: NLP Fundamentals•15分钟
- Recommended Reading: Evaluation of NLP Systems•15分钟
- Recommended Reading: NLP Tools Introduction•15分钟
12个作业•总计45分钟
- NLP Definition•6分钟
- NLP Applications•3分钟
- Why NLP is a Hard Problem•3分钟
- Natural Language Understanding •3分钟
- Levels of Language Understanding•3分钟
- Natural Language Generation•3分钟
- Organisation of NLP System•3分钟
- Intrinsic vs. Extrinsic Evaluation•6分钟
- Challenges in Evaluation•3分钟
- NLP Tools Overview•6分钟
- Demo of NLP Tools•3分钟
- Basic NLP Application Development Using NLP Tools•3分钟
1个讨论话题•总计30分钟
- Real-World Challenges and Tools in Natural Language Processing•30分钟
This module introduces essential NLP preprocessing techniques. It begins with regular expressions for text pattern matching, followed by an overview of words and corpora as foundational data sources. Sentence segmentation and tokenization are then covered through practical demonstrations. Finally, the module explores normalization, lemmatization, and stemming as methods to standardise text, with a demo highlighting their differences and effects.
涵盖的内容
14个视频5篇阅读材料14个作业1个讨论话题
14个视频•总计79分钟
- Regular Expressions•8分钟
- Words and Corpora•5分钟
- Sentence Segmentation•3分钟
- Code Demo Segmentation•5分钟
- Tokenization•5分钟
- Tokenization Methods•7分钟
- Code Demo Tokenization•14分钟
- Normalization •4分钟
- Code Demo Normalization •4分钟
- Stemming•6分钟
- Code Demo Stemming•5分钟
- Lemmatization •3分钟
- Code Demo Lemmatization•6分钟
- Module Wrap-Up•4分钟
5篇阅读材料•总计130分钟
- Recommended Reading: Basic Text Preprocessing•35分钟
- Recommended Reading: Segmentation and Tokenization •30分钟
- Recommended Reading: Normalization•20分钟
- Recommended Reading: Stemming and Lemmatization•30分钟
- Instructional Document: Staff-Graded Assignment-1•15分钟
14个作业•总计99分钟
- Graded Quiz: Modules 1 and 2•60分钟
- Regular Expressions•3分钟
- Words and Corpora•3分钟
- Sentence Segmentation•3分钟
- Code Demo Segmentation•3分钟
- Tokenization•3分钟
- Tokenization Methods•3分钟
- Code Demo Tokenization•3分钟
- Normalization •3分钟
- Code Demo Normalization•3分钟
- Stemming•3分钟
- Code Demo Stemming•3分钟
- Lemmatization•3分钟
- Code Demo Lemmatization•3分钟
1个讨论话题•总计30分钟
- Building a Preprocessing Pipeline: Challenges and Solutions•30分钟
This module explores lexical and vector semantics, focusing on computational representations of word meaning. It covers word vectors, Bag of Words, and co-occurrence matrices to capture contextual relationships. Techniques such as TF-IDF are introduced to measure word importance, along with methods for computing word similarity. Practical examples and mathematical exercises on TF-IDF help reinforce these core NLP concepts.
涵盖的内容
13个视频3篇阅读材料10个作业1个讨论话题
13个视频•总计72分钟
- Lexical Semantics •3分钟
- Why Vectors?•7分钟
- Word and Vectors•8分钟
- Bag of Words•4分钟
- Computing Word Similarity•3分钟
- Cosine Similarity•4分钟
- Cosine Similarity Example•7分钟
- Term Frequency•4分钟
- Inverse Document Frequency•11分钟
- TF-IDF•7分钟
- Demo of Words as Vectors•4分钟
- Demo of TF-IDF•8分钟
- Module Wrap-Up•4分钟
3篇阅读材料•总计45分钟
- Recommended Reading: Foundations of Lexical and Vector Semantics •15分钟
- Recommended Reading: Representing Text Using Vectors •15分钟
- Recommended Reading: Term and Inverse Document Frequency •15分钟
10个作业•总计30分钟
- Lexical Semantics •3分钟
- Why Vectors? •3分钟
- Word and Vectors •3分钟
- Bag of Words•3分钟
- Computing Word Similarity •3分钟
- Cosine Similarity •3分钟
- Cosine Similarity Example •3分钟
- Term Frequency •3分钟
- Inverse Document Frequency •3分钟
- TF-IDF •3分钟
1个讨论话题•总计20分钟
- Applying Vector Semantics in a Real-World Scenario•20分钟
This module introduces Word Embeddings, focusing on the transition from sparse to dense vector representations of words. It covers Word2Vec models, including Skip-gram and CBOW, explained with simple, intuitive examples. The module also explores GloVe embeddings, which capture global word co-occurrence statistics for improved semantic understanding. Learners will visualise word embeddings to gain insights into how words relate in vector space. Finally, the module highlights real-world applications of word embeddings in NLP tasks like sentiment analysis, machine translation, and question answering.
涵盖的内容
13个视频3篇阅读材料14个作业1个讨论话题
13个视频•总计79分钟
- Word2Vec •4分钟
- Basic 1-Hot Word Representation•4分钟
- Feature Based Word Representations•3分钟
- Skip Gram Algorithm Introduction•6分钟
- Skip Gram Probabilities•8分钟
- Skip-Gram Negative Sampling (SGNS) Approach•7分钟
- Skip-Gram Negative Training Data Example•7分钟
- SGNS Log Loss Function•7分钟
- Derivative of SGNS Loss Function•6分钟
- SGNS Example Part 1•12分钟
- SGNS Example Part 2•8分钟
- Continuous Bag of Words (CBOW)•5分钟
- Module Wrap Up •4分钟
3篇阅读材料•总计45分钟
- Recommended Reading: Basics of Word2Vec •15分钟
- Recommended Reading: Skip-Gram Word Embedding •15分钟
- Other Word2Vec Approaches Title: Essential Reading Material – CBOW and GloVe •15分钟
14个作业•总计396分钟
- Graded Quiz - Modules 3 and 4•60分钟
- SGA-1 Submission: Word Embedding•300分钟
- Word2Vec•3分钟
- Basic 1-Hot Word Representation•3分钟
- Feature Based Word Representations•3分钟
- Skip Gram Algorithm Introduction•3分钟
- Skip Gram Probabilities•3分钟
- Skip-Gram Negative Sampling (SGNS) Approach•3分钟
- Skip-Gram Negative Training Data Example•3分钟
- SGNS Log Loss Function•3分钟
- Derivative of SGNS Loss Function•3分钟
- SGNS Example Part 1•3分钟
- SGNS Example Part 2•3分钟
- Continuous Bag of Words (CBOW)•3分钟
1个讨论话题•总计20分钟
- The Power of Dense Vectors: Choosing an Embedding Model•20分钟
This module introduces Language Modeling (LM) and its role in predicting word sequences in natural language. It explores practical applications of LMs and explains N-gram models, including challenges like generalization and handling zero probabilities. Techniques such as smoothing and stupid backoff are covered to improve model robustness. The module concludes with methods for evaluating language models using standard metrics.
涵盖的内容
15个视频4篇阅读材料13个作业1个讨论话题
15个视频•总计96分钟
- What is Language Modeling?•3分钟
- Language Modelling Applications •3分钟
- How to Build a Language Model •5分钟
- Markov Assumption •2分钟
- N-gram Language Models•4分钟
- Bi-gram Computation•10分钟
- Raw Probabilities•10分钟
- Perils of Overfitting•3分钟
- Laplace Smoothing•14分钟
- Interpolation & Backoff•10分钟
- How Good is the Model?•3分钟
- Extrinsic Evaluation•5分钟
- Perplexity & It's Example•9分钟
- Module Demo•10分钟
- Module Wrap-Up•5分钟
4篇阅读材料•总计60分钟
- Recommended Reading: Language Modelling Introduction•15分钟
- Recommended Reading: N-grams •15分钟
- Recommended Reading: Smoothing •15分钟
- Recommended Reading: Language Modelling Evaluation •15分钟
13个作业•总计39分钟
- What is Language Modeling? •3分钟
- Language Modelling Applications •3分钟
- How to Build a Language Model •3分钟
- Markov Assumption•3分钟
- N-gram Language Models •3分钟
- Bi-gram Computation •3分钟
- Raw Probabilities •3分钟
- Perils of Overfitting •3分钟
- Laplace Smoothing•3分钟
- Interpolation & Backoff•3分钟
- How Good is the Model?•3分钟
- Extrinsic Evaluation •3分钟
- Perplexity & its Example•3分钟
1个讨论话题•总计20分钟
- Balancing Simplicity and Performance in Language Modelling•20分钟
This module explores the use of Neural Networks in Language Modelling, starting with the fundamentals of Feed-Forward Neural Networks and their training process for language tasks. It introduces Neural Language Models, which capture complex patterns in text beyond traditional statistical methods. The module also provides a foundational understanding of Large Language Models (LLMs) and their capabilities. Finally, it introduces Prompt Engineering as a technique to effectively interact with and guide LLMs for various NLP applications.
涵盖的内容
17个视频6篇阅读材料16个作业1个讨论话题
17个视频•总计98分钟
- Neural Network Unit•3分钟
- Non-Linear Activation Functions•5分钟
- Perceptron with Examples•4分钟
- Multi-Layer Perceptron•8分钟
- Softmax Function with Example•4分钟
- Feed Connected Neural Network•4分钟
- Feedforward Network•5分钟
- Forward Algorithm•4分钟
- Backpropagation Algorithm•5分钟
- Training Neural Network•12分钟
- Neural Language Modeling•6分钟
- Training Neural Language Model•9分钟
- N-gram Versus Neural Language Model•4分钟
- Neural LM Demo•10分钟
- What is LLM?•6分钟
- LLM Use Cases•5分钟
- Module Wrap Up•3分钟
6篇阅读材料•总计105分钟
- Recommended Reading: Introduction to Neural Network•15分钟
- Recommended Reading: Feed Forward Neural Network •15分钟
- Recommended Reading: Training Neural Network •15分钟
- Recommended Reading: Neural Language Models •15分钟
- Recommended Reading: Introduction to Large Language Models •30分钟
- Instructional Document: Staff-Graded Assignment-2•15分钟
16个作业•总计105分钟
- Graded Quiz - Modules 5 and 6•60分钟
- Neural Network Unit•3分钟
- Non-Linear Activation Functions•3分钟
- Perceptron with Examples•3分钟
- Multi-Layer Perceptron•3分钟
- Softmax Function with Example•3分钟
- Feed Connected Neural Network•3分钟
- Feed Forward Network•3分钟
- Forward Algorithm•3分钟
- Backpropagation Algorithm•3分钟
- Training Neural Network•3分钟
- Neural Language Modeling•3分钟
- Training Neural Language Model•3分钟
- N-gram Versus Neural Language Model•3分钟
- What is LLM?•3分钟
- LLM Use Cases•3分钟
1个讨论话题•总计20分钟
- The Next Generation of Language Modelling: From N-grams to LLMs•20分钟
This module provides an introduction to Part-of-Speech (POS) Tagging, techniques to perform POS Tagging and their applications in NLP. POS tagging is a fundamental task in Natural Language Processing (NLP) that involves assigning grammatical categories (like noun, verb, adjective) to words in text. Starting from basic linguistic foundations and real-world applications, the module dives into the evolution of POS tagging techniques—from statistical models like Hidden Markov Models (HMMs) and Maximum Entropy classifiers, to modern deep learning approaches using Recurrent Neural Networks (RNNs). Learners will gain a strong theoretical understanding and insight into how POS tagging supports downstream tasks like parsing, named entity recognition, and machine translation. The module includes a hands-on coding demonstration for POS tagging.
涵盖的内容
13个视频5篇阅读材料11个作业1个讨论话题
13个视频•总计74分钟
- Outline of the Module •2分钟
- What is POS Tagging? •6分钟
- Challenges in POS Tagging•4分钟
- POS Tagsets •6分钟
- Markov Chain•5分钟
- Hidden Markov Model•5分钟
- Hidden Markov Model as POS Tagger •6分钟
- Viterbi Algorithm •8分钟
- Viterbi Algorithm - Example•8分钟
- Logistic Regression - Overview•9分钟
- Multinomial Logistic Regression - Overview•6分钟
- Maximum Entropy Markov Models (MEMM)•7分钟
- Module Wrap Up•2分钟
5篇阅读材料•总计110分钟
- Code Document: POS tagging using NLTK / spaCy •10分钟
- Recommended Reading: Introduction to POS Tagging and Applications •30分钟
- Code Document: Demonstrating HMM Based POS Tagger•10分钟
- Recommended Reading: HMM for POS Tagging •30分钟
- Recommended Reading: Maximum Entropy Markov Models•30分钟
11个作业•总计33分钟
- What is POS Tagging?•3分钟
- Challenges in POS Tagging•3分钟
- POS Tagsets •3分钟
- Markov Chain•3分钟
- Hidden Markov Model•3分钟
- Hidden Markov Model as POS Tagger •3分钟
- Viterbi Algorithm •3分钟
- Viterbi Algorithm - Example•3分钟
- Logistic Regression - Overview•3分钟
- Multinomial Logistic Regression - Overview•3分钟
- Maximum Entropy Markov Models (MEMM)•3分钟
1个讨论话题•总计30分钟
- POS Tagging: The Right Tool for the Job•30分钟
This module introduces students to the syntactic structure of natural language and its critical role in Natural Language Processing (NLP) applications. Parsing is the task of assigning a structured representation—typically a tree—to a sentence, revealing the grammatical relationships between its components. The module begins by revisiting Context-Free Grammars (CFGs) and how they form the foundation for syntactic parsing. We explore Constituent Parsing, introducing classical parsing techniques such as the CKY (Cocke-Kasami-Younger) algorithm. The module then transitions to modern span-based neural parsing approaches that use neural networks to score and predict parse trees. A significant portion of the module is dedicated to Dependency Parsing, where syntactic structure is represented through direct relationships between words rather than phrases. Students will study both transition-based and graph-based dependency parsers, gaining insight into their strengths, algorithmic designs, and practical performance. Throughout the module, we emphasise real-world NLP applications.
涵盖的内容
18个视频4篇阅读材料18个作业1个讨论话题
18个视频•总计88分钟
- Outline of the Module •2分钟
- Introduction to Context-Free Grammars (CFGs)•8分钟
- Constituency and Phrase Structure•5分钟
- Ambiguity in Grammar•4分钟
- Chomsky Normal Form (CNF) and Grammar Normalisation•5分钟
- Treebanks and Empirical Grammar•3分钟
- CKY Algorithm•7分钟
- CKY Algorithm - Walkthrough•8分钟
- Parse Tree Recovery From CKY Table•5分钟
- Neural Span-based Constituency Parsing•5分钟
- What is Dependency Parsing?•5分钟
- Dependency Formalism•5分钟
- Universal Dependency Relations•4分钟
- Transition-Based Dependency Parsing •6分钟
- Transition-Based Dependency Parsing - Walkthrough•5分钟
- Creating an Oracle •4分钟
- Graph-Based Dependency Parsing•5分钟
- Module Wrap Up•2分钟
4篇阅读材料•总计120分钟
- Recommended Reading: Review of Context-Free Grammars and Parsing in NLP •30分钟
- Recommended Reading: Constituency Parsing and CKY Algorithm •30分钟
- Recommended Reading: Dependency Parsing – Theory and Representations •30分钟
- Recommended Reading: Dependency Parsing Algorithms and Modern Applications •30分钟
18个作业•总计411分钟
- Graded Quiz: Modules 7 and 8•60分钟
- SGA-2: POS Tagging and Parsing•300分钟
- Introduction to Context-Free Grammars (CFGs)•3分钟
- Constituency and Phrase Structure•3分钟
- Ambiguity in Grammar•3分钟
- Chomsky Normal Form (CNF) and Grammar Normalisation•3分钟
- Treebanks and Empirical Grammar•3分钟
- CKY Algorithm•3分钟
- CKY Algorithm - Walkthrough•3分钟
- Parse Tree Recovery From CKY Table•3分钟
- Neural Span-based Constituency Parsing•3分钟
- What is Dependency Parsing?•3分钟
- Dependency Formalism•3分钟
- Universal Dependency Relations•3分钟
- Transition-Based Dependency Parsing •3分钟
- Transition-Based Dependency Parsing - Walkthrough•6分钟
- Creating an Oracle•3分钟
- Graph-Based Dependency Parsing•3分钟
1个讨论话题•总计30分钟
- Parsing Frameworks: Constituent vs. Dependency•30分钟
This module explores the semantic dimension of natural language by covering both lexical semantics—including word senses, ambiguity, and disambiguation techniques—and the semantic web—a framework for enabling machine-readable, structured understanding of web data. The module starts with foundational concepts in lexical semantics and WordNet, then proceeds to classical and modern word sense disambiguation (WSD) methods. The second part focuses on Semantic Web technologies, covering ontologies, knowledge graphs, RDF/OWL, and their role in enabling intelligent systems and knowledge-driven NLP applications.
涵盖的内容
17个视频5篇阅读材料14个作业1个讨论话题
17个视频•总计85分钟
- Outline of the Module•1分钟
- What is a Word Sense?•3分钟
- Homonymy vs Polysemy•7分钟
- Sense Relations•7分钟
- Introduction to WordNet and Synsets•7分钟
- Relations in WordNet•5分钟
- Navigating WordNet Hierarchies and Graph Structures•5分钟
- What is Word Sense Disambiguation? •4分钟
- Supervised WSD•8分钟
- Knowledge-Based WSD: Lesk Algorithm•5分钟
- From Syntactic Web to Semantic Web: What's the Problem?•6分钟
- Semantic Web Vision: Data Integration and Automation•3分钟
- Ontologies•4分钟
- Ontology Languages and Their Layers•9分钟
- What is a Knowledge Graph? •3分钟
- Applications in NLP•6分钟
- Module Wrap Up•1分钟
5篇阅读材料•总计130分钟
- Recommended Reading: Word Senses and Lexical Semantics•30分钟
- Code Document: Querying WordNet in Python (using nltk.corpus.wordnet)•10分钟
- Recommended Reading: WordNet and Semantic Lexicons•30分钟
- Recommended Reading: Word Sense Disambiguation (WSD)•30分钟
- Recommended Reading: Introduction to the Semantic Web and Ontologies•30分钟
14个作业•总计42分钟
- What is a Word Sense? •3分钟
- Homonymy vs Polysemy•3分钟
- Sense Relations•3分钟
- Introduction to WordNet and Synsets•3分钟
- Relations in WordNet•3分钟
- Navigating WordNet Hierarchies and Graph Structures•3分钟
- What is Word Sense Disambiguation?•3分钟
- Supervised WSD•3分钟
- Knowledge-Based WSD: Lesk Algorithm•3分钟
- Semantic Web Vision: Data Integration and Automation•3分钟
- Ontologies•3分钟
- Ontology Languages and Their Layers•3分钟
- What is a Knowledge Graph? •3分钟
- Applications in NLP•3分钟
1个讨论话题•总计30分钟
- Disambiguating the Future: WSD and the Semantic Web•30分钟
This module introduces students to the evolution of neural network architectures in NLP, beginning with recurrent models (RNNs), progressing through attention mechanisms, and culminating in Transformer-based models that have revolutionised natural language processing. Through hands-on coding and application-driven lessons, students will explore how Transformers power state-of-the-art systems in sentiment analysis (text classification), machine translation, and question answering. The module emphasises both theoretical foundations and practical implementation using modern deep learning frameworks.
涵盖的内容
16个视频5篇阅读材料17个作业1个讨论话题
16个视频•总计97分钟
- What RNNs Are and Why They Fall Short•7分钟
- Why Do We Need Attention•5分钟
- The Attention Mechanism Explained•6分钟
- From Attention to Transformer Architecture •6分钟
- High-Level Structure of the Transformer•4分钟
- Self-Attention in Detail•6分钟
- Multi-Head Attention•4分钟
- Positional Encodings•4分钟
- Popular Transformer Variants•5分钟
- What Text Summarisation is and its Uses •2分钟
- Types of Text Summarisation•5分钟
- Natural Text Summarisation •11分钟
- Stages of Text Summarisation •6分钟
- Demo of Text Summarisation •9分钟
- Ethical Issues in NLP •10分钟
- Ethical Design of NLP Applications •6分钟
5篇阅读材料•总计130分钟
- Recommended Reading: From RNNs to Attention•30分钟
- Recommended Reading: Transformer Architecture•30分钟
- Code Document: Transformer Demonstration with Classification•10分钟
- NLP Application - Text Summarisation•30分钟
- Recommended Reading: Ethics in NLP•30分钟
17个作业•总计108分钟
- Graded Quiz - Modules 9 and 10•60分钟
- What RNNs Are and Why They Fall Short•3分钟
- Why Do We Need Attention•3分钟
- The Attention Mechanism Explained•3分钟
- From Attention to Transformer Architecture •3分钟
- High-Level Structure of the Transformer•3分钟
- Self-Attention in Detail•3分钟
- Multi-Head Attention•3分钟
- Positional Encodings•3分钟
- Popular Transformer Variants•3分钟
- What Text Summarisation is and its Uses •3分钟
- Types of Text Summarisation •3分钟
- Natural Text Summarisation •3分钟
- Stages of Text Summarisation •3分钟
- Demo of Text Summarisation •3分钟
- Ethical Issues in NLP •3分钟
- Ethical Design of NLP Applications •3分钟
1个讨论话题•总计30分钟
- The Power and Peril of Large Language Models•30分钟
End Term Examination
涵盖的内容
1个作业
1个作业•总计30分钟
- End Term Examination •30分钟
位教师


提供方

提供方

Birla Institute of Technology & Science, Pilani (BITS Pilani) is one of only ten private universities in India to be recognised as an Institute of Eminence by the Ministry of Human Resource Development, Government of India. It has been consistently ranked high by both governmental and private ranking agencies for its innovative processes and capabilities that have enabled it to impart quality education and emerge as the best private science and engineering institute in India. BITS Pilani has four international campuses in Pilani, Goa, Hyderabad, and Dubai, and has been offering bachelor's, master’s, and certificate programmes for over 58 years, helping to launch the careers for over 1,00,000 professionals.
人们为什么选择 Coursera 来帮助自己实现职业发展

Felipe M.

Jennifer J.

Larry W.

Chaitanya A.
从 Computer Science 浏览更多内容
DDeepLearning.AI
专项课程
NNortheastern University
课程
NNortheastern University
课程

