Home| All soft| Last soft| Your Orders| Customers opinion| Helpdesk| Cart

Program Search:


Shopping Cart:




*Note: Minimum order price: €20
We Recommend:

Coursera Dan Jurafsky Christopher Manning Natural Language Processing €10 buy download

2016
Coursera
15:40:24
English

This course covers a broad range of topics in natural language processing, including word and sentence tokenization, text classification and sentiment analysis, spelling correction, information extraction, parsing, meaning extraction, and question answering, We will also introduce the underlying theory from probability, statistics, and machine learning that are crucial for the field, and cover fundamental algorithms like n-gram language modeling, naive bayes and maxent classifiers, sequence models like Hidden Markov Models, probabilistic dependency and constituent parsing, and vector-space models of meaning.
We are offering this course on Natural Language Processing free and online to students worldwide, continuing Stanford's exciting forays into large scale online instruction. Students have access to screencast lecture videos, are given quiz questions, assignments and exams, receive regular feedback on progress, and can participate in a discussion forum. Those who successfully complete the course will receive a statement of accomplishment. Taught by Professors Jurafsky and Manning, the curriculum draws from Stanford's courses in Natural Language Processing. You will need a decent internet connection for accessing course materials, but should be able to watch the videos on your smartphone.

├── week10-01
│ ├── Natural Language Processing 18.0 Introduction (102).mp4
│ ├── Natural Language Processing 18.1 Recap of GLMs (740).mp4
│ ├── Natural Language Processing 18.2 GLMs for Tagging (Part 1) (526).mp4
│ ├── Natural Language Processing 18.3 GLMs for Tagging (Part 2) (735).mp4
│ ├── Natural Language Processing 18.4 GLMs for Tagging (Part 3) (706).mp4
│ └── Natural Language Processing 18.5 GLMs for Tagging (Part 4) (600).mp4
├── week10-02
│ ├── Natural Language Processing 19.0 Introduction (037).mp4
│ ├── Natural Language Processing 19.1 The Dependency Parsing Problem (Part 1) (521).mp4
│ ├── Natural Language Processing 19.2 The Dependency Parsing Problem (Part 2) (1353).mp4
│ ├── Natural Language Processing 19.3 GLMs for Dependency Parsing (Part 1) (1159).mp4
│ ├── Natural Language Processing 19.4 GLMs for Dependency Parsing (Part 2) (828).mp4
│ ├── Natural Language Processing 19.5 Experiments with GLMs for Dep. Parsing (538).mp4
│ └── Natural Language Processing 19.6 Summary (250).mp4
├── week1-01
│ ├── Natural Language Processing 0.0 Introduction (Part 1) (1117).mp4
│ └── Natural Language Processing 0.1 Introduction (Part 2) (1028).mp4
├── week1-02
│ ├── Natural Language Processing 1.0 Introduction to the Language Modeling Problem (Part 1) (617).mp4
│ ├── Natural Language Processing 1.1 Introduction to the Language Modeling Problem (Part 2) (712).mp4
│ ├── Natural Language Processing 1.2 Markov Processes (Part 1) (856).mp4
│ ├── Natural Language Processing 1.3 Markov Processes (Part 2) (628).mp4
│ ├── Natural Language Processing 1.4 Trigram Language Models (940).mp4
│ └── Natural Language Processing 1.5 Evaluating Language Models Perplexity (1236).mp4
├── week1-03
│ ├── Natural Language Processing 2.0 Linear Interpolation (Part 1) (746).mp4
│ ├── Natural Language Processing 2.1 Linear Interpolation (Part 2) (1135).mp4
│ ├── Natural Language Processing 2.2 Discounting Methods (Part 1) (926).mp4
│ └── Natural Language Processing 2.3 Discounting Methods (Part 2) (334).mp4
├── week1-04
│ └── Natural Language Processing 3.0 Summary (231).mp4
├── week2-01
│ ├── Natural Language Processing 4.0 The Tagging Problem (1001).mp4
│ ├── Natural Language Processing 4.1 Generative Models for Supervised Learning (857).mp4
│ ├── Natural Language Processing 4.2 Hidden Markov Models (HMMs) Basic Definitions (1200).mp4
│ ├── Natural Language Processing 4.3 Parameter Estimation in HMMs (1316).mp4
│ ├── Natural Language Processing 4.4 The Viterbi Algorithm for HMMs (Part 1) (1407).mp4
│ ├── Natural Language Processing 4.5 The Viterbi Algorithm for HMMs (Part 2) (331).mp4
│ ├── Natural Language Processing 4.6 The Viterbi Algorithm for HMMs (Part 3) (733).mp4
│ └── Natural Language Processing 4.7 Summary (150).mp4
├── week3-01
│ ├── Natural Language Processing 5.0 Introduction (028).mp4
│ ├── Natural Language Processing 5.1 Introduction to the Parsing Problem (Part 1) (1037).mp4
│ ├── Natural Language Processing 5.2 Introduction to the Parsing Problem (Part 2) (420).mp4
│ ├── Natural Language Processing 5.3 Context-Free Grammars (Part 1) (1211).mp4
│ ├── Natural Language Processing 5.4 Context-Free Grammars (Part 2) (222).mp4
│ ├── Natural Language Processing 5.5 A Simple Grammar for English (Part 1) (1032).mp4
│ ├── Natural Language Processing 5.6 A Simple Grammar for English (Part 2) (530).mp4
│ ├── Natural Language Processing 5.7 A Simple Grammar for English (Part 3) (1121).mp4
│ ├── Natural Language Processing 5.8 A Simple Grammar for English (Part 4) (220).mp4
│ └── Natural Language Processing 5.9 Examples of Ambiguity (556).mp4
├── week3-02
│ ├── Natural Language Processing 6.0 Introduction (112).mp4
│ ├── Natural Language Processing 6.1 Basics of PCFGs (Part 1) (943).mp4
│ ├── Natural Language Processing 6.2 Basics of PCFGs (Part 2) (826).mp4
│ ├── Natural Language Processing 6.3 The CKY Parsing Algorithm (Part 1) (731).mp4
│ ├── Natural Language Processing 6.4 The CKY Parsing Algorithm (Part 2) (1322).mp4
│ └── Natural Language Processing 6.5 The CKY Parsing Algorithm (Part 3) (1007).mp4
├── week4-01
│ └── Natural Language Processing 7.0 Weaknesses of PCFGs (1459).mp4
├── week4-02
│ ├── Natural Language Processing 8.0 Introduction (0017).mp4
│ ├── Natural Language Processing 8.1 Lexicalization of a Treebank (1044).mp4
│ ├── Natural Language Processing 8.2 Lexicalized PCFGs Basic Definitions (1240).mp4
│ ├── Natural Language Processing 8.3 Parameter Estimation in Lexicalized PCFGs (Part 1) (528).mp4
│ ├── Natural Language Processing 8.4 Parameter Estimation in Lexicalized PCFGs (Part 2) (908).mp4
│ ├── Natural Language Processing 8.5 Evaluation of Lexicalized PCFGs (Part 1) (932).mp4
│ └── Natural Language Processing 8.6 Evaluation of Lexicalized PCFGs (Part 2) (1128).mp4
├── week5-01
│ ├── Natural Language Processing 9.0 Opening Comments (025).mp4
│ ├── Natural Language Processing 9.1 introduction (203).mp4
│ ├── Natural Language Processing 9.2 Challenges in MT (806).mp4
│ ├── Natural Language Processing 9.3 Classical Approaches to MT (Part 1) (802).mp4
│ ├── Natural Language Processing 9.4 Classical Approaches to MT (Part 2) (556).mp4
│ └── Natural Language Processing 9.5 Introduction to Statistical MT (1231).mp4
├── week5-02
│ ├── Natural Language Processing 10.0 Introduction (324).mp4
│ ├── Natural Language Processing 10.1 IBM Model 1 (Part 1) (1306).mp4
│ ├── Natural Language Processing 10.2 IBM Model 1 (Part 2) (901).mp4
│ ├── Natural Language Processing 10.3 IBM Model 2 (1127).mp4
│ ├── Natural Language Processing 10.4 The EM Algorithm for IBM Model 2 (Part 1) (509).mp4
│ ├── Natural Language Processing 10.5 The EM Algorithm for IBM Model 2 (Part 2) (837).mp4
│ ├── Natural Language Processing 10.6 The EM Algorithm for IBM Model 2 (Part 3) (928).mp4
│ ├── Natural Language Processing 10.7 The EM Algorithm for IBM Model 2 (Part 4) (452).mp4
│ └── Natural Language Processing 10.8 Summary (148).mp4
├── week6-01
│ ├── Natural Language Processing 11.0 Introduction (041).mp4
│ ├── Natural Language Processing 11.1 Learning Phrases from Alignments (Part 1) (918).mp4
│ ├── Natural Language Processing 11.2 Learning Phrases from Alignments (Part 2) (701).mp4
│ ├── Natural Language Processing 11.3 Learning Phrases from Alignments (Part 3) (847).mp4
│ └── Natural Language Processing 11.4 A Sketch of Phrase-based Translation (817).mp4
├── week6-02
│ ├── Natural Language Processing 12.0 Definition of the Decoding Problem (Part 1) (912).mp4
│ ├── Natural Language Processing 12.1 Definition of the Decoding Problem (Part 2) (1300).mp4
│ ├── Natural Language Processing 12.2 Definition of the Decoding Problem (Part 3) (1043).mp4
│ ├── Natural Language Processing 12.3 The Decoding Algorithm (Part 1) (1439).mp4
│ ├── Natural Language Processing 12.4 The Decoding Algorithm (Part 2) (623).mp4
│ └── Natural Language Processing 12.5 The Decoding Algorithm (Part 3) (1229).mp4
├── week7-01
│ ├── Natural Language Processing 13.0 Introduction (047).mp4
│ ├── Natural Language Processing 13.1 Two Example Problems (1119).mp4
│ ├── Natural Language Processing 13.2 Features in Log-Linear Models (Part 1) (1356).mp4
│ ├── Natural Language Processing 13.3 Features in Log-Linear Models (Part 2) (1013).mp4
│ ├── Natural Language Processing 13.4 Definition of Log-linear Models (Part 1) (1150).mp4
│ ├── Natural Language Processing 13.5 Definition of Log-linear Models (Part 2) (345).mp4
│ ├── Natural Language Processing 13.6 Parameter Estimation in Log-linear Models (Part 1) (1244).mp4
│ ├── Natural Language Processing 13.7 Parameter Estimation in Log-linear Models (Part 2) (413).mp4
│ └── Natural Language Processing 13.8 SmoothingRegularization in Log-linear Models (1512).mp4
├── week8-01
│ ├── Natural Language Processing 14.0 Introduction (141).mp4
│ ├── Natural Language Processing 14.1 Recap of the Tagging Problem (315).mp4
│ ├── Natural Language Processing 14.2 Independence Assumptions in Log-linear Taggers (832).mp4
│ ├── Natural Language Processing 14.3 Features in Log-Linear Taggers (1321).mp4
│ ├── Natural Language Processing 14.4 Parameters in Log-linear Models (359).mp4
│ ├── Natural Language Processing 14.5 The Viterbi Algorithm for Log-linear Taggers (937).mp4
│ ├── Natural Language Processing 14.6 An Example Application (928).mp4
│ └── Natural Language Processing 14.7 Summary (245).mp4
├── week8-02
│ ├── Natural Language Processing 15.0 Introduction (047).mp4
│ ├── Natural Language Processing 15.1 Conditional History-based Models (714).mp4
│ ├── Natural Language Processing 15.2 Representing Trees as Decision Sequences (Part 1) (723).mp4
│ ├── Natural Language Processing 15.3 Representing Trees as Decision Sequences (Part 2) (1020).mp4
│ ├── Natural Language Processing 15.4 Features and Beam Search (1210).mp4
│ └── Natural Language Processing 15.5 Summary (112).mp4
├── week9-01
│ ├── Natural Language Processing 16.0 Introduction (036).mp4
│ ├── Natural Language Processing 16.1 Word Cluster Representations (836).mp4
│ ├── Natural Language Processing 16.2 The Brown Clustering Algorithm (Part 1) (1150).mp4
│ ├── Natural Language Processing 16.3 The Brown Clustering Algorithm (Part 2) (830).mp4
│ ├── Natural Language Processing 16.4 The Brown Clustering Algorithm (Part 3) (918).mp4
│ ├── Natural Language Processing 16.5 Clusters in NE Recognition (Part 1) (1133).mp4
│ └── Natural Language Processing 16.6 Clusters in NE Recognition (Part 2) (728).mp4
└── week9-02
├── Natural Language Processing 17.0 Introduction (030).mp4
├── Natural Language Processing 17.1 Recap of History-based Models (711).mp4
├── Natural Language Processing 17.2 Motivation for GLMs (634).mp4
├── Natural Language Processing 17.3 Three Components of GLMs (1439).mp4
├── Natural Language Processing 17.4 GLMs for Parse Reranking (1036).mp4
├── Natural Language Processing 17.5 Parameter Estimation with the Perceptron Algorithm (611).mp4
└── Natural Language Processing 17.6 Summary (301).mp4

www.coursera.org



Download File Size:1.05 GB


Coursera Dan Jurafsky Christopher Manning Natural Language Processing
€10
Customers who bought this program also bought:

Home| All Programs| Today added Progs| Your Orders| Helpdesk| Shopping cart      





Adobe Acrobat Pro DC 2022 €70


Mathworks MATLAB R2022 €105


Adobe Acrobat Pro DC 2022 for Mac €70






DxO PhotoLab 5 ELITE €25


SketchUp Pro 2022 €30


Corel Painter 2023 €40






Autodesk 3DS MAX 2023 €75

             

Autodesk Inventor Professional 2023 €95