100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached
logo-home
NLP - N-Grams and Smoothing Questions and Answers 2024 CA$20.92   Add to cart

Exam (elaborations)

NLP - N-Grams and Smoothing Questions and Answers 2024

 4 views  0 purchase
  • Course
  • Institution

NLP - N-Grams and Smoothing

Preview 1 out of 2  pages

  • October 31, 2024
  • 2
  • 2024/2025
  • Exam (elaborations)
  • Questions & answers
avatar-seller
NLP - N-Grams and Smoothing

N-gram - answer Literally a sequence of n tokens. It is used as method of determining
the next word in a sequence based on the previous n-1 tokens.

Markov Assumption - answer The probability of a word type occurring is only based on
the previous word type

Extrinsic evaluation - answer Evaluation of the performance of a language model by
embedding it in an application and measuring how much the application improves. This
is often the only way to know if a particular improvement in a component is really going
to help the task at hand

Intrinsic evaluation - answer Evaluation of the performance of a language model that
measures the quality of a model independent of any application.

Test / training / dev set - answer Train model on the training set, evaluate it with the test
set, and make sure you don't overfit with the dev set.

Maximum Likelihood Estimation - answer Counts the number of times a feature (such
as an n-gram) appears in the corpus and normalizes between 0 and 1.

Perplexity - answer A metric that describes how surprising the next sequence will be;
also a measure of how well the probability distribution predicts a sample. Lower
perplexity is generally better. Mathematically, it is a variation on raw probabilities which
takes the N-th root of the inverse probability of a test set, where N is the number of
tokens. The formula is as follows: 𝑃𝑃 = (1/𝑃(𝑤1𝑤2 ....𝑤𝑛 ))1/N

Relative Frequency - answer The number of appearances of a sequence divided by the
number of appearances of its prefix

Smoothing - answer Since we multiply probabilities, one zero probability event will zero
out an entire chain. In order to avoid this, we perform smoothing, a slight transformation
to data sets that prevents a language model from assigning a probability of zero to an
event. The simplest technique is Laplace smoothing, which merely adds one to each
count of token appearances.

Backoff - answer The process used to decide which n-gram to use in predictions as
determined by the largest n-gram with sufficient evidence. If an n-gram has zero
evidence for use, we backoff to the next largest (n-1 gram) and try again

Interpolation - answer An alternative approach to backoff that combines the estimates
from each n-gram prediction via a linear combination where the constants sum to1.

The benefits of buying summaries with Stuvia:

Guaranteed quality through customer reviews

Guaranteed quality through customer reviews

Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.

Quick and easy check-out

Quick and easy check-out

You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.

Focus on what matters

Focus on what matters

Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!

Frequently asked questions

What do I get when I buy this document?

You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.

Satisfaction guarantee: how does it work?

Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.

Who am I buying these notes from?

Stuvia is a marketplace, so you are not buying this document from us, but from seller julianah420. Stuvia facilitates payment to the seller.

Will I be stuck with a subscription?

No, you only buy these notes for CA$20.92. You're not tied to anything after your purchase.

Can Stuvia be trusted?

4.6 stars on Google & Trustpilot (+1000 reviews)

75323 documents were sold in the last 30 days

Founded in 2010, the go-to place to buy study notes for 14 years now

Start selling
CA$20.92
  • (0)
  Add to cart