Back

Sanskar

Keen Learner and Exp... • 2m

Day 10 of learning AI/ML as a beginner. Topic: N-Grams in Bag of Words (BOW). Yesterday I have talked about an amazing text to vector converter in machine learning i.e. Bag of Words (BOW). N-Gram is just a part of BOW. In BOW the program sees sentences with different meaning as similar which can be a big issue as it is relating the positive and negative things similar which should not happen. N-grams allows us to over come this limitation by grouping the words with next words so that is can give more accurate results for example in a sentence "The food is good" it will group "food" and "good" (assuming we have applied stopwords) together and will then compare it with the actual sentence and this will help the program distinguish between two different sentences and also lets the program understand what the user is saying. You can understand this better by seeing my notes that I have attached at last. I have also performed practical of this as n-gram is a part of BOW I decided to reuse my code and have imported the code in my BOW file (I also used if __name__ == "__main__": so that the results of previous code did not run in the new file). For using n-gram you just need to add this ngram_range=(1, 2) in the CountVectorizer. You can also change the range for getting bigram and trigram etc based on your need. I then used for loop to print all the group of words. Here's my code, its result and the notes I made of N-gram.

3 Replies
8
Replies (3)

More like this

Recommendations from Medial

Sanskar

Keen Learner and Exp... • 2m

Day 8 of learning AI/ML as a beginner. Topic: Bag of Words (BOW) Yesterday I told you guys about One Hot Encoding which is one way to convert text into vector however with serious disadvantages and to cater to those disadvantages there's another on

See More
Reply
1
12
Image Description
Image Description

Sanskar

Keen Learner and Exp... • 2m

Day 9 of learning AI/ML as a beginner. Topic: Bag of Words practical. Yesterday I shared the theory about bag of words and now I am sharing about the practical I did I know there's still a lot to learn and I am not very much satisfied with the topi

See More
4 Replies
20
1

Sanskar

Keen Learner and Exp... • 2m

Day 11 of learning AI/ML as a beginner. Topic: TF-IDF (Term Frequency - Inverse Document Frequency). Yesterday I have talked about N-grams and how they are useful in Bag of Words (BOW) however it has some serious drawbacks and for that reason I am

See More
Reply
2

navansh jagetiya

Hustlin! • 1m

During my visit to D-Mart yesterday, I observed that the rice measuring jugs in the open section hold 800 grams instead of 1 kg. I found this unusual — is there a practical, economic, or psychological reason why they prefer 800 grams over the standar

See More
Reply
6

Sanskar

Keen Learner and Exp... • 2m

Day 5 of learning AI/ML as a beginner. Topic: lemmatization and stopwords. Lemmatization is same as stemming however in lemmatization a word is reduced to its base form also known as lemma. This is a dictionary based process. This is accurate then

See More
Reply
2

Sanskar

Keen Learner and Exp... • 2m

Day 2 of learning AI/ML as a beginner. Topic: text preprocessing (tokenization) in NLP. I have moved further and decided to learn about Natural Language Process(NLP) which is used especially for translations, chatbots, and help them to generate hum

See More
Reply
4

Sanskar

Keen Learner and Exp... • 1m

Day 13 of learning AI/ML as a beginner. Topic: Word Embedding. I have discussed about one hot encoding, Bag of words and TF-IDF in my recent posts. These are the count or frequency tools that are a part of word embedding but before moving forward l

See More
Reply
3

Ayush Kushwaha

Hey I am on Medial • 9m

We will make a attachi one on top of the bag solar panel. You can heat and cool the water inside the bag. You can also charge your phone and laptop. And I will give you a small bag vehicles can you use it anywhere ..

Reply
2
Image Description

param siddh

@paramsiddh • 6m

Listen to every elder’s advice’ is the worst sentence ever said. Even fools grow old. If the talk is bullshit, don’t follow it—even if it’s from an elder. And if there’s truth in the words, don’t hesitate to bow before a 5-year-old

1 Reply
4

Download the medial app to read full posts, comements and news.