Amit Bhatnagar on on May 31, 2012

AWA does not matter much. I have never heard anybody from Adcom of any school saying that they factor AWA in admission-decisions (and I read a lots of Admission Officers’ interviews, blogs, etc; also met some personally, when I was applying to B-schools). Most adcoms would focus just on your overall score, Quant score and verbal score. In any case, they have your application-essays to judge your writing abilities.

However, if your AWA score is terrible (Anything below 4) and your essays look very polished, things may get questionable! Hence, I won’t recommend ignoring this section completely. Remember, even a 5 on a scale of 6 is just around 56 percentile, which means as many as about 44% manage 5 or more. Spend some time towards the end of your GMAT preparation towards this, amd you should be good.

I wrote a detailed blog post on this topic some time ago:…

Hope this helps.

See question on Quora

Continue reading about Does the Analytical Writing Assessment section on the GMAT matter?

Amit Bhatnagar on on May 31, 2012

I don’t have the exact numbers to support or refute their claim, but I have some reasons to believe Amazon. Analysts estimate Kindle Fire sales to be in order of 6MM in its first quarter. (Source:…) This is a big number, and I think captures a big part of this number. Here are my reasons for this:

  • Dual role: Manufacturer & Online stores: For most other electronics goods, you can expect the Online sales to be split across manufacturers’ websites and other online stores. In case of Kindle Fire, Amazon is both the manufacturer and a leading online store. Also, since Amazon offers free shipping for Kindle Fire even for non-prime members and no other online store offers a lower price, there is no reason to buy from anywhere else, if you are ordering online.
  • Relatively low offline sales: I  know that Kindle Fire is available in physical stores like BestBuy, but  there is no major offline official channel like Apple stores for  iPods/iPads, etc, or AT&T/Verizon stores for Samsung phones. So I expect the majority of sales to be from online channels, and Amazon should grab the lion’s share from that.(See previous point).
  • There is just one Kindle fire: Best selling products are reported at an SKU level. A variation like color or size in product will make it a different product for the purposes of best-seller rankings. For example, Kindle Fire is not competing with iPod touch, not even with iPod touch  4G. The competition is against iPod touch 4G 8GB (Black),iPod touch 4G 8GB (White) and iPod touch 4G 32GB-Black, all of which rank in Top 20 Electronics goods. As for Kindle Fire, there is just one variation of it: One size, one storage capacity (8GB), One color (Black). (Other products from Kindle family are simply book-readers, and hence, I am not clubbing them as the same product)
  • Very high number of reviews: Number of reviews may not be accurate representative of actual sales, but you can easily expect the two figures to have at least a moderate correlation and it can at least give you some idea about actual units sold. Since the Fire was launched in mid-November, 2011, it has amassed close to 18,000 reviews. Most other products would be lucky to get 2000 reviews. (Sample the top sellers across  few categories: Books=> Fifty shades of Gray: 1500 reviews, Apps=> Angry Birds: 965 reviews, Camera=> Canon PowerShot D10:585 reviews) Only products that I could locate having higher number of reviews was Kindle Keyboard, and that was released way back in Summer 2010! This does indicate a possibility of Kindle Fire being their top selling product.

Again, as I said, I have no numbers, but the reasons above are convincing enough for me to believe them. Will wait for Amazon to share official sales figures, which may perhaps give us a clearer idea on this.

See question on Quora

Continue reading about Is the Kindle Fire really Amazon’s best selling product?

As I wrote here What efforts has Yelp made to educate the public on its review ranking algorithm? and On Yelp, why aren’t reviews that receive the most votes near the top? Does Yelp have a competitor that does do this? Could Quora eventually compete?, Yelp sort does take into account number of votes in their default sorting algorithm.

Precise sentence from Yelp confirming this:

The order is determined by recency, user voting, and other review quality factors.

See question on Quora

Continue reading about Do average Yelp rankings incorporate the fact that some reviews get a lot of votes and some don’t?

Not sure what would qualify as enough, but if you are like me, chances are that you would at least hover over “Yelp Sort”, when you see that it is the default sort, as compared other more easily understandable things, like Votes, date, Rating etc. And if you do hover over it, here’s what you will see:

    Yelp Sort attempts to show reviews that help consumers make informed decisions. The order is determined by recency, user voting, and other review quality factors. This method is applied to all businesses, sponsors or not.

If you go to their FAQ, you will find another question that clearly addresses it:

  • Are reviews displayed in any particular order?
    Users can decide for themselves how best to order reviews by  clicking one of the links just above the reviews (e.g., date, rating,  voting, etc.). Yelp’s default sort order takes a number of factors into  account and reflects our own attempt to present reviews in a meaningful  order.  For example, we’ll favor reviews from your friends and the users  you follow. The sort algorithm does not take into account whether the business is an advertiser or not.

In my opinion, this is just enough information and at just the right places. Most users won’t even bother about the sorting algorithm, and will make their decision based on top 3-4 reviews + overall rating. Those of us, who do care about how exactly are the reviews ordered, would perhaps look for it (in FAQs or in the hover-over tool tip). And even these guys would not perhaps want to see this information every time they make a search.

See question on Quora

Continue reading about What efforts has Yelp made to educate the public on its review ranking algorithm?

Whatever way you go, don’t forget the ALT tag if you are focusing on SEO:

<img src=”Red-rose.jpg” ALT=”Red Rose”>

See question on Quora

Continue reading about What is the correct way to name images for SEO purposes: “Red Rose.jpg” or “red-rose.jpg”?

Amit Bhatnagar on on May 30, 2012

First, why we need review-filters? So that people like the PHB from Dilbert can’t execute their evil strategies

About the mechanics of the Yelp review filter: You won’t get a formal answer here, as Yelp keeps this review filter mechanism a secret. Yelp says (and I agree) that it is easier to game the system once you know the mechanics. So any answer here would be simply a guess based on a combination of common sense and an analysis of the patterns of what reviews usually get filtered.

Here is my guess at what goes into the review-filter. (Everything is centered around Yelp’s tagline: “Real people, real reviews”):

  • Number of reviews: You don’t even have a Yelp account. Then, suddenly, you create a Yelp account, write a 1* review for a local business and then, you never log in back. In eyes of the review-filter, you may be a scared competitor! (or if this is a 5* review, you maybe the owner’s nephew!) From whatever I have observed, perhaps more than 80% of filtered reviews are from reviewers with less than 5 reviews.
  • Number of friends: Most real people like to make friends on different social media channels, Yelp included. If you don’t want your Facebook friends to see what you are writing, Yelp-filter may put you in category of review-factories, who churn out one review after another for money. This may even override the number of reviews. At least twice, I have seen a filtered review by people with 50+ Yelp reviews. In both cases, the reviewer had no Yelp friends.
  • Uniqueness of content: Real people write reviews reflecting their real experiences. If your review for a Vietnamese restaurant in Atlanta matches another reviewer’s review of Thai restaurant in NYC word for word (or even 70-80%, allowing for changing some proper nouns), chances are high that at least the review published later chronologically would be filtered (if not both).
  • Other trustworthiness factors: Do you have a profile pic? Do you check-in using Yelp mobile or leave tips for other Yelpers at restaurants? Is your profile complete with all the details about your hometown, Things you love, etc? Paid reviewers may not be interested in “wasting” their time in these.
  • Location tracking: You are consistently reviewing businesses all over the US, but your IP shows that you always log in from a different country. In a different case, you, as a reviewer, log in from the same IP that is used by the business owner to log in to his business-owner account, or same IP has 25+ users registered, and all of them have just 1-2 reviews. In all these cases, your review may be flagged.

Remember that none of the above factors can be taken as a standalone factor to decide whether a review should be flagged. Your review may be totally genuine, even if one or more factors above indicate otherwise. My guess is that the filter may be working using a filter-score (similar to spam score used by Anti-spam filters) assigning scores to the factors above (and many more). Once your review crosses a certain threshold, it gets filtered!

See question on Quora

Continue reading about How do the mechanics of Yelp’s review filter work?

Okay, these are so many questions combined into one.. I will focus mostly on Yelp-sorting algorithm, or rather, my understanding of their algorithm, as the actual algorithm is not publicly disclosed.

You do have an option of sorting by number of votes, but I like their algorithm better. They call it “Yelp Sort”. Think of it what are the factors that you would factor in, when deciding which review to believe in:

  • Reviews written by friends: You are more likely to believe your friends as compared to random strangers.
  • Reviews written by trusted Yelpers: Whose review would you trust more: A Yelp Elite with 254 reviews or somebody whose only review is for the business that you are viewing.
  • Recency: A business may have learned its lessons from negative Yelp reviews, and now the reviews are all 5* and 4*. Won’t it make more sense to give more weight to recent reviews?
  • Higher number of Votes: Of course, as you mentioned, votes may help in crowdsourcing the decision of deciding which reviews matter most. 

From what I have observed, Yelp-Sort incorporates at least these four in ordering the reviews. I believe that the highest weight is for reviews by your Yelp friends, which makes sense! So, if  one of your Yelp friend has reviewed a business, his/her review will likely be on  the top. Recency also appears to be heavily weighed!

Yelp Competitors: Urbanspoon comes to my mind, but even they do it on “Relevance” however you may want to define it.

Finally, while I love both Yelp and Quora, but I am sure Quora can not be a good substitute for Yelp. Yes, you can expect to get answers to questions like the ones you included as examples, but if it’s already noon, you are feeling like having Thai for Lunch, and you need to choose between the 12 Thai locations within 5 mile radius, it may not be a smart idea to post the question on Quora. If you are like me, you’d likely turn to Yelp in cases like these!

See question on Quora

Continue reading about On Yelp, why aren’t reviews that receive the most votes near the top? Does Yelp have a competitor that does do this? Could Quora eventually compete?