This is not a post about SEO. It is, however, a post about the future of search. This surprised even me – when I started writing this piece, it really was just an idea about building a better review. I realized, though, that finding relevant reviews is a useful microcosm of the broader challenge search engines face. Specifically, I want to talk about three S’s – Social, Sentiment, and Semantics, and how each of these pieces fit the search puzzle. Along the way, I might just try to build a better mousetrap.
The Core Problem
Product reviews are great, but on a site as big and popular as Amazon.com, filtering reviews isn’t much easier than filtering Google search results. Here’s the review section for the Kindle Fire:
That’s right – 10,859 reviews to sort through. Even if I just decide to look at the 5 stars and 1 stars, that’s still 7,208 reviews. If I could click and skim each one of those 7,208 in about 5 seconds, I’ve got roughly 10 hours of enjoyment ahead of me (if I don’t eat or take bathroom breaks). So, how can we make this system better?
(1) The Social Graph
These days our first answer is usually: “SOCIAL!” Social is sexy, and it will solve all our problems with its sexy sexiness. The problem is that we tend to oversimplify. Here’s how we think about Search + Social, in our perfect world:
Unfortunately, it’s not quite so magical. There are two big problems, whether we’re talking about product reviews or organic search results. The first problem is a delicate one. Some of the people that you associate with are – how shall I put it – stupid.
Ok, maybe stupid is a bit harsh, but just because you’re connected to someone doesn’t mean you have a lot in common or share the same tastes. So, we really want to weed out some of the intersection, like Crazy Cousin Larry…
It’s surprisingly hard to figure out who we actually sit at the Crazy-Larry table. Computationally, this is a huge challenge. There’s a bigger problem, though. In most cases, especially once we start weeding people out, the picture actually looks more like this:
Even with relatively large social circles, the actual overlap of your network and any given search result or product is often so small as to be useless. We can extend our circles to 2nd- and 3rd-degree relationships, but then relevance quickly suffers.
To be fair to Amazon, they’ve found one solution – they elicit user feedback of the reviews themselves as a proxy social signal:
This approach certainly helps, but it mostly weeds out the lowest-quality offerings. Reviews of reviews help control quality, but they don't do much to help us find the most relevant information.
(2) Sentiment Analysis
Reviews are a simple form of sentiment analysis – they help us determine if people view a product positively or negatively. More advanced sentiment analysis uses natural-language processing (NLP) to try to extract the emotional tone of the text.
You may be wondering why we need more advanced sentiment analysis when someone has already told us how they feel on a 1-5 scale. Welcome to what I call “The Cupholder Problem”, something I’ve experienced frequently as a parent trying to buy high-end products on Amazon. Consider this fictional review which is all-too-based in reality:
I’m exaggerating, of course, but the core problem is that reviews are entirely subjective, and sometimes just one feature or problem can ruin a product for someone. Once that text is reduced to a single data point (one star), though, the rest of the information in the content is lost.
Sentiment analysis probably wouldn’t have a dramatic impact on Amazon reviews, but it’s a hot topic in search in general because it can help extract emotional data that’s sometimes lost in a summary (whether it’s a snippet or a star rating). It might be nice to see Amazon institute some kind of sentiment correction process, warning people if the tone of their review doesn’t seem to match the star rating.
(3) Semantic Search
This is where things get interesting (and I promise I’ll get back to sentiment so that the previous section has a point). The phrase “semantic search” has been abused, unfortunately, but the core idea is to get at the meaning and conceptual frameworks behind information. Google Knowledge Graph is probably the most visible, recent attempt to build a system that extracts concepts and even answers, instead of just a list of relevant documents.
How does this help our review problem? Let’s look at the “Thirsty” example again. It’s not a dishonest review or even useless – the problem is that I fundamentally don’t care about cupholders. There are certain features that matter a lot to me (safety, weight, durability), others that I’m only marginally sensitive to (price, color), and some that I don’t care about at all (beverage dispensing capability).
So, what if we could use a relatively simple form of semantic analysis to extract the salient features from reviews for any given product? We might end up with something like this:
Pardon the uninspired UI, but even the addition of a few relevant features could help customers drill down to what really matters to them, and this could be done with relatively simple semantic analysis. This basic idea also illustrates some of the direction I think search is heading. Semantic search isn’t just about retrieving concepts; it’s also about understanding the context of our questions.
Here’s an interesting example from Google Australia (Google.com.au). Search for “Broncos colors” and you’ll get this answer widget (hat tip to Brian Whalley for spotting these):
It’s hardly a thing of beauty, but it gets the job done and probably answers the query for 80-90% of searches. This alone is an example of search returning concepts and not just documents, but it gets even more interesting. Now search for “Broncos colours”, using the British spelling (still in Google.com.au). You should get this answer:
The combination of Google.com.au and the Queen’s English now has Google assuming that you meant Australia’s own Brisbane Broncos. This is just one tiny taste of the beginning of search using concepts to both deliver answers and better understand the questions.
(4) Semantics + Sentiment
Let’s bring this back around to my original idea. What if we could combine semantic analysis (feature extraction) and sentiment in Amazon reviews? We could easily envision a system like this:
I’ve made one small addition – a positive or negative (+/-) sentiment choice next to each feature. Maybe I only want to see products where people spoke highly of the value, or rule out the ones where they bashed the safety. Even a few simple combinations could completely change the way you digest this information.
The Tip of the Penguin
This isn’t the tip of the iceberg – it’s the flea on the wart on the end of the penguin’s nose on the tip of the iceberg. We still think of Knowledge Graph and other semantic search efforts as little more than toys, but they’re building a framework that will revolutionize the way we extract information from the internet over the next five years. I hope this thought exercise has given you a glimpse into how powerful even a few sources of information can be, and why they’re more powerful together than alone. Social doesn’t hold all of the answers, but it is one more essential piece of a richer puzzle.
I’d also like to thank you for humoring my Amazon reviews insanity. To be fair to Amazon, they’ve invested a lot into building better systems, and I’m sure they have fascinating ideas in the pipe. If they’d like to use any of these ideas, I’m happy to sell them for the very reasonable price of ONE MILL-I-ON DOLLARS.
Hi Pete!
I cannot but asking your opinion about Facebook Graph Search... because here you are talking about Reviews, but - really - the connection between Social and Search is where FBS is all around, hence all the logic about the dumb sexiness of the results is valid also in that case.
That's why, in my opinion, Social + Search as a combo (or when used alone) are not really useful for the relevancy of the results in many cases, and why an "authority on topic" metric, which could be for some aspect what many (me included) say should be the author rank or something similar, should enter in the game.
And actually, that is what happens in real life. I can search for a car, and surely I will take into account the opinions of some of my friends, as of the reviews in specialized magazines. But I'd take even more weight to what a recognized expert (recognized as such also thanks to social proof, maybe from my friends too) about cars is saying in a specialized magazine.
Then, we know that Google filled a patent about Reviews and Sentiment Analysis (here the link to Bill Slawski post about it: https://www.seobythesea.com/2009/06/googles-new-review-search-option-and-sentiment-analysis/ ), the problem is - as always it comes to patents: is Google really using it?
Finally, if you want to dig more into this topic, last January Kissmetric published a very interesting post, which was exactly about the potential interaction between Sentiment Analysis and Social Graph as they could be used by Google in Search and Advertising (https://blog.searchmetrics.com/us/2013/01/17/sentiment-and-google-more-than-a-feeling/)
I think the biggest challenge for "pure" social search (like Facebook Graph) is just the limited domain itself. Facebook isn't an index of the web, like Google, Bing, etc. - it's a collection of social activities. There's value in that content, to a point, but there's a lot of information missing in the data Facebook has. We're going to need both pieces of the puzzle (and more) to make this work, and right now, even if Google+ is way behind Facebook, only Google really has both.
I think it's important to note that from a business perspective, Google and Facebook are attempting to converge on the same end result.
They each want both pieces of the search and social pie. But, instead of joining forces (for obvious competitive reasons), they are attempting to build out the infrastructure that each already has to complete their circle. (Facebook needs web content search, Google needs social data.)
My comment is not concerning SEO either, it is strictly concerning Amazon.com and other places where companies try to generate reviews as their content for good reasons. What Amazon and these other sites fail to do is verify if the reviewers actually purchased the product. If you look at Amazon's review of A Memory of Light, you will see about 353 one star review. And if you browse through those reviews, those reviews do not talk about the story plot but they talk about the fact the book was not released in digital format the same day it was released in hardcover. Obviously those reviewers hadn't bought the book and were able to post review. In the race to get more UGC sites like Amazon let people review without verifying the products were actually bought from them, which is little sad!!!
That's an interesting point. Traditionally, I think it's tough because most sites want reviews to be open to people who may have bought the product elsewhere. Amazon has a unique opportunity in that regard, and it would be interesting to see how that impacted review volume.
I see this a lot with movie reviews. The movie is scheduled to be released Friday, and by Tuesday there are 500 reviews on Yahoo (for example) by people who haven't seen it but have already decided they love/hate it. Now, I'm not sure how you'd prove someone saw the movie, but it's the same core problem - these reviews are pretty useless. Even if they occasionally have valuable content, they aren't "reviews" in the sense most of us mean. We expect a review, even if it's based on opinion, to still be grounded in direct experience.
I sell ebooks on Amazon, and the reviews can make or break someone. Many reviews are spurious at best, and I know that many are simply bought and paid for to drive that particular book up in the rankings.
At least with many paid reviews it will say something about the book. The worst is what we're talking about here, when someone comments on whether the shrink wrapping was up to par or not. A review like that helps no one but whose product is being sold, often at the expense of other products, which in many cases, could be of superior quality, but were unfortunately shipped to specification.
Alas, I don't think things will change soon, unless Amazon comes up with perhaps yellow star and blue star reviews,
Penguins don't have noses.
Some great insights Dr. Pete. Another issue with Social is the number of fake accounts (Twitter has gotten the most attention lately) and the ability for social signals (tweets, retweets, clicks, etc.) to be manipulated.
Your points about semantic search and its potential future impact on search results (extracting data not just presenting a list of links) is interesting.
You have some great suggestions for Amazon regarding breaking out reviews by feature and then applying a mechanism to share sentiment, and at only $1M, Jeff B. is getting a bargain :-)
This is a great article. You definitely highlight some major flaws in Amazon's review system, and you offer some intriguing solutions.
Besides offering praise, the reason I'm commenting is that this article reminded me of another article I read a couple of months back. It appeared in the NYTimes, and it was entitled "A Casualty on the Battlefield of Amazon's Partisan Reviews." It was about the organized effort to undermine the sales of a Michael Jackson biography by some of Jackson's most devoted fans. Now the burying of a celebrity biography is not going to destroy freedom in this country, but it is definitely a negative development, especially if the tactics involved are adopted by other groups who have more overt political purposes.
The article spoke of the flaws in Amazon's review system, much like you have here. Although your purpose was to propose a way to find better quality reviews to improve our shopping experience on the web, I believe your article has bigger implications in light of the NYTimes article.
While some of your proposals may help people find more relevant reviews, I don't think they address the issue of credibility (of reviewers) enough. And credibility is one way to counter organized campaigns advanced by people with ulterior motivations. Another commenter above, RKArchaya, suggested Amazon should reveal whether a reviewer actually bought the book. I agree. Of course, Amazon can't know if the reviewer bought the book somewhere else, but it can tell if he or she bought it from Amazon. Not only would this lend greater credibility to the review, it would also undermine organized smear campaigns because at least the author would be getting some compensation from book sales. Amazon could else give higher rankings to reviews written by people who actually purchased the book (from Amazon.)
From your comment response, you assume that Amazon wants to keep its reviews open to all. To deal with this, I would suggest that Amazon set up two tiers of reviewers, people who have purchased the book (from Amazon) and those who haven't. This would also help verify credible reviews and counter organized smear campaigns.
Anyways, that's my two cents.
That's an interesting thought - instead of filtering down to reviews from people who bought the product on Amazon, let the end-user decide. Practically, they might not understand/use the option, but it would be an interesting option.
Funny, Dr. Pete- seems like we have been thinking of the exact same problem:) From my perspective, I believe the issue is with the star rating system in itself. I believe that people have a hard time selecting between the 2, 3, or 4 stars (assuming there are 5 total). 1 and 5 stars are easy, as it's basically the love/hate end of the sentiment. However, it's the middle that makes it hard to assess. For instance, what exactly does a 2 mean? Are we saying we don't hate it, but yet we don't actually like it enough to be indifferent? Huh?
Therefore, I think a spectrum is a much better solution as it gives a rater a chance to select exactly where they are at in the "emotional" spectrum. Especially if the spectrum has indicators of where you are at (https://imgur.com/CqZPIuR) in the range of sentiment. Here's what it would look like using your Amazon mockup example (https://imgur.com/cO3pAYm). It certainly not pretty, but I would personally find more value in that over a 2/5 star rating.
Also, let's not forget the "authority" of the rater. A review from you about cup holders would likely hold much more weight for me (as I would almost bet you've analyzed the heck out of that decision), rather than someone I don't know. In fact, imagine if the rating could somehow integrate the people I most closely follow on Twitter or Facebook and scan through any mentions or ratings of the product from them.
Lastly, check out Hotels.com and how they are beginning to filter out the ratings by cleanliness, service, comfort, and condition. https://goo.gl/4jm1m
Thanks for the post, and good to know someone else cares about this topic as much as I doJ
Yeah, there's always a subjective aspect to rankings or even something like a Likert scale. I haven't looked into the research, but I'm not sure if making it an open-ended spectrum removes subjectivity. I may be very stingy about the high-end while someone else may give out 5s all day, for example. Once you average it, however the measurement is made, then you lose something in translation.
A few people mentioned hotel/travel sites, and they have gotten pretty sophisticated. Amazon's big challenge is that they sell everything. Hotels has a core set of features that can be hard-coded. Amazon would have to extract features from the reviews, because manually coding them would be nearly impossible. That's where they have an interesting challenge, IMO.
Dr. Pete - you just verbalized how I have always used reviews. Frankly, I may ignore the star ratings, but I read the comments like crazy on Amazon and use them to parse out features.
Here is the thing, I may not know what features to look for in a product. I often use the reviews as a way to discover additional semantics (features) around a product. The "I never thought about that", phenomenon.
Going back to a cupholder example, there may be someone who types in a review, "I love the extra cupholder so that I can hold a latte for me and the sippy cup for junior". Why is this important is not just if a cupholder is available, but takes me to another level of thinking, the number of cupholders, or the location of those cupholders. When I started, I was just looking for a stroller to push junior around in, now I have other multiple levels of information to think about - thanks to the information in the reviews (and not just the overall rating). It allows for a higher level of satisfaction (potentially) when I buy something as what I purchase is hopefully more in line with my expectations as I had more information. The fact that satisfaction is relative, well, that is whole other blog post. :-)
The reviews help me to broaden the semantic horizons around things to think about, I can then refocus my search around that additional information, if I want to exclude it, or include it. Either way, because I am online and cannot "test drive" these items, the additional information that is in the text is very valuable to me.
Great point about features to research. Sometimes when someone is looking for things they don't understand, the reviews help you "get it." And find keywords. lol
Reviews are a great way to get info - but yes, it's a lot of reading.
We had the same issues when trying to find baby stuff on Amazon. I think by nature we want to be even more careful when it comes to the little ones and hence the deeper analysis of their reviews. One thing that I tried to do was to find what I thought was a good review and than read some of that reviewer's other reviews on other products. This gave me a better idea of their review system they used, so I could decide to trust the review or not. This also led us to finding other products that the writers of the reviews had found useful and reviewed that we may have not otherwise considered. I think this sort of "Top Reviewer" or maybe "Favorite Reviewer" feature will also come into play at some point. If you were to see a stupid cup holder review you could mark the reviewer and their reviews are filtered out in the future. Then maybe you find a really amazing review and add them to your Favorite list and their reviews show more prominently for you in the future.
That's a good point - it's kind of like RSS, which, when you think about it, is "social" in a sense. You decide who you trust and you subscribe to those people. They may not be friends or even in your extended social circles, but you've decided to accept them as a trusted source. Of course, this applies to institutions, too, in which case I don't think we'd call it social, but it does show that there are gray areas.
It would be interesting to see some sort of Automatic Reviewer Matching System. Maybe they would feature my reviews for you because we had both searched for, purchased, and reviewed baby/kids products, SEO & marketing books, and camping gear for example. This way it would be using a semantic social system based on data rather than direct connections.
Powered by Search, on your question " how would you recommend Amazon and similar companies go about implementing your suggestion?": that's the purpose of semantic technologies or text analytics; an more specifically of sentiment analysis technologies. The task Dr. Pete is proposing is feasible with current state of the art.
Best regards,
Antonio
CEO & Founder
www.bitext.com
Thank you for your response, Antonio!
You are right! What Dr. Pete suggested is absolutely feasible. For some reasons, text analytics slipped off my mind while I was reading the article and the image of people manually reading through reviews took it place instead.
Cheers,
Trang Lam
Great article. About Amazon: I am wondering why they don't change the review form. They only let you write some words about the product, but it's not possible to give a score for, let's say, its functionality, price/quality, packaging etc. I really like the way Booking.com collects their reviews. See for example the reviews for a resort in Las Vegas: https://www.booking.com/hotel/us/the-venetian-resort-casino.en-us.html#hash-blockdisplay4
I am able to filter the reviews in a couple of seconds.
I used to buy furniture from a site (they got bought out and are a different site now) that rated everything by quality and value separately. I really liked that, because you could basically say "Ok, this is just a 3/5 for quality, but it's $79, and I don't need anything great" or you could opt for something more expensive that was made better. You could also avoid the all-too-prevalent category of overpriced junk that just has the right label on it.
Tutan -- totally agree. The usefulness of reviews comes from the context of the reviewer and their relevance to you. I want to read reviews from people like me, which the Score Filter of booking.com does to some extent (best I've seen so far, and thanks for the tip!).
I hate travel site reviews where the reviewer is glowing about some destination and then they say "best place I've ever been" and "our honeymoon" and you realize it's a newly married 21 year old who is (1) on high from just getting married, and more importantly (2) a relatively inexperienced traveller and doesn't have the context to compare that destination with 50 others they've been to. As an experienced traveller (without kids) I want reviews from other experienced non-family travellers who don't place any value on nightclubs but prefer the outdoors (for example).
Same thing with movie reviews -- You'd get wildly different reviews based on age and gender, and I want to read reviews from people like me. I don't care that teenage girls like Twilight; as a mature male I would probably hate it (haven't seen it). I wish I could limit the reviews I see to reviewers of similar age and gender.
Reviews will never capture the opinions of those who don't take the time to write them.
That aside, sounds like you'd be a big fan of TripAdvisor Pete :)
1. Plans to incorporate social graph? Check.
2. Sentiment analysis? Check.
3. Semantic search? In development - Check.
This is a great article. I beleive the sentiment/interpretation is key to reviews and it doesn't just apply to Amazon.
I was staying in a hotel recently and checked the reviews and one said 'terrible, worst hotel ever, 1 star etc...' skimming you would think that doesn't bode well for your stay but, actually reading the review the reason they were so upset was that they had got fined for smoking in the room! In UK it is illegal to smoke in a hotel room and there are signs everywhere and a clear warning on the collateral that you will be fined.
So was the hotel terrible? No it was brilliant but this customer was upset at being caught breaking the law and fined accordingly. Like a child that can't take any responsibility or 'crazy cousin Larry'
Nick
Tripadvisor is the bane of our life, with reviews often bearing no relation to the actual hotel in question but more to do with long flights, missed taxis or often a range of other criteria. And of course there are always the serial complainers who aim to find something to criticise....
A more comprehensive way of reviewing the reviewers would hopefully lead to more credible reviews in a number of areas.
Great article, Dr. Pete.
My personal biggest review-related pet peeve is when people leave a review against a product that isn't actually to do with the product itself. For example, if delivery was delayed and they give it 1 out of 5. That's not the product's fault. I've also seen it on LOVEFiLM, with people giving films reviews of 1 out of 5 for films that aren't available to watch.
In addition to being mind-numbingly stupid, it risks bringing down the average as a whole, and making an otherwise great product look average.
If there were a way Amazon et al could dismiss these types of reviews based on semantics, sentiment and whatever else, I'd be a happier man. It'd also make for a more accurate reviewing experience, too.
Haha - yeah, I love that - "I ordered the Juice-o-matic 9000 and then the mail carrier dropped it in a mud puddle and my dog ate it. AMAZON IS THE WORST!"
I will tell you how I wish they worked. You have to login to leave a review on an Amazon product, right? Based on emails and contextual ads I see from Amazon, we know they follow your history pretty closely. So why not only allow users to leave reviews on products that they actually ordered through Amazon? I imagine something like eBay. When you buy a bunch of things there is a list of people you bought from reminding you to leave feedback on each one. You can get to it as you feel like it, but you can't leave feedback on sellers you didn't buy from. That is essentially what Amazon allows. Anyone can review anything. If nothing else this cuts back on the many fake reviews that have permeated the site. Well, all review sites for that matter, but still, Amazon has the data to prevent this. The sentiment analysis and semantic search, all that is good too, don't get me wrong, just another thing I would add to it.
Dr. Pete,You already know I'm a fan, so nuff said about that. Kudos to you for knowing your fee structure for Amazon. I like that!
You said: "We still think of Knowledge Graph and other semantic search efforts as little more than toys, but they’re building a framework that will revolutionize the way we extract information from the internet over the next five years" - to me that's the crux of your whole post and extremely well said, heady and significant/
I should know better than to read your posts after work when I'm home at night because you are brief with words and huge on substance. I am often left going "wait....whaaatttt????" Then I have to go re-read the whole thing to truly appreciate the full brilliance. lol
Okay, now, I have a followup question. Is there a way to apply filters, sorting and/or faceted navigation to reviews, the same way those things are applied to products?
Dana
Hi Pete, excellent post. Definitely reviews are much more useful and informative if, on top of the score, they have categories associated.
At Bitext we are working along these lines: for example, we analyze Twitter comments on iPad to extract scoring (+10/-10) and category (Product, Price, Image...).
I'd love it if you could take a look and give us some feedback.
Positive and negative comments on iPad with category:
https://demos.bitext.com/naturalopinions_en/RelatedConc.aspx?Entidad=iPad
Topics of positive comments in category PRODUCT
https://demos.bitext.com/naturalopinions_en/RelatedConc_Detail.aspx?Entity=iPad&attribute=PRODUCT&polaridad=Positiva
Actual tweets that exemplify category PRODUCT and positive polarity
https://demos.bitext.com/naturalopinions_en/EntryList.aspx?Entity=iPad&entity_component=case&polaridad=Positiva&Component=PRODUCT&filter=Yes
We also work on semantic search.
You have more information on our semantic technology in www.bitext.com
Best regards,
Antonio
CEO & Founder
www.bitext.com
Products that you need to review before Buy. Get Amazon, Ebay & AliExpress Top Products Review. Read the Others User's Opinion,Make the Decision.
Very interesting post Dr. Pete!
You are right. Products reviews on Amazon as well as in any other evaluation contributing platform can be very subjective. Some reviews can go on praising many good things about the product and then one small disappointment can take all of the stars – which does not make much sense.
Your suggestion to categorize reviews by features would be very helpful. However, as you mentioned, the number of reviews is huge. Even in your Thirsty example, they would have 739 evaluations. To skim through all reviews and categorize them would be a great deal of workload. That is not to mention the number of products Amazon offers. Furthermore, some reviews can discuss several features at once. Therefore, how would you recommend Amazon and similar companies go about implementing your suggestion?
Trang Lam from Powered by Search
Interesting that you used Amazon as your case in point Dr. Pete...and I too would like to see different ratings for quality, price, workmanship(or something like that)...to put the emotional part in here in what Google is and will I believe ultimately do is fascinating...just the progression in how the bots look at content and quality now fascinates me. Thank you for bringing this forward to a small business owner who keeps trying to stay with the curve!
Today I tried the same search on Google Australia for "Broncos Colors", and at the bottom right of the answer box there's a grey text link that says Feedbox/More info. If you click on this link you can give some feedback on the answer by reporting wrong facts or click on facts to locate them on the web. Interesting.
That's what I call - a wishful thinking :-)
This is extraordinary post. About Amazon:
I imagine this kind of "Top Reviewer" or possibly "Favorite Reviewer" attribute resolve as well approach keen on cooperate at several tip.
Thanks for sharing this extraordinary stuffs by Dr. Pete
Great post Dr. Pete. I believe that all of us suffer from the "Crazy Uncle Larry" affect on social media, those people we have added to our social connections, who we are friends with but don't really consider influences. It will be interesting to see how social continues to grow and develop, as social is continuing to affect everything we do from search to the ads we see. We will be able to create a "Crazy Uncle Larry" circle/affect, to distinguish who are true influences are or not? Only time will tell.
As someone who runs a product review blog/website, I found this article very interesting. I make it a point to have my testers who also do the writing be objective and add a little bit of subjectivity. I feel this is most useful as consumers want to find out the facts, but they also like a little real life reference to make sure the person knows what they're talking about. Most reviews on retail websites are good, but not always great. They end up speaking to one or two features they really like or hate and expand on that. This can be extremely helpful if it is a feature you are looking for, but as you mentioned, if they go on talking about a cup holder you don't care about, it has just wasted a minuted of your life. The problem with many product review blogs including mine is that you usually have a few people testing out the product, not hundreds or thousands of people testing out the product which means you can get a slanted view depending on what product blog you are reading.
A potential interesting concept would be for retailers to work with trusted experts/bloggers who can help provide validation. One of the conflicting issues here would be that the retailer would want to keep the focus on their website and brand vs. the blogs website and brand. I would also assume the retailer would want to keep the reviews positive as they want to sell products.
I'd be interested to see if there would be a good way to implement your ideas you spoke about above into a website or blog, not a retailer. If anyone has a good example of a current blog/website doing this, I would love to see it.
As a consumer, I try to look at a number of different sources such as retailers websites, product blog or website, and talk to people I trust. I think this is where Google Authorship could potentially add a lot of value in the future, but different features would have to be built out in the future.
My website is ActiveGearReview.com in case anyone wants to see how I have my testers write their reviews. It's by no means is it a state of the art product review website, but I take pride in that the reviews are honest and comprehensive.
Doc - I love your care for concern - and just delivered the same tone to another platform/site who seems to be limited to some really cool tweaks.
I asked them about maybe launching a beta version or something always trying to stay pro-active and open for a sale ;o)
Your pal,
Chenzo
This is fantastic post. One more time Dr. Pete at his best. Semantics+Sentiment is fabulous method. The most I like in that is only two way feedback. Either you like it or not. You can have more specific features though. So I can look at some specific features which is more important for me. One more thing, by using this we can compare the products more easily. Pick important features you like and compare. Choose which suits you best.
This is a problem that has plagues Amazon for years. It is much more difficult to tackle as Google SERPs take on more user feedback (Google+ Local reviews being the current example), no matter how arbitrary that user sentiment is.
Social signals are very good at driving traffic and sales, and small businesses find it a lot easier to benefit from this than big business does. This is an area where small business can stand up and win!
Nice post!
Some really interesting and good thoughts.
As I was reading the post, the first thing that struck me was breaking down the reviews into sub-categories as you then proposed. That's certainly a good step in making reviews more semantically relevant/digestible. But, it leaves a lot to be desired in terms of tapping into the power of "big" social data.
My big questions:
As a consumer, what does a really, really great review system look like in the ideal world? What data is actually valuable to me? Do I, in all reality, care how many of my friends own or like Product X, or is that just as irrelevant as a database of en masse public reviews?
I don't think that we, as consumers, even know the answer to these yet. That means it's up to someone brilliant to not just figure out how to build something great, but figure out what that something great is, how it functions, and what it looks like.
Honestly, I have big concern regarding quality reviews and filter dump reviews on product. Because, I can't understand: How different user have 100% different review on specific feature of product. Sometimes, I am getting bored to read big reviews which does not make sense before purchase. You can visit following product page to know more about it. I am using similar phone But, I can't understand excitement of users to add that much big reviews on it.
https://www.flipkart.com/micromax-ninja-a89/p/itmdjjguxchngshz
If I am talking about social signal and understanding of products via social network so Facebook fan page is quite better way to drill down more on reviews.
https://www.facebook.com/flipkart
I like to see shared product by Admin, share by users, comments by users on Fan page and reply by Admin.
Thanks for a great post Dr. Pete. The idea of integrating social and semantic analysis into reviews is something which is already being experimented with by companies like Yotpo and Commercesciences which are using consumer behavior and a lot of A/B testing to understand how to get the highest validated consumers to take specific actions. In Yotpo's case, to write reviews.
Your post doesn't touch on the idea of mavens and being able to sort through reviews of experts and other "fan boys" which I think is also an interested concept to think about in this space.
Out of all of the different things Amazon tests for, you really don't see enough testing of their reviews. Great insights.
Love it, but I would ask for a share per usage fee.
Also - SE are MUCH more complex than we seo often assume in our daily work, and this is a great example for this. Why is this ranking and not that - one factor influence might make something tip, but in many cases multiple factors will have played a role to rank / derank a page.
I think social though is more complicated either. it is not about direct connections, but profiling based on direct connections (and many other elements), and then getting reviews / recommendations / search results based on the profiling. With this, SE / social media can avoid this 'small circle' gap.
Good post, Dr Pete.
Here at alaTest.com, we try to collect, analyze and summarize millions of product reviews from around the world to help consumers find the best products for them, based on review data.
We calculate an overall product rating out of 100 (alaScore) and use NLP and sentiment analysis to rate key product features (value, design, usability, etc.) based on review content from over 3,000 sites. We also rank review sources by type (professional or consumers) and editorial quality as you suggest.
The hard parts are matching reviews to the correct product, as well as merging reviews for similar products (e.g. hardcover vs. paperback book, LP vs. CD albums, etc) together...
we would like to comment on what the future of SEO as to Latin America is changing.
greetings.