Five star dining or burger joint from hell?
At Vicky's Diner in Upper Manhattan, New York, Peter the waiter is a much-loved character in the community and the small diner is a popular fixture with families.
Yet online he is occasionally called "grumpy" and even "rude".
"But he has been here for years and knows almost everybody who comes in," says his boss Vicky. "He knows their sense of humour and has a gruff voice, and jokes around with a lot of people.
"Some people think he's being rude to them, because they don't know him!"
This is just one of the problems with online reviews, a lack of context. Others are dishonesty and paid-for puffery - annoyances that review sites such as Yelp, Amazon and Tripadvisor have been struggling to weed out for years.
Such fake reviews and blackmail attempts threaten to undermine trust in the "wisdom of crowds" - reviews that many of us have come to rely on before we part with our hard-earned cash.
Could better technology help restore confidence in the review system?
The developers of a new mobile app from the UK, called Twizoo, say they have found a way of weeding out fake, paid-for and out-of-date reviews.
The start-up, which is concentrating on restaurants initially, scrapes comments from Twitter, then collects and analyses them to provide what it claims is a more reliable stream of review data.
Twizoo co-founder and algorithm writer Madeline Parra is clearly chasing industry leader, Yelp.
"We left 24 fake reviews on Yelp and all got through within a couple of hours," she says. "But Twizoo expects users to have a full social media profile, tweeting about lots of different stuff, so it's a lot harder for a fake review to get through to us."
Twizoo gives users a "Twitter credibility score" that has to be earned over time. Tweets that come from a brand new Twitter account are automatically discarded and several tweets coming in at the same time about one particular restaurant are also considered suspicious.
That stops friends and family of the chef or restaurant owner influencing reviews, for example.
"After three months, tweets count half as much as they did before," says Ms Parra. "On Yelp, if a restaurant got a one star review five years ago, it still counts against that restaurant.
"Because our volume [of tweets] is so high we want to give our users not what was awful five years ago, but what is great right now."
Ms Parra also points out that tweets are more likely to be authentic because they are often sent between friends as recommendations.
'Quality over quantity'
Yelp is incredibly protective of its "secret sauce" or more accurately its "secret source code" that makes up the algorithm used to weed out potentially false or over-enthusiastic five-star reviews.
In fact, Yelp PR director Marco Bilello says only a very few people in the company know the exact formula.
"If everybody knew the ins and outs it would allow them to game the system," he tells the BBC, "so we can't share that information.
"We believe in quality over quantity. That's why only 71% of our reviews that go through the recommendation software are eventually recommended."
He did reveal that multiple Yelp reviews for the same restaurant coming from the same IP address are given extra scrutiny. And those reviews that seem biased - from the likes of competitors and disgruntled employees - are flagged.
Yelp also has people on the ground in many locations and an investigative unit looking out for restaurants offering incentives to customers in return for five-star reviews.
Amazon is also constantly tweaking its review system - star ratings now reflect how useful readers think a new review is and if it comes after a verified purchase.
It also favours reviews that were written by a customer paying a standard price rather than a deeply discounted sale that may have only lasted days or even hours, but which may have resulted in an over-enthusiastic review.
Amazon has also cracked down on dodgy reviews using the full force of the law.
"Since the beginning of 2015, we have brought lawsuits against over 1,000 defendants for reviews abuse," says Amazon spokesman Tom Cook, "including both dishonest sellers and manufacturers who attempt to purchase fraudulent reviews, and the parties who provide and post those reviews."
Jobs reviews site Glassdoor rejects about 5-10% of its submissions, suggesting that it is harder to fake reviews about occupations and specific work places, and much less lucrative.
Tripadvisor chief executive Stephen Kaufer recently sent out a letter warning hotels and other businesses to be wary of "optimisation companies" promising to manipulate travel reviews for a fee, and then attempting to blackmail clients if they try to back out later.
The more we know about the person leaving the review, the easier it should be to weed out the fakers.
Walt Disney World gives customers a wristband that can be used to open hotel doors, pay for meals and souvenirs, and get into attractions.
Disney knows exactly where you ate, what you ate, where you went, what you bought, who your favourite characters are, what rides you went on, and even what you look like. That's a goldmine of data against which reviews could be checked.
But Georgios Zervas, assistant professor of marketing at Boston University, says: "You could imagine doing that, but I think it would be quite creepy.
"And most people would find it annoying and an intrusion of their privacy rather than anything else."
So despite being decades old, the concept of online reviews still has much further to go.
One of the next big shifts is likely to be towards personalised reviews, delivered to you according to your age, location and, perhaps, experience.
After all, a 16-year-old from Kansas and a 65-year-old from Maine might have very different opinions of a new lobster joint opening on the Las Vegas strip.
Follow Technology of Business editor @matthew_wall on Twitter.