About

Critic-led. Jersey-approved.
And honest about the data.

Shorefoodies is one critic's honest take on every restaurant in New Jersey worth talking about — and a quiet protest against a 5-star world where every place is a "4.2."

Meet

Nadia.

Born and raised in New Jersey. Has eaten her way through every county from the Shore to the Skylands and back. The kind of person who reads the entire menu before ordering, asks what's in everything, and remembers what the bread tasted like at a place she went to seven years ago.

"My job is to tell you the truth — even when 1,400 people on Yelp gave it four stars."

Every restaurant on Shorefoodies has been visited (often three or more times), eaten through, paid for out-of-pocket, and scored by Nadia personally. No comped meals. No press dinners. No paid placements. Just an opinion from someone who actually showed up.

The Yelp Problem

The 5-star scale lies. Here's the math.

Yelp uses a 5-point scale, but in practice almost every restaurant lives inside the same narrow window between 4.0 and 4.5. The spread is so compressed that a 4.2 is meaningless — it could be a transcendent dinner or a forgettable strip-mall night. Yelp can't tell you which.

3.96
Average Yelp star rating
Across all 343 NJ restaurants in our database. The Yelp universe is centered on "fine."
±0.46
Standard deviation
68% of restaurants land between 3.5 and 4.4. The whole 5-star scale collapses into one star of usable signal.
45%
In the 4.0-4.5 cluster
Almost half of all restaurants get the same vague "good" rating. Differentiation is impossible.

How NJ Yelp ratings actually distribute

Below 4.0
43%
4.0 – 4.3
31%
4.3 – 4.5
14%
Above 4.5
7%
No rating
5%

There's a deeper problem: the people scoring on Yelp aren't critics. They're people having dinner. They mix up bad service with bad food. They mark a place down because the parking was tight. They confuse "I had a great date" with "the restaurant was great." Atmosphere, value, food, and service all collapse into one star rating, and you can't tell which one tanked the score.

Our Method

Four scores. One number. Real signal.

Nadia scores every restaurant on four dimensions independently, then we combine them with food weighted double — because the food is what you came for. Sub-scores are visible to Nadia internally so she can audit her own consistency, but the public sees one calibrated score that actually means something.

Sub-score 01

Food

How good is the actual food? Flavor, technique, ingredient quality, kitchen consistency, the dishes that should be hits actually being hits.

Weighted 2×
Sub-score 02

Value

Does what arrived match what you paid? An $18 burger that's worth $30 scores high. A $90 omakase that should've been $40 scores low.

Weighted 1×
Sub-score 03

Atmosphere

Lighting, noise, comfort, design, the way the room feels. Whether it actually delivers on the experience it's selling.

Weighted 1×
Sub-score 04

Service

Were they on time, attentive, knowledgeable about the menu, professional under pressure, present without hovering? Service is craft, too.

Weighted 1×

The math

final = ((food × 2) + value + atmosphere + service) ÷ 5 + 0.15 calibration

The +0.15 calibration acknowledges that a "perfect" restaurant doesn't exist — it's an honest acknowledgment that even the best places have one flaw on any given night. This pulls the top of the scale into reachable territory while still demanding real excellence to clear the 80% Nadia Approved threshold. Sub-scores are admin-only; what you see publicly is the calibrated score and the tier it earned.

The result

Across 343 NJ restaurants, Nadia's scores have a standard deviation of 9.13 percentage points — almost 20× the spread of Yelp's compressed scale. The bottom 9% (Skip It) and top 11% (Nadia Approved) are clearly separated. You can actually tell where to spend your money.

Then why show Yelp?

So you can see exactly how broken it is.

Every restaurant card on Shorefoodies surfaces the Yelp rating right next to Nadia's score. On purpose. We're not hiding the comparison — we're making it visible.

Look at any Nadia Approved (80%+) place that has a Yelp 4.0. Or a Skip It list bomb that somehow has a Yelp 4.4. The mismatches tell the story better than any essay we could write. Aggregated public ratings don't differentiate quality — they differentiate who showed up to vote.

See the rankings →