Blackboxx?

Grayboxx is a new local search site that shares many of Loladex’s goals, but goes about things in a different way.

Specifically — as I understand it, anyway — they scour a bunch of sources, both online and offline, for “neighbor recommendations” of local businesses.

Many of these recommendations are implicit, as opposed to, say, the explicit endorsement of a favorable review on Yelp. The example Grayboxx cites is a repeat reservation at a restaurant: It’s a sign someone likes the place.

Grayboxx gathers a zillion such data points from sources it doesn’t disclose, runs them through a secret algorithm, and comes up with recommendations that extend even to obscure service providers in small towns — far wider and deeper coverage, in other words, than any competitor.

The company, which has been brewing for several years now, has large ambitions. It’s positioning its “PreferenceScoring” algorithm as the local equivalent of PageRank, the secret sauce that propelled Google into the stratosphere. And it has lined up a credible advisory board.

The site launched last week, sort of: It kicked off a national “tour” of the smaller towns where its coverage supposedly excels, starting with Burlington, VT. Since I don’t live in Burlington, I can’t judge what it’s doing there. Indeed, I’m not even sure what it means to be “on tour.”

However, the company also has a (non-public?) beta site that’s not limited to certain ZIPs. It’s very interesting. Grayboxx’s future shouldn’t be judged by it, I guess, but a few things are immediately clear:

  • It looks nice. Simple, clean.
  • It definitely has a lot of “neighbor recommendations,” even in small towns like mine, as promised.
  • It doesn’t demand much of its users, which is good.
  • It doesn’t tell users how it makes the sausage.

This last point, I think, may be crucial. Grayboxx is creating a mystique around its “patent-pending” methodology that may come back to bite it. Its claimed value — Find what your neighbors think — is a lofty one, but vulnerable to skeptics.

Grayboxx may be the site’s name, but the beta behaves more like a black box. It doesn’t explain the nature of its “neighbor recommendations” for any listing, nor does it provide much extra information or link to many user reviews. We’re left with the raw rankings.

The thing about black boxes, of course, is that they must work. Judgment is swift, and you don’t get to explain away bad results. Google aced this test in its early days, which is why it’s on top today. I’m not sure whether Grayboxx can follow.

Certainly I wasn’t bowled over by the results on the beta site. The algorithm doesn’t seem to capture character or local flavor, leaning toward bland businesses and chains. And some results were just weird.

As an example, I believe most of my neighbors here in Leesburg, Va., would recommend Lightfoot and Tuscarora Mill as two of the top five restaurants in town. I’d rank Tuskie’s first myself.

Grayboxx “ranked” them at #103 and #104 today, behind the hot-dog place in the food court (#38), Domino’s (#42, #89, #93), Subway (#46), Starbucks (#61, #74), a grocery store (#77), Taco Bell (#82), and many more, inluding several places that are closed.

Luckily for Tuskie’s — which has gotten heavy praise in Wine Spectator, Washingtonian and elsewhere — it still ranks as a better option than Ashburn Eye Care.

By one place.

Meanwhile the top-ranked restaurant near Leesburg, according to Grayboxx, is the Rail Stop in nearby Ashburn. I had never heard of it, so I looked it up. It’s a good restaurant but it’s actually in The Plains, a town almost 40 miles from Ashburn.

True, I can force Lightfoot and Tuskies to the top of the results with two rather non-obvious clicks. Grayboxx seems to have the ingredients for a good ranking system, but is outsmarting itself.

Who knows whether such observations are fair? None of the towns I tested have truly launched, so it’s too early to say. Still, Peter Krasilovsky points at a review from a Burlington resident who had a similar reaction: Grayboxx results are too “obvious,” providing little insight beyond popularity.

Certainly the black box needs some tweaking during Grayboxx’s rollout period, and the data needs scrubbing. (Ashburn Eye Care?) I believe the idea itself is workable, although the blandness factor may never be stamped out entirely — and threatens to stop Grayboxx from being any more helpful than, say, the Yellow Pages.

The underlying issue, I think, is that “real world” word of mouth involves a particular person (me) getting advice from particular people (my friends). It’s not as easy as watching to see where most locals go, or we’d all end up at the food court.

I’m sure that Grayboxx can re-weight its sources, rejigger its algorithm, and come up with more characterful recommendations. But local is above all personal, which means that emulating a non-personalized measure such as PageRank isn’t the best approach, no matter how well it’s done.

As long as every Grayboxx user is getting the same recommendations, something important is being lost.

5 Comments

  1. >The underlying issue, I think, is that “real world” word of mouth involves a particular person (me) getting advice from particular people (my friends).

    Laurence,

    To your point about restaurants in Leesburg, it turns out that Lightfoot is in fact the top restaurant with 373 neighbor recommendations. There is a data issue responsible for this – it was classified as a ‘banquet hall’; we’re sorting that out; in 2 weeks, that will be resolved, and Lightfoot will be right at the top, given its volume of neighbor recommendations. And Tuscorara is second with 217 neighbor recommendations – a large number.

    There is no doubting the value of personal recommendations. We all rely on them to a greater or lesser extent. However, there have been times when I’ve found it easier and sometimes necessary to consult another source for recommendations. Millions of people are looking for recommendations online- hence the traffic volume on sites like Yahoo Local, CitySearch, and Yelp. User-reviews are helpful but the only problem is that they are overwhelmingly found in major metros like New York and the Bay Area. grayboxx results aren’t perfect, but they shed light in geographies and categories (ie: plumbers, patio furniture stores) that user-review largely doesn’t touch.

    Our algorithm is a work in progress. There are cases where it works like a charm and others where it doesn’t work any better than existing local search sites which display yellow pages data. To this end, we are active in having people at the local level give us feedback so we can improve the service. We have an advisory board through Facebook; would like to offer an invite to join it- we benefit from any and all feedback (http://www.facebook.com/group.php?gid=2401529379).

    Best,
    Bob


    Bob Chandra
    Chief grayboxx Officer, grayboxx

  2. This approach has potential, but the current implementation does not seem to add much value to my search results. The chief shortcoming is that there is no context (i.e. relative weighting) provided by the “neighbor recommendations.” What does “131” recommendations mean, particularly when viewed in results ranked by another variable (e.g., distance.) Ideally the algorithm would make comparative assessments that better inform my decision.

    – Drew

  3. Hi Drew,

    There in fact is a relative order: more neighbor recommendations suggest a business with more community approval than the one with less. If you want to decide which business to patronize by other criteria(be it distance, relevance, or recommendations), the search interface allows you to do so.

    Best,
    Bob

  4. Hi Bob —

    Thanks for visiting! I believe in the value of your general approach, I’m excited to see you launch, and I wish you the best. I think there are three broad issues:

    –> The ranking algorithm. There seem to be some issues with data and false matches, and probably some with the algorithm itself. I am sure these are all fixable. As an FYI, I used the beta yesterday to find a restaurant, and found the “order by recommendations – all” option to be most useful. These are the same two clicks that would put Lightfoot & Tuskies at the top of the Leesburg restaurant list. This what I mean by “outsmarting” yourself — perhaps ranking strictly by # of recommendations would suffice, especially if accompanied by some filters? Users probably won’t mind the extra work, and I suspect they’ll be happier than if Grayboxx (essentially) does it for them. I don’t know whether I’m typical, but in my limited usage of Grayboxx I very much feel the absence of filters.

    –> Transparency of the “recommendations.” I think users (at least the influencers) likely want more than the generalized explanation on the site. Personally, I’d like to be assured that when one place has 76 recommendations and another has 42 recommendations, that those are comparable numbers. I think you should have a deeper explanation available somewhere on the site. Its absence makes it harder to cut the algorithm some slack — the black box effect.

    –> Overall approach. I agree with everything you say about review sites, and am not advocating a review-based solution. (I’ve written about this previously.) Mainly I believe that word of mouth is most useful below the level of the entire community. Not that community-level recommendations aren’t useful — they are, particularly in small towns and/or certain categories — but some level of personalization would help. I want to hear from people like me, and ideally from my friends. But then I’m biased, because that’s the path I’m pursuing myself. :)

    Laurence

  5. Bob – Thanks for the reply. What I mean is: Since you can only view a limited number of establishments in any given search result, the accompanying numerical ratings are only comparable within that batch of results. It would be helpful to know where “133” ranks in the overall scheme of things. e.g., “Top 10% establishment” or some sort of uber scale.

    Nice work overall, very impressive.

    Drew