Kadin2048's Weblog
2017
Months
JulAug Sep
Oct Nov Dec

RSS

Tue, 20 Oct 2009

I was flipping though the channels on TV earlier and came across a new addition to the local lineup — something called The Research Channel. Apparently it broadcasts recordings of presentations by various notable people on a variety of subjects. The recording that caught my eye was Behind the Code with Jim Gray. Gray, at the time of the interview (2005) of Microsoft Research but formerly of IBM, Tandem, and DEC, had some interesting comments about databases, parallel processing, and the future of hardware.

At one point (about two thirds of the way through the video), he describes future processors as probably being “smoking, hairy golfballs.” The ‘smoking’ part is because they’ll be hot, consuming and dissipating large amounts of power in order to run at high clock speeds; hairy because they’ll need as many I/O pins as possible, on all sides; golfballs, because that’s about the maximum size you can achieve before, at very fast clock speeds, you start to run into the “event horizon” (in his words) of the speed of light and lose the ability to propagate information from one side of the processor to the other in one clock cycle.

He didn’t give a timeline on this prediction so I’m not sure it’s fair to call it either correct or incorrect just yet, but it’s interesting. The ‘smoking’ part actually seems to have gone in the opposite direction since 2005; power dissipation has gone down from the highs of the Pentium IV and IBM G5, but it’s possible it could creep back up again if something stops the current trend. He seems to have been right, at least in a limited sense, about ‘hairy’: a look at new processor sockets shows a definite upward trend, with Intel’s newest at more than 1500 pins — common sockets in 2005 would have had less than half that. They’re still all on the bottom of the package, though. The ‘golf ball’ maximum on size is more theoretical, but I don’t think anything has happened recently that provides cause to dismiss it.

After watching the segment, I pulled up the Wikipedia page on Gray, curious to see what he was up to today. Unfortunately, it was at that point when I remembered why his name seemed so familiar: he disappeared while solo sailing off the coast near San Francisco, and despite a massive crowdsourced search effort, he was never found. An sad and unfortunate end for a very interesting guy.

Related:

0 Comments, 0 Trackbacks

[/technology] permalink

Sun, 18 Oct 2009

I really like Yelp, which is probably why I’ve bothered to spend time typing up reviews for it, despite it being a commercial service that could theoretically pull a CDDB at any time. I’ve found a lot of neat little restaurants that I wouldn’t have otherwise found, particularly while traveling, via Yelp, and in general have found the ratings and reviews there to be of very high quality.

However, I’ve noticed that as Yelp’s userbase has grown and expanded beyond the computer-savvy foodie demographic that seemed to have been some of its first users, the average ratings for a particular business are no longer as useful as they once were. It used to be, if a restaurant had five stars and more than a handful of ratings, it was almost certainly phenomenal. Similarly, if a place was languishing at one or two stars, it was probably best avoided — after all, if a place is bad enough to actually get someone (who isn’t being paid) to spend the time to write a negative review, something must be pretty wrong. And if something was in the middle, chances are it was pretty much just average for whatever cuisine it was trying to represent.

Lately, though, I’ve noticed that many places — and this is especially true of eclectic or “acquired taste” restaurants — are getting pushed towards middling reviews not because anyone is actually rating them that way, but because very good and very bad reviews are being averaged out into two or three stars. This isn’t really surprising: reviewing restaurants is a “matter of taste” practically by definition. But that doesn’t make the result very useful. When I’m looking down the search results in Yelp, I want to know what I am likely to enjoy, not what some hypothetical “average user” is going to like. (I’m not the first to notice this problem, either.)

As more and more users join Yelp and start writing reviews, the average review will naturally start to approach what you’d get from reading the AAA guide, or any other travel or dining guide aimed at a general audience. That’s not necessarily bad, and when you’re writing a travel book or dining guide it’s pretty much exactly what you want: try to give an idea of what most people will think of a particular restaurant.

But that’s certainly not the best that an interactive system can do, not by a long shot. The benefit of a website, as opposed to a book, is that the website doesn’t necessarily have to show the same exact thing to everyone. This is why the front page of Netflix is more useful than the top-ten display down at your local Blockbuster, or why Amazon’s recommendations are typically more interesting than whatever happens to be on the aisle-end display at Borders. It’s not that Blockbuster or Borders aren’t trying — they’re doing the best they can to please everyone. The beauty of a dynamic website is that you don’t have to try to please everyone with the same content; you can produce the content in a way that’s maximally useful to each user.

If Yelp took this approach, ratings from users who tend to like the same things that I do would be weighted more heavily when computing an establishment’s overall score; if you brough up the same restaurant (or if it came up in your search results, more importantly), it might have a different score, if your preferences — as expressed via your reviews — are significantly different than mine. This makes perfect sense, and provided that there’s still some way to see what the overall, unweighted average review was (Netflix shows it in small print below the weighted-average), it’s a no-lose situation from the user’s perspective.

I’m sure that Yelp’s engineers are aware of the Netflix model and how it could be applied to Yelp, so this isn’t a suggestion so much as a hope that it’ll get implemented someday.

0 Comments, 0 Trackbacks

[/technology/web] permalink

Wed, 07 Oct 2009

Earlier today I read a blog entry by Ben Metcalfe that really hit home. The entry is called “My GMail password scares me with its power,” and I’d like to say that he’s not the only one. Particularly in light of the widespread (and apparently quite successful) phishing attacks going around, it’s a good idea to think about how much of your life and personal information are stored behind that one password, and whether that password is really up to snuff.

Metcalf puts forward what I think is a very modest proposal, which I think boils down to two main points. Neither are trivial, but neither are either one a real stretch on technical grounds:

  1. Google ought to allow you to enforce some sort of privilege separation: rather than just having one password for everything, more sensitive services (GMail, Google Checkout, Search History) should be able to be configured to use a separate password. This would ensure that the cached password saved in the chat program you use at work couldn’t be used to log into your mail, or make purchases to the credit card associated with your Google Checkout account.

  2. Users who are security-conscious could buy a two-factor authentication token, like an RSA SecureID, to use with some or all Google services. This wouldn’t be mandatory and it wouldn’t be free — so it wouldn’t help the clueless or the broke — but it would let those people who are honestly concerned about security but who lack the ability to replicate Google’s services themselves (and, lets face it, just about nobody can replicate Google’s services at this point) to get that security on top of Google’s offerings.

Perhaps neither are economically feasible right now; too few users may care about security—and be willing to pay for it—to cover the cost that either would mean to Google to implement. But as users put more and more of their data in the hands of managed services like Google’s, and security breaches start having more serious consequences, the demand will come.

In the meantime, what’s a concerned user to do? The best thing you can do is to choose a more secure password. If you don’t mind potentially creating something that you can’t memorize, use a random-password generator and either write the results down, or store it in a ‘password keeper’ program that encrypts its data file with one (good!) master password. I take this latter approach, and use the open-source Password Safe on Windows and Linux, and Password Gorilla (which opens Password Safe database files) on Mac OS X. And, of course, take all the usual precautions against potential phishing attacks.

Until Google sees fit to improve on the one-username/one-password architecture for all its services, that’s about the best you can do.

0 Comments, 0 Trackbacks

[/technology/web] permalink