Archive: September 2006

by
Saturday, September 30, 2006

New YSM Features: Ad Testing and Quality Index

Nothing particularly new here, and they haven’t launched yet, but in their September “On Target” advertiser newsletter, Yahoo outlines forthcoming features for their ad platform: ad testing and the quality index. For experienced advertisers, the descriptions will sound almost identical to similar Google features (although we’d guess that Google’s quality measures are more sophisticated). Note in the ad testing explanation that you’ll be able to run multiple ads against all the keywords in your “ad group”. Ad group? Hmmm…. The option to display the “best” (read: most profitable for Yahoo) ad sooner, or run them all equally to maintain control over the testing process at your end, also sounds identical to Google’s approach. Well, for this AdWords addict, the familiar functionality will be welcome.

Perhaps the publicity is a sign that the rollout is on schedule and that many accounts will indeed be upgraded by year-end?

Posted by Andrew Goodman

 

Friday, September 29, 2006

New Traffick Article Posted: “Paid To Read” On FBI’s Radar

Today we have an interesting article by guest writer Detlev Johnson about the “paid to read” crowd, click fraud, and the FBI. Are the feds finally zeroing in on this hard-to-prove type of fraud?

Read the article, “Paid To Read” On FBI’s Radar, but please — don’t click the AdSense ads! 😉

Posted by Cory Kleinschmidt

 

Local Search Surging; Google, Yahoo Winning

A comScore report (via Battelle) shows that local search usage is enjoying strong organic growth, up 43% over the same period last year. Clearly this is due to the fact that it is a natural extension of what users are already interested in, i.e. they didn’t need any convincing. The strength of their mapping products (in Google’s case) or their verticals (in Yahoo’s case) have contributed to user loyalty here. But in general, these companies’ superiority in information aggregation and their ability to innovate and generate rapid feedback cycles on a dime are driving the growth.

As we explained here 18 months ago, “info aggregators like Google and Yahoo” are better placed than the yellow pages companies to take advantage of the surging user interest in local search. They have now done so, and should continue to lead the pack.

Google and Yahoo are neck and neck for the lead, each garnering about 30% of the market. Other than Microsoft, others find themselves far behind — even those who purport to specialize in this area. Particularly disappointing is the “Ask Network” — which presumably includes everything in IAC’s stable including CitySearch — at 2.7%.

My prediction going forward? As a portal guy from away back, I say the portals continue to pull away, and also-ran listings players will continue to be marginal. There is clearly going to be underlying strength in content-rich verticals — especially user-generated reviews and the like — which provide the meat on the bones of navigation, so that will continue to be an area for innovation and rapid growth (and potential merger and acquisition activity). Locally-oriented blogs (of high quality) also seem likely to see additional waves of growth.

Posted by Andrew Goodman

 

Thursday, September 28, 2006

Supplemental Index Gone Wild?

“Disorganizing the world’s information and making it difficult to access.” That wouldn’t be a very good mission statement for a search engine, would it?

We use Google’s own blogger product for this (PR 7, in business since 1999) website. We publish only actual content and make about a dollar fifty a year in ad revenue. In short, we’re not spamming the engine by posting frequently, so WTH? (That’s “polite” for WTF.)

A bunch of pages from this website (the top couple SERP’s here, for example) now seem to be in Google’s supplemental index and thus harder to find in search results. Why? I’ve heard other bloggers mentioning similar problems. Apparently blogs make it easier for spammers to publish, so the rest of us are obviously suspect. Grrrr….

Or, blogs create too many “orphaned” pages, with no links? This shouldn’t be true, as the linkage is built right into the archiving, and again, if it’s Google’s own blog product, they should have a pretty good idea of how that works. If you post 30 times a week, what are the odds that someone external to you will always link to everything and say “great post!”. Is that the only measure of relevance? Should we be forced to engage in stealth linking campaigns for every third post just to keep them out of supplemental?

Honestly, two-year-old posts from this blog should *never* go into supplemental. Why would they? Did something change? They were good enough to index before, so what’s wrong with ’em now?

What’s maddening about that is when we contrive to publish certain posts as if they are “articles,” they tend to rank better. Anytime a search engine’s policy makes it useful to come up with such contrivances, they’re really not doing their job properly. A bunch of “well linked short articles” shouldn’t rank any better than blog entries. Again, the idea of blogs is a good one – it helps people publish without hassles.

The decision to publish something as an “article” rather than just laying back and letting the blog do its job as a superior content management system (well, I’m using blogger, so I wouldn’t quite say “superior,” but convenient and adequate) is not a “relevancy affecting” issue, it’s merely a content management decision. Is blog software so terrible as a content management solution? Of course not! It was invented precisely as a more accessible form of content management.

One reason content can find its way into supplemental can be “duplicate content”. Sometimes we allow others to republish our stuff (though rarely). But that’s not the only issue. I wish I knew what the real issue was. Likely, it comes down to the sheer volume of spam, link-farm-that-isn’t-a-link-farm–honest!, and scraped crap that gets thrown at Google on a daily basis, which means a lot of stuff is getting routed into Supplemental. I just fail to see how a single post on an older, trusted site, using Blogger, would meet that fate. There’s a 50% chance those posts might be useful to at least one searcher in the future, possibly even the President of the United States. There are many sites where that chance is closer to 0%… as in, well below 0.01%.

On a related note, the hack published over at SEOmoz that can help you discover how many pages you have in Supplemental doesn’t seem to be working anymore.

Bin Laden. Viagra. Hot Russian Brides. Peace out.

Posted by Andrew Goodman

 

Tuesday, September 26, 2006

Google Advertising Platform: Winning Battles to Win the War… (or, Why My Pain is Google’s Gain)

The formula for Google AdWords ad rank changed significantly in August, 2005. As I’m discovering, there are still many advertisers who simply aren’t aware of this. Many have taken leaves of absence from either paying attention to the details of paid search, or from in fact spending on it. I regularly encounter new clients who shut down their accounts 1-2 years ago because they weren’t profitable, but they’re now realizing they can no longer get by with “all organic,” especially as the holiday season hits full bore.

For those who still haven’t combed over the extensive updates I’ve provided to paid subscribers, including this July 2006 newsletter update with the article “New Quality Score Formula & Google’s Ulterior Motives,” [subscription required – to be a subscriber you need to purchase the Google AdWords Handbook package], now might be a good time.

The ad rank formula in the past (CTR X Max CPC = ad rank, subject to additional editorial rules and a minimum CTR threshold) was a minor black box because you couldn’t see competitors’ bids, and you don’t know what their clickthrough rates are.

The new regime – dubbed Quality-Based Bidding (QBB) – is essentially a giant hopper that Google can use to (a) throw in all sorts of priorities in different combinations to achieve various business objectives; (b) to optimize their revenues; (c) to make search engine users happier. (b) and (c) are distinct enough to be worth highlighting, but are just subsets of (a). The black box got significantly blacker.

One of Google’s business objectives is to increase advertiser confidence, especially among deep-pocketed, big business advertisers, many of whom haven’t gone “all in” yet. That’s still secondary to preserving the user experience. But in many cases, “quality-based” initiatives have the exact same effect on big potential advertisers as they do on ordinary users — they increase confidence in the medium. Google has taken a systematic approach to removing “chintzy-looking” ads that anger or worry both of these major stakeholders. They do it largely with algorithms that are being refined all the time, and for those heavily impacted, it takes savvy to get around those algorithms, if it can be done at all.

That creates very high minimum bids on some kinds of keywords in some kinds of accounts. Sometimes, an advertiser will give up in the face of this – but other times, we’ll actually raise our bids. It’s not a pure cash grab for Google, since the long term goal (increased big-advertiser community confidence) is the main reason for all this, but it doesn’t hurt their revenues either. One thing that’s being overlooked is the ongoing revenue bonus that comes from the new “keywords are never disabled, they can always be activated with a higher bid” regime. It sucks to bid $2.00 when you would love to try to keep a keyword “alive” at 20 cents as you could try to do under the old system. Then again, all those attempts to keep low-CTR keywords alive, ultimately failing, were a huge waste of everyone’s time. I’m running a successful lead generation campaign right now where some cost-effective keywords are garnering clickthrough rates below 0.1%. Try doing that under the old system. Sure, it doesn’t feel like a bargain because I’m bidding near $2.00 for keywords, with an overall average CPC 0f 0.68. But it’s a campaign that works, that’s the bottom line.

That’s why I’m a bit disappointed in Henry Blodget’s recent analysis that boilds down Yahoo’s woes to an overall ad slowdown; he puts forth the claim that this will affect Google equally. Google not only has a lot of breadth in its revenue base across wide swaths of keywords, but robust international diversification. Most of all, the systematic plan to raise the confidence of deep-pocketed advertisers – not by “wooing” or “selling” them, but by focusing on fundamentals and quality – needs to be understood by analysts.

I don’t know or care what will happen quarter to quarter, but this long term plan is part of what Google has secretly tossed into the Quality-Based Hopper (QBH). They’ll continue to tweak that formula behind analysts’ backs. And from here is looks like their lead on Yahoo is now 2-3 years wide. Ad slowdowns affect everyone equally? Maybe in 2000, when nearly nothing made sense. We’ve iterated a lot since then.

Posted by Andrew Goodman

 

Sad, and Sadder

According to Alexa.com “Movers and Shakers,” traffic to annanicole.com is up 7,000% since the last report, and traffic to VampireFreaks.com is up 340%. There’s some corroboration on the first, as last week’s Google Zeitgeist has “anna nicole smith” as the top gaining query, but no mention of the second at Zeitgeist. “Steve Irwin” was the top gaining query two weeks in a row on Zeitgeist.

Posted by Andrew Goodman

 

Monday, September 25, 2006

This Week on the Airwaves

Just a wee plug for a couple of episodes this week:

I’ll be on the “Shoemoney” Net Income show (Shoemoney is Jeremy Shoemaker, and he has a co-host named Andrea) on webmasterradiofm, tomorrow (Tuesday), live at 5:00 Central Time. For a little background on Jeremy, check out this interview. As a former appliance salesman, I suppose he’d rather “get his money for nothing and his chicks for free” than “moving those microwave ovens”… is a new theme song for his show in order? … Anyway, tune in live or catch up later by downloading the podcast.

On Thursday, I’ll be with Michael McDerment of Freshbooks for round 2 of a 2-parter on PPC advertising. Last week, Freshbooks customers and other interested listeners fired away with a variety of relatively basic questions in last week’s teleseminar. We’ll be back for more Thursday at 1:00 Eastern, for the Advanced version of same. Join us. (Scroll down to the bottom of the same page to download the podcast of the first one, done last week.) Mike has assembled quite a lineup here and he’s Mr. Smooth behind the mike as the interviewer.

Posted by Andrew Goodman

 

More on Third-Party Tagging: Next Stop, Google Groups?

I wrote earlier about Google’s effort to enlist users to tag images, in part through a competitive game that offers an incentive system. Notwithstanding the likelihood that participation in the mind-numbing guessing game will slow (I’m happy to report I just placed 53rd overall in my first round of play today, and can’t for the life of me think why I’d go back for another), it makes you think about all the stuff out there that still needs to be tagged… and how that’s going to happen.

You can see how this would have wider applications, even just within Google’s stable of properties.

Newsgroup postings on Google Groups, for example, aren’t tagged in that Web 2.0 fashion yet. Many useful posts don’t always come up in a search, or would at least be easier to find if users tagged them with a variety of related tags. The less onus on the user to come up with the perfect search query, the better… and when it comes to particularly useful newsgroup posts, the community can help. To reiterate, that model is not too much different from how the community (sometimes unwittingly) helped to create a pretty accurate third-party tagging and evaluating system by linking to “useful or related pages and sites” in the hyperlinked environment, often by including relevant anchor text or relevant nearby text. The usefulness of this has diminished somewhat due to “gaming” (search engine optimization tactics), but it’s still considered foundational to what makes the web tick.

The newer forms of third-party classification are, to a large extent, simply extending that age-old hyperlinks-with-relevant-anchor-text model. It was actually pretty clever for search engines, notably Google and Teoma, to glom onto this “world of third party opinion via linkage” as the dominant approach to “metadata,” replacing the old kinds of keyword metadata (which were and are, ironically, called keyword meta tags). The key differences between third-party hyperlinking behavior and first-party metatag decisions were (1) the “party” – third is more credible than first; (2) the ability to classify stuff even when the creators are too lazy to do so themselves… and to do so without having to rely on formal editors or gatekeepers, but rather the wider community. As we all know, that led to contrived activity (such as linking “campaigns”) that carries on to this day, but that’s another chapter.

The scope of this activity seems to rest heavily on an incentive system, but it needn’t. The particularly useful stuff can get tagged (doesn’t the notable content already always get that extra oomph from the user community, whether that means it gets linked to, bookmarked, or dugg?). The rest needn’t be. Everyone wins.

Posted by Andrew Goodman

 

Sunday, September 24, 2006

Competitive Blogosphere Research Gone Wild!

Let’s start the week off with a bang. Here’s what I think, in general:

  • Best Buy sucks.
  • General Motors sucks. I hate General Motors!
  • Apple Corp. is evil, rotten, hypocritical, and boy, do I ever hate that they have those funny ads right now, because I really like that guy who’s the PC guy. That Mac dude is insipid.
  • Google – worst company in the world!
  • Ben & Jerry’s and Haagen-Dasz both have unacceptably high rat-hair content in their Rocky Road formulations. Plus, the product is preserved with formaldehyde.
  • Toledo, Ohio is for losers.
  • Barry Schwartz has been married seven times, and murdered his third, fifth, and sixth wives.
  • Barry Diller owns several weapons of mass destruction
  • Burger King sucks.
  • Tim Horton’s sucks.
  • The Canadian Football League sucks.
  • Hewlett-Packard is governed by out-of-touch executives. And their printers suck.

These days, it seems that the best way to get curious onlookers to visit your website – and sometimes, to leave a comment – is to post something about a large company. Sometimes with lightning quickness, the public relations departments of such companies (or their suppliers) will light upon your blog if it says anything controversial (or wrong) about them. This must be because they’re all using state-of-the-art, proprietary competitive intelligence blogosphere research software.

Of course, I didn’t really mean any of the above. Here’s what I really think:

  • Best Buy doesn’t suck! I’ve always been pleased by Best Buy’s service. Their people are always helpful, and unlike the Canadian company Future Shop (now merged with Best Buy), they aren’t commission-happy doofuses trying to upsell you on useless warranties. You know who really does somewhat suck, though? The Source by Circuit City (which I believe is this Canadian version of Circuit City which replaces the old Radio Shack in Canada, which was owned by different folks than US Radio Shack). The product selection is terrific, as you’d expect by a modern-day version of Radio Shack. Unfortunately the clever alarm clock I bought there flat-out didn’t work. Also, on nearly every product worth more than $20, they wasted my time trying to sell me the warranty. Spare me the actuarial contortions and let me get my products and be on my way.
  • GM – don’t care much for them, but I like the looks of the new Saturn Aura. Two years ago, my wife would have agreed, but years of harping on the high-end car purchase I plan to make have apparently sunk in, and she refuses to consider this as an option now, should we someday buy a new vehicle. Not classy enough. You see, GM? This is what you’re fighting… at least Chrysler can claim they have cars with Mercedes guts (isn’t that why they have Dr. Z. in the ads now)? Let’s leave it at this, GM: you don’t suck as much as Ford.
  • I think about as much of Apple as the next non-Apple-using person. Great company, but I hear they aren’t as environmentally friendly as they could be. I’m typing this on a wafer-thin Toshiba Portege – I hope they’re better. The Toshiba is great, and doesn’t suck, but it did ship to me with two dead pixels. Don’t you hate that?
  • Google is, in fact, the best company in the world.
  • I made up the ice cream stories. Also, be aware that no major beer brands are preserved with formaldehyde, although a self-appointed beer wonk knowingly told me that Molson’s products are! Dude at the Charlotte House, you know who you are. Quit making stuff up! I did the research!
  • I’ve never been to Toledo, but in corresponding with a local publication, was surprised to discover how large it is.
  • No, not *that* Barry Schwartz! The other one – that murderer guy!
  • Barry Diller did kill a butler, but as far as I know, is otherwise peaceful.
  • Burger King does suck. It’s just my opinion.
  • Tim Horton’s does not. But their coffee is overrated, and about 5% of franchisees make it watery.
  • The CFL is an acquired taste. Also, you can lose that taste. I lost it. 🙂
  • Hewlett Packard is on the mend. But that last HP printer I bought sucked. More recently I’ve been using Brother products for printing.

To the Haloscan!

Posted by Andrew Goodman

 

Friday, September 22, 2006

Froogle to Be De-Emphasized, Google Base to Be Featured?

Google tells some eBay powersellers about its plans. Look out, eBay?

If you’re a retailer of any stripe, it looks like it’s time to start uploading your content (free) to Google Base.

Although Google Base is much different from Froogle, the rollout subterfuge seems to be similar. You can upload whatever you like, and reap new customers, at no cost whatsoever. So where’s the business model? Is this merely anticompetitive, or a more complicated long term strategy?

We started discussing this over at SEM 2.0 in a thread called What is Google Base?. Got a comment? Join in! (You need to join to read & comment).

Posted by Andrew Goodman

 

Sentenced to Be Belgium’s Butler… For Now

Google lost its appeal of the decision to force it to post a court ruling on its Google.be site. Google seems now to be interested in spending mucho cash to overturn the whole case, and is gathering allies in the form of news media who wish to develop protocols to mutually-agreeably distribute their content.

Posted by Andrew Goodman

 

Thursday, September 21, 2006

Yahoo + Facebook – When?

Wall Street analysts are telling Yahoo they should acquire Facebook. Coincidentally, reports say that Yahoo (along with Viacom) are indeed interested in Facebook, at a purchase price in the $1 billion range.

Is this a better or worse deal than acquiring Geocities for billions?

It’s hard to evaluate all those past acquisitions. I’m most comfortable in evaluating what I know about: the Overture deal. Where would Yahoo be now without that revenue, in spite of the huge lag in upgrading the platform? That deal was great. Most of the other acquisitions are much harder to evaluate.

It’s interesting to note we’re watching essentially the same game as we’ve been watching for 6-8 years: portals acquiring “users.” It’s just that today, users of some of these services are much more embedded and entrenched than some of those old acquisitions. A greeting card user (remember: Excite – Blue Mountain – huge sum – now-dead portal) isn’t a “user” at all. A member of Facebook, for now at least, is a relatively embedded, recurring member.

Posted by Andrew Goodman

 

Wednesday, September 20, 2006

Belgian Case: Goldman’s take

Google’s troubles with the Belgian litigant are tough to interpret unless you know the nuances of Belgian law, reports law professor Eric Goldman. The closest thing you’ll get to his opinion, he says, is to read his overview of the Agence France-Presse case. Goldman says that the rough takeaway from the Belgian case is that “this is not a surprising result” and that it “may not bode well for Google’s US case.”

Posted by Andrew Goodman

 

What’s Eating Yahoo

In the wake of Yahoo’s announcement of an impending softening of ad revenue, various assessments are flying around.

One huge piece of the puzzle that’s being underemphasized is international markets. Sure, Yahoo cares about these in a general sense. It’s just that from the standpoint of the ad platform, that caring isn’t translated into functionality for advertisers.

Translation: you’re damn right it matters that Panama was delayed… again.

With the current Overture (Yahoo Search Marketing) platform, you can’t advertise specifically to one country in most Scandinavian countries. You can’t show your ads just to Canada (unless you’re advertising in French). Just to South Africa? Nope.

(What if you want to geotarget to a 300-mile radius around Toronto, an area with a population of roughly 10 million? You can do it with Google AdWords. You can’t do it with Yahoo.)

There are 21 markets listed on the Overture home page. To be sure, they’re big markets, but in the next 100 markets, there’s a chunk of money left on the table.

Even doing US + UK is simply harder because of the platform and customer service silos involved.

It’s become glaringly obvious to all that Yahoo’s “good enough” isn’t good enough anymore – so much so that it’s hurting the company’s capacity to thrive and grow.

I guess the question is when Yahoo (and MSN) hit their targets for platform and customer service rollouts sometime in 2007, will they have fallen even farther behind? And what kind of click volumes will they be generating by then (will search market shares stay static)?

On another topic, Yahoo keeps losing people to other “environments” like MySpace and Facebook. Wall Street seems to believe Yahoo needs to acquire Facebook. Should they? Probably. But then won’t a startup just come along to tap the market for a social-networking-site-not-hosted-by-a-big-company? 🙂

Posted by Andrew Goodman

 

Tuesday, September 19, 2006

On the Belgian News Search Refuseniks

Chris Boggs and co-host for an online radio show tonight asked my opinion, so I replied with one. Going on limited information, on balance I believe the Belgian court’s judgment against Google has a “rather medieval” flavor to it.

Also – has no one heard of pursuing all available remedies, and/or settling out of court? There’s something petulant about both sides’ behavior here… but when you’re the much bigger guy, the right word is “arrogant.” The little guys (and the judge) appear to be the petulant ones.

I posted this as a lengthy comment over at Barry’s Search Engine Roundtable. I’m looking forward to Eric Goldman’s analysis of this case.

Posted by Andrew Goodman

 

Monday, September 18, 2006

So I Dated an Axe Murderer

PC World has a rather uptight little feature on the 25 Worst Websites, including the gem InmatesforYou.com. What, no required field for “what are you in for?” You’d think that would be of passing interest.

Speaking of verklempt, Mark Cuban’s predicting the imminent demise of YouTube. Correct me if I’m wrong, but doesn’t Cuban have a direct stake in any debate about copyright material, given recent investments? He’s got a point in there somewhere, to be sure, but I’ll wait until someone else makes it more disinterestedly.

Posted by Andrew Goodman

 

When Does the #2 Player Start to Make Sense?

Reports that Google has climbed above 60% in US search referral market share (according to Hitwise’s latest report) have buoyed Google stock.

The also-rans:

Yahoo: 22.6%
MSN: 11.6%
Ask: 3.6%

So when does this plateau? Do the others fight for the remaining share for awhile? Does #2 continue to lose share, or does a new #2 emerge? Or does #1 go to 90%, becoming a de facto standard?

It’s just an impression based on a couple of industries with dominant players/standards (no, let’s not talk Coke/Pepsi), but I tend to believe that at a certain point, the #2 option starts to make sense again, often when everyone had given up on competition.

Case #1: men’s razors/blades. Gillette, through as many iterations as one can count (Trac II, Sensor, Sensor Excel, Mach III, Mach III turbo, Mach III turbo with iTunes and universal remote, plus, now, the new ludicrous five-blade Fusion, and I’m sure I’m forgetting some), has been the clear leader. But those poor saps, Schick and whoever, never gave up. In business story after business story, we were fed the wisdom that because Gillette invested so much in R&D and in their brand, the lead was safe. But is it really necessary to invest *that* much in a shaver? Isn’t it possible to just rip off the main ideas, differentiate somewhat, get more aggressive in your marketing, and grab a good 25% of that market? And if so, isn’t it good to be #2? You’ll still get shelf space. You’ll still make outrageous profits. For the first time ever, I thought about getting a Schick razor, partly due to the ads, and the fact that the four blade concept doesn’t make me laugh. This must be because there’s no need to be loyal to one razor brand — if you’re being frugal, you wouldn’t have two different systems, but then again, if you were being frugal you’d probably find a bunch of Trac II’s in a warehouse somewhere. So… why not have a couple different ones? People have two cars and seven phones. From what I’m reading, Gillette still has 75% of the North American market for blade systems. But does it feel like maybe the tide is turning?

Case #2: Chips. The computer kind. Again, remember the incessant hype about the industry standard, Intel, being able to invest so much in R&D that it would crush rivals like AMD, reducing their market share to 1% or less. “AMD in trouble” was a common theme for many years. Then Intel made a misstep. AMD actually got a lead on them in some areas and Intel made way too many dual-core products which seem not to be that great a deal for many users (but they have to unload them, which is why Intel propaganda is creeping into the distribution channel so they can dump the higher-end chips when the lower-priced ones are just as good for most purposes). Today, AMD’s market share is above 20%, and climbing. There are worse fates.

So to search. Yahoo’s 22% seems like it’ll need to fall further, perhaps below 15%, if the “temporary long-term dominance scenario” of Google reaching 80%+ is to happen. If and when that happens, will Yahoo or someone else be there at #2? And how will recent long-term trends of Google increases be reversed?

As things stand, I don’t see it happening, for the same reason people stuck with Gillette and Intel for so long: demonstrable superiority of product in an area deemed to be mission-critical by the consumer. Google’s product is superior on many levels, when push comes to shove. We can debate this, and certainly it’s actually worse than some competitors in areas like news search or blog search, but by and large, it’s got so much that is faster and cooler, it’s way ahead. (It’s *not* ahead in many verticals, however, which is one way the others might win — if the verticals themselves turn out to be fundamentally as important as “search” in some larger sense.)

Longer term, how a #2 player really solidifieds their position seems to depend on similar factors to my above two cases. (1) Affluence and interest in novelty. Search, like razors, isn’t mutually exclusive. Or, if you prefer, like chips, it’s possible to re-evaluate every few years and make a new choice, getting rid of the incumbent because it’s fallen behind. (2) Disruption. There are so many devices, distributors, and choke points emerging, that new “default search tools” could easily emerge. Me, I’m looking forward to a new Blackberry device, and it happens to be very Google-compatible. But that’s how the competition can come in – by exploiting niches, coming up with partnerships and new distribution methods, etc. That’s why, if we’re looking far down the road, say 5-7 years, the #2-search-engine-of-choice could well turn out to be neither Yahoo nor MSN. I’m not going to count page-view-based juggernauts like MySpace, but it is interesting to see how quickly new environments and trends do arise, so when I say 5-7 years, it could be 2-3.

At the end of the day, the argument here is essentially one that mirrors or marvels at the power of business-building technologies and reverse engineering to create knockoff businesses at much lower cost than the leaders spend. The salesman at a men’s store won me over with a charming joke that they have “architects” working on today’s garments (to the tune of $170 trousers). One soon finds that they have the same “architects” working on very similar pants that run $35, albeit with cheaper fabric.

Yep, come to think of it, given how much you can accomplish with even 5% of Google’s R&D budget, there’s no reason to think the #2 search player in five years will be any of the names on today’s leaderboard. And people will opt for them because they’re no longer convinced the leader is so great.

Posted by Andrew Goodman

 

Thursday, September 14, 2006

MySpace for Pets

Via Greg Linden, I learn that Dogster just received $1 million in funding. They also have Catster. I disagree that it means wer’e in a bubble, though. When they release Ratster… then, we’ll be in a bubble.

Posted by Andrew Goodman

 

Tuesday, September 12, 2006

Paypal Cart Icon on Yahoo Search Ads

Not sure what to make of this! Google shows their Google Checkout item next to the ads of participating retailers, so Yahoo starts doing that in concert with Paypal? Google promotes Google, so Yahoo decides to be best friends with eBay? Oh, the permutations.

Posted by Andrew Goodman

 

Monday, September 11, 2006

Aeron Founder Dies, But Chairs Live On


(via NYT, via Danny) Bill Stumpf, the creator of the Aeron chair, has died.

How silly, in retrospect, that this wonderful, creative bit of ergonomics was hailed as a symbol of dot com excess. Chairs matter, man! I suppose the idea of overspending on furniture for “mere workers” is seen as crazy spending that isn’t what startups should be about. True indeed, but most of the people doing the critiquing don’t sit and code, or write, in the exact same spot, for 12 hours a day. Maybe no one should. But if you’re going to do that, chairs matter.

Like Danny, I’d always been saddled with crapola office furniture. Back in grad school I picked up an Ikea (I think) turquoise reclining desk chair, with all the padding, and prickly upholstery. I still have that sucker, actually. The plastic on the arms is flaking off, that fabric is so freakin’ hot on your back, and the ergonomics suck, so you run out of gas faster.

But that doesn’t matter too much… because….

So anyone who’s read my book knows I use chairs as an example of a search query (“Aeron-like”). I promised myself in the book that as soon as I “got rich off it” (ha) I would invest in a beautiful Aeron-like Mirra chair (pictured above), the slightly less expensive, even cooler, highly ergonomic new version of the Aeron.

I took longer than planned deciding, and no that book didn’t make me rich, but I did eventually find my way to Backs Etc., enjoyed the fine customer service, and ordered a Mirra, customized with tangerine fabric and dark grey plastic.

The coolness on the back is something to behold. The firm ergonomics (or as Herman Miller puts it: “AireWeave suspension. This elastomeric suspension seat follows the contours of the body, distributes pressure evenly, and provides aeration.” ) allow me to work considerably longer without getting tired. Looks like my “employer” (me) should have figured out the ROI on that chair a long time ago. Dot com bubble, indeed.

Slight problem: like better monitors and better desktop computers, chairs can’t be taken home, to your home office, so if you want the stuff in both places, you need the stuff in both places. That means soon I’ll need another Mirra. Cough, well maybe having two Mirra chairs is a bit of dot com excess. 🙂

Posted by Andrew Goodman

 

Site Clinic: What Our Heroes, TripAdVisor, Craigslist, Google, Facebook, and PlentyOfFish Did

It’s one thing to talk about site design for companies with a specific sales purpose in mind. But what if you’ve got a startup that is more of a “B2C web property” that is supposed to scale up fast? Is it useful at all to look at the companies you admire, and to learn from the way they deal with the user experience? I think it is, but it’s not always clear how you should follow them.

Because (big caveat here): some companies on our dot-com hero list — maybe even most — caught lightning in a bottle and just got big because of circumstances, and may have redesigned their sites after the fact.

However, looking at most of my examples, I see pretty clear precedents. Most of them designed their home pages and navigational experiences in such a way that they made harsh decisions to relegate secondary goals to the trashbin so the user would be forced to understand how to *primarily* interact with the site. The loudest and best shouting about that phenomenon still comes from Seth Godin’s little red book – the Big Red Fez.

Remember the failures, too. If you watched companies like AltaVista and Excite fail in the dot com bubble, you’re not only aware of their awful, cluttered portal layouts, but you can twig to the psychology that built them. Essentially: too many corporate goals competing for the user’s attention.

So… the biggest example everyone’s familiar with is the Google home page. No need for a screen shot to show you this one. They kicked the other search engines’ butt in part because of that clean layout and the speed of the search. They did search while others forgot to focus on it. The founders stumbled on that because they didn’t do design or HTML. It’s a cute story. But is it a relevant one? YES! What’s central to this user experience? Search, natch.

Next example: Craigslist (it recognizes my IP so I get toronto.craigslist.com in my browser). I don’t know what this is supposed to be. Yahoo, circa 1981? 😉 Eccch, right? Well, it certainly hasn’t hurt them. They became one of the largest classifieds sites in the world. You can go on archive.org to check out past versions, but here’s a screen shot of today’s Craigslist. I would argue that what is absolutely central to this user experience is: search and navigation – either you click or you type in a search to find what you need. For example, if you want a chair within a certain price range, you can type “chair” and narrow down the price range.

Movin’ on: Facebook, a social networking site. I don’t use it, obviously. I hear it’s for college students. But given the number of mentions, you have to assume they’re doing something right. What is that something? Phew! Maybe I shouldn’t randomly select these examples! That home page is very, very clean! In fact it appears that if you don’t want to be a member, they don’t want you there. Interesting. Facebook’s demographic, the 18-25 crowd, is very net savvy, so perhaps we can see this one as an exception. But notice how they didn’t feel the need to cover the home page in all kinds of come-ons. They recognize that it’s a viral service, so they go for a totally different – minimalist – look.

A site for the same demographic, RateMyProfessors.com, is reportedly growing fast. The home page is fairly clean but could be cleaner. The explanation of what the site is about is clear. Again, because of the demographic, this service is more likely to go viral, creating a virtuous circle. They can boast 5.7 million professor reviews. And they’re using sex now. They have a “hotness rating” along with other factors (yikes), with comments like “this is the best looking professor in North America” not uncommon (and I thought my plan to tour 2,000 golf courses to find the best-looking beer cart girl in Ontario was ambitious). This site’s home page certainly doesn’t give away the reasons for its success, but if there is anything you can say, it’s that the overall purpose is clear and that the file size of the page is small because of the very simple design, so it loads fast. This site has convinced me of the need for academic tenure, peer-reviewed journals, and other quality measures in higher education that do not emanate from the student “body.” Let’s face it, most professors are not “hot,” they’re smart. And the ones that give tough grades do so because they are enforcing standards. Let’s keep it that way, for the good of society. 🙂

PlentyOfFish, a dating service reviewed here previously, doesn’t seem to be all that innovative or restrained in its interface design. So what explains the success? Likely the business model in this case. The case is being made for the advantages of a completely free, ad-supported personals site. I get the feeling Markus knows his audience pretty well in that the home page highlights “women seeking men” only, and in that sense it’s a come-on that resembles ubiquitous banner ads from (“for pay”) sites like AdultFriendFinder.com. In spite of some clutter, the idea of what you’re expected to do on the site is clear, because the category is well traveled. And the searchability factor isn’t lacking, although it could be done much better. But for this site, the owner’s main problem has been keeping the site up in the face of massive growth. A problem everyone would like to have.

Finally, consider TripAdvisor.com. Again, this home page is not radically clean like that of Flickr, but it is attractive and quite focused. What I find interesting is that they have no less than three different ways of searching, right there on the home page. Instead of highlighting specific bits of content, or hammering you with explanatory text and offers, they show you, well, a lot of white boxes that seem to say — this is your experience and there’s a lot of information inside: go nuts.

That being said, there is a ton of information below the fold. I think Tripadvisor gets away with this because they already have a large, loyal audience. Would lab testing uphold this model as far as how a new user engages with the site? I’m not sure. That’s an awful lot of information. TripAdvisor gets traffic driven to its internal pages from a variety of sources, given that they sold out to Barry Diller’s IAC Interactive for over $100 million (so would get traffic flows from Ask, CitySearch, and so forth). The business model is intelligent in that advertisers buy high cost listings on internal pages. In essence, my argument here is that traffic needs to come from somewhere, and if you’re venture-backed and form partnerships, you may be better placed to generate visits to your relevant content at relatively low cost while selling sponsorships on the site at relatively high cost. Without “sweet deals” or organic search traffic to drive huge volumes of undervalued visits to the site, it’s more likely that the “clean metaphor” will delight users and lead to faster growth, IMHO.

Verdict: mixed. Clearly some of the best sites in the public’s mind have search and navigation at their core when you arrive at the home page. From there, the user continues to use the site to the point of it being addictive. These home pages are less about presenting information in a particular way than they are about offering clues as to how you’re going to access that information in the particular vertical category the site is about. So a home page is not about persuasion, necessarily, in this B2C consumer property realm – it’s not about making a sale. It might be in a different kind of business (long copy might work). And it’s not about multiple corporate priorities and all kinds of do this, do that, look at this ad, join this, etc. messages, though one contest or one big ad might be OK. Bottom line: great viral growth stories tend to embrace minimalism, and they tend to play up the metaphor of search and navigation almost to the point of what some users might think is obsession. Some well-funded companies with strong business development plans are able to negotiate means of driving underpriced traffic to a site, while selling listings at a higher price (this is why all the kerfuffle about “click arbitrage” seems to be overblown: many businesses have grown through “click arbitrage” and continue to be built around it).

In the past, quite a few companies were built up quickly simply through the grace of free mass organic Google referrals. As spaces get cluttered and large media companies spend in multiple channels in order to indirectly maintain their organic lead, this gets harder to achieve for a startup unless something goes a bit viral.

But the bottom line for startups in consumer content must be: life’s a search. Especially online. Why fight that?

Posted by Andrew Goodman

 

Thursday, September 07, 2006

Top Digger Freaks Out, Leaves

As I mentioned recently, the output pattern on many community-built content sites and recommendation engines appears to skew heavily towards a cadre of obssessive contributors.

Via Threadwatch, I learned that Digg revamped its algorithm so it doesn’t skew towards the ‘take’ of top Diggers.

Next thing you know, the #1 Digger goes ballistic and insults the founder of the company.

You give your heart and soul to something someone else profits from – it leads to heartache. Moral? Maybe, don’t get “married” to something like a social bookmarking site. Moral 2: incentives still matter, as varied and non-pecuniary as they may be in some cases. Moral 3: some people’s incentives for obsessively contributing to something for “free” are not honorable — hence, Digg’s algorithmic shift, no doubt.

Posted by Andrew Goodman

 

Lazy, Eh?

Ken Schafer over at One Degree could spend half his life chronicling maddening Canadian corporate website gaffes. Luckily though, I’m pitching in, so he’ll have time for his day job.

Check out www.kraft.ca. It’s not that they don’t know and can’t redirect you to the actual site, www.kraftcanada.com, it’s just that they haven’t bothered. Hey, you can cut and paste that URL, right? Unless you are like 33.8% of visitors to that page, who will simply leave thinking the site is broken.

No, it never redirects in any of the major browsers. 🙂

Incidentally, this nearly-blank domain/page/site has a PageRank of 5! Sweet!

Posted by Andrew Goodman

 

Wednesday, September 06, 2006

Toronto Search Marketing Seminar, Oct. 5

Under the auspices of the Canadian Marketing Association: a search marketing seminar coming up at the Massey Mansion on Jarvis St. I’ll be joining instructors Steve Mast and Kevin Jackson to contribute about an hour on – you guessed it – the latest, greatest info on paid search strategies.

Posted by Andrew Goodman

 

Tuesday, September 05, 2006

Lead Generation Conundrums

Our client list over at Page Zero is varied. One of the ways we can most consistently add value is in custom work driving paid search traffic and helping with site design and copywriting for “complex” sales, such as B2B campaigns with long sales cycles. The question is, when designing the website, planning the sales strategy, and tweaking landing pages, how *exactly* should one go about it? The debates can be endless, and it’s good to have principles in hand rather than simply falling back on the “just test it” mantra (which does make sense too).

One approach to getting prospects to trust you (and to offer contact info), of course, is to offer a white paper. Again, though — how to produce it, what tone do you take, how to promote it? I remember when I produced an ebook (not exactly a white paper, because I charge for it and it doesn’t offer some of the things that white papers do) I was so thankful I could fall back on a resource from someone who’d done it before (in particular, Marcia Yudkin).

Now, I’m thankful again! In the midst of some of these B2B conundrums, I recently read Writing White Papers, by enterprise “B2B” marketing expert Michael Stelzner. The book is amazingly comprehensive, covering every aspect of producing and marketing white papers. I particularly like the stipulations as to tone; he explains today’s sophisticated enterprise customer wants you to sell to them without being “salesy.” No one minds an intelligent latent sales pitch. But that means paying attention to how much you offer in return for the leads you seek.

Anyway, back to our website design and testing conundrums, I’m looking forward to tapping Mike for ongoing tips to augment our own expertise… expertise he demonstrates in this timely post on his blog, comparing white papers to a “demo” in the world of gaming. Give interested prospects enough to “play with,” and they’ll give up their contact info.

It goes without saying that being extra forthright about how much email contact they’ll receive, in what form, is a big part of the mix. Disclose your intentions fully, and don’t mislead prospects, in order to avoid a bad rap in the industry.

It’s perhaps not coincidental that this type of thinking has found its way into Google’s assessments of Landing Page Quality for AdWords ranking purposes… not all “users” are created equal, but the kind of respect accorded to high-end business customers is also worth offering to B2C customers too.

Posted by Andrew Goodman

 

Whaddya Know, Tucows Bought Kiko

Remember Kiko, the online calendar startup? Once the tagline to an obscure Dennis Miller joke, today, a part of Tucows’ offering to corporate email customers.

Kiko got a lot of attention — some of it negative — for putting the company up on eBay so the founders could wind down and move onto other projects. Many saw it as an example of a non-business being funded; a feature, not a company. Then, a buyer came along, paying $285,000 to acquire the code for the calendar app. Turns out it was Tucows, a publicly traded tech company we post about from time to time here as they’re just down the street. My good friend Elliot Noss blogged at length about the reasons this Ajaxy app was a great deal for his company right now. What he conspicuously left out was the added bonus of free PR. “Hey, we bought our calendar app on eBay for $285,000” is way faster, more fun, and better for publicizing your product than hiring, hunkering down, and building the confounded thing from scratch.

The most fun I’ve had this week is finding another parking spot on Craigslist. Hopefully I’ll be able to improve on that.

Posted by Andrew Goodman

 

Monday, September 04, 2006

Taxonomy for Fun and (Google’s) Profit? Community Image Tagging

Google Image Labeler is eliciting intelligent commentary around the virtual campfire, as one might expect.

It seems Google needs to improve the quality of its Image Search by tagging the images. What better way to go about it than luring an army of volunteer taggers? Hey, where have we heard this story before? Remember ODP?

Accurately describing elements of an image in few words isn’t as complex as editing directory categories.

Today, sites like Flickr and Youtube thrive on tagging. First, contributors of uploaded images, and later, other members of the community, tag their material as well as they can. It’s a rough and ready form of classification that’s attracted much interest and much pro & con, parallel with general debate over whether Web 2.0 is really anything, let alone an advance over what came before.

Well, it is an advance, or Google wouldn’t be doing this. Tags help users find images, there’s no doubt about that.

And now begins the great experiment with different incentive systems and value systems. It looks as if properties like Flickr and Youtube have pretty accurate taggers, perhaps because those engaged in tagging genuinely get it and are genuinely trying to be helpful. At this juncture, by contrast, Google seems to be running into the odd problem with insincere and malicious taggers, at least if the “editorial comment” type tags I’m seeing on Google Video are any indication. But the random “double-verification” approach to tagging is ingenious compared to hierarchical command-and-control systems. Where editors and their “bosses” know one another and can rig up a corruption scheme, this system seems to pair editors up with people they don’t know and cannot know. That isolates cheaters, Panopticon-like. I’m going to give it a try, just to check it out.

If accurate tagging requires the equivalent of professional editorial staff, but you’re running it like a kind of community effort involving nebulous rewards, because professional staff could never get to everything… it seems likely that odd usage/contribution patterns will arise, as they have before. In ODP, there were “meta” editors and high-output editors who developed expertise and did much more work than most of the rest, but also ran the risk of developing blinders of sorts. *Why* did they do so much more than others?

When it comes to Wikipedia, the same phenomenon has occurred. The “spontaneous outpouring of community input” is driven by a cadre of prolific editors, followed by a long tail of occasional helpers. What does it all mean? I’m not sure, except that it speaks to the competitiveness of some people, even when trying to win at something that doesn’t really benefit them, and benefits a “community” in a way that is yet unproven.

In this case the mega-taggers probably can’t wreck anything — especially with the random competitive tagging method tied to points — so the end result is better search. If Google Video tags currently stink, they can perhaps assign “points” to those folks who want to go in and clean up all those tags too. Google, of course, profits, but there is a certain inherent fascination with watching something work better as taggers get involved. Then again, I’m not 100% sure it’s worth anyone’s time to accurately tag a Japanese teenager singing karaoke Barbie Girl.

We debated this subject here way back with the ODP case. To get truly professional editorial results consistently, in some cases you have to pay people; in other cases, you don’t. With a poorly-thought-out incentive system (quality depends on commitment and skill level as well as incentives and sanctions for bad behavior), alternative (corrupt) compensation schemes can arise.

So, some thinking had to go into it. Google doesn’t have a real “vertical” or “spontaneous face to face society” feel to it, but it does of course have the advantage of a lot of money and a willingness to experiment with various filtering and incentive systems. So – it looks like a sawoff. They can find a way to overcome the shortcoming of their bigness.

Either way, tagging is moving search forward. Probably the most intriguing nascent tagging experiment, for me, is Amazon’s. Books are being tagged as we speak, first by authors, then by prolific reviewers… and… later, everyone else? Or not? Regardless, the result seems to be a parallel form of taxonomy that arises spontaneously out of community effort (assuming reasonable expertise in the community), as opposed to getting the Library of Congress category right, or some other method that might have existed in the past. From a tag, bringing up all known books about “beanstalks” *tagged as such* is only a click away. That’s not the same as doing a raw keyword search for beanstalks. Tagging is shades of past information science efforts, obviously, but it’s happening here and now in a specific kind of way, and it would be a mistake to dismiss its impact.

One more thought: vis-a-vis PageRank and anchor text… hasn’t linking always been like tagging? It’s a mistake to say that Google eschewed metadata because they didn’t look at meta keyword tags. They were just looking at different tags, and still do. 🙂 For a long, long time, a high proportion of website publishers voluntarily “tagged” their links with something a little more informative than “click here”… just because the web gurus said it was a good thing to do.

Edit: after playing the “game,” I ran across this excellent post on O’Reilly Radar, which explains that Image Labeler is based on Prof. Louis von Ahn’s “ESP Game”. On Search Engine Watch, Danny Sullivan confirms this in a Postscript, having heard back directly from Prof. van Ahn. As an aid to tagging images, it’s clear to me as a player that the type of “ESP” that is involved in playing the game optimally is not going to lead all by itself to the kind of thorough tagging we see on other sites. The best way to get the most points is to match your partner’s labels as many times as possible in a timed session. And the only way to do that is to quickly type in the least complex words possible. Sure, Google might tuck away your unmatched, more complex words, but to get the most points, you and your partners will soon learn that you should aim for the least complicated word possible to describe some part of the photo: eg. ocean, sky, people, woman, man, office, desk, etc. Screen shots of something complicated, such as a spreadsheet, are most easily matched when partners type in the heading of a column or any prominent word in the screenshot. A complex (but known) type of logo will be best matched with your partner if you both type in “logo.” And so on.

On a final, final note: I suppose “tagging” is slang for “graffiti.” This kind of tagging is something like the opposite of graffiti, especially when the sober, straight taggers are assigned to clean up the “Google Video Graffiti.”

Posted by Andrew Goodman

 

Friday, September 01, 2006

Thanks, Mitch, for a Great Geek Dinner

Barcamps, unconferences, pubcons… and now, geek dinners!

Mitch Joel is a sought-after speaker, online impresario extraordinaire, and founder of Twist Image, an online marketing powerhouse in Montreal. As a result of his boundless energy, Mitch knows a lot of people! Who else could fire out a quick email over the weekend and have 17 world-class geeks show up for a steakhouse dinner on Tuesday night in August?

A couple of the main takeaways for me, from the raucous chatter:

1. Authenticity in filtering. My tablemates, Leesa Barnes, Michael Leblanc, Goody Gibson, and others, got embroiled in a deep discussion about the credibility of the crowd – the various bumps in the road towards developing good online filters and giving voice to to consumers. No simple regurgitation of homilies here. There are many barriers to the full realization of crowd wisdom. What if some of the main consumer opinion sites get acquired and subsumed? Are the various user rating systems *really* doing a good job of sorting out lies from truth? This conversation reminded me a bit of ongoing and nuanced themes emanating from the likes of Steve Rubel (recall, Rubel, the PR expert, doesn’t believe that fictional characters can credibly blog, and is constantly pushing the issue of blogger authenticity).

2. Multi-pronged marketing. While the big agency folk at the table conceded that the economy and mutual back-scratching that have grown up around the “30-second spot” are largely unhealthy, and that advertisers are doing something about it (Pontiac is launching one product entirely online – no TV), the mainstream thinking is that the change isn’t as fast as people think, and that the schmoozing and conservatism of the traditional agency world isn’t done with just yet. You can’t rule any channel out. Marketers need to come at cluttered spaces from all angles because, well, where there is so much clutter, what else are you going to do?

3. The incredibly modest, reclusive unknown founder of the mesh conference (it’s like a Web 2.0 unconference for Toronto natives), Stuart MacDonald, turns out to live in my neighbourhood, so I got to snare a ride home. To go from hearing war stories about his past days with Expedia (do you have *any* idea how much a company like that spends on search advertising in a year?) to trading notes on the Bloor West Village Ukrainian Festival … truly sublime. The mesh conference is going to rise again next spring.

Posted by Andrew Goodman

 

You may also like