June 6, 2007
fter a break and a yummy cupcake, we’re back for our first Organic Search Engine Optimization session. Danny Sullivan is our moderator for today’s Penalty Box Summit. Speakers today are Peter Linsley (Ask.com), Aaswath Raman (Microsoft Live Search), Tim Mayer (Yahoo) and Matt Cutts (Google).Danny and the panel put on hockey masks (because it’s about penalties, geddit?) I cheer for the Ducks and inadvertently set off Danny’s rant about Orange County sports teams. Whoops. (Go, Ducks!)
Okay, back on topic–Danny explains that we’re going to be focused on improving the general level of understanding and applying penalties, not dealing with single site issues.
Tim Mayer is up first.
Tim comments first on an unfortunate misquoting of Yahoo’s commitment to search. Guess what? They’re really committed to it and they feel personalization is an important part. It was “unfortunate and [they’re] obviously very committed to search”. It’s okay to be shocked.
He emphasizes that spam is about INTENT with which you use techniques and the EXTENT to which you use a technique rather than the specific technique you use. There are legitimate uses for almost every technique. IP cloaking for geographic targeting, for example. The important thing is be smart about it. Use it to help, not hurt, user experience.
The acceptable line varies by industry, some categories are more competitive. If you’re doing optimization, you should be appropriate for your industry. [Susan’s Hint: You can use the SEOToolset and Free Tools to get a baseline on your competitors.]
Tim says that he has a link internally that can report a quality problem in the index. Webmasters can report spam through Site Explorer. About 70% of the spam reports they receive through the tool are legitimate spam, the rest is just noise. The tool also allows you to report spammy in-links so that you don’t get penalized by association.
Tim refers people to the Webmaster Resources. Come on, Tim. This is the advanced crowd. We already have those memorized. He goes on to discuss what to do for a re-inclusion request. Tell them “this is what I think I was penalized for, this is what I’ve done to clean it up.” They’ll review it pretty quickly.
Peter Linsley is up next. They just launched the new Ask.com last night. “For you livebloggers, go check it out”. Not right now, Peter. I’m blogging. 😉
Peter talks fast. Here we go:
Candidates for penalty are: hurting the user experience and gaming the search engine. Areas for penalty are links and content.
Gaming includes: Cloaking, Keyword Stuff, hidden text, link farms, scraper sites–basically all SEO Spam 101.
Hurting the user experience includes: Dead pages, no content, dynamic content. Pages that are different every time damage the user experience. Pages with no utility at all are hurting the user experience.
Warning signs of a penalty: drops in traffic, drops in rankings (duh.)
Don’t let the spammers leverage your site. If you have a blog, moderate your comments, don’t publish your access logs, etc.
Re-inclusion requests are looked at case by case.
…wow, you should see the CAPTCHA he just put up. It’s a math equation and there is not enough caffeine in the WORLD. I’m sure that Matt looked at it was all like ‘Oh, that’s EASY.’ Whatever, Matt. The point is, make sure you have a way to keep out the spammers. Don’t let them abuse you.
Aaswath Raman is up next. He’s going to review guidelines, why they use penalties, how they handle them and blah
He repeats what Tim said about spam being about intent and targeting the search engine for gaming.
Example: starwarsactionfigures.com links to starwars.com (okay) but also to cheapcasinohandbags.org (suspicious). So they evaluate to see if it’s just an affiliate or if this is a case of trying to game the engines.
On the page level — being useless to users (strings of keywords, etc) is cause for penalty
On the link level — in-links from bad neighborhoods and out-links to suspicious pages
On a general level — Deceiving users through redirects and misleading information
Suspicious or spammy behavior may cause a ranking penalty. More blatant or harmful spamming could be cause for de-listing. It’s like a venial or mortal sin–one means a single Hail Mary, the other send you to hell. That’s my analogy, not Aaswath’s.
firstname.lastname@example.org is their email. They are working on better ways to improve their ways of getting feeding. All de-listed sites are automatically reviewed.
Matt Cutts is up last. He’s going to read a list of spammers. This should be FUN!
He says he was going to do a “you might be a spammer” joke. “If you’ve left 10000 comments…” but he thinks that we probably already know what spam is. Yes, Matt, we do.
If you can keep someone from spamming because there are better ways to make money, that’s the way to go but really, there will always be spam. So they try to counter it. They want to make the user experience as good as it can possibly be.
They alert webmasters that they might be having issues so that they can take action and correct the problem. When that happens, they can do a re-inclusion request and everyone’s happy.
Yesterday, Pat asked why the guidelines were so brief (You&A with Matt Cutts). As a result, the webmaster guidelines were beefed up LAST NIGHT. Good Lord, Googlers, don’t you ever sleep? Oh wait, it’s been in the works for a while. Wow, they’re all linked to a specific page on the actual issue. Yay, time for more dissection.
Matt likes that the new pages are written so they don’t automatically assume that you’re a bad guy. They explain how things can be interpreted as spam. The whole webmaster guidelines silo is much bigger and deeper now. Cool.
There are a lot of Googlers whose job it is to get feedback. Matt mentions Vanessa Fox and Adam Lasnik, specifically. These are the people you should mob if Matt is busy.
If someone reports off-topic porn, they want action now, so google reserves the right to do manual edits but by and large they want to take care of things algorithmically. Matt doesn’t say that he thinks there needs to be a scalable and robust solution but I know he’s thinking it.
They send out emails in 10 languages to try to help keep people in the loop if their site is in trouble. They distinguish between types of webmasters (as he said yesterday) and they do treat them differently. Mom and Pop shops that have trouble will get alerts so that they can fix it. They’re not really spending too much time on the serial spammers who know full well what they’re doing.
There’s a leveling off of keyword importance. Once is good, twice is better, 900 times you’re past that importance threshold and you’re probably looking spammy.
Matt says that as a search engine, they want a clean index, not just clean scoring (of pages).
Matt discusses the trouble a while back with people sending out fake emails spoofed to look like they were from Google. Email is not authenticated. Someone was sending out the fake emails with an .exe. Google will not do that. He wants to look into how they can authenticate their communications to prevent this from happening again.
They’re going to change the name of the re-inclusion request to something like a reconsideration request. They’ll take into account the kind of webmaster you are. If you’re clearly a novice, you’re going to get a little more leeway than if you’re a hardcore SEO. This whole crowd is screwed.
Danny promises to show us how Search Engine Land is spamming. There’s a CSS hidden text or something. This whole thing was confusing to me. The SELand logo is actually their background. He tries to get the text cache from Live Search, they don’t have one. He tries to get it from Google but they have a bug. He finally gets a text cache somehow and tries to show us the hidden text…but it’s hidden. This is SO AWESOME. They have a note in their CSS file that says ‘we’re not spamming, well we are but suck it up for another week until we fix it’.
Enough about Danny’s spamming, let’s move on to other spamming. What would people like to see? Should it be a free for all? Should it be tighter? Should it be like that new search engine with the Hawaiian name and be manual?
If I mention your question without mentioning your name, comment and I’ll edit. I can’t see everyone from here.
Why is Penn State being ranked for Buy Viagra (because they’ve been hacked, not because Penn State wants to sell Viagra)? How come Google isn’t better about manually banning this kind of spamming?
Heh, Matt just said scalable and robust again. It’s his new phrase. They typically refer to take an algorithmic approach to removing spam but he thinks there’s room for using humans in a ‘scalable’ way.
Tim chimes in to agree that humans can be ‘scalable’. Aaswath says that ultimately the fix for Viagra spam will have to be algorithmic. Peter agrees and says that people could work every day for the next 10000 years and not even touch the amount of cleaning they’d have to do to fix the Viagra problem.
Danny does a quick tour of the SEs. Ask does well, Live doesn’t have any .edu links (Tim: But they’re not relevant! Danny: At least on the others I can buy Viagra.) Speaking of Yahoo, they have an .edu 404 page as their second result. Oh, now we’re going to that Hawaiian named engine. AHAHAHA. They’re not relevant and the ones that are come from .edus. I love this session!
Real estate, travel, home schooling… Different industries have different levels of normal. Link rules in real estate are different because reciprocal links are more common so they’re looking at that.
The search engines look at a larger set of queries so their ‘we’re doing well’ isn’t always the webmaster’s ‘doing well’.
Someone wants penalties to be announced. Don’t just penalize them, announce it and say they did something wrong. Lots of the audience agrees. Even more want a way to look up a way to see if their site has a penalty or not. Matt says ‘yeah, that’d be great but then you also tell the spammers which techniques are still working’. Tim reiterates, they don’t want to clue people in to what tricks are being missed by the engines.
Pat (feedthebot.com) wants the ability to know that his site is definitively under a penalty. How do I do a re-inclusion requests that actually get results?
Matt: We take them very seriously. Some people think you have to admit guilt but they’re trying to soften the message a little. You don’t have to grovel but you do have to try to be honest. If you don’t know but you fixed some stuff, tell them.
Open-sourcing the resolution process says Michael Martinez (I think?). Danny calls it ‘wikispamia’. Matt says having other people vouch might be interesting but there would always be someone out there try to game the system. He mentions that Neil Patel has 30 digg aliases, by way of an example.
Danny says there needs to be a way to say ‘this wasn’t an intent to deceive but I’m getting penalized anyway.’
In a discussion about DUI laws and how to build pages to explain them, Matt calls Wyoming unimportant. Then he has to bribe people with Google Webmaster t-shirts to not tell Wyoming about it. I didn’t get a t-shirt so I’m blogging it anyway. (I wouldn’t except that I’m bitter because they’re supercute long sleeved and black! There’s a heart on the sleeve!)
Danny says rather than monopolizing the engine reps for twenty minutes, give them a business card with the site on it so they can check it later. Matt chimes in and says that you can write it up ahead of time too. They like that.
Quick requests from the audience:
–We should be able to report spam in the SERP not in a separate form.
–Please let us know you at least got the message. It feels like sending messages in to a void.
–Full list of the actual penalties.
–Get more input from the webmaster about the site.
–Trusted webmasters (Lots of applause for that)
–Clean site badge.
–Trust API through Webmaster Tools
–Max Keywords per tag (Danny: Don’t use the tags. Et tu, Danny?)
–Stop being afraid of spammers and provide more transparency.
–Better train your ad reps.
–Ban Viagra (Danny: just give it away! You can afford it!)
–Details on time penalty
–Negative rank on the toolbar
–Caution about banning the entire servers
–Bad neighbor API (Danny calls that silly talk)
Aaswath: We love you.
Tim: Give us feedback on features in site explorer. It’s digg-like.
Matt: We love you infinity times infinity, especially Wyoming. We’re trying to improve communication. Keep talking to us.
Peter: We love the feedback. At the end of the day, we just want to provide good results.
Author: Susan Esparza is a Sr. Editor and writer at Bruce Clay Inc.