SEOThursday

Blog covering a variety of Search Engine Optimization topics.

Thursday, August 31, 2006

Breaking News - Nathaniel Broughton Leaving SES/SEW . . . and Yahoo BL's

I can't believe how many people are posting a version of the "Danny Sullivan is Leaving" story - it's on 50+ blogs I've seen over the past 48 hours. I have seen the guy speak and have personally learned a lot over the years from his blog and other contributions, but it's not like he's leaving the planet. The SEO world is acting like a bunch of high schoolers, perhaps because 95% of them were dorks in high school and no one talked to them. Or maybe they're just sad to see one of their gods (I saw Matt Cutts wearing army pants and flip flops, so I went out and bought army pants and flip flops) leave a place he's been for quite some time. Quite humorous really. Regardless, I'm sure we'll be hearing from Mr. Sullivan in the near future and I look forward to finding out where I can go and indulge myself with his knowledge on the web or at a new conference. Onto other things . . .

I'm not too pleased with the change Yahoo made (about a month back now) to thier backlink searches. Google has long been f'd up for checking backlinks, choosing only to apparently show a small portion of the backlinks to a given site. They like to shroud us in uncertainty. And I can applaud them for doing so from some respects because the first big SEO campaign I did was in a highly competitive industry and I could easily see that our competitors where blatantly "stealing" our links, a.k.a. running BL's on us and then setting up link deals with the same sites. Now there are companies making millions of dollars a year that more or less followed the path that I laid out for them, and I'm sitting here with quite less than that. Yahoo's new tool / update ("Site Explorer") isn't as limiting as Google's, but it definitely isn't as comprehensive as what they used to show over the past few years. It's not terrible, but I personally think it was a regression. While it may in fact limit would be link predators from finding out all of a domain's BL's, it hurts those of us trying to assign a value to a site for a link deal. While there are definitely many factors that go into such a valuation, # and quality of backlinks have always been one of the big ones and these days it's getting harder and harder to find some trustworthly data to supply anyone with that information. You nerds out there will be quick to point out 20 other backlink tools, and I have seen probably 3/4ths of them and not been impressed. We're left in a tough position on the SEO side of things, especially when it comes to training new people on the discipline.

My solution for now has been to run BL's in Google and Yahoo (like I've done for many years) to get a rough idea of what I'm looking at, and then to go to some secondary tools if I feel it necessary. It's become more of a task of determining what sites are good link partners based on the look, feel, domain, search ranking and up-to-datedness (my word, my blog). The only upside to the situation is that the natural progression of things has led us to a point where it is harder and harder to buy your way to the top of the SERPs by purchasing links on high PR sites or sites with great BL's. Relevancy, age of domain, and the authority level of a site are more and more important. So maybe I don't need you so much after all you crappy Yahoo BL tool, but nevertheless I'll always miss the old days when you were more friendly to me.

Thursday, August 24, 2006

Indexing an https:// - That Sounds Like Something

We had an issue pop-up this week where one of our sites that ranks tops in the search engines for some competitive terms (vague enough for ya? At least I didn't say widget) began having one of its https pages also indexing in the top 20. So basically we had our home page and an internal page indexing near the top of the SERPs, plus an additional result showing up down the SERP for the https page. In a related search, 2 https pages (these are submission forms) actually replaced the http pages that normally appear atop the rankings. Why was this happening? Why was it a bad thing?

In the world of SEO, where being overly-cautious is the only smart way to play the game, having a search engine index an https page is undesirable because it could ultimately get you hit for duplicate content. Now, I've seen some sites with multiple results for a given search, one of which was a https form, that have seemingly avoided getting hit with any dupe content penalties for over 2 years. So it's not to say that this is an absolute concept, although SEO is such an inexact science that if you don't cover yourself in as many ways as you have control over then you're being foolish. In order to combat it, we went through our database and also did some site:www.domain.com searches to check up on all the links to the site that we could find to see if someone was in fact linking to our https page - which could be a leading candidate for causing such an occurrence in the SERPs. It's quite common for someone to make simple typos when establishing a link for you, and while letting some of them slide can help to make your backlinks look more "natural" in a sense to the engines, it's still important to keep an eye out for links that could in fact come back to haunt you in the end.

Of course, the other main thing that could be causing pages to index that you don't want to is your robots.txt file. Always remember to "no index" all the pages that you don't want a search engine to see and to check up on that to make sure it is set as such if you see some irregularities in the SERPs.

Your friend Richie,
End of letter

Thursday, August 17, 2006

SES Conference Part Deux

A few other things that caught my attention while attending the SES conference in San Jose last week were re-emphasizing cleaning up your code to make your sites more crawl-friendly and the growing importance of online reputation management. Obviously it isn't much of a new idea to say that the code of any website should be clean, well organized and designed with search engine spiders in mind. One thing I think a lot of people (myself included) may get caught up in is continually adding new, fancy features to their websites that over the course of a year or two can really clutter up their code. We're all out to one-up our competition, to make websites that have cooler functions and incorporate the newest of programming languages and their various bells and whistles. While I'm not encouraging people to abandon trying to improve their sites, I think it is definitely important to take a day every few months to reasses the layout of your code and make sure that it is clean, organized, and necessary. All it took was one reminder from a speaker at SES that search engines love a 100 line site much more than they do a 1,000 line site to convince me to plan to make that reassessment as often as I can.

Online reputation management (ORM) also struck a cord with me and led to some serious brainstorming sessions. The term basically means exactly what it appears to mean, but the cool thing about managing an online reputation is that it in fact is closely tied to SEO. Online reputations generally stem from the top 10 or so search results - whether you search "pepsi" or "dmx", the top 10 organic search results and their contents will have a far-reaching effect on potential customers, employees, voters, etc. In a world that is growing more and more intent / content with taking every new idea or new name they hear and plugging it into a search engine to find more information, one's online reputation is becoming just as important as their public reputation. I personally think there is a huge potential for those with solid SEO skills to apply them to this idea, to help people or companies effectively acknowledge and manage their online reputation. Certain "flame" sites (those sites that slander or speak negatively about a particular person or company) can cause some severe damage to business, but combining the proper software tools to monitor the SERPs (or doing it manually) with a strong network of optimized websites and link deals can help push those down in the rankings. It is almost inevitable that a given business will have someone speak negatively about them online, and whether the negative information is borne out of truth or not it is still quite important to acknowledge this fact and develop a plan to manage your online rep accordingly. At this point in time, there actually isn't that much out there in the form of companies or firms to help people not versed in SEO to make these plans, but based on what I see ORM will be a booming industry in not too long. The good news is that people who have, like I said, a solid knowledge base in SEO can likely do a sufficient job of it themselves. For more information on the ins and outs of what ORM is, check out Jennifer Laycock's
Search Engine Guide.

Friday, August 11, 2006

SES Conference - SJ 2006 What I Learned, What I Saw Part I

San Jose 2006 was actually the first SES conference I had a chance to participate in, and after attending 4 or 5 WebmasterWorld conferences, I give some props to SES and the quality sessions they offered. While it’s truly difficult to not be outright bored by the repetitive, basic information at any session 50% of the time, I was impressed with the speakers and the collective minds gathered in San Jose. If you’ve been through the wars with SEO, SES is the place where you can pick up some new tricks and spend some quality time contemplating / discussing the future direction of search marketing. Onto what I learned . . .

I focused on attending sessions that pertained to organic search, particularly site architecture, link baiting, and ones that would lend themselves towards talking about the “direction” of organic search. Site architecture and page layout has been of particular interest to me lately because I recently acquired a domain and am working on finding the right balance between aesthetically pleasing pages and optimized layouts for the engines under my own direction (instead of letting the nerds run with it). I’ve preached on this blog already about the importance of quality, link-laden content, but not so much about where to put that content with respect your page layout. It was reiterated to me at SES the importance of sticking content that explains exactly what your page is about and what keywords should be bringing people into it right at the top of your page. Keep in mind that in this day in age search engines have evolved to the point where they quickly “skip over” the areas of web pages that are generally reserved for headers, main navigation, banners, etc. to go right to the meat of a page to find out what it is about. If your page is about purchasing homes in California, then you should have a brief, informative, keyword contained block of text right in the middle of the top of the page. If you’re running an ecommerce site, or driving your visitors to complete some sort of action (like filling out a form), it can definitely be tempting to place product images or big “Get Started” buttons at the top of your page, after all, it’s a highly visible location. But don’t forget to include some plain old, easily spiderable text up there first. Your SERP rankings will love you back.

Another thing that has been pretty hot lately (“Hansel, so hot right now, Hansel”) is the idea of link baiting. There was a solid session on this on Wednesday afternoon at SES that brought up some good points and tips for link baiting, which basically means obtaining links from people/sites simply by having good/informative/controversial/newsworthy content running on your site. It doesn’t necessarily fall into the same type of practices that many other black hat SEO tactics do – it can in fact be accomplished by doing something of note that people out there find helpful or plausible. They want to link to you because they feel like you and/or your business deserves it. Certain websites definitely lend themselves to this method of obtaining links, whether they be controversial in nature or perhaps quite innovative in nature (or they are owned or backed by a major corporation), but that doesn’t necessarily mean that the “little guy” can’t enjoy the love of unsolicited, editorial links by just doing a proverbial good. I was left wanting more, however, with regards to some tips on how to make a small budget site or a generally “non-controversial” website have some link bait qualities from the presenters. They kept reasserting that if you built a good website and maintained a business that was helpful and courteous to customers that things would all work out. Thanks for the tip! Oh well, back to the drawing board. But seriously, no matter what your site is or is about, there is some intelligent way to join the link bait parade – it just may take a few beers to find it.

I’ll write more on two other topics/tips that caught my eye at SES next week – cleaning up your code for search purposes and the almighty importance of reputation management with respect to the SERPs.

Thursday, August 10, 2006

Content in Action

The best thing you can do to ensure that the content of your website is always progressing is to have at least one person solely dedicated to it all the time. Most websites will be able to enjoy steady growth by simply adding a page of fresh content per day. Steady growth appears “natural” to the search engines, and is a key indicator that you are seriously vested in this website. Of course, placing bad content on the site doesn’t do much – always be concerned with quality. As long as the person in charge of writing new content is competent, does adequate research on their topic before writing, and has a good understanding of how to write “to the search engines” so to speak then you are set. To briefly elaborate, writing “to the search engines” just means that there are a few things to keep in mind when writing content for website that you would like to see optimized for search results. Those being: include as many keywords as possible related to your product or service, place outgoing links to authoritative websites in your field, and keep the content brief, yet informative for the reader.

I want to stress again the point of adding content at a steady pace. In general, uploading an unusually large amount of pages to a site at one time is seen as a poor practice that can be red-flagged by search engines. Does that mean that adding 10 pages to a site in a day is bad? No, especially if you already have 150. But taking a site from 1 page to 25 pages in a day may not be the best idea. Now, there definitely have been tales of it being fine to add even 10 times that many – but you have to weigh the risks. It’s best to play it safe in the SEO game.

One good method that we have employed for years for content writing is to outsource it. Countless freelancers can be pulled off of sites like www.elance.com. Setup an agreement where you pay per page or per week for a certain amount of work, set an editorial calendar based on your needs, and let the content flow in. It’s a good way to go because it keeps the daily grind of ensuring that new content is added from wearing on other aspects of your business. We have also tried requiring each employee to write one article per week during down time, but we have seen mixed results. Essentially, as long as it gets done it gets done. Be consistent, buckle down and with some consistency a site of 10 pages will grow into 1,000 in 2 years. From a search marketing perspective, the benefits of having 1,000 relevant pages of content will never stop presenting themselves. Your site will see more and more traffic from “long-tail” searches, and over time those can come to account for 30-50% of the traffic you enjoy, even if you’re pursuing some very prime keywords. I'm off to San Jose next week to check out my first Search Engine Strategies conference, I'll post what I learn from it when I return. Victory!

Thursday, August 03, 2006

Get Your Quill and Burn the Midnight Oil

There is a common saying in the search industry that “Content is King”. While it’s definitely overused by people who are turned off by the competition and somewhat underhanded techniques of SEO, it still rings true every time. Good, relevant, unique content is crucial to making both a search friendly website and a customer friendly website. Good / plentiful content can separate a good site from a great site, a site that ranks #1 from a site that ranks #3. If you think about it, writing is one place where many people out there struggle, either because they never learned how to do it properly, are uneducated, can’t speak English, or are (the mostly likely case) too lazy. Not many of us want to sit down and spew off 4 pages about our products or our services, offer customer tips, help sections, etc. (at least not every day for years and years). Content is something you can never have enough of. It increases the size / breadth of your website, thus making it more visible for the search engines. There’s nothing a search engine likes more than crawling some newly created, relevant content. It’s your ticket to success.

One thing that has been a large point of discussion ever since optimizing websites for the search engines began is the problem of duplicate content. Dupe content is exactly what is appears to be – the exact same sentence or phrase or page being hosted on one site that is hosted on another. The widespread prevalence of dupe content and the reason that it has become such a hotly debated topic in recent years goes back to the fact that most people are lazy. It is much easier to copy and paste the content off of a related site’s pages and place them on yours – like I said, the more content you have the better. Search engines love it. Not only that, but maybe you take the time to do a quick Find/Replace on a few keywords within the text – that’s brand new, unique content right? Probably not in the eyes of Google or Yahoo. Search engines have become increasingly apt at catching and penalizing sites that run any sort of duplicate content on their pages. Questions like “Who’s content is it really? Who put it up first?” and “What % of content being unique makes it so according to a search engine?” have been widely debated recently, but the end all of the situation is that anyone that wants to have a lasting presence in the SERPs and a good website overall should just stick to it, buckle down, and write quality unique content all themselves.