Thursday, September 30, 2010

Understanding How PPC Affects SEO

Understanding How PPC Affects SEO
There’s a lot of debate about how an organization should use organic keyword marketing versus
the way those same organizations should use PPC marketing. And there seem to be two (and possibly
three) distinct camps about what should and shouldn’t happen with these different types of
marketing.
The first position is that PPC programs can hurt organic keyword programs. According to subscribers
to this position, PPC programs damage organic rankings because the act of paying for a listing automatically
reduces the rank of your organic keyword efforts. Those who follow this theory believe that
there is no place for PPC programs.
Another position in this argument is that PPC has no effect at all on SEO. It’s a tough concept to
swallow, because one would naturally believe that any organization paying for a specific rank in
search returns would automatically push organic keyword returns into a lower slot (which supports
the first theory). Those who follow this theory believe that there is no need to invest in PPC,
because you can achieve the same results with organic keywords, though it takes much longer for
those results to become apparent.
The most widely held belief, however, is that a combination of PPC and organic keywords is the best
approach. This theory would seem to have a lot of validity. According to some researchers, PPC programs
tend to be much more effective if an organization also has organic keywords that rank in the
same area as the PPC ranks. For example, if you’ve bid on a keyword that’s consistently placed number
two or three in search engine returns, and you have organic keywords that fall in the next few
slots, you’re likely to find better conversion numbers than either organic keywords or PPC programs
can bring on their own.
It’s important to note here that all search engines make a distinction between PPC and organic SEO.
PPC doesn’t help your organic rankings. Only those activities like tagging your web site properly,
using keyword placement properly, and including great content on your site will help you on the
organic side. PPC is a search marketing strategy.
PPC Is Not Paid Inclusion!
One distinction that is important for you to understand is the difference between PPC and paidinclusion
(PI) services. Many people believe that PPC and PI services are the same type of marketing,
but there can be some subtle differences. For starters, paid-inclusion services are used by
some search engines to allow web-site owners to pay a one-year subscription fee to ensure that their
site is indexed with that search engine all the time. This fee doesn’t guarantee any specific rank in
search engine results; it only guarantees that the site is indexed by the search engine.
Yahoo! is one company that uses paid inclusion to populate its search index. Not all of the listings
in Yahoo! are paid listings, however. Yahoo! combines both normally spidered sites and paid sites.
Many other search engines have staunchly avoided using paid-inclusion services — Ask.com and
Google are two of the most notable — because most users feel that paid inclusion can skew the
search results. In fact, search engines that allow only paid-inclusion listings are not likely to survive
very long, because users won’t use them.
There is a bit of a gray line between paid inclusion and PPC. That line begins at about the point where
both services are paid for. Detractors of these types of programs claim that paying for a listing — any
listing — is likely to make search returns invalid because the theory is that search engines give higher
ranking to paid-inclusion services, just as they do to PPC advertisements.

NOTE Throughout this book, you’ll like see the terms SEO and search marketing used interchangeably.
Very strictly speaking, search marketing and SEO are quite different activities.
Search marketing includes any activity that improves your search engine rankings — paid or organic.
SEO, on the other hand, usually refers to strictly the free, organic methods used to improve search rankings.
Very often, the two terms are used interchangeably by people using SEO and search engine marketing
techniques. SEO and SEM experts, however, will always clearly differentiate the activities.

Wednesday, September 29, 2010

Pay-per-Click Categories

Pay-per-Click Categories
Pay-per-click programs are not all created equal. When you think of PPC programs, you probably
think of keyword marketing — bidding on a keyword to determine where your site will be placed
in search results. And that’s an accurate description of PPC marketing programs as they apply to
keywords. However, there are two other types of PPC programs, as well. And you may find that
targeting a different category of PPC marketing is more effective than simply targeting keyword
PPC programs.
Keyword pay-per-click programs
Keyword PPC programs are the most common type of PPC programs. They are also the type this
book focuses on most often. By now, you know that keyword PPC programs are about bidding on
keywords associated with your site. The amount that you’re willing to bid determines the placement
of your site in search engine results.
In keyword PPC, the keywords used can be any word or phrase that might apply to your site.
However, remember that some of the most common keywords have the highest competition for
top spot, so it’s not always advisable to assume that the broadest term is the best one. If you’re in
a specialized type of business, a broader term might be more effective, but as a rule of thumb, the
more narrowly focused your keywords are, the better results you are likely to have with them. (And
PPC will cost much less if you’re not using a word that requires a $50 per click bid.)
Did you know that Google and Yahoo! have $100 caps on their keyword bids? They do.
Imagine paying $100 per click for a keyword. Those are the kinds of keywords that will
likely cost you far more money than they will generate for you. It’s best if you stick with keywords
and phrases that are more targeted and less expensive.
The major search engines are usually the ones you think of when you think keyword PPC programs,
and that’s fairly accurate. Search PPC marketing programs such as those offered by vendors like
Google, Yahoo! Search Marketing, and MSN are some of the most well-known PPC programs.
Product pay-per-click programs
You can think of product pay-per-click programs as online comparison shopping engines or price
comparison engines. A product PPC program focuses specifically on products, so you bid on placement
for your product advertisements.
The requirements for using a product PPC program are a little different from keyword PPC programs,
however. With a product PPC, you must provide a feed— think of it as a regularly updated pricelist
for your products — to the search engine. Then, when users search for a product your links are given
prominence, depending on the amount you have bid for placement. However, users can freely display
those product listings returned by the search engine in the order or price from lowest to highest if
that is their preference. This means that your product may get good placement initially, but if it’s not
the lowest-priced product in that category, it’s not guaranteed that your placement results will stay in
front of potential visitors.
Some of these product PPC programs include Shopping.com, NexTag, Pricegrabber, and Shopzilla.
Although product PPC programs are popular for controlling the placement of your
product listings, there are some services, like Google Base, that allow you to list your
products in their search engine for free. These product PPC programs still require a product feed,
however, to keep product listings current.
Implementing a product feed for your products isn’t terribly difficult, although, depending on the
number of products you have, it can be time-consuming. Most of the different product PPC programs
have different requirements for the product attributes that must be included in the product
feed. For example, the basic information included for all products are an item title, the direct link
for the item, and a brief description of the item.
Some of the additional attributes that you may need to include in your product PPC listings include:
title
description
link
image_link
product_type
upc
price
mpn (Manufacturer’s Part Number)
isbn
id
Some product PPC programs will require XML-formatted feeds; however, most will allow text
delimited Excel files (simple CSV files). This means you can create your product lists in an Excel
spreadsheet, and then save that spreadsheet as text delimited by selecting File ➪ Save As and
ensuring the file type selected is text delimited.
Service pay-per-click programs
When users search for a service of any type, such as travel reservations, they are likely to use search
engines related specifically to that type of service. For example, a user searching for the best price
for hotel reservations in Orlando, Florida, might go to TripAdvisor.com. Advertisers, in this case
hotel chains, can choose to pay for their rank in the search results using a service PPC program.
Service PPC programs are similar to product PPC programs with the only difference being the type
of product or service that is offered. Product PPC programs are more focused on e-commerce products,
whereas service PPC programs are focused on businesses that have a specific service to offer.
Service PPC programs also require an RSS feed, and even some of the same attribute listings as product
PPC programs. Some of the service PPC programs you might be familiar with are SideStep.com
and TripAdvisor.com. In addition, however, many product PPC programs have expanded to include
services. One such vendor is NexTag.
In addition to the three categories of PPC programs discussed in this text, there is an
additional one. Pay-per-call is a type of keyword advertising in which search results
include a phone number. Each time a call is connected through that phone number, the company
that owns the number is charged for the advertising, just as if it were paying for a traditional payper-
click service.

Tuesday, September 28, 2010

How Pay-per-Click Works
How Pay-per-Click Works
Pay-per-click marketing is an advertising method that allows you to buy search engine placement
by bidding on keywords or phrases. There are two different types of PPC marketing.
In the first, you pay a fee for an actual SERP ranking, and in some cases, you also pay a per-click fee
meaning that the more you pay, the higher in the returned results your page will rank.
The second type is more along true advertising lines. This type of PPC marketing involves bidding
on keywords or phrases that appear in, or are associated with, text advertisements. Google is probably
the most notable provider of this service. Google’s AdWords service is an excellent example of
how PPC advertisements work, as is shown in Figure 5-1.



Determining visitor value
So the first thing that you need to do when you begin considering PPC strategies is to determine
how much each web-site visitor is worth to you. It’s important to know this number, because otherwise
you could find yourself paying far too much for keyword advertising that doesn’t bring the traffic
or conversions that you’d expect. For example, if it costs you $25 to gain a conversion (or sale)
but the value of that conversion is only $15, then you’re losing a lot of money. You can’t afford that
kind of expenditure for very long.
To determine the value of each web-site visitor, you’ll need to have some historical data about the
number of visitors to your site in a given amount of time (say a month) and the actual sales numbers
(or profit) for that same time period. This is where it’s good to have some kind of web metrics
program to keep track of your site statistics. Divide the profit by the number of visitors for the same
time frame, and the result should tell you (approximately) what each visitor is worth.
Say that during December, your site cleared $2,500. (In this admittedly simplified example, we’re
ignoring various things you might have to figure into an actual profit and loss statement.) Let’s also
say that during the same month, 15,000 visitors came to your site. Note that this number is for all
the visitors to your site, not just the ones who made a purchase. You divide your $2,500 profit by
all visitors, purchasers or not, because this gives you an accurate average value of every visitor to
your site. Not every visitor is going to make a purchase, but you have to go through a number of
non-purchasing visitors to get to those who will purchase.
Back to the formula for the value of a visitor. Divide the site profit for December ($2,500) by the
number of visitors (15,000) and the value of your visitors is approximately $.17 per visitor. Note
that I’ve said approximately, because during any given month (or whatever time frame you choose)
the number of visitors and the amount of profit will vary. The way you slice the time can change
your average visitor value by a few cents to a few dollars, depending on your site traffic. (Also note
that the example is based on the value of all visitors, not of conversions, which might be a more valid
real-life way of calculating the value of individual visitors. But this example is simply to demonstrate
the principle.)
The number you got for visitor value is a sort of breakeven point. It means you can spend up to
$.17 per visitor on keywords or other promotions without losing money. But if you’re spending
more than that without increasing sales and profits, you’re going in the hole. It’s not good business
to spend everything you make (or more) to draw visitors to the site. But note the preceding italicized
words. If a $.25 keyword can raise your sales and profits dramatically, then it may be worth
buying that word. In this oversimplified example, you need to decide how much you can realistically
spend on keywords or other promotions. Maybe you feel a particular keyword is powerful
enough that you can spend $.12 per click for it, and raise your sales and visitor value substantially.
You have to decide what profit margin you want and what promotions are likely to provide it. As
you can see, there are a number of variables. Real life is dynamic, and eludes static examples. But
whatever you decide, you shouldn’t spend everything you make on PPC programs. There are far
too many other things that you need to invest in. Popular keyword phrases can often run much more than $.12 per click. In fact, some of the most
popular keywords can run as much as $50 (yes, that’s Fifty Dollars) per click. To stretch your PPC
budget, you can choose less popular terms that are much less expensive, but that provide good
results for the investment that you do make.
Putting pay-per-click to work
Now that you have your average visitor value, you can begin to look at the different keywords on
which you might bid. Before you do, however, you need to look at a few more things. One of the
top mistakes made with PPC programs is that users don’t take the time to clarify what it is they
hope to gain from using a PPC service. It’s not enough for your PPC program just to have a goal
of increasing your ROI (return on investment). You need something more quantifiable than just
the desire to increase profit. How much would you like to increase your profit? How many visitors
will it take to reach the desired increase?
Let’s say that right now each visit to your site is worth $.50, using our simplified example, and your
average monthly profit is $5,000. That means that your site receives 10,000 visits per month. Now
you need to decide how much you’d like to increase your profit. For this example, let’s say that you
want to increase it to $7,500. To do that, if each visitor is worth $.50, you would need to increase
the number of visits to your site to 15,000 per month. So, the goal for your PPC program should be
“To increase profit $2,500 by driving an additional 5,000 visits per month.” This gives you a concrete,
quantifiable measurement by which you can track your PPC campaigns.
Once you know what you want to spend, and what your goals are, you can begin to look at the different
types of PPC programs that might work for you. Although keywords are the main PPC element
associated with PPC marketing, there are other types of PPC programs to consider as well.

Monday, September 27, 2010

Pay-per-Click and SEO

Pay-per-Click and SEO
Pay-per-click (PPC) is one of those terms that you hear connected to
keywords so often you might think they were the conjoined twins of
SEO. They’re not, really. Keywords and PPC do go hand in hand, but it
is possible to have keywords without PPC. It’s not always advisable, however.
Hundreds of PPC services are available, but they are not all created equal.
Some PPC services work with actual search rankings, whereas others are
more about text advertisements. Then there are the category-specific PPC
programs, such as those for keywords, products, and services.
The main goal of a PPC program is to drive traffic to your site, but ideally
you want more out of PPC results than just visits. What’s most important is
traffic that reaches some conversion goal that you’ve set for your web site.
To achieve these goal conversions, you may have to experiment with different
techniques, keywords, and even PPC services.
PPC programs have numerous advantages over traditional search engine
optimization:
No changes to a current site design are required. You don’t have
to change any code or add any other elements to your site. All you
have to do is bid on and pay for the keywords you’d like to target.
PPC implementation is quick and easy. After signing up for a PPC
program, it might only take a few minutes to start getting targeted
traffic to your web site. With SEO campaigns that don’t include
PPC, it could take months for you to build the traffic levels that PPC
can build in hours (assuming your PPC campaign is well targeted).
PPC implementation is also quick and easy, and it doesn’t require any
specialized knowledge. Your PPC campaigns will be much better targeted,
however, if you understand keywords and how they work.
As with any SEO strategy, there are limitations. Bidding for keywords can be fierce, with each competitor
bidding higher and higher to reach and maintain the top search results position. Many organizations
have a person or team that’s responsible for monitoring the company’s position in search
results and amending bids accordingly. Monitoring positions is crucial to maintaining good placement,
however, because you do have to fight for your ranking, and PPC programs can become prohibitively
expensive. The competitiveness of the keywords or phrases and the aggressiveness of the competition
determine how much you’ll ultimately end up spending to rank well.
One issue with PPC programs is that many search engines recognize PPC ads as just
that — paid advertisements. Therefore, although your ranking with the search engine
for which you’re purchasing placement might be good, that doesn’t mean your ranking in other
search engines will be good. Sometimes, it’s necessary to run multiple PPC campaigns if you want
to rank high in multiple search engines.
Before you even begin to use a PPC program, you should consider some basics. A very important
point to keep in mind is that just because you’re paying for placement or advertising space associated
with your keywords, you’re not necessarily going to get the best results with all of the keywords
or phrases that you choose. With PPC services you must test, test, and test some more. Begin small,
with a minimum number of keywords, to see how the search engine you’ve selected performs in
terms of the amount of traffic it delivers and how well that traffic converts into paying customers.
An essential part of your testing is having a method in place that allows you to track your return on
investment. For example, if your goal is to bring new subscribers to your newsletter, you’ll want to
track conversions, perhaps by directing the visitors funneled to your site by your PPC link to a subscription
page set up just for them. You can then monitor how many clicks actually result in a goal
conversion (in this case a new subscription). This helps you to quickly track your return on investment
and to determine how much you’re paying for each new subscriber.
Before investing in a PPC service, you may want to review a few different services to determine
which is the best one for you. When doing your preliminary research, take the time to ask the
following questions:
How many searches are conducted each month through the search engine for which
you’re considering a PPC program?
Does the search engine have major partners or affiliates that could contribute to the
volume of traffic you’re likely to receive through the PPC program?
How many searches are generated each month by those partners and affiliates?
What exactly are the terms of service for search partners or affiliates?
How does the search engine or PPC program prevent fraudulent activity?
How difficult is it to file a report about fraudulent activity and how quickly is the issue
addressed (and by whom)?
What recourse do you have if fraudulent activity is discovered?
Do you have control over where your listing does appear? For example, can you choose
not to have your listing appear in search results for other countries where your site is not
relevant? Or can you choose to have your listing withheld from affiliate searches?
When you’re looking at different PPC programs, look for those that have strict guidelines about
how sites appear in listings, how partners and affiliates calculate visits, and how fraud is handled.
These are important issues, because in each case, you could be stuck paying for clicks that didn’t
actually happen. Be sure to monitor any service that you decide to use to ensure that your PPC
advertisements are both working properly and that they seem to be targeted well.
It often takes a lot of testing, monitoring, and redirection to find the PPC program that works well
for you. Don’t be discouraged or surprised if you find that you must try several different programs
or many different keywords before you find the right combination. But through a system of trial
and error and diligent effort, you’ll find that PPC programs can help build your site traffic and goal
conversions faster than you could without the program.

Sunday, September 26, 2010

More About Keyword Optimization

More About Keyword Optimization
There is much more to learn about keywords and pay-per-click programs. In the next chapter,
you learn more about how to conduct keyword research, what pay-per-click programs are, and
how to select the right keywords for your web site. But remember, keywords are just tools to
help you improve your search rankings. When designing your site, the site should be designed
to inform, enlighten, or persuade your site visitor to reach a goal conversion.
And that’s truly what SEO is all about. Keywords may be a major component to your SEO strategies,
but SEO is about bringing in more visitors who reach more goal conversions. Without those
conversions, all of the site visitors in the world don’t really mean anything more than that people
dropped by.
71

Saturday, September 25, 2010

Avoid Keyword Stuffing

Avoid Keyword Stuffing
Keyword stuffing, mentioned earlier in this chapter, is the practice of loading your web pages with
keywords in an effort to artificially improve your ranking in search engine results. Depending on
the page that you’re trying to stuff, this could mean that you use a specific keyword or keyphrase a
dozen times or hundreds of times.
Temporarily, this might improve your page ranking. However, if it does, the improvement won’t last,
because when the search engine crawler examines your site, it will find the multiple keyword uses.
Because search engine crawlers use an algorithm to determine if a keyword is used a reasonable number of times on your site, it will discover very quickly that your site can’t support the number
of times you’ve used that keyword or keyphrase. The result will be that your site is either dropped
deeper into the ranking or (and this is what happens in most cases), it will be removed completely
from search engine rankings.
Keyword stuffing is one more black-hat SEO technique that you should avoid. To keep from
inadvertently crossing the keyword stuffing line, you’ll need to do your “due diligence” when
researching your keywords. And you’ll have to use caution when placing keywords on your web
site or in the meta tagging of your site. Use your keywords only the number of times that it’s
absolutely essential. And if it’s not essential, don’t use the word or phrase simply as a tactic to
increase your rankings. Don’t be tempted. The result of that temptation could be the exact opposite
of what you’re trying to achieve.

Friday, September 24, 2010

Taking Advantage of Organic Keywords

Taking Advantage of Organic Keywords
We’ve already covered brief information about organic keywords. As you may remember, organic keywords
are those that appear naturally on your web site and contribute to the search engine ranking of
the page. By taking advantage of those organic keywords, you can improve your site rankings without
putting out additional budget dollars. The problem, however, is that gaining organic ranking alone
can take four to six months or longer. To help speed the time it takes to achieve good rankings, many
organizations (or individuals) will use organic keywords in addition to some type of PPC (pay per
click) or pay for inclusion service.
To take advantage of organic keywords, you first need to know what those keywords are. One way
to find out is to us a web-site metric application, like the one that Google provides. Some of these
services track the keywords that push users to your site. When viewing the reports associated with
keywords, you can quickly see how your PPC keywords draw traffic, and also what keywords in
which you’re not investing still draw traffic.
Another way to discover what could possibly be organic keywords is to consider the words that
would be associated with your web site, product, or business name. For example, a writer might
include various keywords about the area in which she specializes, but one keyword she won’t necessarily
want to purchase is the word “writer,” which should be naturally occurring on the site.
The word won’t necessarily garner high traffic for you, but when that word is combined with more
specific keywords, perhaps keywords that you acquire through a PPC service, the organic words
can help to push traffic to your site. Going back to our writer example, if the writer specializes in
writing about AJAX, the word writer might be an organic keyword, and AJAX might be a keyword
that the writer bids for in a PPC service.
Now, when potential visitors use a search engine to search for AJAX writer, the writer’s site has a
better chance of being listed higher in the results rankings. Of course, by using more specific terms
related to AJAX in addition to “writer,” the chance is pretty good that the organic keyword combined
with the PPC keywords will improve search rankings. So when you consider organic keywords, think
of words that you might not be willing to spend your budget on, but that could help improve your
search rankings, either alone or when combined with keywords that you are willing to invest in.

Thursday, September 23, 2010

What’s the Right Keyword Density?

What’s the Right Keyword Density?
Keyword density is hard to quantify. It’s a measurement of the number of times that your keywords
appear on the page, versus the number of words on a page — a ratio, in other words. So if you have a
single web page that has 1,000 words of text and your keyword appears on that page 10 times (assuming
a single keyword, not a keyword phrase), then your keyword density is 1 percent.
What’s the right keyword density? That’s a question that no one has been able to answer definitively.
Some experts say that your keyword density should be around five to seven percent, and others suggest
that it should be a higher or lower percentage. But no one seems to agree on exactly where it
should be.
Because there’s no hard and fast rule, or even a good rule of thumb, to dictate keyword density, site
owners are flying on their own. What is certain is that using a keyword or set of keywords or phrases
too often begins to look like keyword stuffing to a search engine, and that can negatively impact the
ranking of your site.
See, there you are. Not enough keyword density and your site ranking suffers. Too much keyword
density and your site ranking suffers. But you can at least find out what keyword density your competition
is using by looking at the source code for their pages.
Looking at your competition’s source code is also a good way to find out which keywords
they’re using. The listed keywords should appear in the first few lines of code
(indicated in Figures 4-3 and 4-5).
To view the source code of a page if you’re using Internet Explorer, follow these steps:
1. Open Internet Explorer and navigate to the page for which you would like to view the
source code.
2. Click View in the standard toolbar. (In Internet Explorer 7.0, select Page.) The View (or
Page) menu appears, as shown in Figure 4-2.



3. Select View Source and a separate window opens to display the source code from the
web page you’re viewing, as shown in Figure 4-3.
If you’re using the Firefox browser, the menus are slightly different, and the source code looks a
little different. These are the steps for Firefox:
1. Open Firefox and navigate to the page for which you would like to view the source code.
2. Click View in the standard toolbar. The View menu appears, as shown in Figure 4-4.
3. Select Page Source to open a separate window that displays the source code for the
web page, as shown in Figure 4-5. Alternatively, you can use the keyboard combination
Ctrl + U to open the source code window.
You may notice that the source code looks a little different in Internet Explorer than it does in
Firefox. The differences are noticeable, but the basic information is all there. That said, it’s not
easy to get through this information. All the text of the page is jumbled in with the page encoding.
It may take some time to decipher, but ultimately, this is the best way to find out not only
what keywords the competition is using, but also how they’re using them, and how often the
keywords appear on their pages.





Wednesday, September 22, 2010

Picking the Right Keywords

Picking the Right Keywords
Keywords are really the cornerstone of any SEO program. Your keywords play a large part in determining
where in search rankings you’ll land, and they also mean the difference between a user’s finding
your page and not. So when you’re picking keywords, you want to be sure that you’ve selected the right ones. The only problem is, how do you know what’s right and what’s not?
Selecting the right keywords for your site means the difference between being a nobody on the Web, and being the first site that users click to when they perform a search. You’ll be looking at two types of keywords. The first is brand keywords. These are keywords associated with your brand. It seems pretty obvious that these keywords are important; however, many people don’t think they need to pay attention to these keywords, because they’re already tied to the site. Not true. Brand keywords are just as essential as the second type, generic keywords.
Generic keywords are all the other keywords that are not directly associated with your company brand. So if your web site, TeenFashions.com, sells teen clothing, then keywords such as clothing,tank tops, cargo pants, and bathing suits might be generic keywords that you could use on your site.Going back to brand keywords for a moment, if you don’t use the keywords contained in your business
name, business description, and specific category of business, you’re missing out. It seems as if
it wouldn’t be necessary to use these keywords, because they’re already associated with your site, but
if you don’t own them, who will? And if someone else owns them, how will they use those keywords
to direct traffic away from your site?
Before we go too much further in this description of keywords and how to choose the right one,
you should know that keywords fall into two other categories: those keywords you pay a fee to
use (called pay-per-click), and those naturally occurring keywords that just seem to work for you
without the need to pay someone to ensure they appear in search results (these are called organic
keywords).
When you think about purchasing keywords, these fall into the pay-per-click category. When you
stumble upon a keyword that works, that falls into the organic category.
When you begin considering the keywords that you’ll use on your site, the best place to start brainstorming
is with keywords that apply to your business. Every business has its own set of buzzwords
that people think about when they think about that business or the products or services related to
the business. Start your brainstorming session with words and phrases that are broad in scope, but
may not bring great search results. Then narrow down your selections to more specific words and
phrases, which will bring highly targeted traffic. Table 4-1 shows how broad keywords compare to
specific key phrases. Chapter 5 contains more specifics about choosing the right keywords and key phrases. The
principal for choosing keywords is the same, whether the words you’re using are in PPC programs
or they occur organically, so all of the elements of keyword selection for both categories
are covered there.
When you’re considering the words that you’ll use for keywords, also consider phrases
of two to three words. Because key phrases can be much more specific than single words,
it’s easier to rank high with a key phrase than with a keyword. Key phrases are used in the same ways
as keywords, they’re just longer.

Tuesday, September 21, 2010

Using Anchor Text

Using Anchor Text
Anchor text — the linked text that is often included on web sites — is another of those keyword anomalies that you should understand. Anchor text, shown in Figure 4-1, usually appears as an underlined or alternately colored word (usually blue) on a web page that links to another page, either inside the same web site or on a different web site.

What’s important about anchor text is that it allows you to get double mileage from your keywords.
When a search engine crawler reads the anchor text on your site, it sees the links that are embedded
in the text. Those links tell the crawler what your site is all about. So, if you’re using your
keywords in your anchor text (and you should be), you’re going to be hitting both the keyword
ranking and the anchor text ranking for the keywords that you’ve selected.
Of course, there are always exceptions to the rule. In fact, everything in SEO has these, and with
anchor text the exception is that you can over-optimize your site, which might cause search engines to reduce your ranking or even block you from the search results altogether. Over-optimization
occurs when all the anchor text on your web site is exactly the same as your keywords, but there is
no variation or use of related terminology in the anchor text.
Sometimes, web-site owners will intentionally include only a word or a phrase in all their anchor
text with the specific intent of ranking high on a Google search. It’s usually an obscure word or phrase
that not everyone is using, and ranking highly gives them the ability to say they rank number one for
whatever topic their site covers. It’s not really true, but it’s also not really a lie. This is called Google
bombing. However, Google has caught on to this practice and has introduced a new algorithm that
reduces the number of false rankings that are accomplished by using anchor text in this way.
The other half of anchor text is the links that are actually embedded in the keywords and phrases
used on the web page. Those links are equally as important as the text to which they are anchored.
The crawler will follow the links as part of crawling your site. If they lead to related web sites, your
ranking will be higher than if the links lead to completely unrelated web sites.
These links can also lead to other pages within your own web site, as you may have seen anchor text
in blog entries do. The blog writer uses anchor text, containing keywords, to link back to previous
posts or articles elsewhere on the site. And one other place that you may find anchor text is in your
site map. This was covered in Chapter 3, but as a brief reminder, naming your pages using keywords
when possible will help improve your site rankings. Then to have those page names (which are keywords)
on your site map is another way to boost your rankings and thus your traffic — remember
that a site map is a representation of your site with each page listed as a name, linked to that page.
Anchor text seems completely unrelated to keywords, but in truth, it’s very closely related. When
used properly in combination with your keywords, your anchor text can help you achieve a much
higher search engine ranking.

Monday, September 20, 2010

The Ever-Elusive Algorithm

The Ever-Elusive Algorithm
One element of search marketing that has many people scratching their head in confusion is the
algorithms that actually determine what the rank of a page should be. These algorithms are proprietary
in nature, and so few people outside the search engine companies have seen them in their
entirety. Even if you were to see the algorithm, you’d have to be a math wizard to understand it. And
that’s what makes figuring out the whole concept of optimizing for search engines so difficult.
To put it as plainly as possible, the algorithm that a search engine uses establishes a baseline to
which all web pages are compared. The baseline varies from search engine to search engine. For
example, more than 200 factors are used to establish a baseline in the Google algorithm. And
though people have figured out some of the primary parts of the algorithm, there’s just no way to
know all of the parts, especially when you realize that Google makes about half a dozen changes to
that algorithm each week. Some of those changes are major, others are minor. But all make the algorithm
a dynamic force to be reckoned with.
Knowing that, when creating your web site (or updating it for SEO), you can keep a few design principles
in mind. And the most important of those principles is to design your web site for people, not
for search engines. So if you’re building a site about springtime vacations, you’ll want to include
information and links to help users plan their springtime vacations.
Then if a crawler examines your site and it contains links to airfare sites, festival sites, garden shows,
and other related sites, the crawler can follow these links, using the algorithm to determine if they are
related, and your site ranks higher than if all the links lead to completely unrelated sites. (If they do, that
tells the crawler you’ve set up a bogus link farm, and it will either rank your site very low or not at all.)
The magic number of how many links must be related and how many can be unrelated is just that, a
magic number. Presumably, however, if you design a web page about springtime vacations and it’s
legitimate, all the links from that page (or to the page) will be related in some way or another. The
exception might be advertisements, which are clearly marked as advertisements. Another exception
is if all your links are advertisements that lead to someplace unrelated to the topic (represented by
keywords) at hand. You probably wouldn’t want to have a site that only had links from advertisements,
though, because this would likely decrease your search engine ranking.
The same is true of keywords. Some search engines prefer that you use a higher keyword density
than others. For all search engines, content is important, but the factors that determine whether or
not the content helps or hurts your ranking differ from one search engine to another. And then there
are meta tags, which are also weighted differently by search engines.
So this mysterious baseline that we’re talking about will vary from search engine to search engine.
Some search engines look more closely at links than others do, some look at keywords and context,
some look at meta data, but most combine more than one of those elements in some magic ratio that
is completely proprietary.
What that means for you is that if you design your web site for search engines, you’ll always be playing
a vicious game of cat and mouse. But if you design your web site for people, and make the site
as useful as possible for the people who will visit the site, you’ll probably always remain in all of the
search engines’ good graces.
Of course, most of these heuristics apply more specifically to web-site design and less specifically to
keywords and SEO. However, because SEO really should be part of overall site usability, these are
important principles to keep in mind when you’re designing your web site and implementing your
keyword strategies. As mentioned previously, don’t design your web site for SEO. Instead, build it
for users, with SEO as an added strategy for gaining exposure. Always keep the user in mind first,
though. Because if users won’t come to your site, or won’t stay on your site once they’re there, there’s
no point in all the SEO efforts you’re putting into it.

Sunday, September 19, 2010

Understanding Heuristics

Understanding Heuristics
If you’re going to maintain a web site with the best search engine optimization possible, you will
have to be familiar with “heuristics.” This is simply a term for recognizing a pattern and being able
to solve a problem or come to a conclusion quickly and efficiently by consulting what you already
know about that particular pattern.
In other words, using heuristics is a way to solve a problem, although it’s not always the most accurate
way. Heuristics are important in search engine optimization because they allow for variations
in the way that users search for a particular keyword or keyphrase. Because a combination of factors
must come together to create a ranking for your web site, heuristics make it possible for some,
but not all, of those factors to be present.
The Greeks had a word for it. The root of the adjective “heuristic” comes from their
term for “invent” or “discover.” “Heuristics” has come to mean a way of education or
computer programming that proceeds by experiment or observation, rather than theory, and sometimes
employs “rules of thumb” to find solutions or answers. We all act “heuristically” every day.
An example: Let’s say you run a travel-planning web site. If a web user is searching for “springtime
vacations,” a search engine crawler will visit many sites, with varying keywords, keyword placement,
and keyword density. In effect, it will give each a score, calculated on a baseline for relevance. It may
find one site with the phrase “some writers get their best ideas in springtime or while on vacation.”
But it won’t score that site high, because it doesn’t meet baseline criteria very well. The keywords are
separated and the context is wrong. Also, links from that site are unlikely to support the idea of planning
a springtime vacation. The search engine likes your travel-planning web site better, because it
has a lot to say about “springtime vacations.”
But the crawler doesn’t stop with your site, and it doesn’t look just at the words in your links,
although it helps if those say “springtime” and “vacation,” not something vague like “trips.” But the
crawler will actually go to your links to see if they’re really helpful for the user who wants something
about “springtime vacations.” If your links are irrelevant to that, the crawler may decide you’re running
a “link farm,” designed to catch its attention without really delivering. But if a high percentage
of your links really are related to springtime vacationing — travel information, garden shows, trips to
tulip festivals — then the crawler may score you high and put your site high on the list it compiles
for the user. That user, after all, is the crawler’s customer — and also, you hope, yours.
The crawler has operated “heuristically,” making its best judgments at each stage of the process.
Keywords apply to heuristics because they provide the pattern by which a problem (that is, the
search) is solved. Why do you need to know all of this? Because understanding the pattern by
which your site is ranked will help you to understand just how important it is to properly choose
and place keywords that will improve your search engine ranking. Think of it (as in the preceding Tip) as a rule of thumb. Heuristics provides a working guideline
by which a search term is ranked. However, it’s important to remember that rankings are achieved
through a complex combination of factors, not all of which are completely predictable. So, these
guidelines are just that — but they help you set a standard for how you plan to use keywords.
Heuristics for web-site usability were first established by Jacob Nielsen in 1990. At the time, he developed
a list of ten items that when included in web-site design would make the site more usable for
individuals. In 1994, Nielsen updated that list of heuristics so that it now includes the following items:
Visibility of system status: This principle says that the user should always know what’s
going on through feedback from the system that’s provided in a timely manner.
Match between the system and the real world: According to this, the system should
speak the user’s language. This means that keywords, phrases, and concepts should be
used in a way that is familiar to the user and not be just technical or marketing buzzwords.
User control and freedom: This principle says that users often mistakenly make choices
they don’t really want. For that reason, it’s essential to have the ability to undo or redo an
action. A good example of this is having back and forward buttons in a web browser.
Consistency and standards: Each time users click a button or see a word, they should
not have to wonder what that action or word means. Consistency and standards apply to
both languages and actions, and should be predictable across the Internet.
Error prevention: Users are frustrated by errors. Therefore, you should design your
site with the prevention of errors in mind. However, if there is a place where users might
encounter an error, using a confirmation system is recommended.
Recognition rather than recall: Don’t make users remember things from one screen or
dialog to another. Instead, create your pages with clearly visible instructions, actions, and
objects. If you must create an element that requires additional instructions, make those
instructions easy to access and clearly mark them as instructions.
Flexibility and efficiency of use: This principle applies to both novice users and experienced
users of your site. According to this rule, your site should apply to both groups of
users by providing customizable actions.
Aesthetic and minimalist design: Remember the adage KISS (Keep It Simple, Stupid)?
Well, your users may not be stupid, but they still want you to keep your site as simple as
possible. If your products, services, or information are complicated to locate, you’ll lose
site visitors very quickly. They’ll go to a site where it’s easy to find what they’re looking for.
Help users recognize, diagnose, and recover from errors: Users want error messages
that help them navigate through and correct the error as quickly as possible. Make sure that
error messages aren’t cryptic, and provide clear, easy-to-follow instructions.
Help and documentation: It’s always best not to have to refer users to help and documentation
files. But there are some instances when you must. If that’s the case for your site, be
sure your help and documentation files are easy to navigate and written in a clear, understandable
language.

Saturday, September 18, 2010

Keywords and Your Web Site

Keywords. That’s a term you hear associated with search engine optimization
all the time. In fact, it’s very rare that you hear anything
about SEO in which keywords aren’t involved some way. So, what’s
so special about keywords?
Simply put, keywords are those words used to catalog, index, and find your
web site. But of course, it’s not nearly as simple as it sounds. There is a fine
science to finding and using the right keywords on your web site to improve
your site’s ranking. In fact, an entire industry has been built around keywords
and their usage. Consultants spend countless hours finding and applying the
right keywords for their customers, and those who design web sites with SEO
in mind also agonize over finding just the right ones.
Using popular — and effective — keywords on your web site will help to assure
that it will be visible in the search engine results instead of being buried under
thousands of other web-site results. There are keyword research tools that can
help you find the exact keywords to use for your site and therefore for your
search engine optimization. Understanding the use of keywords, where to find
them, which ones to use, and the best ways to use them allows you to have a
highly visible and successful web site.
The Importance of Keywords
Basically, keywords capture the essence of your web site. Keywords are what a
potential visitor to your site puts into a search engine to find web sites related
to a specific subject, and the keywords that you choose will be used throughout
your optimization process. As a small-business owner, you will want your
web site to be readily visible when those search engine results come back.
Using the correct keywords in your web-site content can mean the difference in whether you come
back in search engine results as one of the first 20 web sites (which is optimum) or buried under
other web sites several pages into the results (which means hundreds of results were returned before
your site). Studies show that searchers rarely go past the second page of search results when looking
for something online.
Take into consideration for a moment the telephone book Yellow Pages. Say you’re looking for a
restaurant. The first thing you’re going to do is find the heading restaurant, which would be your
keyword. Unfortunately, even in a smaller city, there might be a page or more of restaurants to
look through. However, if you narrow your search to Chinese restaurants, that’s going to cut
in half your time searching for just the right one. Basically, that’s how keywords work in search
engines and search engine optimization. Choosing the appropriate keywords for your web site
will improve your search engine rankings and lead more search engine users to your site.
How do you know which keywords to use? Where do you find them? How do you use them? The
answer to these questions will save you a great deal of time when creating a web site. Where you rank
in search engine results will be determined by what keywords are used and how they are positioned
on your web site. It’s critical to choose appropriate keywords, include variations of those keywords,
avoid common (or “stop”) words, and know where and how many times to place them throughout
your web site.
Used correctly, keywords will allow you to be placed in the first page or two of the most popular
search engines. This tremendously increases the traffic that visits your web site. Keep in mind, the
majority of Internet users find new web sites through use of a search engine. High search engine
rankings can be as effective, if not more effective, than paid ads for publicity of your business. The
business you receive from search engine rankings will also be more targeted to your services than it
would be with a blanket ad. By using the right keywords, your customer base will consist of people
who set out to find exactly what your site has to offer, and those customers will be more likely to
visit you repeatedly in the future.
To decide which keywords should be used on your web site, you can start by asking yourself the
most simple, but relevant, question. Who needs the services that you offer? It’s an elementary
question, but one that will be most important in searching for the correct keywords and having
the best search engine optimization. If you’re marketing specialty soaps, you will want to use
words such as soap (which really is too broad a term), specialty soap, bath products, luxury bath
products, or other such words that come to mind when you think of your product. It’s also
important to remember to use words that real people use when talking about your products.
For example, using the term “cleaning supplies” as a keyword will probably not result in a good
ranking because people thinking of personal cleanliness don’t search for “cleaning supplies.”
They search for “soap” or something even more specific, like “chamomile soap.”
In addition to the terms that you think of, people also will look for web sites using variations of words
and phrases — including misspellings. It might help to have friends and family members make suggestions
of what wording they would use to find a similar product and include those words in your keyword
research as well as misspellings of those words. An example might be “chamomile.” Some people
may incorrectly spell it “chammomile,” so including that spelling in your keywords can increase your
chance of reaching those searchers. Also remember to use capitalized and plural keywords. The more specific the words are, the better the chance will be that your web site is targeted. Just remember that
words such as “a,” “an,” “the,” “and,” “or,” and “but” are called stop words. These words are so common
they are of no use as keywords.

Friday, August 27, 2010

SEO Strategies

SEO is hard work. It takes much effort to optimize just the
right elements of your web site so search engines will
not only find you, but will also index your site so that it
appears high in search query results. And all of that effort must
be attended to by you. There are currently no tools that will
put all of the elements of SEO in place for you.
Instead, you have to build your web site with SEO in mind,
choose all the right keywords, and use them in the right places
and balance on your site, determine if pay-per-click and paidinclusion
programs are for you, use the right meta tags in the
right places, create great content, and add all the right links.
Sounds like a lot of work, doesn’t it?
It is. But don’t let the amount of work overwhelm you. Consistent
effort and the strategies included in this part of the book will have
you working toward your SEO goals in no time. Each of the chapters
in this section contains an explanation of how these elements
affect SEO, and how you can create and implement strategies to
help you leverage that element to reach your SEO goals.
Search engine optimization is a collection of strategies that improve the
level at which your web site is ranked in the results returned when a
user searches for a key word or phrase.
By now, that’s a definition you should be pretty familiar with. What you probably
don’t know (yet) is how to achieve SEO. You can’t do it all at once. Instead,
SEO has to happen in stages. If you try to implement too many strategies at
one time, two things are going to happen.
First, you won’t be able to tell which of your efforts are successful. Imple -
menting one strategy at a time makes it possible for you to pinpoint which
strategies are working and which are not.
Second, when you try to implement too many strategies at one time, your
efforts — even the successful ones — could be lost in the shuffle. It’s like having
too many children running around the house on the weekend. If you’re
not paying complete attention to all of them (and that’s virtually impossible),
at least one is bound to get into something.
SEO is most successful when you concentrate on one effort at a time. A great
place to start concentrating is on the way your site is built. One of the first
things that attracts a search engine crawler is the actual design of your site.
Tags, links, navigational structure, and content are just a few of the elements
that catch crawlers’ attention.
Before You Build Your Site
One of the most common misconceptions about SEO is that it should be implemented after a web
site has been built. It can be, but it’s much harder. A better option is to consider SEO even before
you begin to build your web site, if that’s at all possible. It may not be. But if that’s the case, you
can still implement SEO strategies in the design of your site; it will just require a lot more work
than building it in at the beginning.
Know your target
Before you even start contemplating how to build your web site, you should know in what types
of search engines it’s most important for your site to be ranked. Search engines are divided into
several types, beyond the primary, secondary, and targeted search engines that you learned about
in Chapter 2. In addition, search engine types are determined by how information is entered into
the index or catalog that’s used to return search results. The three types of search engines are:
Crawler-based engines: To this point, the search engines discussed fall largely into this
category. A crawler-based search engine (like Google) uses an automated software agent
(called a crawler) to visit, read, and index web sites. All the information collected by the
crawler is returned to a central repository. This is called indexing. It is from this index
that search engine results are pulled. Crawler-based search engines revisit web pages
periodically in a time frame determined by the search engine administrator.
Human-powered engines: Human-powered search engines rely on people to submit
the information that is indexed and later returned as search results. Sometimes, humanpowered
search engines are called directories. Yahoo! is a good example of what, at one
time, was a human-powered search engine. Yahoo! started as a favorites list belonging to
two people who needed an easier way to share their favorite web site. Over time, Yahoo!
took on a life of its own. It’s no longer completely human-controlled. A newer search
engine called Mahalo (www.mahalo.com) is entirely human-powered, however, and it’s
creating a buzz on the Web.
Hybrid engine: A hybrid search engine is not entirely populated by a web crawler, nor
entirely by human submission. A hybrid is a combination of the two. In a hybrid engine,
people can manually submit their web sites for inclusion in search results, but there is also
a web crawler that monitors the Web for sites to include. Most search engines today fall
into the hybrid category to at least some degree. Although many are mostly populated by
crawlers, others have some method by which people can enter their web site information.
It’s important to understand these distinctions, because how your site ends up indexed by a search
engine may have some bearing on when it’s indexed. For example, fully automated search engines
that use web crawlers might index your site weeks (or even months) before a human-powered search
engine. The reason is simple. The web crawler is an automated application. The human-powered
search engine may actually require that all entries be reviewed for accuracy before a site is included
in search results. In all cases, the accuracy of search engine results will vary according to the search query that is used.
For example, entries in a human-powered search engine might be more technically accurate, but the
search query that is used will determine if the desired results are returned.
Page elements
Another facet of SEO to consider before you build your web site is the elements needed to ensure
that your site is properly indexed by a search engine. Each search engine places differing importance
on different page elements. For example, Google is a very keyword-driven search engine; however, it
also looks at site popularity and at the tags and links on any given page.
How well your site performs in a search engine is determined by how the elements of your page
meet the engine’s search criteria. The main criteria that every search engine looks for are the site
text (meaning keywords), tags — both HTML and meta tags — site links, and the site popularity.
Text
Text is one of the most important elements of any web site. Of particular importance are the keywords
within the text on a page, where those keywords appear, and how often they appear. This
is why keyword marketing has become such a large industry in a relatively short time. Your keywords
make all the difference when a search engine indexes your site and then serves it up in
search results.
Keywords must match the words and phrases that potential visitors will use when searching for
your site (or for the topic or product that’s listed on your site). To ensure that your keywords are
effective, you’ll need to spend some time learning which keywords work best for your site. That
means doing keyword research (which you learn more about in
Tags
In search engine optimization, two kinds of tags are important on your web site: meta tags and
HTML tags. Technically, meta tags are HTML tags, they just appear in very specific places. The
two most important meta tags are the keyword tag and the description tag.
The keyword tag occurs at the point where you list the keywords that apply to your web site. A
keyword tag on a search engine optimization page might look something like this:

The description tag gives a short description of your page. Such a tag for the search engine optimization
page might look like this:
Not all search engines take meta tags into consideration. For that reason, you site should use both
meta tags and other HTML tags. Some of the other HTML tags that you should include on your web
site are the title tag, the top (or H1) heading tags, and the anchor tags.
The title tag is the tag that’s used in the title of your web site. This tag will appear like this:
Your Title Here< / Title >
Once you’ve tagged your site with a title tag, when a user pulls the site up, the title that you entered
will appear at the very top of the page if the user is using an Internet Explorer browser (IE) earlier
than IE7, as shown in Figure 3-1. In IE7 and the Firefox browser, the title will appear on the browser
tab, shown in Figures 3-2 and 3-3.
High-level headings (H1s) are also important when a crawler examines your web site. Your keywords
should appear in your H1 headings, and in the HTML tags you use to create those headings. An H1
tag might look like this:

High-Level Heading


Anchor tags are used to create links to other pages. An anchor tag can point users to another web
page, a file on the Web, or even an image or sound file. You’re probably most familiar with the anchor
tags used to create links to other web sites. Here’s what an anchor tag might look like:
Text for link
Other criteria to consider
In addition to the four main elements you should plan to include on your site, there are a few others.
For example, the body text on your web site will be examined by the crawler that indexes your site.
Body text should contain enough keywords to gain the attention of the crawler, but not so many that
it seems the site is being “stuffed’ with such words.
Alternative tags for pictures and links are also important. These are the tags that might appear as a
brief description of a picture or graphic on a web site that fails to display properly. The alternative
tags — called alt tags — display a text description of the graphic or picture, so that even if the actual
image doesn’t appear, there’s some explanation of what should be there. Alt tags are a good place to
include additional keywords.
Understanding Web-Site Optimization
Web-site optimization is all about creating a site that is discoverable by search engines and search
directories. It sound simple enough, but there are many aspects of site optimization to consider, and
not all of them are about the keywords, links, or HTML tagging of your site.
Does hosting matter?
That question comes up frequently when a company or individual is designing a web site. Does it
matter who hosts your site? The answer is no, but that’s not to say that domain hosting is unimportant.
Elements of the hosting have a major impact on how your site ranks in search results.
One of the biggest issues that you’ll face with domain hosting is the location of your hosting company.
If you’re in the United States and you purchase a domain that is hosted on a server in England,
your search engine rankings will suffer. Geographically, search engine crawlers will read your site as
being contradictory to your location. Because many search engines serve up results with some element
of geographical location included, this contradiction could be enough to affect your ranking.
The length of time for which you register your domain name could also affect your search engine
ranking. Many hackers use throw away domains, or domain names that are registered for no more
than a year, because they usually don’t even get to use the domain for a full year before they are shut
down. For this reason some search engines have implemented ranking criteria that give priority to
domains registered for longer periods. A longer registration also shows a commitment to maintaining
the web site.
Domain-naming tips
The question of what to name a web site is always a big one. When selecting a name, most people
think in terms of their business name, personal name, or a word or phrase that has meaning for
them. What they don’t think about is how that name will work for the site’s SEO. Does the name
have anything at all to do with the site, or is it completely unrelated?
Have you ever wondered why a company might be willing to pay millions of dollars for a domain
name? The domain name business.com was purchased for $7.5 million in 1999, and was
recently thought to be valued at more than $300 million. Casino.com went for $5.5 million and
worldwideweb.com sold for $3.5 million. What’s so important about a name?
Where SEO is concerned, the name of your web site is as important as many of the other SEO elements
that you concentrate on. Try this test. Use your favorite search engine to search for a topic,
perhaps “asphalt-paving business.” When your search results are returned, look at the top five results.
Most of the time, a web site containing those words will be returned in those top five results, and it
will often be in the number one slot.
So, if your company name is ABC Company, but your business is selling nutmeg graters, consider
purchasing the domain name NutmegGraters.com, instead of ABC Company.com — ABC Company
may not get you in the top of search rankings, but the very specific nature of your product probably
will. And both the content of your site and your domain name will attract crawlers in the way you
want. Using a domain name containing a keyword from your content usually improves your site
ranking.
A few more things that you should keep in mind when you’re determining your domain name
include:
Keep the name as short as possible. Too many characters in a name mean increased potential
for misspellings. It also means that your site address will be much harder for users to
remember unless it’s something really startling.
Avoid dashes, underscores, and other meaningless characters. If the domain name that
you’re looking for is taken, don’t just add a random piece of punctuation or numerology
to the name to “get close.” Close doesn’t count here. Instead, try to find another word
that’s relevant, and possibly included in the list of keywords you’ll be using. For
example, instead of purchasing www.yourwebsite2.com, try to find something
like www.yoursitesubject.com.
Opt for a .com name whenever possible. There are lots of domain extensions to choose
from: info, biz, us, tv, names, jobs. However, if the .com version of your chosen domain
name is available, that’s always the best choice. Users tend to think in terms of .com, and
any other extension will be hard for them to remember. Com names also tend to receive
higher rankings in search engines than web sites using other extensions. So if your competition
has www.yoursite.com and you choose to use www.yoursite.biz, chances
are the competition will rank higher in search results than you.
Again, it’s important to realize that domain naming is only one facet of SEO strategy. It won’t make
or break your SEO, but it can have some effect. So take the time to think about the name you plan
to register for your site. If you can use a name that not only reaches your audience but also lands you
a little higher in search results, then by all means purchase it. But if no name really seems to work in the SEO strategy for your site, don’t get discouraged. You can make up for any domain-naming issues
by implementing solid keyword strategies, tagging strategies, and other elements of SEO.
Understanding usability
Usability. It means different things to different web site designers. It’s also been at the top of every
user’s requirements list since the Web became part of daily life. When users click through to your
web site from a search results page, they want the site to work for them. That means they want to be
able to find what they’re looking for, to navigate from place to place, and to be able to load pages
quickly, without any difficulties.
Web-site users are impatient. They don’t like to wait for pages to load, they don’t want to deal with
Flash graphics or JavaScript, and they don’t want to be lost. These are all elements of usability —
how the user navigates through and uses your web site. And yes, usability has an impact on SEO.
Especially from the perspective of your site links and loading times.
When a search engine crawler comes to your site, it crawls through the site, looking at keywords,
links, contextual clues, meta and HTML tags, and a whole host of other elements. The crawler will
move from page to page, indexing what it finds for inclusion in search results. But if that crawler
reaches the first page and can’t get past the fancy Flash you’ve created, or if it gets into the site and
finds links that don’t work or that lead to unexpected locations, it will recognize this and make note
of it in the indexed site data. That can damage your search engine rankings.
Navigation knowledge
When you consider web-site navigation, there are two types: internal navigation and external navigation.
Internal navigation involves the links that move users from one page to another on your site.
External navigation refers to links that take users away from your page. For your navigation to be
SEO-friendly, you have to use both types of navigation carefully.
Look at a number of different high-ranking web sites. How is the navigation of those sites designed?
In most cases, you’ll find that the top sites have a left-hand navigation bar that’s often text-based, and
some have a button-based navigation bar across the top of the page. Few have just buttons down the
left side, and all of them have text links somewhere in the landing page.
The navigation for many sites looks the same, because this plan works. Having a text-based navigation
bar on the left works for SEO because it allows you to use anchor tags with the keywords you’re
using for the site. It also allows crawlers to move from one page to another with ease.
Buttons are harder for crawlers to navigate, and depending on the code in which those buttons are
designed, they might be completely invisible to the crawler. That’s why many companies that put
button-based links at the top of the page also usually include a text-based navigation bar on the
left. The crawler can still move from page to page, but the user is happy with the design of the site.The other element you see on nearly every page is text-based links within the content of the page.
Again, those links are usually created with anchor tags that include the keywords the site is using
to build site ranking. This is an effective way to gain site ranking. The crawler comes into the site,
examines the linking system, examines the content of the page, compares these items, and finds
that the links are relevant to the content, which is relevant to the keywords. That’s how your ranking
is determined. Every element works together.
Take the time to design a navigational structure that’s not only comfortable for your users, but is also
crawler-friendly. If it can’t always be perfect for the crawlers, make sure it’s perfect for users. Again,
SEO is influenced by many different things, but return visits from users are the ultimate goal. This
may mean that you have to test your site structure and navigation with a user group and change it
a few times before you find a method that works both for returning users and for the crawlers that
help to bring you new users. Do those tests. That’s the only way you’ll learn what works.
Usability considerations
It’s not always possible to please both your site users and the crawlers that determine your page
ranking. It is possible, however, to work around problems. Of course, the needs of users come first
because once you get them to your site you want them to come back. On the Internet, it’s extremely
easy for users to surf away from your site and never look back. And returning visits can make or
break your site.
But the catch is that in order to build returning visitors, you have to build new visitors, which is the
purpose of SEO. That means you need search engines to take notice of your site.
When it seems that users’ preferences are contrary to crawlers’ preferences, there is a solution.
It’s a site map. And there are two types of which you should be aware. A basic site map is an
overview of the navigational structure of your web site. It’s usually text-based, and it’s nothing
more than an overview that includes links to all of the pages in your web site. Crawlers love site
maps. You should, too.
A site map allows you to outline the navigational structure of your web site, down to the second or
third level of depth, using text-based links that should include anchors and keywords. An example
of a site map for the Work.com web site is shown in Figure 3-5.
When a site map exists on your web page, a search engine crawler can locate the map and then
crawl all of the pages that are linked from it. All of those pages are then included in the search
engine index and will appear on search engine results pages. Where they appear on those SERPs
is determined by how well the SEO is done for each individual page.
A second type of site map, the XML site map, is different from what you think of as a site map in
both form and function. An XML site map is a file that lists all of the URLs for a web site. This file
is usually not seen by site visitors, only by the crawlers that index your site. There are more specifics
on XML site maps in Chapter 16.Components of an SEO-Friendly Page
Building an SEO-friendly web site doesn’t happen by accident. It requires an understanding of what
elements search engines examine and how those elements affect your ranking. It also requires including
as many of those elements as possible on your site. It does little good to have all the right meta
tags in place if you have no content and no links on your page.
It’s easy to get caught up in the details of SEO and forget the simplest web-design principles — principles
that play a large part in your search engine rankings. Having all the right keywords in the right
places in your tags and titles won’t do you much good if the content on your page is non-existent or
completely unreachable by a search engine crawler.Understanding entry and exit pages
Entry and exit pages are the first and last pages that a user sees of your web site. It’s important to
understand that an entry page isn’t necessarily the home page on your web site. It can be any other
page where a user lands, either by clicking through search engine results, by clicking a link from
another web site or a piece of marketing material, or by bookmarking or typing directly into the
address bar of a browser.
Entry pages are important in SEO, because they are the first page users see as they come onto the
web site. The typical web site is actually several small connected sites. Your company web site might
contain hubs, or central points, for several different topics. Say you’re a pet store. Then you’ll have
hubs within your sites for dogs, cats, birds, fish, and maybe exotic animals. Each hub will have a
main page — which will likely be your entry page for that section — and several additional pages
leading from that central page to other pages containing relevant content, products, or information
about specific topics.
Understanding which of your pages are likely entry pages helps you to optimize those pages for
search engine crawlers. Using the pet-store example, if your home page and all the hub pages are
properly SEO’ed, you potentially could be ranked at or near the top of five different sets of search
results. When you add additional entry pages deeper in your web site structure (that is, a dogtraining
section to the hub for dogs), you’ve increased the number of times you can potentially
end up at the top of search engine rankings.
Because entry pages are important in the structure of your web site, you want to monitor those pages
using a web-site analytics program to ensure they are working the way you expect them to work. A
good analytics program, like Google Analytics, will show you your top entry and exit pages.
Exit pages are those from which users leave your site, either by clicking through an exiting link,
selecting a bookmark, or typing a different web address into their browser address bar. But why
are exit pages important? They have two purposes; the first is to drive users from their entry
pages to a desired exit page. This is called the path that users travel through your site. A typical
path might look something like this:
SERP ➪ Home ➪ Women’s Clothing ➪ Product Pages ➪ Shopping Cart ➪
Checkout ➪ Receipt
In this example, Home is the entry page and Receipt is the exit page. By looking at this navigational
path, you can tell how users travel through your page and where they fall off the page. But there’s an
added benefit to understanding the navigational path of your users. When you know how users
travel through your site, you can leave what’s called a bread-crumb trail for them. That’s a navigational
indicator on the web site that allows them to quickly see where they are on your site, as shown
in Figure 3-6. This is the navigation path shown on the Wal-Mart web site. You can quickly see
where in the navigational structure of the site you’re located.The bread-crumb trail not only helps users return to a previous page in the navigational path; it also
makes it easier for a web crawler to fully examine your site. Because crawlers follow every link on
your page, this is an internal link structure that leads crawlers to individual pages that you want to
have included in search engine results.Choosing an Analytics Program
An important element in any SEO plan is analytics — the method by which you monitor the effectiveness
of your web site. Analytics are the metrics that show you how pages, links, keywords,
and other elements of your web site are performing. If your web host hasn’t provided you with an
analytics program, find one. Not having an analytics program is like walking around in the dark,
hoping you won’t bump into a wall.
Many web-site owners shy away from analytics packages because they appear to be complicated as
well as expensive. However, they don’t always have to be. You can find a good analytics program
that’s not only easy to use but is also inexpensive or even free. But use caution about making ease
and low cost the deciding factors when selecting an analytics program.
The program will give you the power to see and control how your web site performs against your
goals and expectations. You want it to show you everything you need to know, so here are some considerations
when you’re evaluating analytics programs:
What reports are included in the tools you’re examining, and how will you use those
reports?
How do you gather the information used to create the metrics you need?
How often are your reports updated?
How much training is necessary to understand your application and the reports provided?
Do you get software installation or is the product provided strictly as a web-based service?
What is the total cost of ownership?
What types of support are available?
What is the typical contract length?
Many analytics programs are available to you. Google Analytics, AW Stats, JayFlowers, ClickTracks,
and dozens of others all offer something different at a different price tag. If “free” is what you can
afford, don’t assume you’ll get a terrible package. Google Analytics is one of the free packages available;
it’s an excellent program and is based on what used to be the Urchin Analytics package (which
was quite costly). Other programs cost anywhere from $30 to $300 a month, depending on the
capabilities you’re purchasing.
The cost is not the most important factor, however. Ultimately, your consideration should be how the
analytics package can help you improve your business.Using powerful titles
Page titles are one of the most important elements of site optimization. When a crawler examines
your site, the first elements it looks at are the page titles. And when your site is ranked in search
results, page titles are again one of the top elements considered. So when you create your web site,
you need to have great page titles.
There are several considerations when coming up with your page titles. Here are some of the key
factors to consider:
Unless you’re Microsoft, don’t use your company name in the page title. A better choice
is to use a descriptive keyword or phrase that tells users exactly what’s on the page. This
helps ensure that your search engine rankings are accurate.
Try to keep page titles to less than 50 characters, including spaces. Some search engines
will index only up to 50 characters; others might index as many as 150. However, maintaining
shorter page titles forces you to be precise in the titles that you choose and ensures
that your page title will never be cut off in the search results.

TIP
The World Wide Web Consortium (W3C) has determined that the outside length of a
page title should be no more than 64 characters. Search engines will vary in the size of
title that’s indexed. Using 64 characters or less is an accepted practice, however, that still leaves your
page titles cut off in search engines that only index up to 40 or 50 characters. For this reason, staying
at or below the 40-character length is a smarter strategy within your SEO efforts.
Don’t repeat keywords in your title tags. Repetition can occasionally come across as spam
when a crawler is examining your site, so avoid repeating keywords in your title if possible,
and never duplicate words just to gain a crawler’s attention. It could well get your site
excluded from search engine listings.
Consider adding special characters at the beginning and end of your title to improve
noticeability. Parentheses (()), arrows (<<>>), asterisks (****), and special symbols like
££££ can help draw a user’s attention to your page title. These special characters and
symbols don’t usually add to or distract from your SEO efforts, but they do serve to call
attention to your site title.
Include a call to action in your title. There’s an adage that goes something like, “You’ll
never sell a thing if you don’t ask for the sale.” That’s one thing that doesn’t change with
the Web. Even on the Internet, if you want your users to do something you have to ask
them to.
All your page titles should be indicated with the title tag when coding your web site. The title tag
isn’t difficult to use. Here’s an example of such a tag:
< title >A Descriptive Web Site Title< / title >
If your page titles aren’t tagged properly, you might as well not be using those titles, so take the
time to ensure that your page titles are short, descriptive, and tagged into your web-site code. By
using title tags, you’re increasing the possibility that your web site will be ranked high within search
engine results.
Creating great content
Web-site content is another element of an SEO-friendly site that you should spend plenty of time
contemplating and completing. Fortunately, there are some ways to create web-site content that will
make search crawlers love you.
Great content starts with the right keywords and phrases. Select no more than three keywords or
phrases to include in the content on any one of your web pages. But why only three? Wouldn’t
more keywords and phrases ensure that search engines take notice of your site?
When you use too many keywords in your content, you face two problems. The first is that the effectiveness
of your keywords will be reduced by the number of different ones you’re using. Choose two
or three for each page of your site and stick with those.
The other problem you face is being delisted or ignored because a search engine sees your SEO
efforts as keyword stuffing. It’s a serious problem, and search engine crawlers will exclude your site
or pages from indexes if there are too many keywords on those pages.
Once you have the two or three keywords or phrases that you plan to focus on, you need to actually
use those keywords in the content of your page. Many people think the more frequently you use the
words, the higher your search engine ranking will be. Again, that’s not necessarily true. Just as using
too many different keywords can cause a crawler to exclude you from a search engine index, overusing
the same word will also cause crawlers to consider your attempts as keyword stuffing. Again,
you run the risk of having your site excluded from search indexes.
The term used to describe the number of times a keyword is used on a page is keyword density. For
most search engines, the keyword density is relatively low. Google is very strict about ranking sites
that have a keyword density of 5 to 7 percent; much lower or much higher and your ranking is
seriously affected or completely lost.
Yahoo!, MSN, and other search engines allow keyword densities of about 5 percent. Going over that
mark could cause your site to be excluded from search results.
Keyword density is an important factor in your web-site design, and is covered in more depth in
Chapter 4. But there are other content concerns, too. Did you know that the freshness and focus
of your content is also important in how high your web site ranks? One reason many companies
began using blogs on their web sites was that blogs are updated frequently and they’re highly focused
on a specific topic. This gives search engines new, relevant content to crawl, and crawlers love that.
Consider implementing a content strategy that includes regularly adding more focused content or
expanding your content offerings. It doesn’t have to be a blog, but news links on the front page of the
site, regularly changing articles, or some other type of changing content will help gain the attention of
a search engine crawler. Don’t just set these elements up and leave them, however. You also have to
carry through with regular updates and keep the links included in the content active. Broken links
are another crawler pet peeve. Unfortunately, with dynamic content links will occasionally break. Be
sure you’re checking this element of your content on a regular basis and set up some kind of a userfeedback
loop so broken links can be reported to your webmaster.
Finally, when you’re creating your web-site content, consider interactive forums. If you’re adding
articles to your site, give users a forum in which they can respond to the article, or a comments section.
This leads to more frequent updates of your content, which search crawlers love. The result?
An interactive relationship with your web-site users will keep them coming back, and give an extra
boost to your search engine ranking.
Maximizing graphics
Images or graphics on your web site are essential. They’re also basically ignored by search engines,
so what’s the point of putting them on your site? There’s a good reason that has nothing to do with
SEO. Without images, your page is just boring text. You’re not going to be happy with using plain
text instead of that cool new logo you had designed for your company, and neither are your users.
They want to see pictures. If images are a must on a web site, then there should be a way to use those images to increase your
web-site traffic or at least to improve your site ranking. And there is.
One technique that will help your SEO make use of graphics on your site is to tag those graphics
with alt tags inside the img tags.
Alt tags are the HTML tags used to display alternative text when there is a graphic present. Your alt
tags should be a short, descriptive phrase about the image that includes the keywords used on that
page when possible.
Img tags are the tags used to code the images that will appear on your web site. Here’s an example
of what an img tag, with an included alt tag, should look like:
”alternative
Here’s how that tag breaks down: text”/> is your alternative text tag. The alternative text tag is where your keywords should be
included if at all possible.
You want to tag your images as part of your SEO strategy for two reasons. First, crawlers cannot index
images for a search engine (with an exception, which is covered shortly). The crawler “sees” the image
and moves on to the text on the page. Therefore, something needs to take the place of that image, so
the crawler can index it. That’s what the alternative text does. If this text includes your keywords, and
the image is near text that also includes the keywords, then you add credibility to your site in the
logic of the crawler.
The second reason you want to tag your images as part of your SEO strategy is to take advantage of
image-based search engines, like Google Images. These image-based search engines are relatively
new, but they shouldn’t be undervalued. Just as a search engine can find and index your site for
users searching the Web, image-based search engines find and index your images. Then, when
users perform a search for a specific keyword or phrase, your image is also ranked, along with the
text on the pages.
Image searches are gaining popularity. So crawlers like the one Google uses for its Google Images
search engine will gain momentum, and image searches will add to the amount of web-site traffic
that your SEO strategies help to build. But while not discounting the value of images, don’t overuse
them on your web pages either. As with any element of a web page, too much of a good thing
is just not good.
Problem Pages and Work-Arounds
No matter how much time and consideration you put into your SEO strategy, there are going to
be elements of your web site that require special consideration. Some sites — like portals — need
a different approach than a standard web site might require. How you deal with these issues will
impact the effectiveness of your SEO efforts. Painful portals
The use of portals — those web sites that are designed to funnel users to other web sites and
content — as a search engine placement tool is a hotly debated topic. Many experts will start
throwing around the word “spam” when the subject of SEO and portals comes up. And there
have been serious problems with portals that are nothing more than search engine spam. In the
past, portals have certainly been used as an easy link-building tool offering nothing more than
regurgitated information. Sometimes the information is vaguely reworded, but it’s the still the
same information.
Search engine operators have long been aware of this tactic and have made every effort to hinder
its usefulness by looking for duplicate content, interlinking strategies, and other similar indicators.
Using these techniques, search engines have managed to reduce the usefulness of portal web sites
as SEO spam mechanisms.
However, because search engine operators need to be cautious about portals that are nothing more
than SEO spam, your job in optimizing your site if it’s a portal is a little harder. As with all web-site
design, the best objective for your site, even for a portal, is to help your visitors achieve a desired
result, whether that’s purchasing a product, signing up for a newsletter, or finding the desired information.
If you make using your site easy and relevant, your site visitors will stay on your site longer,
view more pages, and return to your site in the future. Portals help you reach these goals by acting
as excellent tools for consolidating information into smaller, more manageable sources of information
that users find easier to use and digest.
Too often people optimizing web sites focus on the spiders and forget about the visitors. The sites you
are developing have to appeal to the visitors and provide them with the information that they’re looking
for, or all you’ll get at the end of the day is hosting bills and low conversion rates. Portal web sites
enable you to create a series of information resources giving full information on any given topic while
structuring a network of information covering a much larger scope.
Though the visitor is of significant importance when building a web site, the site itself is of primary
significance, too. There’s no point in creating a beautiful web site if no one’s going to see it, and portals
are a fantastic tool for increasing your online visibility and search engine exposure, for a wide
variety of reasons.
Perhaps the most significant of these reasons is the increase in keywords that you can use in portal
promotion. Rather than having one web site with which to target a broad range of keywords, portals
allow you to have many web sites, each of which can have its own set of keywords. For example,
instead of trying to put “deer hunting” and “salt-water fishing” on the same page, you can create a
hunting portal that allows you to have separate sites for deer hunting, salt-water fishing, and any
other type of hunting activity that you would like to include.
On one page it is much easier to target the two keyphrases “deer season” and “Mississippi hunting
license” than it is to target two keyphrases like “deer season” and “marlin fishing.” Targeting
incompatible keywords or phrases — that is, keywords or phrases that aren’t related to a larger
topic — makes it harder to have both readable, relevant content and to reach the keywords that
you need to use.There are other advantages to creating web portals, as well. Having a portal allows you to have
multiple home pages, which can give you the opportunity to create sites that consistently appear in
top ranking. You also have more sites to include in your other SEO strategies, and more places to
include keywords. However, there is a fine line between a useful portal and one that causes search
engines to turn away without listing your portal on SERPs.
Don’t link all your sites to all of the others within your portal using some link-farm footer
at the bottom of every page. You may not even want to link all of them to the others on a
site map or links page. Instead, interlink them in an intelligent way. When you want to lead visitors to
another site in the portal, or when you want those users to be able to choose which site is most useful
to them, you can create intelligent links that have value for the site user. This value translates into better
rankings for your web site.
As with most issues in web design, keep it user-friendly and attractive. If you have any doubt that the
actions you’re taking with your site or the design methods that you’re using could lead to negative
results for the SEO of your site, don’t use them. If you’re feeling that a strategy won’t work, it probably
won’t, and you’re wasting your time if you insist on using a design you’re not comfortable with.
Fussy frames
Some web-site designs require the use of frames. Frames are sections of a web site, with each section
a separate entity from the other portions of the page. Because the frames on a site represent separate
URLs, they often create display issues for users whose browsers don’t support frames, and for search
crawlers, which encounter the frames and can’t index the site where the frame is the navigational
structure.
You have a couple of alternatives when frames are essential to the design of your web site. The first is
to include an alternative to the framed site. This requires the use of the noframes tag. The tag directs
the user’s browser to display the site without the framed navigational system. Users may see a strippeddown
version of your site, but at least they can still see it. When a search crawler encounters a site
made with frames, the noframes tag allows it to index the alternative site. It’s important to realize,
however, that when you use the noframes tag, you should load the code for an entire web page
between the opening tag and closing tag.
When you’re creating a noframes tag for a framed site, the content of the noframes tags
should be exactly identical to the frame set. If it’s not, a search crawler could consider it
spam, and then your site would be penalized or even delisted.
Another problem with frames is that search engines often display an internal page on your site in
response to a search query. If this internal page does not contain a link to your home page or some
form of navigation menu, the user is stuck on that page and is unable to navigate through your site.
That means the search crawler is also stuck in that same spot. As a result, the crawler might not index
your site.
The solution is to place a link on the page that leads to your home page. In this link, include the
attribute TARGET = “_top”. This prevents your site from becoming nested within your own
frames, which locks the user on the page they landed on from the search results. It also makes it
possible for crawlers to efficiently crawl your site without getting stuck.That link back to your home page will probably look something like this:
Return to Home Page
Frames are difficult to get around when you’re putting SEO strategies into place, but doing so is
not entirely impossible. It’s a good idea to avoid frames, but they won’t keep you completely out of
search engine rankings. You just have to use a different approach to reaching the rankings that you
desire.
Cranky cookies
Cookies are one of those irritating facts of life on the Internet. Users want web sites tailored to them,
and cookies are one way companies have found to do that. When users enter the site and customize
some feature of it, a small piece of code — the cookie — is placed on the user’s hard drive. Then,
when the user returns to the site in the future, that cookie can be accessed, and the user’s preferences
executed.
When cookies work properly, they’re an excellent tool for web designers. When they don’t work
as they should, the problems begin. So what constitutes a problem? The main issue with cookies
is that some browsers allow users to set how cookies will be delivered to them. And some source
code prompts the user to be asked before a cookie is accepted. When this happens, the search
engine crawler is effectively stopped in its tracks, and it doesn’t pick back up where it stopped
once the cookies are delivered. Also, any navigation that requires cookies will cause the crawler
to be unable to index the pages.
How do you overcome this issue? The only answer is to code cookies to ensure that the source code
is not designed to query the user before the cookie is delivered.
Programming Languages and SEO
One aspect of web-site design you might not think of when planning your SEO strategy is the programming
language used in developing the site. Programming languages all behave a little differently.
For example, HTML uses one set of protocols to accomplish the visuals you see when you
open a web page, whereas PHP uses a completely different set of protocols. And when most people
think of web-site programming, they think in terms of HTML.
But the truth is that many other languages also are used for coding web pages. And those languages
may require differing SEO strategies.
JavaScript
JavaScript is a programming language that allows web designers to create dynamic content. However,
it’s also not necessarily SEO-friendly. In fact, JavaScript often completely halts a crawler from indexing
a web site, and when that happens the result is lower search engine rankings or complete exclusion
from ranking.To overcome this, many web designers externalize any JavaScript that’s included on the web site.
Externalizing the JavaScript creates a situation where it is actually run from an external location,
such as a file on your web server. To externalize your JavaScript:
1. Copy the code, beginning at the starting tags, and paste it into a Notepad file.
2. Save the Notepad file as filename.js.
3. Upload the file to your web server.
4. Create a reference on your web page to the external JavaScript code. The reference
should be placed where the JavaScript will appear and might look like this:

This is just one of the solutions you can use to prevent JavaScript from becoming a problem for
your SEO efforts. There are many others, and depending on your needs you should explore some
of those.
Sometimes, people use JavaScript as a way to hide content or links from a search engine.
However, search crawlers can read JavaScript and most can even follow the links that are
in JavaScript. So if you try to hide content or links behind JavaScript, you run the risk of having your
site labeled as search engine spam. There’s more about search engine spam in Chapter 17.
Flash
Flash is another of those technologies that some users absolutely hate. That’s because Flash, though
very cool, is resource intensive. It causes pages to load slower, and users often get stuck on an opening
Flash page and can’t move forward until the Flash has finished executing. If the user is in a hurry,
it’s a frustrating thing to deal with.
Flash is also a nightmare when it comes to SEO. A Flash page can stop a web crawler in its tracks,
and once stopped, the crawler won’t resume indexing the site. Instead, it will simply move on to
the next web site on its list.
The easiest way to overcome Flash problems is simply not use it. But despite the difficulties with
search rankings, some organizations need to use Flash. If yours is one of them, the Flash can be
coded in HTML and an option can be added to test for the ability to see Flash before the Flash is
executed. However, there’s some debate over whether or not this is an “acceptable” SEO practice,
so before you implement this type of strategy in an effort to improve your SEO effectiveness, take
the time to research the method.
Dynamic ASP
Most of the sites you’ll encounter on the Web are static web pages. These sites don’t change beyond the
regular updates by a webmaster. On the other hand, dynamic web pages are web pages that are created
on the fly according to preferences that users specify in a form or menu. The sites can be created
using a variety of different programming technologies including dynamic ASP. The problem with these sites is that they don’t technically exist until the user creates them. Because a web crawler can’t make
the selections that “build” these pages, most dynamic web pages aren’t indexed in search engines.
There are ways around this, however. Dynamic URLs can be converted to static URLs with the right
coding. It’s also possible to use paid inclusion services to index dynamic pages down to a predefined
number of levels (or number of selections, if you’re considering the site from the user’s point of view).
Dynamic ASP, like many of the other languages used to create web sites, carries with it a unique set
of characteristics. But that doesn’t mean SEO is impossible for those pages. It does mean that the
approach used for the SEO of static pages needs to be modified. It’s an easy enough task, and a quick
search of the Internet will almost always provide the programming code you need to achieve SEO.
PHP
Search engine crawlers being what they are — preprogrammed applications — there’s a limit
to what they can index. PHP is another of those programming languages that falls outside the
boundaries of normal web-site coding. Search engine crawlers see PHP as another obstacle if
it’s not properly executed.
Properly executed means that PHP needs to be used with search engines in mind. For example,
PHP naturally stops or slows search engine crawlers. But with some attention and a solid understanding
of PHP and SEO, it’s possible to code pages that work, even in PHP.
One thing that works well with PHP is designing the code to look like HTML. It requires an experienced
code jockey, but it can be done. And once the code has been disguised, the PHP site can be
crawled and indexed so that it’s returned in search results.
Other Design Concerns
You’re likely to encounter numerous problems with SEO when designing your web site. Some are
easy to overcome. Others can be quite difficult. And still others aren’t problems you have to overcome;
rather, you just need to beware of them or risk being ignored by search engine crawlers.
Among tactics that might seem okay to some, but really aren’t, are the so-called black-hat SEO
techniques. These are practices implemented with a single thought in mind — increasing search
engine rankings, no matter how inappropriate those rankings might be. Some companies deliberately
use such techniques when creating web sites, even if the results that show up have absolutely
nothing to do with the search terms users entered.
Domain cloaking
On the surface, domain cloaking sounds like a great idea. The concept is to show users a pretty web
site that meets their needs, while at the same time showing search engines a highly optimized page that probably would be almost useless to users. In other words, it’s a slimy trick to gain search
engine ranking while providing users with a nice site to look at.
It starts with content cloaking, which is accomplished by creating web-site code that can detect and
differentiate a crawler from a site user. When the crawler enters the site, it is re-directed to another
web site that has been optimized for high search engine results. The problem with trying to gain
higher search results this way is that many search engines can now spot it. As soon as they find
that a web page uses such a cloaking method, the page is delisted from the search index and not
included in the results.
Many less-than-savory SEO administrators will use this tactic on throw-away sites. They know the
site won’t be around for long anyway (usually because of some illegal activity), so they use domain
cloaking to garner as much web site traffic as possible before the site is taken down or delisted.
Duplicate content
When you’re putting together a web site, the content for that site often presents one of the greatest
challenges, especially if it’s a site that includes hundreds of pages. Many people opt to purchase bits
of content, or even scrape content from other web sites to help populate their own. These shortcuts
can cause real issues with search engines.
Say your web site is about some form of marketing. It’s very easy to surf around the Web and find
hundreds (or even thousands) of web sites from which you can pull free, permission-granted content
to include on your web site. The problem is that every other person or company creating a web
site could be doing the same thing. And the result? A single article on a topic appears on hundreds
of web sites — and users aren’t finding anything new if they search for the topic and every site has
the same article.
To help combat this type of content generation, some search engines now include as part of their
search algorithm a method to measure how fresh site content is. If the crawler examines your site
and finds that much of your content is also on hundreds of other web sites, you run the risk of
either ranking low or being delisted from the search engine’s indexing database.
Some search engines now look for four types of duplicate content:
Highly distributed articles. These are the free articles that seem to appear on every single
web site about a given topic. This content has usually been provided by a marketing-savvy
entrepreneur as a way to gain attention for his or her project or passion. But no matter
how valuable the information, if it appears on hundreds of sites, it will be deemed duplicate
and that will reduce your chances of being listed high in the search result rankings.
Product descriptions for e-commerce stores. The product descriptions included on
nearly all web pages are not included in search engine results. Product descriptions can
be very small and depending on how many products you’re offering, there could be thousands
of them. Crawlers are designed to skip over most product descriptions. Otherwise,
a crawler might never be able to work completely through your site. Duplicate web pages. It does no good whatever for a user to click through a search result
only to find that your web pages have been shared with everyone else. These duplicate pages
gum up the works and reduce the level at which your pages end up in the search results.
Content that has been scraped from numerous other sites. Content scraping is the
practice of pulling content from other web sites and repackaging it so that it looks like
your own content. Although scraped content may look different from the original, it is
still duplicate content, and many search engines will leave you completely out of the
search index and the search results.
Hidden pages
One last SEO issue concerns the damage to your SEO strategy that hidden pages can inflict. These
are pages in your web site that are visible only to a search crawler. Hidden pages can also lead to
issues like hidden keywords and hidden links. Keywords and links help to boost your search rankings,
so many people try to capitalize on these requirements by hiding them within the body of a
web page, sometimes in a font color that perfectly matches the site background.
There’s no way around the issue of hidden pages. If you have a web site and it contains hidden pages,
it’s just a matter of time before the crawler figures out that the content is part of a hidden SEO strategy.
Once that’s determined by the crawler, your site ranking will drop drastically.
After Your Site Is Built
Building the right site to help maximize your SEO efforts is a difficult task. And when you’re finished,
the work doesn’t end. SEO is an ongoing strategy, not a technology that you can implement and forget.
Time needs to be spent reviewing your practices, examining results, and making adjustments
where necessary. If this ongoing maintenance is ignored, your SEO efforts to this point will quickly
become time that would have been better spent standing out on the street with a sign around your
neck advertising your web site. That might be more effective than outdated SEO.
Beware of content thieves
Maintenance of your SEO strategies is also essential in helping you find problems that might be completely
unrelated to SEO. For example, SEO strategies can help you locate content thieves. One such
strategy is tagging your web site. Some people (including black-hat SEOs) take snippets of content
from your site to use on their own. If you tag your content properly, you can use some very distinctive
tags, which will help you quickly locate content that has been stolen.
Another way in which SEO helps you to locate stolen content is through tracking. Presumably, if
you’re executing SEO strategies, you’re monitoring your site metrics with a program like Google
Analytics. Watching the metrics used by one of those analytics programs can help you locate content
thieves. For example, if you look at your incoming links on one of these programs, you might
find that people are coming to your site from a completely unexpected location. If that’s the case,you can follow the link back to that site to find out why. A site using stolen content is easy to find
using this method. There are also many services available that will help you track your web-site
content. Those services are covered in more depth in Chapter 12.
Tagging works well for finding content thieves, and there’s another tactic you can use to thwart automatic
content scrapers — domain cloaking. This is a process by which your web site appears to be
located somewhere other than where it is. This is accomplished using an HTML frame set that redirects
traffic from one URL to another. For example, if your web site address is www.you.somewhere.com,
you can use domain cloaking to have your site appear to be www.yourbusiness.com.
The problem with using domain cloaking is that it can confuse a search engine crawler, because the
same content appears to be on two pages, although it’s only one page and one that redirects. And
another problem is that some search engine crawlers can’t read the frame set that’s used to redirect
the user, which means your site may end up not being ranked at all. This is a tactic that should only
be used in special cases where content is truly unique and could possibly affect your SEO rankings
(or that of someone who might steal it) in a dramatic way.
Dealing with updates and site changes
One last problem you may encounter after you’ve initially set up your SEO strategies is the updates
and changes that your site will go through. Often, people feel that once the SEO is in place, then it’s
always in place, and they don’t have to think about it again. But believing this can lead to a very
unpleasant surprise.
When your site changes, especially if there are content updates or changes to the site structure, links
can be broken, tags may be changed, and any number of other small details may be overlooked.
When this happens, the result can be a reduced ranking for your site. Site crawlers look at everything,
from your tags to your links, and based on what they see, your ranking could fluctuate from
day to day. If what the crawler sees indications that your site has changed in a negative way, the site’s
ranking will be negatively affected.
Many things affect the way your site ranks in a search engine. You’ve seen an overview of a lot of
them in this chapter, and you’ll see them all again in more depth in future chapters. Realize that
SEO is not a simple undertaking. It is a complex, time-consuming strategy for improving your
business. And without attention to all of the details, you could just be wasting your time. So plan
to invest the time needed to ensure that your search engine optimization efforts aren’t wasted.