Friday, August 27, 2010

SEO Strategies

SEO is hard work. It takes much effort to optimize just the
right elements of your web site so search engines will
not only find you, but will also index your site so that it
appears high in search query results. And all of that effort must
be attended to by you. There are currently no tools that will
put all of the elements of SEO in place for you.
Instead, you have to build your web site with SEO in mind,
choose all the right keywords, and use them in the right places
and balance on your site, determine if pay-per-click and paidinclusion
programs are for you, use the right meta tags in the
right places, create great content, and add all the right links.
Sounds like a lot of work, doesn’t it?
It is. But don’t let the amount of work overwhelm you. Consistent
effort and the strategies included in this part of the book will have
you working toward your SEO goals in no time. Each of the chapters
in this section contains an explanation of how these elements
affect SEO, and how you can create and implement strategies to
help you leverage that element to reach your SEO goals.
Search engine optimization is a collection of strategies that improve the
level at which your web site is ranked in the results returned when a
user searches for a key word or phrase.
By now, that’s a definition you should be pretty familiar with. What you probably
don’t know (yet) is how to achieve SEO. You can’t do it all at once. Instead,
SEO has to happen in stages. If you try to implement too many strategies at
one time, two things are going to happen.
First, you won’t be able to tell which of your efforts are successful. Imple -
menting one strategy at a time makes it possible for you to pinpoint which
strategies are working and which are not.
Second, when you try to implement too many strategies at one time, your
efforts — even the successful ones — could be lost in the shuffle. It’s like having
too many children running around the house on the weekend. If you’re
not paying complete attention to all of them (and that’s virtually impossible),
at least one is bound to get into something.
SEO is most successful when you concentrate on one effort at a time. A great
place to start concentrating is on the way your site is built. One of the first
things that attracts a search engine crawler is the actual design of your site.
Tags, links, navigational structure, and content are just a few of the elements
that catch crawlers’ attention.
Before You Build Your Site
One of the most common misconceptions about SEO is that it should be implemented after a web
site has been built. It can be, but it’s much harder. A better option is to consider SEO even before
you begin to build your web site, if that’s at all possible. It may not be. But if that’s the case, you
can still implement SEO strategies in the design of your site; it will just require a lot more work
than building it in at the beginning.
Know your target
Before you even start contemplating how to build your web site, you should know in what types
of search engines it’s most important for your site to be ranked. Search engines are divided into
several types, beyond the primary, secondary, and targeted search engines that you learned about
in Chapter 2. In addition, search engine types are determined by how information is entered into
the index or catalog that’s used to return search results. The three types of search engines are:
Crawler-based engines: To this point, the search engines discussed fall largely into this
category. A crawler-based search engine (like Google) uses an automated software agent
(called a crawler) to visit, read, and index web sites. All the information collected by the
crawler is returned to a central repository. This is called indexing. It is from this index
that search engine results are pulled. Crawler-based search engines revisit web pages
periodically in a time frame determined by the search engine administrator.
Human-powered engines: Human-powered search engines rely on people to submit
the information that is indexed and later returned as search results. Sometimes, humanpowered
search engines are called directories. Yahoo! is a good example of what, at one
time, was a human-powered search engine. Yahoo! started as a favorites list belonging to
two people who needed an easier way to share their favorite web site. Over time, Yahoo!
took on a life of its own. It’s no longer completely human-controlled. A newer search
engine called Mahalo (www.mahalo.com) is entirely human-powered, however, and it’s
creating a buzz on the Web.
Hybrid engine: A hybrid search engine is not entirely populated by a web crawler, nor
entirely by human submission. A hybrid is a combination of the two. In a hybrid engine,
people can manually submit their web sites for inclusion in search results, but there is also
a web crawler that monitors the Web for sites to include. Most search engines today fall
into the hybrid category to at least some degree. Although many are mostly populated by
crawlers, others have some method by which people can enter their web site information.
It’s important to understand these distinctions, because how your site ends up indexed by a search
engine may have some bearing on when it’s indexed. For example, fully automated search engines
that use web crawlers might index your site weeks (or even months) before a human-powered search
engine. The reason is simple. The web crawler is an automated application. The human-powered
search engine may actually require that all entries be reviewed for accuracy before a site is included
in search results. In all cases, the accuracy of search engine results will vary according to the search query that is used.
For example, entries in a human-powered search engine might be more technically accurate, but the
search query that is used will determine if the desired results are returned.
Page elements
Another facet of SEO to consider before you build your web site is the elements needed to ensure
that your site is properly indexed by a search engine. Each search engine places differing importance
on different page elements. For example, Google is a very keyword-driven search engine; however, it
also looks at site popularity and at the tags and links on any given page.
How well your site performs in a search engine is determined by how the elements of your page
meet the engine’s search criteria. The main criteria that every search engine looks for are the site
text (meaning keywords), tags — both HTML and meta tags — site links, and the site popularity.
Text
Text is one of the most important elements of any web site. Of particular importance are the keywords
within the text on a page, where those keywords appear, and how often they appear. This
is why keyword marketing has become such a large industry in a relatively short time. Your keywords
make all the difference when a search engine indexes your site and then serves it up in
search results.
Keywords must match the words and phrases that potential visitors will use when searching for
your site (or for the topic or product that’s listed on your site). To ensure that your keywords are
effective, you’ll need to spend some time learning which keywords work best for your site. That
means doing keyword research (which you learn more about in
Tags
In search engine optimization, two kinds of tags are important on your web site: meta tags and
HTML tags. Technically, meta tags are HTML tags, they just appear in very specific places. The
two most important meta tags are the keyword tag and the description tag.
The keyword tag occurs at the point where you list the keywords that apply to your web site. A
keyword tag on a search engine optimization page might look something like this:

The description tag gives a short description of your page. Such a tag for the search engine optimization
page might look like this:
Not all search engines take meta tags into consideration. For that reason, you site should use both
meta tags and other HTML tags. Some of the other HTML tags that you should include on your web
site are the title tag, the top (or H1) heading tags, and the anchor tags.
The title tag is the tag that’s used in the title of your web site. This tag will appear like this:
Your Title Here< / Title >
Once you’ve tagged your site with a title tag, when a user pulls the site up, the title that you entered
will appear at the very top of the page if the user is using an Internet Explorer browser (IE) earlier
than IE7, as shown in Figure 3-1. In IE7 and the Firefox browser, the title will appear on the browser
tab, shown in Figures 3-2 and 3-3.
High-level headings (H1s) are also important when a crawler examines your web site. Your keywords
should appear in your H1 headings, and in the HTML tags you use to create those headings. An H1
tag might look like this:

High-Level Heading


Anchor tags are used to create links to other pages. An anchor tag can point users to another web
page, a file on the Web, or even an image or sound file. You’re probably most familiar with the anchor
tags used to create links to other web sites. Here’s what an anchor tag might look like:
Text for link
Other criteria to consider
In addition to the four main elements you should plan to include on your site, there are a few others.
For example, the body text on your web site will be examined by the crawler that indexes your site.
Body text should contain enough keywords to gain the attention of the crawler, but not so many that
it seems the site is being “stuffed’ with such words.
Alternative tags for pictures and links are also important. These are the tags that might appear as a
brief description of a picture or graphic on a web site that fails to display properly. The alternative
tags — called alt tags — display a text description of the graphic or picture, so that even if the actual
image doesn’t appear, there’s some explanation of what should be there. Alt tags are a good place to
include additional keywords.
Understanding Web-Site Optimization
Web-site optimization is all about creating a site that is discoverable by search engines and search
directories. It sound simple enough, but there are many aspects of site optimization to consider, and
not all of them are about the keywords, links, or HTML tagging of your site.
Does hosting matter?
That question comes up frequently when a company or individual is designing a web site. Does it
matter who hosts your site? The answer is no, but that’s not to say that domain hosting is unimportant.
Elements of the hosting have a major impact on how your site ranks in search results.
One of the biggest issues that you’ll face with domain hosting is the location of your hosting company.
If you’re in the United States and you purchase a domain that is hosted on a server in England,
your search engine rankings will suffer. Geographically, search engine crawlers will read your site as
being contradictory to your location. Because many search engines serve up results with some element
of geographical location included, this contradiction could be enough to affect your ranking.
The length of time for which you register your domain name could also affect your search engine
ranking. Many hackers use throw away domains, or domain names that are registered for no more
than a year, because they usually don’t even get to use the domain for a full year before they are shut
down. For this reason some search engines have implemented ranking criteria that give priority to
domains registered for longer periods. A longer registration also shows a commitment to maintaining
the web site.
Domain-naming tips
The question of what to name a web site is always a big one. When selecting a name, most people
think in terms of their business name, personal name, or a word or phrase that has meaning for
them. What they don’t think about is how that name will work for the site’s SEO. Does the name
have anything at all to do with the site, or is it completely unrelated?
Have you ever wondered why a company might be willing to pay millions of dollars for a domain
name? The domain name business.com was purchased for $7.5 million in 1999, and was
recently thought to be valued at more than $300 million. Casino.com went for $5.5 million and
worldwideweb.com sold for $3.5 million. What’s so important about a name?
Where SEO is concerned, the name of your web site is as important as many of the other SEO elements
that you concentrate on. Try this test. Use your favorite search engine to search for a topic,
perhaps “asphalt-paving business.” When your search results are returned, look at the top five results.
Most of the time, a web site containing those words will be returned in those top five results, and it
will often be in the number one slot.
So, if your company name is ABC Company, but your business is selling nutmeg graters, consider
purchasing the domain name NutmegGraters.com, instead of ABC Company.com — ABC Company
may not get you in the top of search rankings, but the very specific nature of your product probably
will. And both the content of your site and your domain name will attract crawlers in the way you
want. Using a domain name containing a keyword from your content usually improves your site
ranking.
A few more things that you should keep in mind when you’re determining your domain name
include:
Keep the name as short as possible. Too many characters in a name mean increased potential
for misspellings. It also means that your site address will be much harder for users to
remember unless it’s something really startling.
Avoid dashes, underscores, and other meaningless characters. If the domain name that
you’re looking for is taken, don’t just add a random piece of punctuation or numerology
to the name to “get close.” Close doesn’t count here. Instead, try to find another word
that’s relevant, and possibly included in the list of keywords you’ll be using. For
example, instead of purchasing www.yourwebsite2.com, try to find something
like www.yoursitesubject.com.
Opt for a .com name whenever possible. There are lots of domain extensions to choose
from: info, biz, us, tv, names, jobs. However, if the .com version of your chosen domain
name is available, that’s always the best choice. Users tend to think in terms of .com, and
any other extension will be hard for them to remember. Com names also tend to receive
higher rankings in search engines than web sites using other extensions. So if your competition
has www.yoursite.com and you choose to use www.yoursite.biz, chances
are the competition will rank higher in search results than you.
Again, it’s important to realize that domain naming is only one facet of SEO strategy. It won’t make
or break your SEO, but it can have some effect. So take the time to think about the name you plan
to register for your site. If you can use a name that not only reaches your audience but also lands you
a little higher in search results, then by all means purchase it. But if no name really seems to work in the SEO strategy for your site, don’t get discouraged. You can make up for any domain-naming issues
by implementing solid keyword strategies, tagging strategies, and other elements of SEO.
Understanding usability
Usability. It means different things to different web site designers. It’s also been at the top of every
user’s requirements list since the Web became part of daily life. When users click through to your
web site from a search results page, they want the site to work for them. That means they want to be
able to find what they’re looking for, to navigate from place to place, and to be able to load pages
quickly, without any difficulties.
Web-site users are impatient. They don’t like to wait for pages to load, they don’t want to deal with
Flash graphics or JavaScript, and they don’t want to be lost. These are all elements of usability —
how the user navigates through and uses your web site. And yes, usability has an impact on SEO.
Especially from the perspective of your site links and loading times.
When a search engine crawler comes to your site, it crawls through the site, looking at keywords,
links, contextual clues, meta and HTML tags, and a whole host of other elements. The crawler will
move from page to page, indexing what it finds for inclusion in search results. But if that crawler
reaches the first page and can’t get past the fancy Flash you’ve created, or if it gets into the site and
finds links that don’t work or that lead to unexpected locations, it will recognize this and make note
of it in the indexed site data. That can damage your search engine rankings.
Navigation knowledge
When you consider web-site navigation, there are two types: internal navigation and external navigation.
Internal navigation involves the links that move users from one page to another on your site.
External navigation refers to links that take users away from your page. For your navigation to be
SEO-friendly, you have to use both types of navigation carefully.
Look at a number of different high-ranking web sites. How is the navigation of those sites designed?
In most cases, you’ll find that the top sites have a left-hand navigation bar that’s often text-based, and
some have a button-based navigation bar across the top of the page. Few have just buttons down the
left side, and all of them have text links somewhere in the landing page.
The navigation for many sites looks the same, because this plan works. Having a text-based navigation
bar on the left works for SEO because it allows you to use anchor tags with the keywords you’re
using for the site. It also allows crawlers to move from one page to another with ease.
Buttons are harder for crawlers to navigate, and depending on the code in which those buttons are
designed, they might be completely invisible to the crawler. That’s why many companies that put
button-based links at the top of the page also usually include a text-based navigation bar on the
left. The crawler can still move from page to page, but the user is happy with the design of the site.The other element you see on nearly every page is text-based links within the content of the page.
Again, those links are usually created with anchor tags that include the keywords the site is using
to build site ranking. This is an effective way to gain site ranking. The crawler comes into the site,
examines the linking system, examines the content of the page, compares these items, and finds
that the links are relevant to the content, which is relevant to the keywords. That’s how your ranking
is determined. Every element works together.
Take the time to design a navigational structure that’s not only comfortable for your users, but is also
crawler-friendly. If it can’t always be perfect for the crawlers, make sure it’s perfect for users. Again,
SEO is influenced by many different things, but return visits from users are the ultimate goal. This
may mean that you have to test your site structure and navigation with a user group and change it
a few times before you find a method that works both for returning users and for the crawlers that
help to bring you new users. Do those tests. That’s the only way you’ll learn what works.
Usability considerations
It’s not always possible to please both your site users and the crawlers that determine your page
ranking. It is possible, however, to work around problems. Of course, the needs of users come first
because once you get them to your site you want them to come back. On the Internet, it’s extremely
easy for users to surf away from your site and never look back. And returning visits can make or
break your site.
But the catch is that in order to build returning visitors, you have to build new visitors, which is the
purpose of SEO. That means you need search engines to take notice of your site.
When it seems that users’ preferences are contrary to crawlers’ preferences, there is a solution.
It’s a site map. And there are two types of which you should be aware. A basic site map is an
overview of the navigational structure of your web site. It’s usually text-based, and it’s nothing
more than an overview that includes links to all of the pages in your web site. Crawlers love site
maps. You should, too.
A site map allows you to outline the navigational structure of your web site, down to the second or
third level of depth, using text-based links that should include anchors and keywords. An example
of a site map for the Work.com web site is shown in Figure 3-5.
When a site map exists on your web page, a search engine crawler can locate the map and then
crawl all of the pages that are linked from it. All of those pages are then included in the search
engine index and will appear on search engine results pages. Where they appear on those SERPs
is determined by how well the SEO is done for each individual page.
A second type of site map, the XML site map, is different from what you think of as a site map in
both form and function. An XML site map is a file that lists all of the URLs for a web site. This file
is usually not seen by site visitors, only by the crawlers that index your site. There are more specifics
on XML site maps in Chapter 16.Components of an SEO-Friendly Page
Building an SEO-friendly web site doesn’t happen by accident. It requires an understanding of what
elements search engines examine and how those elements affect your ranking. It also requires including
as many of those elements as possible on your site. It does little good to have all the right meta
tags in place if you have no content and no links on your page.
It’s easy to get caught up in the details of SEO and forget the simplest web-design principles — principles
that play a large part in your search engine rankings. Having all the right keywords in the right
places in your tags and titles won’t do you much good if the content on your page is non-existent or
completely unreachable by a search engine crawler.Understanding entry and exit pages
Entry and exit pages are the first and last pages that a user sees of your web site. It’s important to
understand that an entry page isn’t necessarily the home page on your web site. It can be any other
page where a user lands, either by clicking through search engine results, by clicking a link from
another web site or a piece of marketing material, or by bookmarking or typing directly into the
address bar of a browser.
Entry pages are important in SEO, because they are the first page users see as they come onto the
web site. The typical web site is actually several small connected sites. Your company web site might
contain hubs, or central points, for several different topics. Say you’re a pet store. Then you’ll have
hubs within your sites for dogs, cats, birds, fish, and maybe exotic animals. Each hub will have a
main page — which will likely be your entry page for that section — and several additional pages
leading from that central page to other pages containing relevant content, products, or information
about specific topics.
Understanding which of your pages are likely entry pages helps you to optimize those pages for
search engine crawlers. Using the pet-store example, if your home page and all the hub pages are
properly SEO’ed, you potentially could be ranked at or near the top of five different sets of search
results. When you add additional entry pages deeper in your web site structure (that is, a dogtraining
section to the hub for dogs), you’ve increased the number of times you can potentially
end up at the top of search engine rankings.
Because entry pages are important in the structure of your web site, you want to monitor those pages
using a web-site analytics program to ensure they are working the way you expect them to work. A
good analytics program, like Google Analytics, will show you your top entry and exit pages.
Exit pages are those from which users leave your site, either by clicking through an exiting link,
selecting a bookmark, or typing a different web address into their browser address bar. But why
are exit pages important? They have two purposes; the first is to drive users from their entry
pages to a desired exit page. This is called the path that users travel through your site. A typical
path might look something like this:
SERP ➪ Home ➪ Women’s Clothing ➪ Product Pages ➪ Shopping Cart ➪
Checkout ➪ Receipt
In this example, Home is the entry page and Receipt is the exit page. By looking at this navigational
path, you can tell how users travel through your page and where they fall off the page. But there’s an
added benefit to understanding the navigational path of your users. When you know how users
travel through your site, you can leave what’s called a bread-crumb trail for them. That’s a navigational
indicator on the web site that allows them to quickly see where they are on your site, as shown
in Figure 3-6. This is the navigation path shown on the Wal-Mart web site. You can quickly see
where in the navigational structure of the site you’re located.The bread-crumb trail not only helps users return to a previous page in the navigational path; it also
makes it easier for a web crawler to fully examine your site. Because crawlers follow every link on
your page, this is an internal link structure that leads crawlers to individual pages that you want to
have included in search engine results.Choosing an Analytics Program
An important element in any SEO plan is analytics — the method by which you monitor the effectiveness
of your web site. Analytics are the metrics that show you how pages, links, keywords,
and other elements of your web site are performing. If your web host hasn’t provided you with an
analytics program, find one. Not having an analytics program is like walking around in the dark,
hoping you won’t bump into a wall.
Many web-site owners shy away from analytics packages because they appear to be complicated as
well as expensive. However, they don’t always have to be. You can find a good analytics program
that’s not only easy to use but is also inexpensive or even free. But use caution about making ease
and low cost the deciding factors when selecting an analytics program.
The program will give you the power to see and control how your web site performs against your
goals and expectations. You want it to show you everything you need to know, so here are some considerations
when you’re evaluating analytics programs:
What reports are included in the tools you’re examining, and how will you use those
reports?
How do you gather the information used to create the metrics you need?
How often are your reports updated?
How much training is necessary to understand your application and the reports provided?
Do you get software installation or is the product provided strictly as a web-based service?
What is the total cost of ownership?
What types of support are available?
What is the typical contract length?
Many analytics programs are available to you. Google Analytics, AW Stats, JayFlowers, ClickTracks,
and dozens of others all offer something different at a different price tag. If “free” is what you can
afford, don’t assume you’ll get a terrible package. Google Analytics is one of the free packages available;
it’s an excellent program and is based on what used to be the Urchin Analytics package (which
was quite costly). Other programs cost anywhere from $30 to $300 a month, depending on the
capabilities you’re purchasing.
The cost is not the most important factor, however. Ultimately, your consideration should be how the
analytics package can help you improve your business.Using powerful titles
Page titles are one of the most important elements of site optimization. When a crawler examines
your site, the first elements it looks at are the page titles. And when your site is ranked in search
results, page titles are again one of the top elements considered. So when you create your web site,
you need to have great page titles.
There are several considerations when coming up with your page titles. Here are some of the key
factors to consider:
Unless you’re Microsoft, don’t use your company name in the page title. A better choice
is to use a descriptive keyword or phrase that tells users exactly what’s on the page. This
helps ensure that your search engine rankings are accurate.
Try to keep page titles to less than 50 characters, including spaces. Some search engines
will index only up to 50 characters; others might index as many as 150. However, maintaining
shorter page titles forces you to be precise in the titles that you choose and ensures
that your page title will never be cut off in the search results.

TIP
The World Wide Web Consortium (W3C) has determined that the outside length of a
page title should be no more than 64 characters. Search engines will vary in the size of
title that’s indexed. Using 64 characters or less is an accepted practice, however, that still leaves your
page titles cut off in search engines that only index up to 40 or 50 characters. For this reason, staying
at or below the 40-character length is a smarter strategy within your SEO efforts.
Don’t repeat keywords in your title tags. Repetition can occasionally come across as spam
when a crawler is examining your site, so avoid repeating keywords in your title if possible,
and never duplicate words just to gain a crawler’s attention. It could well get your site
excluded from search engine listings.
Consider adding special characters at the beginning and end of your title to improve
noticeability. Parentheses (()), arrows (<<>>), asterisks (****), and special symbols like
££££ can help draw a user’s attention to your page title. These special characters and
symbols don’t usually add to or distract from your SEO efforts, but they do serve to call
attention to your site title.
Include a call to action in your title. There’s an adage that goes something like, “You’ll
never sell a thing if you don’t ask for the sale.” That’s one thing that doesn’t change with
the Web. Even on the Internet, if you want your users to do something you have to ask
them to.
All your page titles should be indicated with the title tag when coding your web site. The title tag
isn’t difficult to use. Here’s an example of such a tag:
< title >A Descriptive Web Site Title< / title >
If your page titles aren’t tagged properly, you might as well not be using those titles, so take the
time to ensure that your page titles are short, descriptive, and tagged into your web-site code. By
using title tags, you’re increasing the possibility that your web site will be ranked high within search
engine results.
Creating great content
Web-site content is another element of an SEO-friendly site that you should spend plenty of time
contemplating and completing. Fortunately, there are some ways to create web-site content that will
make search crawlers love you.
Great content starts with the right keywords and phrases. Select no more than three keywords or
phrases to include in the content on any one of your web pages. But why only three? Wouldn’t
more keywords and phrases ensure that search engines take notice of your site?
When you use too many keywords in your content, you face two problems. The first is that the effectiveness
of your keywords will be reduced by the number of different ones you’re using. Choose two
or three for each page of your site and stick with those.
The other problem you face is being delisted or ignored because a search engine sees your SEO
efforts as keyword stuffing. It’s a serious problem, and search engine crawlers will exclude your site
or pages from indexes if there are too many keywords on those pages.
Once you have the two or three keywords or phrases that you plan to focus on, you need to actually
use those keywords in the content of your page. Many people think the more frequently you use the
words, the higher your search engine ranking will be. Again, that’s not necessarily true. Just as using
too many different keywords can cause a crawler to exclude you from a search engine index, overusing
the same word will also cause crawlers to consider your attempts as keyword stuffing. Again,
you run the risk of having your site excluded from search indexes.
The term used to describe the number of times a keyword is used on a page is keyword density. For
most search engines, the keyword density is relatively low. Google is very strict about ranking sites
that have a keyword density of 5 to 7 percent; much lower or much higher and your ranking is
seriously affected or completely lost.
Yahoo!, MSN, and other search engines allow keyword densities of about 5 percent. Going over that
mark could cause your site to be excluded from search results.
Keyword density is an important factor in your web-site design, and is covered in more depth in
Chapter 4. But there are other content concerns, too. Did you know that the freshness and focus
of your content is also important in how high your web site ranks? One reason many companies
began using blogs on their web sites was that blogs are updated frequently and they’re highly focused
on a specific topic. This gives search engines new, relevant content to crawl, and crawlers love that.
Consider implementing a content strategy that includes regularly adding more focused content or
expanding your content offerings. It doesn’t have to be a blog, but news links on the front page of the
site, regularly changing articles, or some other type of changing content will help gain the attention of
a search engine crawler. Don’t just set these elements up and leave them, however. You also have to
carry through with regular updates and keep the links included in the content active. Broken links
are another crawler pet peeve. Unfortunately, with dynamic content links will occasionally break. Be
sure you’re checking this element of your content on a regular basis and set up some kind of a userfeedback
loop so broken links can be reported to your webmaster.
Finally, when you’re creating your web-site content, consider interactive forums. If you’re adding
articles to your site, give users a forum in which they can respond to the article, or a comments section.
This leads to more frequent updates of your content, which search crawlers love. The result?
An interactive relationship with your web-site users will keep them coming back, and give an extra
boost to your search engine ranking.
Maximizing graphics
Images or graphics on your web site are essential. They’re also basically ignored by search engines,
so what’s the point of putting them on your site? There’s a good reason that has nothing to do with
SEO. Without images, your page is just boring text. You’re not going to be happy with using plain
text instead of that cool new logo you had designed for your company, and neither are your users.
They want to see pictures. If images are a must on a web site, then there should be a way to use those images to increase your
web-site traffic or at least to improve your site ranking. And there is.
One technique that will help your SEO make use of graphics on your site is to tag those graphics
with alt tags inside the img tags.
Alt tags are the HTML tags used to display alternative text when there is a graphic present. Your alt
tags should be a short, descriptive phrase about the image that includes the keywords used on that
page when possible.
Img tags are the tags used to code the images that will appear on your web site. Here’s an example
of what an img tag, with an included alt tag, should look like:
”alternative
Here’s how that tag breaks down: text”/> is your alternative text tag. The alternative text tag is where your keywords should be
included if at all possible.
You want to tag your images as part of your SEO strategy for two reasons. First, crawlers cannot index
images for a search engine (with an exception, which is covered shortly). The crawler “sees” the image
and moves on to the text on the page. Therefore, something needs to take the place of that image, so
the crawler can index it. That’s what the alternative text does. If this text includes your keywords, and
the image is near text that also includes the keywords, then you add credibility to your site in the
logic of the crawler.
The second reason you want to tag your images as part of your SEO strategy is to take advantage of
image-based search engines, like Google Images. These image-based search engines are relatively
new, but they shouldn’t be undervalued. Just as a search engine can find and index your site for
users searching the Web, image-based search engines find and index your images. Then, when
users perform a search for a specific keyword or phrase, your image is also ranked, along with the
text on the pages.
Image searches are gaining popularity. So crawlers like the one Google uses for its Google Images
search engine will gain momentum, and image searches will add to the amount of web-site traffic
that your SEO strategies help to build. But while not discounting the value of images, don’t overuse
them on your web pages either. As with any element of a web page, too much of a good thing
is just not good.
Problem Pages and Work-Arounds
No matter how much time and consideration you put into your SEO strategy, there are going to
be elements of your web site that require special consideration. Some sites — like portals — need
a different approach than a standard web site might require. How you deal with these issues will
impact the effectiveness of your SEO efforts. Painful portals
The use of portals — those web sites that are designed to funnel users to other web sites and
content — as a search engine placement tool is a hotly debated topic. Many experts will start
throwing around the word “spam” when the subject of SEO and portals comes up. And there
have been serious problems with portals that are nothing more than search engine spam. In the
past, portals have certainly been used as an easy link-building tool offering nothing more than
regurgitated information. Sometimes the information is vaguely reworded, but it’s the still the
same information.
Search engine operators have long been aware of this tactic and have made every effort to hinder
its usefulness by looking for duplicate content, interlinking strategies, and other similar indicators.
Using these techniques, search engines have managed to reduce the usefulness of portal web sites
as SEO spam mechanisms.
However, because search engine operators need to be cautious about portals that are nothing more
than SEO spam, your job in optimizing your site if it’s a portal is a little harder. As with all web-site
design, the best objective for your site, even for a portal, is to help your visitors achieve a desired
result, whether that’s purchasing a product, signing up for a newsletter, or finding the desired information.
If you make using your site easy and relevant, your site visitors will stay on your site longer,
view more pages, and return to your site in the future. Portals help you reach these goals by acting
as excellent tools for consolidating information into smaller, more manageable sources of information
that users find easier to use and digest.
Too often people optimizing web sites focus on the spiders and forget about the visitors. The sites you
are developing have to appeal to the visitors and provide them with the information that they’re looking
for, or all you’ll get at the end of the day is hosting bills and low conversion rates. Portal web sites
enable you to create a series of information resources giving full information on any given topic while
structuring a network of information covering a much larger scope.
Though the visitor is of significant importance when building a web site, the site itself is of primary
significance, too. There’s no point in creating a beautiful web site if no one’s going to see it, and portals
are a fantastic tool for increasing your online visibility and search engine exposure, for a wide
variety of reasons.
Perhaps the most significant of these reasons is the increase in keywords that you can use in portal
promotion. Rather than having one web site with which to target a broad range of keywords, portals
allow you to have many web sites, each of which can have its own set of keywords. For example,
instead of trying to put “deer hunting” and “salt-water fishing” on the same page, you can create a
hunting portal that allows you to have separate sites for deer hunting, salt-water fishing, and any
other type of hunting activity that you would like to include.
On one page it is much easier to target the two keyphrases “deer season” and “Mississippi hunting
license” than it is to target two keyphrases like “deer season” and “marlin fishing.” Targeting
incompatible keywords or phrases — that is, keywords or phrases that aren’t related to a larger
topic — makes it harder to have both readable, relevant content and to reach the keywords that
you need to use.There are other advantages to creating web portals, as well. Having a portal allows you to have
multiple home pages, which can give you the opportunity to create sites that consistently appear in
top ranking. You also have more sites to include in your other SEO strategies, and more places to
include keywords. However, there is a fine line between a useful portal and one that causes search
engines to turn away without listing your portal on SERPs.
Don’t link all your sites to all of the others within your portal using some link-farm footer
at the bottom of every page. You may not even want to link all of them to the others on a
site map or links page. Instead, interlink them in an intelligent way. When you want to lead visitors to
another site in the portal, or when you want those users to be able to choose which site is most useful
to them, you can create intelligent links that have value for the site user. This value translates into better
rankings for your web site.
As with most issues in web design, keep it user-friendly and attractive. If you have any doubt that the
actions you’re taking with your site or the design methods that you’re using could lead to negative
results for the SEO of your site, don’t use them. If you’re feeling that a strategy won’t work, it probably
won’t, and you’re wasting your time if you insist on using a design you’re not comfortable with.
Fussy frames
Some web-site designs require the use of frames. Frames are sections of a web site, with each section
a separate entity from the other portions of the page. Because the frames on a site represent separate
URLs, they often create display issues for users whose browsers don’t support frames, and for search
crawlers, which encounter the frames and can’t index the site where the frame is the navigational
structure.
You have a couple of alternatives when frames are essential to the design of your web site. The first is
to include an alternative to the framed site. This requires the use of the noframes tag. The tag directs
the user’s browser to display the site without the framed navigational system. Users may see a strippeddown
version of your site, but at least they can still see it. When a search crawler encounters a site
made with frames, the noframes tag allows it to index the alternative site. It’s important to realize,
however, that when you use the noframes tag, you should load the code for an entire web page
between the opening tag and closing tag.
When you’re creating a noframes tag for a framed site, the content of the noframes tags
should be exactly identical to the frame set. If it’s not, a search crawler could consider it
spam, and then your site would be penalized or even delisted.
Another problem with frames is that search engines often display an internal page on your site in
response to a search query. If this internal page does not contain a link to your home page or some
form of navigation menu, the user is stuck on that page and is unable to navigate through your site.
That means the search crawler is also stuck in that same spot. As a result, the crawler might not index
your site.
The solution is to place a link on the page that leads to your home page. In this link, include the
attribute TARGET = “_top”. This prevents your site from becoming nested within your own
frames, which locks the user on the page they landed on from the search results. It also makes it
possible for crawlers to efficiently crawl your site without getting stuck.That link back to your home page will probably look something like this:
Return to Home Page
Frames are difficult to get around when you’re putting SEO strategies into place, but doing so is
not entirely impossible. It’s a good idea to avoid frames, but they won’t keep you completely out of
search engine rankings. You just have to use a different approach to reaching the rankings that you
desire.
Cranky cookies
Cookies are one of those irritating facts of life on the Internet. Users want web sites tailored to them,
and cookies are one way companies have found to do that. When users enter the site and customize
some feature of it, a small piece of code — the cookie — is placed on the user’s hard drive. Then,
when the user returns to the site in the future, that cookie can be accessed, and the user’s preferences
executed.
When cookies work properly, they’re an excellent tool for web designers. When they don’t work
as they should, the problems begin. So what constitutes a problem? The main issue with cookies
is that some browsers allow users to set how cookies will be delivered to them. And some source
code prompts the user to be asked before a cookie is accepted. When this happens, the search
engine crawler is effectively stopped in its tracks, and it doesn’t pick back up where it stopped
once the cookies are delivered. Also, any navigation that requires cookies will cause the crawler
to be unable to index the pages.
How do you overcome this issue? The only answer is to code cookies to ensure that the source code
is not designed to query the user before the cookie is delivered.
Programming Languages and SEO
One aspect of web-site design you might not think of when planning your SEO strategy is the programming
language used in developing the site. Programming languages all behave a little differently.
For example, HTML uses one set of protocols to accomplish the visuals you see when you
open a web page, whereas PHP uses a completely different set of protocols. And when most people
think of web-site programming, they think in terms of HTML.
But the truth is that many other languages also are used for coding web pages. And those languages
may require differing SEO strategies.
JavaScript
JavaScript is a programming language that allows web designers to create dynamic content. However,
it’s also not necessarily SEO-friendly. In fact, JavaScript often completely halts a crawler from indexing
a web site, and when that happens the result is lower search engine rankings or complete exclusion
from ranking.To overcome this, many web designers externalize any JavaScript that’s included on the web site.
Externalizing the JavaScript creates a situation where it is actually run from an external location,
such as a file on your web server. To externalize your JavaScript:
1. Copy the code, beginning at the starting tags, and paste it into a Notepad file.
2. Save the Notepad file as filename.js.
3. Upload the file to your web server.
4. Create a reference on your web page to the external JavaScript code. The reference
should be placed where the JavaScript will appear and might look like this:

This is just one of the solutions you can use to prevent JavaScript from becoming a problem for
your SEO efforts. There are many others, and depending on your needs you should explore some
of those.
Sometimes, people use JavaScript as a way to hide content or links from a search engine.
However, search crawlers can read JavaScript and most can even follow the links that are
in JavaScript. So if you try to hide content or links behind JavaScript, you run the risk of having your
site labeled as search engine spam. There’s more about search engine spam in Chapter 17.
Flash
Flash is another of those technologies that some users absolutely hate. That’s because Flash, though
very cool, is resource intensive. It causes pages to load slower, and users often get stuck on an opening
Flash page and can’t move forward until the Flash has finished executing. If the user is in a hurry,
it’s a frustrating thing to deal with.
Flash is also a nightmare when it comes to SEO. A Flash page can stop a web crawler in its tracks,
and once stopped, the crawler won’t resume indexing the site. Instead, it will simply move on to
the next web site on its list.
The easiest way to overcome Flash problems is simply not use it. But despite the difficulties with
search rankings, some organizations need to use Flash. If yours is one of them, the Flash can be
coded in HTML and an option can be added to test for the ability to see Flash before the Flash is
executed. However, there’s some debate over whether or not this is an “acceptable” SEO practice,
so before you implement this type of strategy in an effort to improve your SEO effectiveness, take
the time to research the method.
Dynamic ASP
Most of the sites you’ll encounter on the Web are static web pages. These sites don’t change beyond the
regular updates by a webmaster. On the other hand, dynamic web pages are web pages that are created
on the fly according to preferences that users specify in a form or menu. The sites can be created
using a variety of different programming technologies including dynamic ASP. The problem with these sites is that they don’t technically exist until the user creates them. Because a web crawler can’t make
the selections that “build” these pages, most dynamic web pages aren’t indexed in search engines.
There are ways around this, however. Dynamic URLs can be converted to static URLs with the right
coding. It’s also possible to use paid inclusion services to index dynamic pages down to a predefined
number of levels (or number of selections, if you’re considering the site from the user’s point of view).
Dynamic ASP, like many of the other languages used to create web sites, carries with it a unique set
of characteristics. But that doesn’t mean SEO is impossible for those pages. It does mean that the
approach used for the SEO of static pages needs to be modified. It’s an easy enough task, and a quick
search of the Internet will almost always provide the programming code you need to achieve SEO.
PHP
Search engine crawlers being what they are — preprogrammed applications — there’s a limit
to what they can index. PHP is another of those programming languages that falls outside the
boundaries of normal web-site coding. Search engine crawlers see PHP as another obstacle if
it’s not properly executed.
Properly executed means that PHP needs to be used with search engines in mind. For example,
PHP naturally stops or slows search engine crawlers. But with some attention and a solid understanding
of PHP and SEO, it’s possible to code pages that work, even in PHP.
One thing that works well with PHP is designing the code to look like HTML. It requires an experienced
code jockey, but it can be done. And once the code has been disguised, the PHP site can be
crawled and indexed so that it’s returned in search results.
Other Design Concerns
You’re likely to encounter numerous problems with SEO when designing your web site. Some are
easy to overcome. Others can be quite difficult. And still others aren’t problems you have to overcome;
rather, you just need to beware of them or risk being ignored by search engine crawlers.
Among tactics that might seem okay to some, but really aren’t, are the so-called black-hat SEO
techniques. These are practices implemented with a single thought in mind — increasing search
engine rankings, no matter how inappropriate those rankings might be. Some companies deliberately
use such techniques when creating web sites, even if the results that show up have absolutely
nothing to do with the search terms users entered.
Domain cloaking
On the surface, domain cloaking sounds like a great idea. The concept is to show users a pretty web
site that meets their needs, while at the same time showing search engines a highly optimized page that probably would be almost useless to users. In other words, it’s a slimy trick to gain search
engine ranking while providing users with a nice site to look at.
It starts with content cloaking, which is accomplished by creating web-site code that can detect and
differentiate a crawler from a site user. When the crawler enters the site, it is re-directed to another
web site that has been optimized for high search engine results. The problem with trying to gain
higher search results this way is that many search engines can now spot it. As soon as they find
that a web page uses such a cloaking method, the page is delisted from the search index and not
included in the results.
Many less-than-savory SEO administrators will use this tactic on throw-away sites. They know the
site won’t be around for long anyway (usually because of some illegal activity), so they use domain
cloaking to garner as much web site traffic as possible before the site is taken down or delisted.
Duplicate content
When you’re putting together a web site, the content for that site often presents one of the greatest
challenges, especially if it’s a site that includes hundreds of pages. Many people opt to purchase bits
of content, or even scrape content from other web sites to help populate their own. These shortcuts
can cause real issues with search engines.
Say your web site is about some form of marketing. It’s very easy to surf around the Web and find
hundreds (or even thousands) of web sites from which you can pull free, permission-granted content
to include on your web site. The problem is that every other person or company creating a web
site could be doing the same thing. And the result? A single article on a topic appears on hundreds
of web sites — and users aren’t finding anything new if they search for the topic and every site has
the same article.
To help combat this type of content generation, some search engines now include as part of their
search algorithm a method to measure how fresh site content is. If the crawler examines your site
and finds that much of your content is also on hundreds of other web sites, you run the risk of
either ranking low or being delisted from the search engine’s indexing database.
Some search engines now look for four types of duplicate content:
Highly distributed articles. These are the free articles that seem to appear on every single
web site about a given topic. This content has usually been provided by a marketing-savvy
entrepreneur as a way to gain attention for his or her project or passion. But no matter
how valuable the information, if it appears on hundreds of sites, it will be deemed duplicate
and that will reduce your chances of being listed high in the search result rankings.
Product descriptions for e-commerce stores. The product descriptions included on
nearly all web pages are not included in search engine results. Product descriptions can
be very small and depending on how many products you’re offering, there could be thousands
of them. Crawlers are designed to skip over most product descriptions. Otherwise,
a crawler might never be able to work completely through your site. Duplicate web pages. It does no good whatever for a user to click through a search result
only to find that your web pages have been shared with everyone else. These duplicate pages
gum up the works and reduce the level at which your pages end up in the search results.
Content that has been scraped from numerous other sites. Content scraping is the
practice of pulling content from other web sites and repackaging it so that it looks like
your own content. Although scraped content may look different from the original, it is
still duplicate content, and many search engines will leave you completely out of the
search index and the search results.
Hidden pages
One last SEO issue concerns the damage to your SEO strategy that hidden pages can inflict. These
are pages in your web site that are visible only to a search crawler. Hidden pages can also lead to
issues like hidden keywords and hidden links. Keywords and links help to boost your search rankings,
so many people try to capitalize on these requirements by hiding them within the body of a
web page, sometimes in a font color that perfectly matches the site background.
There’s no way around the issue of hidden pages. If you have a web site and it contains hidden pages,
it’s just a matter of time before the crawler figures out that the content is part of a hidden SEO strategy.
Once that’s determined by the crawler, your site ranking will drop drastically.
After Your Site Is Built
Building the right site to help maximize your SEO efforts is a difficult task. And when you’re finished,
the work doesn’t end. SEO is an ongoing strategy, not a technology that you can implement and forget.
Time needs to be spent reviewing your practices, examining results, and making adjustments
where necessary. If this ongoing maintenance is ignored, your SEO efforts to this point will quickly
become time that would have been better spent standing out on the street with a sign around your
neck advertising your web site. That might be more effective than outdated SEO.
Beware of content thieves
Maintenance of your SEO strategies is also essential in helping you find problems that might be completely
unrelated to SEO. For example, SEO strategies can help you locate content thieves. One such
strategy is tagging your web site. Some people (including black-hat SEOs) take snippets of content
from your site to use on their own. If you tag your content properly, you can use some very distinctive
tags, which will help you quickly locate content that has been stolen.
Another way in which SEO helps you to locate stolen content is through tracking. Presumably, if
you’re executing SEO strategies, you’re monitoring your site metrics with a program like Google
Analytics. Watching the metrics used by one of those analytics programs can help you locate content
thieves. For example, if you look at your incoming links on one of these programs, you might
find that people are coming to your site from a completely unexpected location. If that’s the case,you can follow the link back to that site to find out why. A site using stolen content is easy to find
using this method. There are also many services available that will help you track your web-site
content. Those services are covered in more depth in Chapter 12.
Tagging works well for finding content thieves, and there’s another tactic you can use to thwart automatic
content scrapers — domain cloaking. This is a process by which your web site appears to be
located somewhere other than where it is. This is accomplished using an HTML frame set that redirects
traffic from one URL to another. For example, if your web site address is www.you.somewhere.com,
you can use domain cloaking to have your site appear to be www.yourbusiness.com.
The problem with using domain cloaking is that it can confuse a search engine crawler, because the
same content appears to be on two pages, although it’s only one page and one that redirects. And
another problem is that some search engine crawlers can’t read the frame set that’s used to redirect
the user, which means your site may end up not being ranked at all. This is a tactic that should only
be used in special cases where content is truly unique and could possibly affect your SEO rankings
(or that of someone who might steal it) in a dramatic way.
Dealing with updates and site changes
One last problem you may encounter after you’ve initially set up your SEO strategies is the updates
and changes that your site will go through. Often, people feel that once the SEO is in place, then it’s
always in place, and they don’t have to think about it again. But believing this can lead to a very
unpleasant surprise.
When your site changes, especially if there are content updates or changes to the site structure, links
can be broken, tags may be changed, and any number of other small details may be overlooked.
When this happens, the result can be a reduced ranking for your site. Site crawlers look at everything,
from your tags to your links, and based on what they see, your ranking could fluctuate from
day to day. If what the crawler sees indications that your site has changed in a negative way, the site’s
ranking will be negatively affected.
Many things affect the way your site ranks in a search engine. You’ve seen an overview of a lot of
them in this chapter, and you’ll see them all again in more depth in future chapters. Realize that
SEO is not a simple undertaking. It is a complex, time-consuming strategy for improving your
business. And without attention to all of the details, you could just be wasting your time. So plan
to invest the time needed to ensure that your search engine optimization efforts aren’t wasted.

No comments:

Post a Comment