Forum Forum MESA !! Strona Główna
 Strona glówna  •  FAQ  •  Szukaj  •  Użytkownicy  •  Grupy  •  Galerie  •  Rejestracja  •   Profil  •  Zaloguj się, by sprawdzić wiadomości  •  Zaloguj 
Coach Tote Bags ErrorFix the Best Solution for a W 
Napisz nowy temat   Odpowiedz do tematu    Forum Forum MESA !! Strona Główna -> Terminarz
Zobacz poprzedni temat :: Zobacz następny temat  
Autor
Wiadomość
shoes0d8y
ORANGE EKSTRAKLASA



Dołączył: 24 Gru 2010
Posty: 730
Przeczytał: 0 tematów

Ostrzeżeń: 0/5
Skąd: England

PostWysłany: Pią 3:56, 04 Mar 2011  

v chic="googleappropriate">
ErrorFix is an simple to inarrest and use bendableceramics affairs which accredits you to apple-pie your own atoneuter's arrangement anthology. The acumen you charge to do this is becould cause the atoneuter rebasisry is always getting adapted with books which accumulate clue of end user data and appliances you insalpine,[link widoczny dla zalogowanych], un-inarrest or conceivably accomplish about-faces to.
Any time you do this,[link widoczny dla zalogowanych], a tiny book is fabricated and pabstemious in the PC anthology so that your atoneuter accepts what to accomplish the actual next time you accomplish a affairs. So as you can brainstorm, over a abiding time aeon anytimey one of these books can yield up a lot of 87d3c7efb5ff21a4e1accomplishment939381b30c amplitude. In accession some books may be actual calmly aberrationn for addeds or even can be accidentd. When this appears you will beam your claimed atoneuter activates assuming absurditys commonly.
How Does ErrorFix Function?
absurdityFix will seek aural your arrangement anthology and attending for books which are bombastic and aren't appropriate any added. It will next get rid of them which acceptedly will reabundance basic deejay amplitude and advice annihilate any affectionate of absurditys that you could accept been traveling thasperous.
On top of this it has added allowances that assorted added atoneuter anthology charwoman affairss don't accept. One of them is a deejay fragmaccess that admittances you to optimize your harder dcarve and it is claimed for getting a lot bigger than the archetypal one that appears forth with Windows. It aswell appearance a atonelete aback up ability to advice you to advancement your anthology afore authoritative any affectionate of cadherees to it. That is acceptationant just in case something may go amiss or you in fact annihilate the inactual books if captitude the PC rebasisry.
User Friendly
ErrorFix has an interface that is in fact addle-patele to use. Meawait accessible up the bendableceramics and chase the apprenticeships. It's traveling to again assassinate a absolute browse of the anthology and acquaint you absolutely what abstracts books it is assertive to be accidental. One added bang of your abrasion baseon and anybody will be alone.
Your atoneuter should absolutely again accomplish a lot faster and aswell be absurdity chargeless. It's all atoneleted in just a brace of account and acquiesces you to advance your claimed atoneuter absolutely how you should be able to, bare the arch anguishs of connected botherations and lock-ups.
ErrorFix is chargeless to downamount and inarrest and can access your atoneuter's acceleration by about 70%,[link widoczny dla zalogowanych], so it is absolutely account accomplishing appropriate now,[link widoczny dla zalogowanych]!
Reanchorage this commodity
ch_applicant = "commodityalleyway";
ch_blazon = "mpu";
ch_amplitude = 590;
ch_acme = 250;
ch_blush_appellation = "


Post został pochwalony 0 razy
Powrót do góry
Zobacz profil autora
Saly986336
ORANGE EKSTRAKLASA



Dołączył: 26 Sty 2011
Posty: 347
Przeczytał: 0 tematów

Ostrzeżeń: 0/5
Skąd: England

PostWysłany: Pon 4:25, 07 Mar 2011  

Free newsletter
Subscribe to our free newsletter
(Powered by Google Groups)
SEO Tutorial / FAQ
The essentials of SEO 1. Introduction
1.1. What is SEO?
1.2. Do I need SEO?
1.3. Should I hire someone or make it all myself?
2. Basic concepts
2.1. Search engines
2.2. Terminology
3. Ranking factors
3.1. On-Page ranking factors
3.1.1. Important stuff
3.1.2. Helpful stuff
3.1.3. Useless stuff
3.1.4. Stuff that hurts your rankings
3.1.5. On-page factors summary
3.2. Off-Page ranking factors
3.2.1. What is it?
3.2.2. PageRank
3.2.3. Important stuff
3.2.4. Helpful stuff
3.2.5. Useless stuff
3.2.6. Stuff that hurts your rankings
3.2.7. Off-page factors summary
4. SEO strategies
4.1. On-Page SEO strategies
4.2. Off-Page SEO strategies
4.3. Tracking the results
4.4. Help! I've lost my rankings!
5. SEO software and tools
6. SEO resources that are worth visiting
a) Write for humans, not for search engines! Remember: you offer you products for humans. It is human who reads the texts on your website and decides whether or not he is going to purchase the stuff from you. Yeah, technically speaking, search engines read your site too, but I never heard of a search engine that would buy something.
So you should create a content that is interesting and useful for your human visitors at the first place!
First of all, let me assure you that this is NOT a usual SEO tutorial made mostly of techniques that worked well in 2003 by some guys who didn't try to actually apply these techniques themselves in 2010. This tutorial contains some basic SEO info as well as some more advanced tips and tricks which are actually rather obvious too, but that doesn't make them less important, and it seems that it is their obviousness that makes some people think these tips don't work.
This SEO tutorial is not about "how-to-trick-google-best-of-all" stuff. We expose only legitimate, whitehat and working methods and recommendations here. Read along!

Table of Contents (SEO FAQ):



1. Introduction One of the buzz-words of the latest 10 years in internet marketing is SEO. Everyone talks about SEO, everyone tries to apply it more or less successfully. If you are experienced in this theme you may skip the first chapters, otherwise read along to learn the very basics.
1.1. What is SEO The term "SEO" is the abbreviation to "Search Engine Optimization". This is not about optimizing search engines, though. It is about optimizing websites for search engines. But why one needs to optimize a website? To answer this question we need to understand what a search engine is.
Search engines as the way to find an info on the web appeared in the middle of 90's. They crawled websites and indexed them in their own databases marking them as having one or another keyword in its content. Thus, when someone put some query in the searchbox of that search engine, it quickly searched its database and found which indexed pages corresponded to that query.
So, the more keywords of a query a website had, the higher it was shown in the results of a search. We don't know who was the first guy realized that he can make some changes to the pages of his website to make it rank higher, but he was truly a diamond!
So, SEO is something that helps your site rank better in search engines. There are a number of ways and methods of SEO, some of them are legitimate, while others are restricted and considered as "blackhat" techniques. Search engines don't like blackhat SEO and the effect of its usage may be disastrous for your website. Anyways, we'll thoroughly cover this material later in this SEO FAQ.
Back to Table of Contents
1.2. Do I need SEO? Well, the answer "Yes" is the first thing that comes to mind, does it? But let's think a bit more. Does SEO help... well.. umm.. say some oil-extracting company to sell their product? Does it help to promote a small local grocery in the neighbourhood of your home owned by an old chinese? Does it help Obama to rule his bureaucrats? Well, I guess you've got the idea. SEO is effective mostly for Internet businesses. Do you have one? Then you need SEO. Otherwise, SEO is only one of the possible channels to spread the word out about your product or service. And not necessarily the best one.
Back to Table of Contents
1.3. Should I hire someone or make it all myself? One of the most frequent unspoken questions is: should I hire a SEO professional or save few bucks and do it myself? There is no one universal answer for all situations, so here are some pros and cons:
ProsCons Hired SEO You don't have to waste your time;You don't have to learn SEO yourself;SEO Pro's can be quite effective. You still need to control a hired SEO yourself;SEOs usually don't give any guarantees and actually you must be very cautious while choosing a SEO to hire;Hired guy may be a SEO professional, but he is not necessarily a proffesional in your theme;Finally, you have to pay this guy. Do it yourself If you want it done right - do it yourself. You are the one who performs all the show, so you know best what is right and what is wrong about it;You really saving some bucks out there;You can constantly monitor the trends and apply changes in your SEO strategy on the fly. You will need to spend some time reading SEO FAQs and tutorials like this one, posting stupid questions on forums and doing other things nubies always do. It doesn't kill, but still takes some time;You can get very little SEO benefits for all of your efforts and time spent. After all, you are not a guru, right?
Back to Table of Contents
2. Basic concepts
2.1. Search engines Before we start talking about search engine optimization we need to understand how search engines work. Basically, each search engine consists of 3 parts: The Crawler (or the spider). This part of a search engine is a simple robot that downloads pages of a website and crawls them for links. Then, it opens and downloads each of those links to crawl (spider) them too. The crawler visits websites periodically to find the changes in their content and modify their rankings accordingly. Depending on the quality of a website and the frequency of its content updates this may happen from say once per month up to several times a day for a high popularity news sites.
The crawler does not rank websites itself. Instead, it simply passes all crawled websites to another search engine module called the indexer.
The Indexer. This module stores all the pages crawled by the spider in a large database called the index. Think of it as the index in a paper book: you find a word and see which pages mention this word. The index is not static, it updates every time the crawler finds a new page or re-crawls the one already presented in the index. Since the volume of the index is very large it often takes time to commit all the changes into the database. So one may say that a website has been crawled, but not yet indexed.
Once the website with all its content is added to the index, the third part of the search engine begins to work.
The ranker (or search engine software). This part interacts with user and asks for a search query. Then it sifts millions of indexed pages and finds all of them that are relevant to that query. The results get sorted by relevance and finally are shown to a user.
What is relevance and how would one determine if a page is more or less relevant to a query? Here comes the tricky part - the ranking factors...
Back to Table of Contents
2.2. Terminology Here are the basic terms you need to know. All others will be explained along the way.
Anchor text
This is simply a text of a link. Let suppose you have a link like that:
<a>The essentials of SEO - a complete guide<a>
The link would be looking as follows:
The essentials of SEO - a complete guide
The text "The essentials of SEO - a complete guide" - is the anchor text in this case. The anchor text is the key parameter in a link building strategy. You should always make sure that the anchor text of a link meets the theme of that page. If your page is about dogs, do not link to it with the "cats" anchor text. Obviously, you cannot control all and every link on the web, but at least you should make all links within your own website have an appropriate anchor text.
Inbound link
...or backlink is a link that points to your site. The more you have - the better. But in particular there are many exclusions from this rule, so read the Off-Page optimization section to learn more.
Keyword
One or more words describing the theme of a website or page. In fact, we should distinguish keyWORDS and keyPHRASES, but in SEO practice they all called keywords. For instance, the keywords for this page are: SEO FAQ, SEO tutorial, etc.
Short-tail and long-tail keywords
Easy one. Short-tail keywords are some general, common words and phrases like "rent a car", "seo", "buy a toy", "personal loan" and so on. Long-tail on the opposite precisely describe a theme: "rent bmw new york", "seo in florida", "buy a plush teddy bear" etc. The more precise a keyword is, the less it is popular, the less people type this exact query in the search box. But! The other side of the coin is: since each query is highly targeted, then once a visitor comes to your website from a search engine query and finds what he is looking for - it is very likely that such visitor will soon become a customer. This part is very important! Long-tail queries are not very popular, but the conversion rate for such queries is much much greater than for short-tail ones.

SERPs
You may heard this term, but didn't understand what is it. SERP means "Search Engine Result Page". If a user types some query and hit Enter he is redirected then to a SERP. Then he can click one of the results to open that website. Obviously, the results shown in the first positions get much more visitors than the ones from page #2-3 and lower. This is the purpose of SEO, actually: make a website move higher in SERPs.
Snippet
This is a short description shown by a search engine in the SERP listings. The snippet is often taken from a Meta Description tag, or it can be created by a search engine automatically basing on the content of a page.
Landing page
Landing page is a page opened when a visitor comes to the site clicking to a SERP. Here is an example query:

In this case, the page [link widoczny dla zalogowanych] is a landing page for the "google monitor" query.
Link juice
This funny term means the value that passes from one page to another by means of a link between them. To be precise: the linked page (acceptor) gets a link juice from the linking page (donor). The more link juice flows into a page, the higher it is ranked. Let's imagine a page that is worth $10 - this is the value of that page. If a page has 2 links, each one costs $5 then - that is the amount of link juice passed to the linked page. If the first page has 5 links, then each one only passes $2 of the initial link juice. Here is a simple picture to illustrate this concept:

Each link passes $5 value

Each link passes only $2 value

This means, the more links a Page A has, the less value each linked Page B gains from that Page A. Obviously, the real link juice value is not measured in dollars.
Nofollow links
Nofollow link is a link that a search engine should not follow. To make a link nofollow you need the below code:
<a>Some anchor text</a>
Google does not follow nofollow links and does not transfer the link juice across such links. You can read more about nofollow links here.
Link popularity
This term designates the amount of inbound links pointing to a site. Popular sites have more links. However, the number of inbound links is only a half of a pie. Read the off-page optimization section below to learn more.
Keyword stuffing
When you put a long list of keywords in a tag - this is keyword stuffing. For instance, a title tag for this page could look like: <TITLE>SEO guide, SEO FAQ, SEO tutorial, best seo faq, seo techniques, seo strategy guide</TITLE> and so on. This would be the keyword stuffing. Instead, the current title of this page (the one you're reading now) looks quite natural and adequately describes its contents. Do not use the keyword stuffing as a) it does not work; b) it is a bad practice that can hurt your rankings.
Robots.txt
robots.txt is a file intended to tell search engine spiders whether or not they are allowed to crawl the content of the site. It is a simple txt file placed in the root folder of your website. Here are some examples:
This one blocks the entire site for GoogleBot:
User-agent: Googlebot
Disallow: /

This one blocks all files withing a single folder except myfile.html for all crawlers:
User-agent: *
Disallow: /folder1/
Allow: /folder1/myfile.html


Back to Table of Contents
3. Ranking factors In general, there are only two groups of them: on-page and off-page ranking factors. It's been argued which one is the most important, but we'll answer that question later in this FAQ. At this time you should understand that both are crucial and both need the proper attention.
3.1. On-Page ranking factors There are many on-page ranking factors and even more has been spoken of since the first days of SEO. Some of them are really important, while others are said to be crucial for SEO, but actually are useless or even hurt your rankings. You know, search engines are evolving, they change their algos, and something that used to work in 2003 now has become a piece of useless garbage. So, here is the list of on-page ranking factors sorted by their importance and SEO value.
3.1.1. Important stuff Title
This one seems to be one the most important on-page factors. You should pay a close attention to the title tag. Here are some tips on writing a good title:
Content
The next important factor is the content of a page which seems pretty naive at the first glance, right? Wrong! Content is the king, as SEOs like to repeat. The quality content not only describes your product or service, it also converts your visitors to your customers and customers to returning customers. The quality content increases your ranking in search engines as they like a quality content. Moreover, the quality content even helps you get more inbound links to your website (see off-page ranking factors below)!
Basic tips for content are:
Navigation and internal linking
Again an important ranking factor. It seems obvious to create a proper navigation so the search engine crawler could follow all the links on a website and indexed all of its pages then. However, this factor is still being highly underestimated. Creating a clear and easy plain-text navigation helps both search engines and human visitors.
Avoid using JavaScript or Flash links since they are hard to read by search engines. Always provide an alternative way to open any page at your website with simple text links. Do have a sitemap of your website available from any other page with one click.
Also keep in mind that quality internal linking spreads the link juice across the pages of your website, and this strongly helps your landing pages rank better in SERP for long-tail keywords. Use this wisely, though. Link only to pages that really need to be linked to.
Let's suppose you have two pages: one generates you $10 income for every visitor, while other one does only $0.1. Which one whould you link first? Think of it that way and link to the most important and valuable pages of your website, using a relevant anchor text for each link.
Back to Table of Contents


3.1.2. Helpful stuff The below factors and techniques are not as crucial as the ones described above, but still they help a bit in gaining a higher rank in SERPs. Headings
Once upon a time search engines paid a close attention to the heading tags (H1 through H6), but now those days are gone. Heading tags are easily manipulated, so their value is not very high nowadays. Nevertheless, you still want to use headings to mark the beginning of a text, to split an article into parts, to organize sections and sub-sections within your document. In other words, despite headers provide merely a small SEO value,[link widoczny dla zalogowanych], they are still crucial for making your texts easiliy readable by human visitors.
Use the H1 tag for the main heading of the page, then the H2 for the article headings and the H3 to split the different parts of an article with sub-headers. That would be pretty good practice and is enough to make your site readable by humans. It also adds some SEO points which you should not neglect too.
Bold/Strong and Italic/Emphasized text
Both are nearly useless, but still have some SEO value (very little though). As with headers, you better use them for the benefit of your human visitors, emphasizing the key parts of the text. But do not put every 5th keyword in a bold text as it looks ugly while not giving any significant boost to your rankings anyway. Moreover, such page would be very hard to read.
Keyword placement
The value of keywords in a text depends on their placement across the page. Keywords placed near the top of the document get higher value than ones residing near the bottom. Important: when I say top or bottom I mean the source of the HTML document, not its visual appearance. That is why you want to put your navigation and supplemental texts near the bottom of the source file and all important and relevant content - near the top.
This rule also works in more specific cases: keywords placed in the beginning of the title tag are more important than ones placed 4th or 5th. Keywords placed in the beginning of the anchor text are more important and get more value too.
Keywords in filenames and domain name
An old trick with putting your target keywords into a filename or having them in a domain name. Still works, but don't expect too much boost from this one.
Image Alt attribute
This one was very popular in 2003, but now keyword stuffing of the Alt attribute does not give any SEO value to a page. The better use of the Alt attribute would be something like this: <img>
Write a natural description for each image and make sure it reads well. This helps you in two ways: a) your site ranks better in the image search; b) Google often takes the Alt text to create a snippet for the SERP.
Meta Description
One of the most popular and steady myths (alongside with keyword density) is the Meta Description tag. They say it helps you rank better. They say it is crucial to have it filled with the apropriate description of a page content. They say you must have it on each page of your website. All of these is not true. Nowadays, the only way the Meta Description is used by search engines is taking its content to create a snippet for the SERP. That is all! You don't get any other benefits of using the Meta D on your page, neither do you fall upon any penalty for not using it.
There is an opposite opinion suggesting not to use the Meta D at all, since a search engine anyway creates a snippet basing on the content of a page and you can't make this work better than a search engine. So why waste your time doing that? Personally, I would not agree with this point, since according to Google guidelines the Meta Description tag is still the preferred source of the info for a snippet. Though it is up to you decide whether you want it on your page or not, since as stated above it doesn't give any additional SEO impact, neither positive, nor negative.
Back to Table of Contents
3.1.3. Useless stuff (no pain, but no gain as well) Meta Keywords
Long time ago the <meta> tag was intended to tell search engine the keywords relevant to this particular page. In modern SEO history search engines download websites and extract relevant keywords from their content, so the Meta K tag is not used for web ranking anymore. Simply forget it, it is useless for SEO.
Keyword Density
One of the most overestimated web ranking factors is the keyword density. What is keyword density and why this myth lives so long? The keyword density of each particular word on a page is calculated as follows:
KD = Word_Count / Total_Words * 100%
That is, if a page has 150 words and the word "SEO" is mentioned 24 times on that page, its keyword density would be: 24 / 150 * 100% = 16%
But why this value is useless? Because search engines has evolved and does not count on keyword density anymore, since it is very easily manipulated. There are thousands of factors that search engines consider when calculating the page rank, so why would they need such simple (not to say primitive) way to rank pages as to count the number of times a word appears in the page text? You may hear the keyword density of 6% is the best rate, or keep it within 7% to 10%, or search engines like kw density within 3% to 7% and other bullshit. The truth is...
Search engines like pages written in a natural language. Write for humans, not for search engines! A page can have any keyword density from 0% (no keyword on a page at all) to 100% (a page consisting of only one word) and still rank high.
Well, of course you may want to control the keyword density of your pages, but please consider that there is no good value for this factor. Any value will work if your text is written with a human reader in mind. Why would one still want to check for keyword density if it is not count any more? Because it is a quick and dirty way to estimate the theme of a page. Simply do not overestimate this thing, it is merely a number, nothing more and it is useless for SEO.
Another interesting question: why this myth is still alive and why there are so many people still talking about keyword desnity as an important ranking factor? Perhaps, because keyword density is easy to understand and modify if needed. You can see it right here with your naked eye and quickly learn if your site is going good or bad. Well, it only seems as that, but not actually is - keyword density is useless, remember?
Dynamic URLs vs. static URLs
Beleive me or not, there is no difference. Both are of the same SEO value. The days when search engines had difficulties indexing dynamic URL websites are gone for good.
[link widoczny dla zalogowanych] vs. site.com
No difference either. If you want your site to be accessed with both ways, please add something like this into your .htaccess file:
RewriteEngine on
RewriteCond %{HTTP_HOST} ^domain.com
RewriteRule (.*) [R=301,L]

Underscore vs. hyphen in URLs
Once again, there is no any difference from the SEO point. You can use underscore, or hyphen, or even don't use any separator at all - this neither helps, nor hurts your position in SERPs.
Subfolders
Is it better to have a /red-small-cheap-widget.php file rather than /widgets/red/small/cheap/index.php? Does it hurt your rank if you put the content deep into the subfolders? The answer is no, it won't hurt your rankings and actually it doesn't matter at all how deep in the folder tree a file is located. What matters is how many clicks it takes to reach that file from the homepage.
If you can reach that file in one click - it certainly is more important and would have more weight than say some other file located within 5 clicks away from the index page. The homepage usually has many link juice to share, so the pages it directly links to are obviously more important than others (well, since they receive more link juice, that is).
W3C validation
W3C is World Wide Web Consortium - an international consortium where Member organizations, a full-time staff, and the public work together to develop Web standards. Basically speaking, they are guys who invented HTML, CSS, SOAP, XML and other web technologies.
Validation is the process of checking a page or website for its compliance with W3C standards. You can run a validation of any website for free here. Note, this validator shows not only such trivial things like unclosed quotation, undefined tags or wrong attribute values. It also checks the encoding problems, the compliance with the specified DOCTYPE, obsolete tags and attributes and many more.
Why is validation needed? A 100% valid website ensures that it will display correctly (and identically!) in all browsers that support standards. Unfortunately, in real life some browsers do not strictly follow the W3C standards, so a variety of different cross-browser problems with the number of websites are not rare thing all over the web. This doesn't belittles the importance of W3C standards, however.
From the SEO point the validation doesn't look so crucial though. Run a validation through google.com and you'll see a bunch of warnings and errors on their website. This example pretty clearly shows that Google doesn't care of W3C validation itself. At least not as much to give a strong rank boost to valid websites or penalize erroneous ones. It simply doesn't care. The recommended W3C validation strategy is: perform it to make your site working and accessible with all common browsers and don't bother doing it for the SEO purposes only, if you don't experience any cross-browser issues - it works fine as it is.

Back to Table of Contents
3.1.4. Stuff that hurts your rankings Keyword stuffing
Google defines that term pretty clear. Once again: write for humans. Repeating keywords across the page can trigger Google spam filter and this will result in huge loss of positions if not total ban of your website. Write naturally, optimize a bit if needed - that's the best way of using keywords nowadays.
Hidden text / Invisible links
At first, let's see what Google says about hidden text. Obviously, Google doesn't like it and if your site uses such technique it may be excluded from Google's index. You may ask, how would Google know if I use hidden text or not? Ok, I can set "display:none" in my external CSS file and limit the access to that CSS file with my robots.txt. Will Google be able to learn that a page has a hidden text then? Yes and no. This might work in the short term, but in the long run your disguise will fail, sooner or later. Also, it's been reported that GoogleBot not always strictly follows the robots.txt instructions, so it actually can read and parse JS and CSS without any problems and once it does - the consequences for your website and its web rankings will be disastrous.
Doorway pages
As bad as some SEO method could ever be. The doorway pages are special landing pages created for the only sake of obtaining good positions for some particular keyword. It doesn't have any valuable content and its only purpose is to catch the visitor from the SERP and redirect him to some other, non-doorway page which by the way is usually absolutely irrelevant to the initial visitor's query.
Splogs
Splogs (derivative from Spam Blogs) is the modern version of the old-evil doorways. The technique was as follows: one created thousands of blogs on some free blog service like blogspot.com, linked them between each other and obtained some backlinks via the blog comment spam and other blackhat methods (see below). Splogs itself did not contain any unique information, their content was always automatically generated articles stuffed with keywords, however due to a large number of inbound links such splogs ranked very well in SERPs dislodging many legitimate blogs. Later, Google implemented some filters to protect itself from the large amount of splogs and now any splog gets banned pretty fast.
If you own a blog - do not make it spammy. Instead focus your attention on writing good and interesting content. This works better in fact.
Cloaking
Not as bad in some particular cases, but still a blackhat technique. The method is based on determining whether a visitor is a human or search engine spider and then deciding which content to show. Humans then get one variant of the website while search engines get another one, stuffed with keywords.
Duplicate content
Being a scarecrow for many webmasters, duplicate content is not actually as dangerous as it is spoken. There are two types of content that can be called duplicate. The first case is when a website has several different ways to access the same page, for instance:




etc.
All four refer to the same page, but actually are treated as different pages having the same content. This type of duplicate content issue is easiliy resolved by Google itself and does not lead to any penalty from Google.
The other type is duplicate content on different domain names. A content of a website is considered duplicate if it doesn't add any value to the original content. That is if you simply copy-paste an article to your site - it is a duplicate content. If you copy-paste an article and add some comments or review it from your point of view - that's not duplicate content. The key feature here is some added value. If a site adds value to the initial information - it is not duplicate.
There are two other moments here that are worth to be mentioned. First, if someone copies your text and posts it then on another site - it is very unlikely that you will be penalized for that. Google tracks the age of each page and tends to consider the older one - and it is your website in this case - as the source of the original text. Second, you still can borrow the materials from other websites without a significant risk of being penalized for duplicate content by simply re-writing the text with your own words. There is a way to produce unique random texts using Markov chains, synonymizers and other methods, but I would not recommend using them, since the output looks too spammy and is not natural anyway, so it really can hurt your Google position. Write for humans. Write by yourself.
Frames
The frames technology not being a blackhat SEO by itself still can hurt your rankings, because seach engines do not like frames, since they destroy the whole concept of the web - single page for single URL. With frames, one page may load and display the content from many other URLs which makes it very hard to crawl and index. Avoid using IFRAME and other associated tags unless you really, really have to and if you do - provide an alternative way to index the contents of each frame with direct links or use the NOFRAMES tag with some backup content shown to search engines.
JavaScript and Flash
Google can read both JS and Flash (well, its text part of course), but it is not recommended to build your site solely basing on these two. There should always be a way for a visitor (either human or bot) to read the content of a website with the simple plain text links. Do not rely exclusively on JS or Flash navigation - this will kill your SEO perspectives as quickly as the headshot.
Back to Table of Contents
3.1.5. On-page factors summary Well, if you've read carefully the above parts you already can figure out the summary yourself. Content is the king, but only a quality one is. Do not try to trick or cheat with search engines as this only works on the short run and it is always just a matter of time when your rankings get dropped forever. Providing high-quality relevant content interesting both for you and your visitors is the key to on-page ranking success and (paradoxically!) a half of the way to the success with off-page ranking factors.
Back to Table of Contents
3.2. Off-page ranking factors
3.2.1. What is it? At the end of XX-th century search engines ranked websites basing solely on their content. The situation has changed after the Google triumph. Google's algos were based on the link popularity, not only the content of websites. So, the more inbound links a website had, the higher it was ranked by Google. The whole concept didn't change very much from those days - popular websites often get linked to, so this factor is applied for calculating web rankings alongside with the content of such websites. In present days it is possible to rank for some keyword even if a website does not contain that keyword in its text! (the proof link)
Needles to say that you should pay an attention to off-page ranking factors as much as you do for on-page optimization. This SEO tutorial describes all the things you should keep in mind while maintaining your inbound links. Read along.

Back to Table of Contents
3.2.2. PageRank In the first hand, we must separate two things: the real PageRank and the PageRank green bar shown in Google Toolbar and other online and offline PageRank tools. The Google bar PageRank (I'll be calling it the green PageRank, or gPR) is merely an indicator. The real PageRank of a website (I'll be calling it the PageRank, or PR from now) is a mathematical value reflecting the probability of a visitor randomly following the links on websites to open this particular website. The value of 1 means 100% probability, that is a visitor randomly surfing the web will always open the website. Sooner or later. On the opposite, the value of 0 means that a random visitor never comes to that particular website through a link on some other site.
I won't get deep into mathematics of the PageRank since this info can be easily found on the web. I'll only underscore the key moments of the PageRank statistical nature.
First of all, you must understand the following: the number of websites grows each day, while the overall PageRank value always stays the same: 1 (one). In other words, there is a 100% probability of the fact that a visitor opens SOME site on the web. But the odds of each particular website are going lower and lower every minute. If you have 3 apples and two of them are maggoty, what are your odds to take a good apple? They are 1/3 or 33%. If you have to choose one apple of 100 you only have 1% probability. That's the case with the PageRank - it lowers naturally every day.
Due to the pt.1 and the overall enormous number of indexed websites it is not possible to show the exact PageRank value every minute. That is why we need the green PageRank which is updated every 3 or 4 months and shows the PageRank value in a more comprehensible form: as a number from 0 to 10. This number correlates to the actual PageRank very little, it only shows the basic trends.
Also, the gPR scale is non-linear. One may think that a website with PR2 is two times more popular (or at least has two times more chances to get that random visitor we were talking about earlier) than its unlucky brother with PR1, but that is not true. In fact it is more likely to be that a PR2 website is 10 times more popular than PR1, but 10 times less popular than PR3. Something like this, but the number 10 is only an example here, since we don't know the exact formula.
The PageRank models a random user who is surfing the web and following random links on websites. From the practical point of view this means: the more links all over the web points to your website, the higher its PageRank is.
So, now you know that the key off-page ranking factor is the number of inbound links to a website and the green PageRank is the indirect indicator of that number. However, the PageRank mathematical mechanism considers only a quantity of links, while in fact there is also a quality factor. This is implemented via different filters and value dumping factors that Google applies to each link before including it into the PageRank calculation.
Back to Table of Contents
3.2.3. Important stuff This part describes the crucial off-page ranking factors you should always pay attention to. The theme of the linking website
This one is very important since the links from a relevant website are worth much more. On your link building efforts, try searching the websites that are close or at least similar to your own site theme. Though having a link from the unrelated site is not bad by itself and even Google admits that a webmaster doesn't have a full control on who and how links to its website. Nevertheless, avoid links from unrelated websites or sites with illegal or unethical content (porn, malware etc).
The theme of websites you link to
On the other hand, you have a full control of the links placed on YOUR own website so if you link to some unrelated website - it is you who is responsible for that and it is your site that will be penalized. So be careful what sites you link to. Linking to some unrelated content not necessarily leads to a penalty, but anyways you should be cautious and link only to quality websites.
Anchor text
The anchor text of the inbound link is very important and if you can adjust it - try to squeeze all out of it. First of all, avoid using the same anchor text all over the links. Use synonyms, paraphrases, different keywords, whatever else. Second, put the important keywords in the beginning of the anchor text. Finally, do not put all of your keywords into the link. There is really no reason to use anchor text longer than 50-55 characters or 10-12 words. Keep it short.
Landing pages
This factor is often ignored even by some professional SEOs and webmasters. It is not enough to simply have a link to your site! The link must be a) relevant; and b) quality. And you must be sure that both websites - the linking and the linked - qualify to these requirements. As for the theme of the linking website - see the pt.1. But the theme of your website should also be quality and relevant both to the donor website and to the anchor text of the link.
Well, it is not necessary in fact, but it helps a lot to have a properly optimized landing pages for every link you have. What this "proper optimization" includes? The landing page must include keywords mentioned in anchor text in its content; The keywords should appear in all important places like the title tag, the headings etc; The overall topic of the page must match to those keywords. If a landing page qualifies to all of these - it gets significant boost to its rank, because the corresponding inbound links get much more value now.
PageRank
The PageRank doesn't do anything by itself and the green PageRank does even less - it is simply an EGO-meter. However, the PR of the linking website (or a candidate) gives you an approximation of what the link from this website is worth, what value it has. Also, high-PR website are considered as trusted and gets some more value from Google. See below for more trust-factors.
A single link from a PR10 website (if you somehow manage to have one, of course) will quickly boost your own website PR to 7 or even 8 giving you the comparable boost in your search positions. But this factor is the last in the list of important off-page factors, because at first hand you should find a relevant and quality website that is willing to put a link to you and then (only then!) check its PageRank. Exactly in that order. Because quality content is worth more than high PageRank.
Back to Table of Contents
3.2.4. Helpful stuff Reciprocal linking
The basic reciprocal linking is very simple: site A links to site B while site B links to site A.

There are other schemes though:
Cross linking. Site A links to site B from page A1, while site B links to site A from page B1.

Circular linking. Site A links to site B, site B links to site C,... site Z links to site A.

Three-in-a-row linking. Site A links to site B, site B links to site C. No link back from C.

Combined.

There is a strong misbelief that reciprocal linking does not work anymore. That's not true. It does work, but the efficiency of this method is much much lower than it was in 2003. In 2009 Google greatly reduces the value of reciprocal links, especially for the schemes a and c, but the whole concept still works and really helps to gain rankings on early and middle stages of SEO promotion when literally every link counts.
Though there are some exclusions (as always however). Needless to say that you still have to choose the partners for reciprocal linking very carefully. Consider the theme of the linking website, its quality, its neighbourhood (other sites it links to), consider the page that would point to your site, pay attention to the anchor text of the link etc. You don't want to exchange links with spammy websites, or websites that use e-mail spam to suggest the partnership. You don't want a link buried 17 clicks away from the homepage.
Usually you don't want a nofollow link, but even a nofollow link from a relevant site can bring a load of target visitors to your site so it is up to you to decide whether it is only the link juice that you expect from the link exchange, or the auditory too. By the way, you also don't want a link from a page already having 50+ links on it. And the final yet still important note: do not e-mail website owners all over the web with link exchange proposal! That sucks, man, and no one answers anyway, while you will put your karma down to zero with such activity and possibly will receive a penalty from SpamCop or some other paranoid bastards. Don't do that, I tell you.
Web directories
One more technique that everyone tells it doesn't work anymore. Well, to be honest the efficiency of web directories never was so amazing. In fact there is only one web directory you certainly want to be included into: Google's Open Directory (or DMOZ). Google Directory is a free, human-edited web directory of high value. It is a bit tricky to get included into it, because it often takes months before your submission will be approved (if it ever will), but the game worths the candle - a link from DMOZ is a significant boost to your website value and a drop of life-giving link juice too.
If you have some free funds to spend, you may also want to be included into several paid inclusion directories, starting from Yahoo Directory which seems to be the most respectful of them. Also, here is a great article on directory submission you should definitely read. Don't miss the outstanding list of directories to submit too.
Social bookmarks
They used to work very well, but due to enormous amounts of spam on the social bookmarking websites the method is not as efficient as it were 2-3 years ago. Promoting a website through the social bookmarking websites has its pros and cons: Pro: SB websites get crawled very frequently - every 2 or 3 hours. This means if you manage to get there you will get your piece of traffic from search engines pretty soon. Pro: Social bookmarks not only help you raise your link strength, but also bring some amont of pure traffic from the bookmarking sites themselves. Depending on the popularity of the article posted on SB the traffic to your website could vary from few grains of sand to a pure avalanche. Con: Unfortunately, you cannot simply bookmark a link to your website and wait for traffic. This could have worked in the first days of social bookmarking, but now it does not. First, the amount of posts (diggs,[link widoczny dla zalogowanych], reddits, stumbles etc) per minute does not leave many chances to each particluar post to get popular. Your bookmark simply may lost between hundreds of thousands others. Second, bookmarking websites often have either moderators or some way for other users to disapprove an inappropriate post or a bookmark. So if you post a bookmark of your own website, the link is deleted and your account is banned. Too bad.
There are some workarounds for this though. The whitehat one. Post an article or some other valuable (do you hear me? I said valuable!) content on your website and wait until someone else links to it. Then you should bookmark that site instead. This won't increase your link popularity or PageRank, but still bring you visitors. Oh, did I forgot to tell that the other linking site could be yours too? So you may have a commercial website with the article and a non-commercial blog where you mention that commercial article. Or even you may have another blog where you mention the blog that mentions the commercial article (in the house that Jack built). The blackhat one. Create as many fake accounts as you need to promote your bookmark on every social bookmark site. Getting tough now since the method has been revealed ages ago and now SB websites are highly loaded with such spam already. Con: One more bad thing about social bookmarks is: they tend to work for a limited period of time. They bring you a splash of traffic in the short term, but then they simply deplete and only give few visits a week. On the other hand, even few visits still's more than no visits at all. Con: And the worst thing about social bookmarking is the quality of the traffic they produce. The traffic of a social bookmarking website is not targeted well, it is based on the impulse of curiosity, not on the intent. This means that you will (or will not - see above) receive a large load of traffic, but if you manage to convert merely 1% of it to customers you can congratulate yourself - you've done a good job! The conversion rate for this type of traffic is extremely low so this recipe only works well for a limited class of websites and products. Though it is still good if you want to build a community or simply need many people on your website for some reason (AdSense and so on).
Trust factors
A bunch of ranking factors that you only have a limited control of. Each of such factors does not add a value directly to the rank of a website. Instead, they increase its trust rating. Google (and other search engines too) prefers trusted websites and gives a boost to their ranking. Trust factors include: Domain name age. Old websites seem more trustworthy than others. If the domain didn't change its owner - it gets even more trust points (though don't ask how many). The number and quality of inbound links, the PageRank. If many other websites link to this one - it is considered trustworthy. The quality of links doesn't play a significant role though, since you cannot be responsible for the links pointing to your website as you got no any control on that. Otherwise it would be possible to hurt your competitors by posting links to them from some malware sites. Website content. If a website uses pop-ups, pop-unders or some of the blackhat SEO methods - its trust rating gets lower. Outbound links. If a website links to other trustworthy sites it gets a boost to its own trust rating. I believe there are more, but these ones are the most important.
Press releases, related resources, word of mouth etc.
That's a bit off SEO theme, but still can help to obtain a couple of links. You have some exclusive info? Share it with the community on some thematic resource. Do you have some astonishing news in your industry? Tell the world about it with a press release. Are you running a special promo action or offering a discount coupon? Let others know about it.
All of these usually doesn't require a single cent from you. You can send a press release via PRWeb, you can register on some forums within your industry for free to share your thoughts, you can tell others of your promo coupons at Giveaway of the Day or RetailMeNot and other similar sites. Don't neglect the power of word of mouth!
Back to Table of Contents
3.2.5. Useless stuff These off-page ranking factors do not work any more (some - never did) or their value is neglible tiny. Many links pointing to the same page
If a page has several links pointing to the same URL this won't give any additional SEO value, since Google only considers the very first link on the page. From the on-page optimization point this means you want to put your navigation menu links somewhere near the end of your HTML source. From the off-page point this means that you need only one link from one URL, because anyways only the first one counts.
Moreover, it even may hurt a bit. Let's suppose there is an external page that has 3 links in total, one of them points to your site. This would mean that one third of the overall link juice of that page flows into your website. Let's imagine then that you asked a webmaster to put one more link to your site to that page. So now it has 4 links, while two of them point to your site. So the link juice is now divided into 4 parts instead of 3, but hey! - the second inbound link from that page is not considered anyway, so in that case you are getting even less link juice than you did before!
Nofollow links
The SEO value of nofollow links is close to zero as they doesn't pass the PageRank and the link juice doesn't flow through them too. However, the link is always a link. Would you decline a nofollow link from the homepage of Google? This link would not give you any SEO value, but the traffic stream it would generate could smash up any dam.
Links in signature
Forum post signature is a popular place for links, but the SEO value of such method is extremely low. The fact is that nobody reads your signature unless you become a significant figure in that community and even then the signature links are not worth much. How many times have you opened someone's signature link yourself?
The direct SEO impact of such links is also tiny - the links are usually nofollow and even if they dofollow, they still buried in the depths of forum topics. The amount of the link juice you could obtain through them does not worth to be mentioned.
Does this mean "forget links in signature"? No. If you managed to become a part of the community and got some authority there, every spoken word of you (and your signature links as well) would attract the attention of the whole community. It needs time and efforts for sure, but there is no such thing as free lunch you know.
Guestbook links
The old as hell technique that never worked.
Blog comment links
Do not post comments to a blog for the only sake of the link. First, this is SPAM. Second, this doesn't work anyway. Third, most blogs have nofollow links in the comment so don't waste your time on something that doesn't help you, but pisses off all other people at the same time.
No-PR links
The amount of link juice that no-PR link has is utterly small and what is more important the trust rating it passes to the linked sites is small as well. This means that it is crucial to obtain links from high PageRank sites. Wasting your time on PR0 or no-PR sites is not worth the candles, because you need a bulk load of such links for the changes in your website rankings that you'll probably not even notice.
The PageRank by itself (as stated in the above sections) does not directly affect the position of your website, but it does affect the trust rating of other sites that link to you. Since you want links from trusted websites in the first hand, you should prefer high-PR links before all others.
Article submission
This one often occurs in many SEO FAQs and guides all over the web: write an article and submit it to article websites. That doesn't work. Well, ok, may be it used to work in the past, but now it doesn't. What is an article? It is a piece of useful text interesting to its readers. Now imagine a guy that is interested in reading 100.000+ very similar articles on some article website. You can't? What's the problem? The problem is: such guy never existed. Nobody wants to read an article made from the parts of another ten articles each of those in their turn was constructed from some initial article written in early 2003 with a keyword synonyms auto-replace software. Who wants to read those articles? Who wants to link to their authors? Nobody.
Surely, articles are good and you defintely want to write some. But submitting them to article websites is useless. Try applying some link bait instead.
Submitting your site to Google
Useless, because if you have some inbound links it crawls you anyway, and if you don't - ther's no difference whether you submitted the site to Google or not - it won't show up in SERPs. Though you may need this if your site has been excluded from the index for some reason (usually for some black hat SEO) - to include it back when you fixed the issue.
Back to Table of Contents
3.2.6. Stuff that hurts your rankings Link Farms
Well, even a child knows - link farms are evil, do not ever participate in link farms and so on and so further. What is link farm? It is a group of websites that simply link to each other. The technique worked very well in the beginning of 2000s due to a high influence of the link popularity parameter to the SERPs those days. Then search engines introduced a filter and now link farms value for SEO has only a negative consequences for your website position. Do not participate in link farms, do not create one either.
FFA (Free-For-All) sites
Another old as hell example of unavailing SEO method that works opposite the way it was supposed to work. The idea is to have a site that links to others, but the amount of links is limited and all links are shifted down every time a new link is submitted to FFA site. Practically, this means that thousands of webmasters submit their links to FFA and each particular link is only displayed about 5 minutes before it gets dislodged off the site by another load of links. Guess what SEO value such links have?
If you were hoping of getting a bunch of human traffic - you are wrong too. All the traffic on FFA site is generated by other webmasters submitting their links. Most of them do that automatically so they do not visit other links anyway. So this side of FFA value is negative too. Finally, FFA is often a way to collect working e-mails for the SPAM purposes, so summarizing all of the above: by submitting to FFA you get a 5-min link that nobody visits and tons of spam to your e-mail address. Sounds not too attractive, right?
Forum/Blog/E-mail Spam
Simply put - spam is spam. The first rule of the ethical SEO is: do not use spam methods. The second rule of the ethical SEO is: do not use spam methods! Read this carefully and remember: do not use spam methods. Ever. I'll curse you if you do.
Now, leaving the emotions behind, here is a more technical explanation of why the spam is bad. Spam pisses off forum readers, blog readers and owner, e-mail addressee. Spam links on forums or blogs are useless in the terms of SEO, because forums where you can freely post spam comments are usually of very low quality and thus such links won't give you any link juice. On the other hand, quality resources are usually human moderated and your spam comments won't pass through anyway. Someone can abuse your spam activity to SpamCop or other anti-spam freaks. Spamming requires abuse-proof hosting, abuse-proof domain registrar, abuse-proof payment processor and abuse-proof conscience. Do you happen to have all those?
Paid links
Well... It was a hard decision whether to put the paid links into "Harmful stuff" part, or into "Useless", or into "Helpful"... Because paid links are all of these: helpful, useless and may hurt your rankings depending on how you use them. Personally, I never bought any links and I would not recommend doing so. The paid link is something that breaks up the whole concept of the WWW: "I link to it, because it is interesting or relevant" treating it to "I link to it, because I was paid for it". That's not linking, that's advertising. And that is why many search engines are treating paid links very cautious these days. The value of paid links is very low now and if Google somehow finds that a website prefers paid links over the natural ones it may get a penalty or get sandboxed.
Still, in despite of the above, paid links could be useful in promoting your website. However, you should keep in mind that since paid links are advertising, they must be nofollow according to the Google paid links guidelines. This way a paid link is simply promoting a website and does not lead to the transfer of the link juice to it. That's ok, but since the link is nofollow now you should keep a closer attention to where you buy the link from - that link must bring you relevant visitors. Choose appropriate websites that are close to your theme, check their trust rating, don't hesitate to make a phone call or an e-mail inquiry if you have any doubts about them. One high quality link is better than 10 garbage links, it doesn't matter whether the links are natural or paid.
Inappropriate neighbours
This works in two ways: a) if some unsavoury site links to you; b) if you link to some unsavoury site. Both are bad. The first case is bad, because it hurts the trust rating of your site. If a bad guy links to you - you're also a bad guy. The second is no better either - it directly hurts your ranking positions.
What do we mean under "inappropriate neighbours"? This are malware, porn, hack, fishing, casino and other websites of questionable kind. So do not link to such websites and try to not have any inbound link from them too, though you can't control that directly, of course.
Unrelated websites
Not as bad as the above, but still can bring your rankings down a bit. Try obtaining links from the relevant sources - websites of your own theme or at least related to it. Why? Because search engines not only consider the anchor text of a link, but also read the surrounding text preceding and following it and collate it then with the text of your site. If the subjects of both sites differ significantly - the link is filtered out and its value is neglected by a search engine. Simply put, it is better to have one link from a relevant site than five links from ones as far from your website as the Sun is from the Earth.
Back to Table of Contents
3.2.7. Off-page ranking factors summary Let's summarize the above part of this SEO guide. Here is a quick synopsis of what you have already read: Good inbound link is a link from a relevant, high-PR site closely related to yours. Good inbound link has target keywords in its anchor text and around the link itself. Good inbound link points to a landing page made specially for each target keyword. Reciprocal links and directories still work, but don't expect miracles. Don't overlook social bookmarking sites. Article submission is useless. Do not spam. Paid links may both hurt and help. To avoid penalty, paid links must be nofollow. Keep your link neighbourhood clean and relevant.
Back to Table of Contents
4. SEO strategies In this section of the SEO tutorial we'll describe the strategies, irreplaceable to raise the position of your site in Google and other search engines. As before, we will talk about on-page and off-page strategies separately, however you should understand that in order to achieve the highest efficiency of your efforts all of the strategies must work together. There must be no preferences to the content or to the link builidng. Both are crucial and both require your constant attention.

Back to Table of Contents
4.1. On-Page SEO strategies The content is the king. Your main on-site SEO strategy is to build the content interesting to your visitors, providing info and value, correctly describing the services you offer, but not being a dull piece of text leaving the memory two seconds after you have finished reading it. At the same time, the content must serve its SEO purpose - to be a landing page for search engine queries. Surely, the process of writing a good copy is a creation, not the hack-work, but nevertheless there are few steps that you can follow walking this path: Discover your keywords
This is the first thing you should think of while starting to optimize your site content. It doesn't matter what industry you're working in - every industry, every sphere of human work can be described in hundreds if not thousands of different words! And your potential customers enter many of those words in the search box every day! You can't afford yourself to sit and look how this money river just flows downwards passing you by.
Think of what you're doing. Think of what your customers want from you. Try to figure this out with simple keywords. Then use synonyms. That would be the basis of your keywords. Then use Google Keyword Tool or Search-based keyword tool to find more synonyms and related searches that other people type in the search box. Conveniently, you can see the search volume per month and the approximate competition for each keyword which also helps you much to filter out the best terms to target.
Write your keywords down and sort them by demand (the amount of searches according to Google keyword tool) and then by relevance. Then, make a quick Google search for those keywords to reveal your competitors and analyze their websites to extract a bit more keywords of your niche. Select the most relevant and demanded keywords of the final list and proceed to the next step.
Prepare the content
Now, as you know which keywords people are searching, you need to give them what they want - that is you should prepare a landing page for each search term. Obviously, it is not enough to simply write a copy and stuff it with the keyword. Quite the contrary, you should build your copy around the chosen keyword and the theme it describes. Write naturally, don't neglect the headers and the loud speaking title with sticky words in it attracting the attention of the reader.
Remember the important places where your keywords should appear? We thoroughly covered this matter earlier in this FAQ. Write a plain, natural-language title with the target keyword placed somewhere in the beginning of the title, write a couple of headers, name the file accordingly etc. How many keywords should each page cover? The best strategy here is to make a landing page for 2-3 keyword phrases, no more. You don't want to disperse the focus of your efforts. Each page should concentrate on several short-tail keywords and a dozen or two long-tail ones.
Repeat the step for all of your keywords. It may take a while to write a proper content for each of your target pharses, and more importantly it may be hard to write a unique, non-duplicate content for each of the pages. Do not simply copy-paste one text to another changing then "gadgets" to "widgets" and "foo" to "bar". That's won't work. Instead, write with normal language, write for humans (I never tired of repeating this one!). You don't have to prepare all of the pages at once - this is a marathon, not a sprint.
Interlink your pages
A very important step! As you already know, the relevant links to a page with the proper anchor text are one of the most important ranking factors. Most likely, at this time you don't have many backlinks from other websites so building the internal links is crucial for you as it gives the very first boost to your search engine position.
So, review the pages you have and make links on them to each other. I do not mean the usual navigation stuff here. I mean find some keyword on a page and enclose it into the anchor tag, pointing to the page that is closely relevant to that keyword. Here is a picture to illustrate this concept:

You can see that the gadgets page links to the widgets page with some general term and to some more detailed page (or a product page) with the "orange gadgets" anchor text. Interlinking your pages is extremely important, but please do not overload your pages with links! Too many links hurt your positions and make it difficult to read the page. Try using link only when it's necessary, that is: when a link helps a visitor to better understand the info, or to choose a proper product, or simply adds some other value to a page. Also, you should keep in mind that only the first link counts, so you must assure that your navigation menu links are placed at the very bottom of your HTML source, while the links with proper anchor text - somewhere near the top.
Create navigation and site map
Aside from the interlinking, your site must provide the proper navigation both for humans and for search engines. Ensure you have linked all the pages, check the whole site for broken links, create a site map that puts all pages together in one place and make it accessible with one click from any other page of your site. It is not necessary though to create an XML sitemap and submit it to Google - it won't give any preferences anyway.
Keep working
While your newly-created pages start to obtaining their positions in SERPs, you should continue working - keep writing the content, optimize or change it if needed, monitor your competitors and so on. Things usually get changed pretty fast in SEO, so don't let yourself to rest on the laures.


Back to Table of Contents
4.2. Off-Page SEO strategies Perhaps you'll be a little surprised, but off-page optimization starts with the content (which is the king, as you remember). The very first step you should accomplish


Post został pochwalony 0 razy
Powrót do góry
Zobacz profil autora
Wyświetl posty z ostatnich:   
Napisz nowy temat   Odpowiedz do tematu    Forum Forum MESA !! Strona Główna -> Terminarz Wszystkie czasy w strefie EET (Europa)
Strona 1 z 1
   
 
Opcje 
Zezwolenia Opcje
Kto jest na Forum Możesz pisać nowe tematy
Możesz odpowiadać w tematach
Nie możesz zmieniać swoich postów
Nie możesz usuwać swoich postów
Nie możesz głosować w ankietach
Kto jest na Forum
 
Jumpbox
Kto jest na Forum
Skocz do:  


fora.pl - załóż własne forum dyskusyjne za darmo
Theme FrayCan created by spleen & Download
Powered by phpBB © 2001, 2005 phpBB Group
Regulamin