Work And Earn Money

minute workers

Thursday, July 21, 2011

SEO-positioning techniques and techniques worth

Advisable when the website is very large or difficult that search engines can follow the links on the menus (for example, by using javascript or flash, and can not read it). The site map should be put text links to all pages of the site, the search can thus access all of our website. However, do not put more than 100 links per page as this can be penalized.
Avoid using flash, PDF and javascript as far as possible
Avoid unnecessary use of flash files, the search engines index and find it very difficult in most cases do not. Same with pdf files, although less problematic than the first, can also create difficulties in indexing. For its part, the javascript code presents another problem for search engines as most can not follow the links they contain, so if your menu is done in javascript probably only get indexed your page.
Adhere to web standards
Validate our pages in accordance with the standards will help indirectly by search engine rankings as validating our pages we are reviewing possible bugs that can cause difficulties for the robots to index our pages, or they are not correctly the user. The easier it put the robots to index your site better. To achieve this, a thorough review of manuals W3C (the organization that creates web standards) and use the validator.
Maximum size of pages and links
The magic number for both cases is 100. No more than 100 links per page (best not to approach) and not more than 100 Kb weight. If you put more than 100 links on a page you run the risk of being penalized in search engines, so your position will fall dramatically or even erased from the database. Try not to exceed 80 links to be sure of not doing things wrong. As for the weight of the page, if it exceeds 100 Kb many search engines do not get the index, also the heavier the longer it takes to load, which is not recommended for the user.
Monitor the web, continuous positioning
Monitor the site and its positions on major search engines is a useful strategy for early detection of potential loss positions. To monitor positions are programs and websites that make it easier, for example, the Free Monitor for Google. However, when controlling positioning in Google, it should be noted that there is often a sharp drop of positions or the disappearance of our pages, especially when the site is new. This need not be bad, rather, is common and normal. If after a couple of days the situation has returned to normal you can start to worry.
Moreover, the issue of the positioning of websites constantly come and new things are obsolete before, so it may be that what you found useful at the time, now does not work for nothing or even harm you. Reason enough to keep up on this issue and renewed continually.
Tactics penalized in the search engine rankings
The penalty for websites that do unethical practices to position itself in the first search result is a strategy used by search engines to provide more interesting results to its users. When it detects that a page is taking actions not allowed, is penalized, which could include losing positions to make searches or even delete the results and Desindex. Among the reasons are to be penalized are:
1. Hidden text.
2. Duplicate websites.
3. Artificial links.
4. Cloaking and doorways.
Hidden text
It basically consists of placing text designed specifically for search engines is not visible to users in order to get a better search engine rankings. Webmasters who carry out these practices are devoted to writing words to fill your site and gain higher density there. For the user does not see, it is common to put the same color for the text and the background.
Duplicate Sites
This is to create identical sites (with the same content, etc.) to get links to the web you really want to position. This technique is quite frowned upon by search engines, so it must be avoided. Also, be careful when turning not to let Web site online the same in both directions could be penalized for having duplicate content.
Artificial links
It consists of artificially increasing the inbound links to a website. It can be done several ways, all of which are detrimental to our website sooner or later:
Spamming blogs and signature books, writing them with the sole intention of leaving the link to the web.
FFA and link farms, are pages dedicated to links to other pages, either paid or free, but over time are severely penalized by search engines, and the pages linked by them to be just pages with thousands links that do not contribute anything.
Cross-linking: making chain links with other sites known so that all link to all in a circle. If search engines detect that the only reason those links is to get a better position, can penalize them.
Cloaking and doorways
The cloaking is to make different sites depending on who visits it. Thus, the webmaster made fully optimized pages for different robots and pages for users to get high rankings on search engines. When a robot tries to track the page you are identified as such and is redirected to the page built specifically for him, so does the normal visitors.
The doorways are a special case of Cloaking, where optimized pages are made for the sole purpose of getting a good web search engine positioning. Normally, when the user tries to access them is redirected to the page? Really? using a javascript link (search engines can not read), while when the browser accesses the page you can browse it and unable to read the link javascript stores the page optimized for it.
Although these techniques may seem attractive, not in use, given that current browsers implement new detection methods for them, so sooner or later your pages will be penalized.

0 comments:

Post a Comment