Tag Archives: Algo Cracking

Website Promotion Laboratory

How to tell real facts from marketing pitches?

Every week several new tools for web promotion and SEO appear in the market. Most of them can be useless for the average Web Promoter, but a few can bring significant value. Which ones? Hard to tell.

Every new product comes out with an impressive amount of cleverly designed marketing mumbo-jumbo, mixed with technicalities and dubious favourable testimonials. Knowledgeable testing is rarely performed. Objective reviews are hard to find.

Continue reading

Good WordPress rankings

WordPress is an interesting platform for ranking articles in Google, because all postings are identical in several structural aspects. There are several plugins in WordPress that add positive factors for SEO purposes.

So far we achieved good rankings with some of our older blogs: DomainGrower and PromotordeSitios (this one in Spanish). However, some postings are very well ranked and others not at all. Why is that? It would be nice to know. And it would be excellent to have a formula for consistent good rankings in WordPress. Continue reading

Sell software online

Steps for Selling Software online

Selling software is an activity that quite obviously, needs to be done online. There are significant advantages of being able to do all these steps in a few minutes: find a useful software product, learn about it, download it, install it, test it, and reach for the credit card to buy it.
Paradoxically, many software programmers are unaware of the huge resources for selling software online, and do not even have a web site. Or they have a nice one, with plenty of Flash animations, and no elements for web ranking.

Continue reading

SEO Audit: what to check and how

We have a SEO Audit service, and we are always in the look for new issues to detect. So far we have about 50, and most webmasters should be aware of those. On average, a website incurs in 30-35 faults that negatively influence its rankings.

The number of in-site factors to be addressed is quite large. The most important ones are revelead by Google itself, if you have a Webmaster account. But many others are not divulgued, and you need to hang around the good webmaster forums for a while before you can identify them.
Continue reading

What a web promoter learns from a Trojan site infection

I suddenly noted that the visits to a website of mine had multiplied 5 fold in a day. After feeling some satisfaction on the success of my promotion efforts, I noticed something unusual: the requested subjects had nothing to do with my main theme, but with foods and drinks. And when you visited one of those pages, you were redirected to a malware site.

I entered my sites by FTP and I found many offending pages that some bot had placed there. And when I say many, I mean about 5000 files in dozens of directories in several sites accross 3 servers… Many hours were needed to clean up.

Of course I had to clean my PC from malware, apparently coming from a mailing software that I had downloaded 2 weeks ago.

Continue reading

Google’s algorithm (or close to it)

First of all, the algorithm is the mathematical formula that Google uses to decide which website goes first. Knowing this formula would be of great value, as it would make our web positioning job easier, but it is a very well kept technical secret.

I have been collecting clues on the algorithm for a while, and running some quiet experiments. The latest non-official disclosure of the algorithm is from Rand Fish, in his seomoz.org site, obviously very well positioned under the SEO keyword. The article’s name is “A little piece of the Google algorithm revealed”.

And the formula is:

GoogScore = (KW Usage Score * 0.3) + (Domain Strength * 0.25) + (Inbound Link Score * 0.25) + (User Data * 0.1) + (Content Quality Score * 0.1) + (Manual Boosts) – (Automated & Manual Penalties)

The different factors are calculated as follows:

KW Usage Score
• KW in Title
• KW in headers H1, H2, H3…
• KW in document text
• KW in internal links pointing to the page
• KW in domain and/or URL

Domain Strength
• Registration history
• Domain age
• Strength of links pointing to the domain
• Topical neighbourhood of domain based on inlinks and outlinks
• Historical use and links pattern to domain

Inbound Link Score
• Age of links
• Quality of domains sending links
• Quality of pages sending links
• Anchor text of links
• Link quantity/weight metric (Pagerank or a variation)
• Subject matter of linking pages/sites

User Data
• Historical CTR to page in SERPs
• Time users spend on page
• Search requests for URL/domain
• Historical visits/use of URL/domain by users GG can monitor (toolbar, wifi, analytics, etc.)
Content Quality store
• Potentially given by hand for popular queries/pages
• Provided by Google raters
• Machine-algos for rating text quality/readability/etc

Automated & Manual Penalties are a mystery, but it seems they lower the ranking by 30 entries or more.

The mentioned factors are generally known in the experts’ forums, but the relative value that Rand gives them is useful. Rand’s conclusion is that little we can do to apply this algorithm, but to improve the content quality.

Some factors are too basic for Rand to mention, and relate to selecting a good domain, writing with a reasonable density of keywords, intelligently programming links, good code, sensible writing, etc.

Surprisingly, there are very few companies publishing results on the Google algorithm. However, competing search engines do very well their research, because they were able to copy almost the same ranking features as Google. Most of the times when I get a good ranking result in Google, Yahoo follows. A clear difference between both algos lies in the penalties, being Yahoo more lenient.

Most algo crackers show only a small sample of their knowledge, to prevent their competition to take advantage of their findings, and to avoid identification and possible penalizations. However, some of us are a bit more open, trying to use distributed thinking in order to achieve our algo cracking goals.