Website Promotion Laboratory

How to tell real facts from marketing pitches?

Every week several new tools for web promotion and SEO appear in the market. Most of them can be useless for the average Web Promoter, but a few can bring significant value. Which ones? Hard to tell.

Every new product comes out with an impressive amount of cleverly designed marketing mumbo-jumbo, mixed with technicalities and dubious favourable testimonials. Knowledgeable testing is rarely performed. Objective reviews are hard to find.

Continue reading “Website Promotion Laboratory”

Good WordPress rankings

WordPress is an interesting platform for ranking articles in Google, because all postings are identical in several structural aspects. There are several plugins in WordPress that add positive factors for SEO purposes.

So far we achieved good rankings with some of our older blogs: DomainGrower and PromotordeSitios (this one in Spanish). However, some postings are very well ranked and others not at all. Why is that? It would be nice to know. And it would be excellent to have a formula for consistent good rankings in WordPress. Continue reading “Good WordPress rankings”

Sell software online

Steps for Selling Software online

Selling software is an activity that quite obviously, needs to be done online. There are significant advantages of being able to do all these steps in a few minutes: find a useful software product, learn about it, download it, install it, test it, and reach for the credit card to buy it.
Paradoxically, many software programmers are unaware of the huge resources for selling software online, and do not even have a web site. Or they have a nice one, with plenty of Flash animations, and no elements for web ranking.

Continue reading “Sell software online”

SEO Audit: what to check and how

We have a SEO Audit service, and we are always in the look for new issues to detect. So far we have about 50, and most webmasters should be aware of those. On average, a website incurs in 30-35 faults that negatively influence its rankings.

The issues are divided into insite and offsite. The insite are the easiest to fix, and the offsite factors, mostly link related, are to be addressed with method and patience.

We are offering an installment plan, that also offers a monthly update on the standing issues. The idea is that sites are to be permanently monitored for negative ranking factors.

We collected a number of software resources to make checking faster. Some are links, some are software packages, some are scripts that run in our pages. This arsenal allows us to run deep analysis in a day or two. This is not intended to discourage imitators: most tools are publicly known. But of course it requires a lot of time of dedicated programmers, which we offer to our clients.

Check the above cited page and if you are really interested, ask for a sample of our work, or for the complete model report that we send to our clients.

Our interest is mainly to earn money providing service, but also to correlate the SEO Audit issues with ranking: as a result, we improve our knowledge of what is important for ranking.

Google’s algorithm (or close to it)

First of all, the algorithm is the mathematical formula that Google uses to decide which website goes first. Knowing this formula would be of great value, as it would make our web positioning job easier, but it is a very well kept technical secret.

I have been collecting clues on the algorithm for a while, and running some quiet experiments. The latest non-official disclosure of the algorithm is from Rand Fish, in his seomoz.org site, obviously very well positioned under the SEO keyword. The article’s name is “A little piece of the Google algorithm revealed”.

And the formula is:

GoogScore = (KW Usage Score * 0.3) + (Domain Strength * 0.25) + (Inbound Link Score * 0.25) + (User Data * 0.1) + (Content Quality Score * 0.1) + (Manual Boosts) – (Automated & Manual Penalties)

The different factors are calculated as follows:

KW Usage Score
• KW in Title
• KW in headers H1, H2, H3…
• KW in document text
• KW in internal links pointing to the page
• KW in domain and/or URL

Domain Strength
• Registration history
• Domain age
• Strength of links pointing to the domain
• Topical neighbourhood of domain based on inlinks and outlinks
• Historical use and links pattern to domain

Inbound Link Score
• Age of links
• Quality of domains sending links
• Quality of pages sending links
• Anchor text of links
• Link quantity/weight metric (Pagerank or a variation)
• Subject matter of linking pages/sites

User Data
• Historical CTR to page in SERPs
• Time users spend on page
• Search requests for URL/domain
• Historical visits/use of URL/domain by users GG can monitor (toolbar, wifi, analytics, etc.)
Content Quality store
• Potentially given by hand for popular queries/pages
• Provided by Google raters
• Machine-algos for rating text quality/readability/etc

Automated & Manual Penalties are a mystery, but it seems they lower the ranking by 30 entries or more.

The mentioned factors are generally known in the experts’ forums, but the relative value that Rand gives them is useful. Rand’s conclusion is that little we can do to apply this algorithm, but to improve the content quality.

Some factors are too basic for Rand to mention, and relate to selecting a good domain, writing with a reasonable density of keywords, intelligently programming links, good code, sensible writing, etc.

Surprisingly, there are very few companies publishing results on the Google algorithm. However, competing search engines do very well their research, because they were able to copy almost the same ranking features as Google. Most of the times when I get a good ranking result in Google, Yahoo follows. A clear difference between both algos lies in the penalties, being Yahoo more lenient.

Most algo crackers show only a small sample of their knowledge, to prevent their competition to take advantage of their findings, and to avoid identification and possible penalizations. However, some of us are a bit more open, trying to use distributed thinking in order to achieve our algo cracking goals.