Blog

  • My very naked special way to promote my ideas

    I just released my ebook “Naked Business Proposals”.  A freeware version can be downloaded at my site. A paid version can be obtained paying $39 .


    The  book has 40 business ideas that I created and intend to take to the market. Most of them are Web projects, each one based on a different subject or business model. 5 or 6 are health related, and the rest cover most fields. I have the concept and the technical elements to carry them out, but not the financing.

    (more…)

  • Our automated site creation software for Google ranking

    We are improving our page-generation software, obtaining every time better sites and always good positioning results.

    We recently made a project for our Spanish SEO site, www.yo1ro.com and got excellent results for most of the keywords. It is very interesting to analyze the results of the near 80 pages made, and which ones obtained good results. All pages have one of 5 templates, depending on their role in the site network: sitemap, 1st level keyword, 2nd level keyword, index to secondary domain pages, randomly combined keywords.

    Since we can change the structure of the templates, getting different keyword densities and element (image, metatag, heading, tag) combination, there are many experiments to be made. Those experiments let us know which factors are important (positive or negative) for results.

    If you understand some Spanish, ask us for the  detailed results of this experiment. Ohterwise, wait a couple of weeks for the next English experiment resutls.

  • Investing in Domains: revenue thru Adsense – How much and when

    Investing in Domains in these days more profitable than putting the money in bonds, stocks or any other financial instruments. Of couse, this is November 2008 and the markets are quite upset. My view is that they will come back to almost normal in a month or two. In the mean time, Domain Development is a very good alternative.

    Since I started DomainGrower.com I am trying to predict how a domain will perform, depending on its name, the intended market, and the investment the owner is ready to do. Every domainer wants to make sure the investment will be highly profitable, and they ask about Google Ranking, Traffic, Adsense revenue, maintenance cost, increased domain-site value and finally, profit.

    I am not able at this point to provide exact projections or a formula. However, I have now a pretty good idea of how the domain can grow and how much will return.

    Very good domains convert visits into sales at 1% or more. This is quite unusual, but sometimes we see it. When your domain does not sell any particular item, just adsense clicks, the conversion rate can be 0.5% and be very good. It is also important to assess the quality and quantity of the traffic. If your visitors are real estate investors ready to buy expensive property, the clicks will be very valuable. Maybe U$D 40 or 50.

    On the other hand, if they are poor Hispanic teenagers looking for free music or porn, their clicks will be much less valuable, maybe $ 0.03.

    My advice would be to focus on a niche audience, providing some free service that could mean value for them. Adsense revenues will help pay the bills.

    After that, you can start thinking into selling some valuable items. As your audience grows you can find new demanded fields where you can offer quality contents. At that point, maybe 6 months after start, you will jump from red to black numbers. In a year, the increased value of your site will be also reflected in the domain name value.

    Ask us about the value of your existing domain or developed site, and listen to our domain growing suggestions.

  • What a web promoter learns from a Trojan site infection

    I noted yesterday that the visits to a website of mine had multiplied 5 fold in a day. After feeling some satisfaction on the success of my promotion efforts, I noticed something unusual: the requested subjects had nothing to do with my main theme, but with foods and drinks. And when you visited one of those pages, you were redirected to a malware site.

    I entered my sites by FTP and I found many offending pages that some bot had placed there. And when I say many, I mean about 5000 files in dozens of directories in several sites accross 3 servers… Many hours were needed to clean up.

    Of course I had to clean my PC from malware, apparently coming from a mailing software that I had downloaded 2 weeks ago.

    (more…)

  • Results of the 3rd Positioning Experiment

    Once completed the “Familia Raimundez” positioning experiment, results were available in just 10 days, what makes them very interesting and convenient compared to those times when one had to wait 45 days for an indexation cycle.
    The results matched the previous ones, as some domains seem to be penalized in an arbitrary way. And I say in an arbitrary way because there is no obvious sign in the website for it to be penalized for: no black hat techniques, no links to questionable sites, no keyword abuse or obvious content duplication.
    (more…)

  • Support for purchasers of our GGG software

    The GGG (Great Gateway Generator) project was born in 2002 as a gateway page generator. Over the years there were 2 spinoff softwares: Keyword Thief (a keyword retrieving spider) and Synonymizer (a content generator). There are also some beta-stage PHP scripts that complemented the main product: Octopus Link Quadrangulator, Search Engine Optimized FAQ System, and Delinker.

    Over the years we collected several testimonials from customers who succeeded in improving their SE rankings using GGG, and we used it to provide service to our customers. Several buyers requested special features and appointed custom programming from us. Most of those improvements were incorporated into the products.

    We suddenly realized that the gateway pages WERE NOT the reason for the good ranking results obtained with GGG… Something else was, and the links had something to do with it. Thus, the product should might be renamed from Gateway Generator into “Optimized Site Generator”, “Website Ranking Enhancer” or something like that. The new name would also reflect the fact that the product did not rely on Black Hat techniques, like gateway pages.

    The many changes in the Google algorithm have not affected the ranking power of GGG. However, there is a need in a strategy change. Very large projects are no longer acceptable, because the sudden increase in pages and links is seen as unnatural by the Search Engines. Slow but permanent site growth is now required. In GGG, that means periodically adding more domains or directories as secondary hosts holding random pages.

    The other change in Google is related to subtle penalization of domains that do not follow some hidden guidelines, like “No Content Duplication” and “No Bad Neighbourhood Linking”. For that, we are developing special tests that set off the alarm before having the offensive pages indexed.

    Having said this, we admit that we never completed an extensive testing on the optimal settings of the GGG Projects, the minimal and maximal values for keywords, domains and pages, and the limits for safe operation without getting penalization.

    At this point we have 3 goals:

    – create an effective communication channel for users of GGG. At this point, I publish the results of the Web Ranking Experiments in this public blog. However, the most critical info will be mailed to paid subscribers of the future Page Generators Newsletter.

    – release GGG 4.0 in no more than 3 months.

    – offer a Website Ranking Enhancer service for those who do not have the time and patience to obtain the new software, create the contents and upload the pages. We will use the latest GGG, Synonymizer and Keyword Thief, plus all the knowledge on optimized page generation that we have collected and keep collecting.

    – run a permanent Testing System for our GGG pages, in order to be aware of the best site settings according to Google, and to provide the best possible service to our customers.

    – acquire more hosting services with different IPs, in order to host our pages without loss of ranking value.

    For all those web promoters out there that want to enhance their web rankings, we announce different service packages in this site.

  • Results of the Antolinez Family Experiment

    In only 10 days I got results for the experiment that tried to detect Goog penalization. It turned out that 2 domains are penalized with a -20 fall.

    The other result refers to the extent of indexation. It seems that penalized sites receive only superficial indexation. For instance, if a site is well indexed, all the word strings will be indexed, and searching for phrases within quotes will find them. If the site is badly indexed due to penalization, only individual words will be indexed, and the strings will not be detected. Interesting…

    The other result is that penalization covers all subjects, even those unrelated to the main one. For instance, a domain penalized for duplicate content will be penalized for content that has nothing to do with the abused content.

    However, I am not making a difference between penalization types, which probably exists. In the next test I will include one domain that was penalized for duplicates and other penalized for linking to bad neighbourhood. Let’s see if the penalizations are similar.

    I am now working on a new experiment (3rd of my controlled series) using more domains, mixing penalized with healthy domains.  

    On the other hand, I am analyzing the directories I use for submission with reciprocal link exchange. It seems that some of them are considered bad neighbours, maybe because they include black hat sites, or they sell links, or whatever. My analysis includes only existing factors, because I am not free to upload test pages to them.

    There are a few factors that warn you against bad directories: bad ranking in Google while searching for their own Home Page Title or Description, as compared with Yahoo or MSN. Also, few indexed incoming links, and other parameters. We are trying to establish the most reliable of those parameters.

    For all the Penalization Detection experiments we need to focus on keywords that have 10-200 results. Less, is not enough to detect a fall in rankings. More, are difficult to detect and count.

    We plan to offer a Standard Penalization Detection service (exact value), and a Penalization Diagnosis which will try to find an explanation for the issue. In most cases we detect bad linking, code problems or duplicate contents that explained the problem and could be corrected. 

  • Detection of Google Domain Penalization

    Sometimes the effort we make to position a website is fruitless, and the client and the SEO wonder why.
    A full penalization is obvious: the website disappears from the search engine under every keyword, even under its own domain name. It won’t have a Google PageRank, not even zero. And it is common for a domain banned from Google to rank perfectly well in Yahoo and the other search engines, more lenient penalizers.
    Nevertheless, partial penalization is hard to detect. The website has a worst ranking than it should, and there is no way to know why or for how long.
    A good website and link strategy analysis can suggest some of the exclusion reasons, and the measures to be taken. However, even after asking and begging the search engine for forgiveness, a response could take months. The best thing to do is to rapidly create a new site with similar content.
    To be sure the best is to run the Penalization Test. It involves selecting a group of uncommon keywords, or even made-up ones. For example, the Antolineck family. Its members are Gualter Antolineck. Serap Antolineck, Rupert Antolineck, Torib Antolineck and Egbert Antolineck. The name and last name of each brother must appear in the URL and in the Title, and once in the page body under the H1 header.

    5 brothers

    We create one page for each brother, hosted in each tested domain. We also create a sitemap in an independent well ranked domain pointing to each brother. Something like: “The Antolineck family is composed by (Gualter link), (Serap link), (Rupert link), (Egbert link), (Torib link).”
    That’s it. After a couple of weeks we will see the results. We will search Google for every brother, and register the rankings. And if we search for Antolineck, every indexed page will appear, ordered by the real positioning value of each domain. This value is not necessarily the same that Google’s PageRank.
    For this example I used a made-up last name, completely absent in search engines, for clearer results. However, the test can be performed under less common but existent words, say 1000 results. In this way the search is more natural and detects the -30 or -100 penalizations (descending 30 or 100 results in any search).

    Are the domains penalized? We will soon see. Will the domains be penalized because of a test? There is no breaking of the S.E. rules here.

    penalized brothers

    I plan to repeat the test periodically with my own and my client’s domains, because if a penalization occurs I need to instantly be aware of it and fix whatever it is that I did wrong. The test is also open to individual SEO webmasters who want to share the data.

    Ask a SEO-focused webmaster to position your site. Webmasters today are quite specialized, and the guy that designs, programs, writes and hosts, does not necessarily get you a good ranking…

  • Google’s algorithm (or close to it)

    First of all, the algorithm is the mathematical formula that Google uses to decide which website goes first. Knowing this formula would be of great value, as it would make our web positioning job easier, but it is a very well kept technical secret.

    I have been collecting clues on the algorithm for a while, and running some quiet experiments. The latest non-official disclosure of the algorithm is from Rand Fish, in his seomoz.org site, obviously very well positioned under the SEO keyword. The article’s name is “A little piece of the Google algorithm revealed”.

    And the formula is:

    GoogScore = (KW Usage Score * 0.3) + (Domain Strength * 0.25) + (Inbound Link Score * 0.25) + (User Data * 0.1) + (Content Quality Score * 0.1) + (Manual Boosts) – (Automated & Manual Penalties)

    The different factors are calculated as follows:

    KW Usage Score
    • KW in Title
    • KW in headers H1, H2, H3…
    • KW in document text
    • KW in internal links pointing to the page
    • KW in domain and/or URL

    Domain Strength
    • Registration history
    • Domain age
    • Strength of links pointing to the domain
    • Topical neighbourhood of domain based on inlinks and outlinks
    • Historical use and links pattern to domain

    Inbound Link Score
    • Age of links
    • Quality of domains sending links
    • Quality of pages sending links
    • Anchor text of links
    • Link quantity/weight metric (Pagerank or a variation)
    • Subject matter of linking pages/sites

    User Data
    • Historical CTR to page in SERPs
    • Time users spend on page
    • Search requests for URL/domain
    • Historical visits/use of URL/domain by users GG can monitor (toolbar, wifi, analytics, etc.)
    Content Quality store
    • Potentially given by hand for popular queries/pages
    • Provided by Google raters
    • Machine-algos for rating text quality/readability/etc

    Automated & Manual Penalties are a mystery, but it seems they lower the ranking by 30 entries or more.

    The mentioned factors are generally known in the experts’ forums, but the relative value that Rand gives them is useful. Rand’s conclusion is that little we can do to apply this algorithm, but to improve the content quality.

    Some factors are too basic for Rand to mention, and relate to selecting a good domain, writing with a reasonable density of keywords, intelligently programming links, good code, sensible writing, etc.

    Surprisingly, there are very few companies publishing results on the Google algorithm. However, competing search engines do very well their research, because they were able to copy almost the same ranking features as Google. Most of the times when I get a good ranking result in Google, Yahoo follows. A clear difference between both algos lies in the penalties, being Yahoo more lenient.

    Most algo crackers show only a small sample of their knowledge, to prevent their competition to take advantage of their findings, and to avoid identification and possible penalizations. However, some of us are a bit more open, trying to use distributed thinking in order to achieve our algo cracking goals.

  • How to value a website

    At some point most of us wanted to sell our websites. The perspective of selling a virtual item for real dollars is very attractive, but most developers get disappointed when facing the website market.

    Before Adsense, the perspective was even worse, because not even high traffic websites achieved revenue. Thanks to Adsense, the advertisement serving program run by Google, things have gone better for webmasters, who can now collect some money. This advertisement serving was Google’s and Yahoo ‘s great success.

    So, it is all about signing up to Adsense, place a little Javascript on the pages, and start making money. For example, PodcastDirectory.com gets 35.000 dollars per month for Adsense, and seatguru.com gets 15.000. Right, the first one has a million hits per month, and the second, 700.000. Humble websites with 400 visits a day (which are not that easy to obtain) can collect 40 dollars per month, luckily and with tail wind…

    There is also a theme-related matter. A website in an “expensive“ field, such as medical malpractice, will get more valuable clicks than a website with poor advertisement opportunities, such as a personal blog.. If we multiply the monthly revenue by 18, we will get the approximate value of each website, following a pretty conventional valuation model.

    Let’s suppose our site has no Adsense, either because we do not want to damage our image, because the revenue is not enough, or other reasons. Thus, we have a highly visited but profit-less site.¿How do we value it?

    Websiteoutlook.com is an online free tool for website valuations. According to it:

    PodcastDirectory.com is worth $60580
    seatguru.com is worth $70200

    This is less than the value coming from the real revenue, but is still pretty good for a website that pops up a number live and free, without asking for anything but the domain name.

    See these SEO related sites:

    seochat.com is worth $340,000
    seobook.com is worth $342.000
    webuildpages.com is worth $37,529
    seoadministrator.com is worth $23,000
    domaingrower.com (this site) is worth $2,379

    If I was to valuate a website, I would ask lots of things. Firstly, how the website makes money, its development and maintenance cost, etc. And after offering my valuation to the clients, I would listen to their opinion, and, maybe, basing on it I would adjust my value algorithm.

    Analyzing some small and medium size sites, own and from clients, I find that the automatic valuation comes quite close to the real asking price for the developed sites. The predicted values for very large sites (google, yahoo, microsoft, cnn, wired) are not related to real market values, as expected, because many other considerations apply, besides some site metrics.

    So, what is this website valuation model based on? In the linked spreadsheet there are some details, as indexed pages, Alexa ranking, backlinks, and the relation between these values.

    I have a file named: Website valuation data.xls

    Another factor, not taken into account here, is web positioning. Websites positioned for SEO, that allows offering services, should have an increased value. To know, a positioning index should be established for each site for a group of keywords, as the one I suggest elsewhere, and multiply it by an activity-specific rate.

    The site itself, WebsiteOutlook.com, has an Alexa rank of 11.000, 370.000 indexed pages, and, according to itself, it’s worth $217.000. The many indexed pages correspond to every search saved in a new page. They double as cache and indexed content.

    Having figuring out their algorithm, we should be able to duplicate their valuable site, improve it, and also calculate the increase in value of any client website enhanced by SEO (Google ranking) and Domain Development services. Is is also a nice tool to establish link value, predict traffic and check the value of promotion campaigns.