Don’t Detect Visitor OS

March 1st, 2006

The beauty of the web is its cross-compatible, OS-independant nature. Or so it should be! Today I tried to view a video on the MTV site and was greeted with the following:

Detecting OS…

PC Users with Windows 95 or 98: you need to run Windows 2000 or Windows XP to use MTV Overdrive.

Um, hello! I’m running Windows XP 64-bit edition! The occassional program doesn’t want to run on x64 because someone’s OS detection code sucks - that’s almost understandable.. but to have a web site reject you when you clearly do have the required specs is sh*t! I can only imagine what Mac users go through…

The moral: Don’t lock out visitors to your web site based on their OS. If your web app doesn’t work on all PC platforms (mobile devices are a different story) then fix it or use a different technology!

Lock Down and Say Goodbye to Bots

March 1st, 2006

I recently wrote about how spam bots were using my site to send spam. To combat this I did a thorough security check of both my web forms and my customers. A little bit of sanitization/validation code and everything was locked down. The bad part was the spam bots still kept hitting the previously-vulnerable php script trying to exploit it. Obviously they failed, but there were tens of bots hitting the site.

The bots couldn’t achieve anything as all the scripts were now secure. They were still annoying however. They skew server stats and are a constant threat.

Thankfully the bots have stopped hitting the page. I was worried they might hit the page forever more which could have been a major bandwidth issue as I couldn’t simply block an IP - the attacks were coming from a zombie network - so different IPs each time. It seems that the scripts/programs powering the zombie networks are quite smart indeed. They kept hammering the site each day for around 7-10 days once the security hole was fixed before the old insecure script got wiped off the vulnerable list.

So the moral of the story? If you’ve just had a script exploited and you’ve fixed the security hole, be patient and the bots will go away. :)

PigeonRank

February 24th, 2006

I was doing a search today on SEO related stuff and didn’t realise Google developed a technology called PigeonRank. They introduced it way back in April 2002 and I only stumbled across it today… ;)

http://www.google.com/technology/pigeonrank.html

Are Spammers Using Your Web Site?

February 21st, 2006

Last week we had a spammer test one of our contact forms (on another web site) until they found a potential vulnerability. I’m still not sure whether the spammer was successful in sending anything as all the log files were full of errors. In any case, we quickly locked down all our scripts and included some extra levels of security just to be sure.

The most interesting part of the attack was what happened afterwards and where the attack came from. Our exploitedform.php must have been put on some sort of attack list. Over the next few days we received repeated attempts to use exploitedform.php to send spam. These attempts failed each time and we set traps to log IP addresses and the attack strings.

It seems the attempts to use exploitedform.php came from a whole range of IPs, most likely a bot network. This bot network would consist of thousands of PCs that had been turned into zombies.

It amazes me to see how far a spammer will go to send their sh*t. They must somehow infect PCs around the world to turn them into zombies, then send out commands to these zombies to scan web sites for vulnerabilities, and then they go ahead and exploit a network of vulnerable sites to send spam. Even though they’re a$$holes, the software that powers all that must be quite impressive..

Still, it’s time to work out a way to shut these botnets and spammers down. More thoughts to come in future postings…

Google Toolbar to Combat Scraper Sites

February 14th, 2006

A few weeks ago I downloaded Google Toolbar Version 4. 2 new features struck me:

  1. The toolbar can store your bookmarks if you have a Google Account.
  2. The toolbar has a type of predictive input (or autocomplete), so that when you type part of a keyphrase it will automatically suggest similar searches that have been conducted in the past.

The bookmark feature is a huge development for the toolbar. I have always toyed with the idea of creating a search engine with a toolbar plugin (big call I know :) ). Users could rate a site, blacklist sites, highlight potential threats on sites etc.. This information would be fed back to the server to weed out the good from the bad based on human input. The biggest hurdle would be abuse of the system.

The new Google Toolbar may in fact work on a similar user rating system by monitoring bookmarks people add. This is fantastic as it could finally mean the end to scraper sites and automatically generated sites (ie. botsites, machine-generated etc).

Here’s hoping the system is in fact working this way. Perhaps in the future it can be expanded with the whole blacklist/threat warning ideas.. 8)

Can Bandwidth Thieves Increase Your PR?

February 7th, 2006

Recently an image used on one of our sites was linked to in a popular forum. When I say linked, I mean the picture was put in someone’s post and it was feeding off our server (ie. hotlinking). Sure, we could put in some hotlink protection, but I thought it would be a good opportunity to perform a little experiment..

Do search engines look at off-site images in the same way they look at links? Will someone hotlinking my images in a forum actually count as a vote toward my site’s Page Rank? Probably not, but it would be nice if it did - as hotlinking generally only happens when the original content is worth ‘taking’ in the first place.

I’ll keep an eye on the stats over the next few months, but chances are the only thing that will increase as a result of this hotlinking is our bandwidth usage and not so much our visitors from search engines.

What is Considered to be a Good CTR?

January 30th, 2006

I often get asked what is considered to be a good Click Through Rate (CTR). The answer to this question will depend on a number of things:

  1. The industry you’re advertising in.
  2. How well your ad is written (or produced for non-text ads).
  3. How targeted your keywords and keyphrases are.
  4. Where your ad is being shown.

Before we answer what a good CTR is, let’s look at how we can improve our performance in each of the above 4 areas.

The Industry

The industry you’re advertising in doesn’t require much attention as there’s nothing you can do to changes this - unless you’re in a product planning stage. But, if you’re looking up PPC tricks and tips, then chances are you’re into production already and ready to sell product.

Your Ad

If your ad is well written, you can achieve a massive increase in CTR. So how do you write better ads for PPC? Learn more about human psychology and experiment. You may have read that using the word because in a question will increase the likelihood of a favourable answer. In a similar way, you can use tricks like this to increase the probability that someone will click on your PPC ad. Learn. Test. Learn more. Test more. Achieve results.

Target Keywords and Keyphrases

Common keywords will get you a lot of traffic, but the market for these keywords may be so saturated that you won’t achieve a good CTR. Even worse, the popularity of these keywords will mean you have to pay top dollar to achieve the first page of search results. If you put in specific, targeted keywords, you can achieve a massive increase in CTR.

To target particularly well you can group your terms in “inverted commas”, or simply keep adding keywords to your phrase. You probably won’t receive much traffic as highly targeted keywords and phrases don’t get searched as much, but you’ll get a great CTR for very little Cost Per Click (CPC).

Filter out Sites

Ads shown on the Google Content Network often have a particularly bad CTR. The conversion rate can be even worse. The reason for this is because generally, a visitor is at a web site to read particular content or to find specific answers. If they find what they’re looking for at the site, then they won’t click on your ad. On the odd chance they do click your ad, the probability of your site capturing their interest is very low unless you have exactly the information they’re after.

It’s not all doom and gloom though. Some sites on the content network have visitors who are very open or who don’t mind spending money to save time or just get the job done. If you have the answers (or products) they need, you’ll get the clicks and the conversions. So the trick is to filter out the bad sites wisely.

So What is a Good CTR!?

So after all this, what is a good CTR? As you can see above, there are so many factors that there’s no way to predict a good CTR for your individual case, however here are some rules of thumb that we have observed:

  • A CTR under 1% is almost always bad.
  • A CTR of 1-2% is mediocre.
  • A CTR greater than 2% is pretty good and what is you should be looking for.
  • A CTR greater than 4% is fantastic.
  • A CTR of 75% is mind blowing (and yes, it’s possible with targeted keywords and phrases - just ask our clients :) ).

Why HTML Sucks

January 27th, 2006

Have you ever spent hours trying to work out why your latest CSS design doesn’t work, only to find you had a single character wrong in your code?

A few days ago I spent 1.5 hours trying to finalise a design for a client, only to find out that all the layout issues were the result of an extra bracket that was accidentally left in the CSS code. This is such a waste of time and money. It’s lost revenue because we like to charge on a per project basis here, not per hour (in the majority of cases).

I often blame CSS for many days and nights of hair pulling. Things were much easier in the days of table-based layouts right?

Then it hit me…

The challenge with modern web design isn’t the result of CSS, it’s a problem with HTML itselfHTML is too forgiving!

Think about it. Is it harder to find a problem with your simple HTML layout code or harder to debug a simple PHP application? It’s harder to find the problem with the HTML layout because web browsers are too forgiving. PHP on the other hand will usually stop in its tracks when there’s a problem. Even better, it will tell you the line number the error is on!

Sure, there are services such as the W3C Markup Validation Service that can find these errors, but it would be nice if it were integrated into all web browsers to the point where the HTML won’t ‘run’ if there’s clear errors (such as double brackets, missing inverted commas etc).

/end rant :)

Keyword Rich vs Branded Domain Names

January 21st, 2006

There are two theories on what a good domain name is for the purpose of Search Engine Optimization (SEO).

Keyword Rich/Targeted Domain Names

One belief is that you should have generic keywords you want to target in your domain name. For example, if you are a web designer in Sydney you might choose a domain name such as qualitywebdesigninsydney.com.

Branded Domain Names

The second belief is that your domain name should represent your brand. You might run a company called Whicked Reality Inc, so your domain name would then be whickedreality.com or something along those lines.

Keyword vs Branded Domain Names - Which is better?

So which is better for SEO, the keyword or the branded domain name? Branded domain names are the cleaner, safer bet.

Sure, search engines analyse URLs for keywords, but they clearly filter out questionable domain names. In the two examples above, qualitywebdesignsydney.com looks horrible to the human eye, so chances are it will look horrible to the search engine algorithms. That is, it will look horrible unless the page contains some great quality content.

You only have to look up a popular search term in Google for proof that long, wordy domain names aren’t generally the way to go. SEO is a term that would certainly come up with sites that have been optimized to the nth degree. Put it in Google and you won’t find any mega-long-silly-domain-names.com. It’s a similar story when you search for the term search engine optimization.

Some of the sites in the SERPs do contain the term seo but it is always used within a branding context and not simply to boost a search engine ranking.

Keywords in Page Names

If you really feel the need to have some keywords in your URL, then I suggest putting those keywords in either directory or page names. For example, if you run a web hosting and web design company you might structure your pages as so:

www.companyname.com/web-design/quote.html
www.companyname.com/web-hosting/packages.html

As you can see, we’ve hit the target keywords of web design and web hosting without compromising a simple and classy looking companyname.com.

Conclusion

  • Choose a simple, relevant, branded domain name. The one exception is if you manage to pick up a nice one or two word generic keyword domain name, but most are taken already.
  • Structure your site/URLs so that you hit the most important 1 or 2 keywords in your directory or page structure.
  • Keep optimizing your site with other SEO tips and tricks.
  • Always remember - content is king - so make your pages useful and informative for people.

Welcome to Goo Theory

January 20th, 2006

Welcome to Goo Theory! What’s this site all about I hear you ask? The main purpose of this site is to reveal the mysteries and secrets behind search engines, online marketing and making money on the web. The title of this site is a spin on the term Google Theory, however the site is not specific to Google.

We aim to seek out the truth. Can you really make a million dollars selling e-books on how to make a million dollars? Can you really earn a living off Adsense? What about affiliate programs?

Join Goo Theory as we seek the answers…