The latest news about web marketing, SEO, PPC & Analytics. But only the stuff that matters from a New Zealand perspective. Less noise, more signal!
If your business has anything to do with the online world you know how much information is out there. It’s overwhelming and constantly changing. Finding out what’s relevant to your business amongst all the noise is time consuming and can result in information overload.
At SureFire Search we want to help you by sifting through all the noise and highlighting what’s new and noteworthy in SEO, PPC and Web Analytics. But more importantly, answer the question – Why this might matter to YOU and YOUR business here in New Zealand.
Here’s what caught our attention this week…
- Google really doesn’t want people reverse engineering local search listings
- Site quality is now part of the rich snippets algorithm
- Amazon takes aim at Google’s online ad business
- Sick of ‘click bait’ articles – so is Facebook
- Google has ended all authorship functionality in the search results
“Reverse engineering circumvention of spam detection algorithms” is the name of a recently filed patent by Google. Follow the link if you want to really get into the nuts and bolts of the patent but basically Google wants to stop local business owners trying to game the local search results.
Gaming the system is primarily done when a business tries to show up in the local results in locations where they don’t actually have a business location. Google is calling these “listings” as Fake Business Spam.
Google Local was designed to help people find local businesses easily. Google doesn’t want people searching for a specific type of local business and getting results for businesses that are not actually local. However, in some cases this is what is happening. Business owners are creating bogus listings that have a real number attached. The fake listing is being used to generate leads.
The patent has been created to try and stop this and protect the user experience of Google’s users. They plan on stopping Fake Business Spam by giving a spam score to a “business listing when the listing is received at a search entity. A noise function is added to the spam score such that the spam score is varied.”
Why should you care?
What this patent shows is just how much Google values local search. They don’t want spammers disrupting the results and therefore putting people off trusting the local results. Keeping the local search results spam free can only be a good thing for local businesses. Mobile and desktop searches with local intent continue to rise making it very important that you have your local SEO house in order.
Hopefully you have a Google My Business profile set up so that potential customers can find you on Google Maps and in the local search results.
Rich snippets are basically results that are a bit ‘fancy’. They often contain a visual aspect with could be stars or a thumbnail image that catches the attention of the searcher and therefore more clicks.
Unsurprisingly there is a Rich Snippet Algorithm which is highlighted in the rich snippet guidelines.
Until the recent Panda 4.0 update this algorithm was rather basic and wasn’t integrated with site quality.
It is now.
This means that Google plans on only showing rich snippet results for websites that are deemed to be quality.
Why should you care?
From Google’s perspective it makes sense for site quality to be taken into account when showing rich snippets. This is because rich snippets attract attention and clicks.
Google’s search quality team doesn’t want to attract attention to poor sites by allowing them to show rich snippets. They want the websites that they deem to be quality to get the clicks.
What this means for your business is that just because you implement structured markup on your website it won’t necessarily result in a rich snippet being returned in the search results.
It may seem like a broken record but it’s important that your website has good content. Google’s algorithms are just going to keep getting more advanced and more selective in what they show in the search results.
Give your website the best change of being considered quality by always producing informative content.
If you had to list the names of company’s that could pick a fight with Google and possibly win – it would be a short list. However, one of the names on that list would probably be Amazon.
The multi-billion dollar online retailer wants in on the multi-billion dollar online advertising industry currently being dominated by Google.
The Wall Street Journal is reporting that Amazon is building a new ad platform that will be very similar to Google’s. This ad platform is apparently called ‘Amazon Sponsored Links’ and it could open up a very large new revenue stream for the company.
As it stands right now this could be a double blow for Google. Amazon has the technology and data to be a contender in the online advertising space and right now Amazon is one of Google’s largest ad buyers.
Although Google has a head start of around 14 years – Amazon has a huge amount of data about what people buy. This information is invaluable.
As it was recently put;
“Google knows what we SEARCH for, Facebook knows what we LIKE but Amazon knows what we BUY.”
Why should you care?
Right now Amazon runs a small ad program that can place ads on others sites but it is small time compared to what this new advertising platform could be.
If Amazon does go ahead with Amazon Sponsored Links there is huge potential for businesses that sell products.
One can only imagine the type of targeting Amazon could supply. Instead of targeting what people search for; you could target what people buy. This could mean that an advert with your product is presented to people who are buying products that are known to be bought alongside your product.
SureFire will keep you updated with this.
“Click-bait” articles are designed by publishers so that a user has to click the link to see what the post is actually about. These posts tend to get a lot of clicks and often don’t contain any relevant information.
However, due to the number of clicks that they receive they get shown to a large number of people while at the same time being promoted up the Facebook news feed.
An example of a click-bait post:
Facebook recently conducted a survey where it discovered that “80% of the time people preferred headlines that helped them decide if they wanted to read the full article before they had to click through.”
To try and decide what posts are ‘click-bait’ and which aren’t, Facebook is going to take bounce rate into their algorithmic equation. Bounce rate is how long people spend reading an article before they leave. If people leave quickly (a high bounce rate) Facebook will deem that article low quality because it’s not engaging and therefore classify it as click-bait.
Facebook will also look at how often the article is shared as an indicator of quality.
Why should you care?
From a personal perspective it should mean less “junk” in your news feed. From a professional perspective, if you are using Facebook as a marketing tool you will have to do away with click-baiting (if you use that method to attract visitors).
As with Google, quality content is king.
Just like a high bounce rate on your website can be cause for concern, a high bounce rate on your Facebook posts could negatively affect your Facebook marketing efforts.
Following the recent removal of authorship photos from the search results and after 3 years of pushing for authorship markup to be used by webmasters and authors; John Mueller of Google Webmaster Tools announced via his Google Plus post that Google will stop showing authorship results in Google search completely.
John Mueller stated in his post that the information that the markup provided wasn’t as useful to the user as they had hoped and because of this Google has decided to stop showing authorship all together in the search results.
As if reading the minds of many skeptics out there who believe authorship simply attracted too much attention away from Google adverts, he wrote;
“(If you’re curious — in our tests, removing authorship generally does not seem to reduce traffic to sites. Nor does it increase clicks on ads. We make these kinds of changes to improve our users’ experience.)”
The two main reasons that Google has removed authorship are:
- Low adoption rates by authors and webmasters.
- Low value to searchers.
Why should you care?
As Google pointed out authorship had a low adoption rate so chances are you haven’t implemented it. To avoid confusion going forward; there are many articles on the net speaking about the positives of authorship markup – these should be ignored.
If you have implemented authorship, then SureFire’s advice is to leave it.
It’s important to note that author photos continue to be presented for Google + content from people that the searcher has in their Google network provided that the searcher is logged in to their Google + account.
OK, that’s what we think. We’re keen to hear your thoughts on any of the above – please comment below.
Click here to read previous editions of No Noise Friday web marketing news.
Want this delivered to your inbox each Friday?
If you found this useful, please tell your friends.