Agencies have kept cool heads despite Google’s
encryption of all natural searches hitting an average
80 per cent of their traffic, with some predicting this
to hit 100 per cent by mid-November.
Google started encrypting natural searches in the UK
for logged-in account users last March as part of its
overall strategy to protect consumer privacy. It did so
by integrating encryption protocol Secure Sockets
Layer (SSL) to natural searches made by people
signed in to accounts such as Google+, Gmail and
YouTube.
The move obscured the amount of keyword-referral
data advertisers have, telling them what natural
search terms people have used to arrive on their
sites.
The changes rang alarm bells at the time, with some
seeing it as a commercially-driven move from Google
to encourage more paid-search spend, given that
referral data isn’t obscured for PPC.
Although all have been monitoring the changes since
the initial rollout, which began in the US in 2011,
Google has now extended the rollout to all users,
whether signed in or not, which has seen the amount
of keyword referral data obscured spike, with many
major agencies citing up to 80 per cent losses.
A Google spokesperson said: “We added SSL
encryption for our signed-in search users in 2011, as
well as searches from the Chrome omnibox earlier
this year. We’re now working to bring this extra
protection to more users who are not signed in.”
Agencies are resigned to the changes yet it is still a
hot topic with their clients. iProspect’s chief
performance officer Mark Fagan told The Drum that
with SEO being one of the most important channels
for brands it is a major point of interest and concern
for clients.
“It hasn't changed how brands need to approach
SEO, but it has changed how they evaluate
performance of SEO. In the short term, there is still a
lot of data available that can be modelled with a high
degree of accuracy but in the long term we are
looking at pulling in various other data sources to give
our clients the most accurate picture possible.”
TM Lewin’s head of search, Francisco Lema said:
“With Google it has always been a ‘Their way or the
highway’ kind of relationship, so it’s not really a
surprise. The surprise nowadays would be if they do
something thinking of someone else but them.
“It obviously affects the quality and
representativeness of the data we use to improve our
customer experience, and it’s harder to find new
opportunities now. However, there are always going to
be other search engines, especially if you run an
international website like us, and you can always rely
on your historic data.”
Agencies and brands, accustomed to Google’s
continuous rolling iterations, have turned to other
viable tracking methods which can help fill the gap.
Havas Media head of SEO David Freeman said the
changes have raised challenges in all markets in
which Google is the dominant search engine, therefor
the US is less affected.
“We had forewarning that Google was switching
everything to secure search and so we have been
using other methods. Things like Searchmetrics offer
market-share data based on rankings, which is useful
as it shows how you stack up against the
competition.
“If I get 50,000 visits for X keywords and that drives
me X revenue - is that that really useful? Can you
make actual business decisions off of it? It’s useful
but it doesn’t change much. If you have a client like
Amazon for example– given the amount of products
they have it could give you an indicator to show you
what’s selling and what’s not but they have sales
data to give you that – can get a lot of trend data out
of your business data.
He believes it will be companies like Hitwise that
suffer most from the loss of data, given it will rely on
the kind of data now being obscured. “It puts a
question mark over the value of HItwise data now,” he
added.
However, there are wider implications to the changes,
which clients are likely to feel more keenly, according
to Freeman. The most “critical” area to be affected by
the loss of data will hamper holistic SEO and PPC
strategies.
“When it comes to the integration level clients care
very much as that impacts their budget and spend
and bottom line in terms of what they are making. For
example they can’t now tell whether they should bid
on certain keywords for PPC for a number one
position, or if they can already achieve the same in an
organic position," he said.
Meanwhile, it is also not possible to split brand and
non-brand traffic, making it harder to understand the
search trends following major above-the-line
campaigns, according to Freeman.
Other agencies agree that there are workarounds,
albeit ones that give less accurate results that the
keyword referral data itself.
DigitasLbi head of media innovations, Andrew
Girdwood said Google’s complete rollout of secure
connection (HTTPS) to all SEO in the last weeks has
created a “stink” in the industry.
Currently agencies are reporting up to 90 per cent of
traffic affected by the loss of keyword-referral data,
and this is likely to hit 100 per cent in the near future.
“It does feel like 100 per cent ‘not provided’ is a very
real thing. The keyword intelligence you get now for
free is now very small, which is why people are angry
because you can pay for it with PPC.”
He advises using Google’s Webmaster console tools
that does still provide keyword-level data for free,
albeit on a less accurate, and more censored level
not always popular with SEO heads.
Yesterday, Google announced the introduction of
Estimated Total Conversions, a new tool for its
AdWords which will help advertisers monitor their
conversation rates across separate platforms and
differing browsers.
Thursday, 3 October 2013
Agencies keep cool heads as Google’s SEO keyword-referral encryption hits 80% of traffic
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment