Thursday, 3 October 2013

What The Government Shutdown Means For .gov SEO

As part of the US government shut down , many .gov
sites have also shut down, replaced by messages like
this:
“Due to the lapse in federal government funding, this
website is not available. We sincerely regret this
inconvenience.”
But one day, the government will start up again and
government agencies will once again focus on ensuring
that the information their sites provide is easily
available to US citizens.
We know that Americans primarily use search to
access government information, so once the shutdown
is over, it won’t be enough that the web sites power
up. They also need to be indexed and ranking in
search engines such as Bing and Google for relevant
queries.
And that may take a while.
The .gov sites that have shut down have done so in
one of two ways:
The page content is replaced by an unavailable
message. The URLs remain the same and return a 200
status code.
The pages 302 redirect to a different page that
contains the unavailable message (and that second
page returns a 200 status code).
Both scenarios result in the same outcome from an
SEO perspective: the original URL remains indexed,
but with the contents of the unavailable message. All of
the context of what the pages are about will be lost
and the only relevance clues will come from links to
those pages.  That means that as the pages are
recrawled they may stop ranking entirely for some
queries and will rank lower for others.
That’s fine during the shutdown, since the pages
aren’t available anyway, but once the sites are live
again, it will take some time before the search engines
recrawl the pages (think of the thousands and
thousands of pages most .gov web sites contain) and
the rankings return.
Let’s take a closer look at why this happens. (And of
course, I’m not positive that this will in fact happen
for .gov sites. The search engines are certainly aware
of this situation and may implement special handling,
such as pausing .gov site crawls and leaving the
previous versions of the pages in the index during the
shutdown.)
Scenario 1: Modified Page Content
In this scenario, the page content is replaced by an
unavailable message and the pages return a 200
response code. You can see this, for instance, with
www.census.gov .
All pages on the site contain the following content:

Google’s Matt Cutts: NoFollow Attributes On Internal Links Don’t Hurt But Generally Don’t Do It

In a video, Google’s head of search spam Matt Cutts
published today an answer to the question, “Should I
use rel=”nofollow” on internal links to a login page?”
Matt Cutts basically said you shouldn’t, but said it
won’t hurt you if you did. Matt said, “It doesn’t hurt if
you want to put a nofollow pointing to a login page or
to a page that you think is really useless.” But Matt
said, “in general” it also doesn’t hurt to not add a
nofollow, and in general, you should let Googlebot
crawl and explore your site.
In most cases, using a noindex may be better than
using a nofollow on the link attribute.
This is interesting because of what Matt Cutts
explained in 2009 about using the nofollow of
PageRank sculpting , where it doesn’t work as you’d
expect.
Here is the full video:


After Blocking Keywords In Referrers, Now Google Stops Sharing Through Google Webmaster Tools – A Bug?

Last week, Google moved to send all searches through
Google SSL Search, setting up the ultimate end of
keyword data passed along via referrers non-
advertisers. Now, Google’s official alternative channel
for this information — Google Webmaster Tools — has
also stopped sharing the data, most likely due to a
bug.
A Bug Or An Intentional Move?
Google Webmaster Tools has had outages before, so
maybe this is just a temporary bug. But if so, it’s
terrible timing.
NOTE: See the postscript below that suggests this is
indeed a bug. We’ve added that to the headline and
the lead of the story.
Google stopped reporting keyword data through the
“Search Queries” feature as of September 25. There’s
no data for that day or any of the following ones, from
what I can see — and others on Twitter have reported
a similar omission.
That puts the cut-off happening only two days after
Google confirmed that it had ramped up the use of
Google SSL Search . Even people not logged into
Google are now directed to Google SSL Search, also
called Google Secure Search. That causes search
terms to be stripped from the “referrer” data passed
along to publishers, except when people click on ads.
Advertisers still get referrer data off their ads.
The data withholding, coupled with an earlier move to
allow publishers to archive search term information if
they use Google’s AdWords system — rather than the
Google Webmaster Tools system — has led to much
suspicion that Google’s holding back the data not just
for privacy reasons (and some doubt even this) but
also to push ads.
Google Stays Silent
Early yesterday, we asked Google about the Google
Webmaster Tools issue — whether it was a bug or a
permanent change. An entire business day went by
without the company responding.
That fits in with Google’s earlier refusal to answer
further questions I sent last week about the reasons
why it increased the use of secure search — and it’s
not encouraging for frustrated publishers.
For more background, see our earlier stories:
Google’s Plan To Withhold Search Data & Create New
Advertisers
Post-PRISM, Google Confirms Quietly Moving To Make
All Searches Secure, Except For Ad Clicks
“100% Not Provided Is Not The End Of SEO” Insider
Opinions On Google’s Move To Withhold Data
This is a related column I wrote for CNET that may also
be of interest:
How Google could have made the Web secure and
failed — again
Postscript (1:30am ET): As noted in the comments
below, one of Google’s webmaster relations people,
John Mueller, has acknowledged the issue and
promises it will be fixed:
The team is aware of the problem and working on
speeding that data back up again. Thanks for your
patience in the meantime.


Agencies keep cool heads as Google’s SEO keyword-referral encryption hits 80% of traffic

Agencies have kept cool heads despite Google’s
encryption of all natural searches hitting an average
80 per cent of their traffic, with some predicting this
to hit 100 per cent by mid-November.
Google started encrypting natural searches in the UK
for logged-in account users last March as part of its
overall strategy to protect consumer privacy. It did so
by integrating encryption protocol Secure Sockets
Layer (SSL) to natural searches made by people
signed in to accounts such as Google+, Gmail and
YouTube.
The move obscured the amount of keyword-referral
data advertisers have, telling them what natural
search terms people have used to arrive on their
sites.
The changes rang alarm bells at the time, with some
seeing it as a commercially-driven move from Google
to encourage more paid-search spend, given that
referral data isn’t obscured for PPC.
Although all have been monitoring the changes since
the initial rollout, which began in the US in 2011,
Google has now extended the rollout to all users,
whether signed in or not, which has seen the amount
of keyword referral data obscured spike, with many
major agencies citing up to 80 per cent losses.
A Google spokesperson said: “We added SSL
encryption for our signed-in search users in 2011, as
well as searches from the Chrome omnibox earlier
this year. We’re now working to bring this extra
protection to more users who are not signed in.”
Agencies are resigned to the changes yet it is still a
hot topic with their clients. iProspect’s chief
performance officer Mark Fagan told The Drum that
with SEO being one of the most important channels
for brands it is a major point of interest and concern
for clients.
“It hasn't changed how brands need to approach
SEO, but it has changed how they evaluate
performance of SEO. In the short term, there is still a
lot of data available that can be modelled with a high
degree of accuracy but in the long term we are
looking at pulling in various other data sources to give
our clients the most accurate picture possible.”
TM Lewin’s head of search, Francisco Lema said:
“With Google it has always been a ‘Their way or the
highway’ kind of relationship, so it’s not really a
surprise. The surprise nowadays would be if they do
something thinking of someone else but them.
“It obviously affects the quality and
representativeness of the data we use to improve our
customer experience, and it’s harder to find new
opportunities now. However, there are always going to
be other search engines, especially if you run an
international website like us, and you can always rely
on your historic data.”
Agencies and brands, accustomed to Google’s
continuous rolling iterations, have turned to other
viable tracking methods which can help fill the gap.
Havas Media head of SEO David Freeman said the
changes have raised challenges in all markets in
which Google is the dominant search engine, therefor
the US is less affected.
“We had forewarning that Google was switching
everything to secure search and so we have been
using other methods. Things like Searchmetrics offer
market-share data based on rankings, which is useful
as it shows how you stack up against the
competition.
“If I get 50,000 visits for X keywords and that drives
me X revenue - is that that really useful? Can you
make actual business decisions off of it? It’s useful
but it doesn’t change much. If you have a client like
Amazon for example– given the amount of products
they have it could give you an indicator to show you
what’s selling and what’s not but they have sales
data to give you that – can get a lot of trend data out
of your business data.
He believes it will be companies like Hitwise that
suffer most from the loss of data, given it will rely on
the kind of data now being obscured. “It puts a
question mark over the value of HItwise data now,” he
added.
However, there are wider implications to the changes,
which clients are likely to feel more keenly, according
to Freeman. The most “critical” area to be affected by
the loss of data will hamper holistic SEO and PPC
strategies.
“When it comes to the integration level clients care
very much as that impacts their budget and spend
and bottom line in terms of what they are making. For
example they can’t now tell whether they should bid
on certain keywords for PPC for a number one
position, or if they can already achieve the same in an
organic position," he said.
Meanwhile, it is also not possible to split brand and
non-brand traffic, making it harder to understand the
search trends following major above-the-line
campaigns, according to Freeman.
Other agencies agree that there are workarounds,
albeit ones that give less accurate results that the
keyword referral data itself.
DigitasLbi head of media innovations, Andrew
Girdwood said Google’s complete rollout of secure
connection (HTTPS) to all SEO in the last weeks has
created a “stink” in the industry.
Currently agencies are reporting up to 90 per cent of
traffic affected by the loss of keyword-referral data,
and this is likely to hit 100 per cent in the near future.
“It does feel like 100 per cent ‘not provided’ is a very
real thing. The keyword intelligence you get now for
free is now very small, which is why people are angry
because you can pay for it with PPC.”
He advises using Google’s Webmaster console tools
that does still provide keyword-level data for free,
albeit on a less accurate, and more censored level
not always popular with SEO heads.
Yesterday, Google announced the introduction of
Estimated Total Conversions, a new tool for its
AdWords which will help advertisers monitor their
conversation rates across separate platforms and
differing browsers.

Video: Expanding your site to more languages

Webmaster Level: Intermediate to Advanced

We filmed a video providing more details about expanding your site to more languages or country-based language variations. The video covers details about rel=”alternate” hreflang and potential implementation on your multilingual and/or multinational site

You can watch the entire video or skip to the relevant sections:
Additional resources on hreflang include:
Good luck as you expand your site to more languages!