google seo

Google: The App Install Interstitial Penalty Is Now Live

Two months ago, Google warned us that app interstitials may lead to a penalty, if they take up too much of the page, covering most of the content. Well, yesterday, November 2nd, that penalty went live into the overall mobile friendly algorithm.

Google posted the news yesterday afternoon on Google+ saying:

An update to the mobile-friendly algorithmStarting today, pages with an app install interstitial that hide a significant amount of content on the transition from the search result page won’t be considered mobile-friendly.

Instead of full page interstitials, we recommend that webmasters use more user-friendly formats such as app install banners. We hope that this change will make it easier for searchers to see the content of the pages they are looking for.

No one should be surprised by this but I have yet to see people complaining in the forums that their sites were hit by this penalty.

Structured Markup May Become A Ranking Factor

Structured Markup May Become A Ranking Factor

Structured Markup May Become A Ranking Factor

I was shocked a bit to hear this, since Google has said time and time again that rich snippets and structured markup doesn’t help your site rank higher in Google. But that might change in the future.

Google’s John Mueller dropped a hint this morning in a Google+ Hangout at the 21:40 mark that Google may, one day, use structured markup in their ranking algorithm.

John said that “over time, I think it [structured markup] is something that might go into the rankings as well.” He added his rational:

If we can recognize someone is looking for a car, we can say oh well, we have these pages that are marked up with structured data for a car, so probably they are pretty useful in that regard. We don’t have to guess if this page is about a car.

John Mueller, while not in charge of Google’s ranking algorithms, does have insight into what Google is thinking on that front added that “definitely makes sense to use structured data” for ranking in those situations. He said:

So I think in the long run, it definitely makes sense to use structured data where you see that as being reasonable on the web site. But I would’t assume that using structured data markup with make your site jump up in rankings automatically. So we try to distinguish between a site that is done technically well and a site that actually has good content.Just because it is done technically well, it doesn’t mean it is as relevant to the users as content that is not done as technically well.

ripoff report

The Mysterious Disappearing Act Of Ripoff Report In Google, Again

On Wednesday, I reported Ripoff Report was deindexed by Google, but that only lasted about 30 minutes. By the time people started seeing the blog post and social media mentions, Ripoff Report was already back in Google’s index.

The Mysterious Disappearing Act Of Ripoff Report In Google, Again

But it was legit, was completely gone from Google’s index. I even pulled down a screen shot for a site command on both the www and non-www version:

ripoff report no documents in google

I even asked Google and they have yet to explain what happened. They did respond that they are back but they didn’t explain why or what happened or even provide a no comment.

Was it something that Ripoff Report did? They did remove themselves back in 2011 but we knew that. This time, we saw no clear evidence that Ripoff Report removed themselves.

I did notice that the non-www version didn’t load, and still doesn’t load. So I am not sure if that was the issue.

I’ve never seen a whole site delisted and relisted in Google within an hour.

Jarrod Wright was the first to spot this and post it on Twitter.

Do you have any ideas on how they dropped out completely of Google and came back in full within an hour of time?

Forum discussion at Twitter.

Google Panda

Google: The Slow Panda Roll Out Is Not Designed To Confuse Webmasters

Google: The Slow Panda Roll Out Is Not Designed To Confuse Webmasters

When Google rolled out Panda 4.2 they told me this will roll out over months and months, a very slow roll out. It was just technically how this roll out was happening they told me.

John Mueller in a Google+ Hangout added more things on record about why it is rolling out slowly. Well, why is still not answered but he did add some more advice around the slow roll out.

In summary he said:

  • The roll out is similar to previous roll outs
  • This one is slower because of “technical reasons”
  • John actually said there is “an internal issue” for the technical reason but I am not sure he actually meant that.
  • This is not rolling out slowly to confuse webmasters

He mentioned it in three different points in the hangout, let’s go through each:

At the 25:30 mark the question was:

Hi John my question is Panda related. First of all why you guys decided this update to roll out so slowly. And second could you please tell us how is this update different then the other updates. Is it targeting more then just content?

John answered:

This is actually, pretty much a similar update to before. For technical reasons we are rolling it out a bit slower. It is not that we are trying to confuse people with this. It is really just for technical reasons.

At the 37:14 mark the question asked was:

Panda has run many times over the past few years. This P. run has been said to be a “crawl” lasting several months… Does that mean it’s moving slowly for a one pass over the internet? Does Panda effect a sites rank immediately?

John answered:

So, it is not that we are crawling slowly. We are crawling and indexing normal, and we are using that content as well to recognize higher quality and lower quality sites. But we are rolling out this information in a little bit more slower way. Mostly for technical reasons.

And then later on, at 39:50 mark John added:

It is not like we are making this process slower by design, it is really an internal issue on our side.

Take what you want from these clarification points. It still doesn’t answer all the questions webmasters and SEOs have on the slow Panda 4.2 roll out.

Forum discussion at Google+.

google new tlds

New TLDs Have No Magical SEO Bonus

John Mueller at Google must have a pet peeve against the misinformation domain name registrars push out about TLDs and SEO. He has time and time again said they play no role in ranking at Google.

But that didn’t stop John and Google from posting something official on the Google blog yesterday. John said Google has heard “questions and misconceptions” about the new TLDs and wanted to set the record straight. On Google+ John said the short version:

Somewhat simplified: if you spot a domain name on a new TLD that you really like, you’re keen on using it for longer, and understand there’s no magical SEO bonus, then go for it.

But the blog post has the full FAQs, but again, none of this is new:

Here are the past stories we covered when John Mueller spoke publicly about TLds and SEO:

Here is the FAQs as posted:

Q: How will new gTLDs affect search? Is Google changing the search algorithm to favor these TLDs? How important are they really in search? 
A: Overall, our systems treat new gTLDs like other gTLDs (like .com & .org). Keywords in a TLD do not give any advantage or disadvantage in search.

Q: What about IDN TLDs such as  .みんな? Can Googlebot crawl and index them, so that they can be used in search?
A: Yes. These TLDs can be used the same as other TLDs (it’s easy to check with a query like [site:みんな]). Google treats the Punycode version of a hostname as being equivalent to the unencoded version, so you don’t need to redirect or canonicalize them separately. For the rest of the URL, remember to use UTF-8 for the path & query-string in the URL, when using non-ASCII characters.

Q: Will a .BRAND TLD be given any more or less weight than a .com?
A: No. Those TLDs will be treated the same as a other gTLDs. They will require the same geotargeting settings and configuration, and they won’t have more weight or influence in the way we crawl, index, or rank URLs.

Q: How are the new region or city TLDs (like .london or .bayern) handled?
A: Even if they look region-specific, we will treat them as gTLDs. This is consistent with our handling of regional TLDs like .eu and .asia. There may be exceptions at some point down the line, as we see how they’re used in practice. See our help center for more information on multi-regional and multilingual sites, and set geotargeting in Search Console where relevant.

Q: What about real ccTLDs (country code top-level domains) : will Google favor ccTLDs (like .uk, .ae, etc.) as a local domain for people searching in those countries?
A: By default, most ccTLDs (with exceptions) result in Google using these to geotarget the website; it tells us that the website is probably more relevant in the appropriate country. Again, see our help center for more information on multi-regional and multilingual sites.

Q: Will Google support my SEO efforts to move my domain from .com to a new TLD? How do I move my website without losing any search ranking or history?
A: We have extensive site move documentation in our Help Center. We treat these moves the same as any other site move. That said, domain changes can take time to be processed for search (and outside of search, users expect email addresses to remain valid over a longer period of time), so it’s generally best to choose a domain that will fit your long-term needs.

Forum discussion at Google+ and WebmasterWorld.

google panda seo

Returning To SEO: But My Google Rankings Have Dropped.

An interesting topic sprung up at WebmasterWorld. Typically, we have webmasters and SEOs who have been at it forever and needs help ranking better. But in this thread, we have someone who was involved in SEO back in the day, left for a while and came back and wants to get the web site ranking again.

This webmaster said, “We have been away from the SEO the past year working on other projects but now we are getting back into it and we are trying to catch up on all the Google changes.” The webmaster said they went mobile friendly, went HTTPS, changes and cleaned up the UI and navigation, and decreased the code bloat. It lead to a huge decrease in bounce rate and nice uptick in conversions but what about rankings?

The webmaster asked, “does anyone have any more suggestions of things we can look at, target and work on?”

So far the current suggestions seem to revolved around Panda. Remove thin content pages, make sure there are no empty pages created via your CMS. Improve your pagination, sorts and filtering, session IDs and more URL issues. One says just focus on making sure you have a “positive user experience,” and that should help.

The web has always a suprise for us… we love the web!

Google hiring SEO

Google Is Hiring SEOs To Help Them Rank Better

Yesterday, Google posted a job listing on their careers site for an SEO position. I kid you not! Google is looking to hire an SEO to help them “drive organic traffic,” as the job description says, to their Google Cloud Platform (GCP) marketing web pages.

This SEO will specifically work with the Google Cloud Platform Marketing team and the Cloud Web Development as the “Cloud SEO program manager.”

As I documented at Search Engine Land the other day, here are the requirements:

The responsibilities include:
•Architect, design, develop and maintain innovative, engaging and informative sites for a worldwide audience.
•Maintain and develop the web code to ensure quality, content and readability by search engines.
•Keep pace with SEO, search engine and internet marketing industry trends and developments and report changes as needed.
•Advise, collaborate with, and synthesize feedback from Marketing, Product and Engineering partners to push for technical SEO best practices.
The qualifications include: •BA/BS degree in Computer Science, Engineering or equivalent practical experience.
•4 years of experience developing websites and applications with SQL, HTML5, and XML.
•2 years of SEO experience.
•Experience with Google App Engine, Google Custom Search, Webmaster Tools and Google Analytics and experience creating and maintaining project schedules using project management systems.
•Experience working with back-end SEO elements such as .htaccess, robots.txt, metadata and site speed optimization to optimize website performance.
•Experience in quantifying marketing impact and SEO performance and strong understanding of technical SEO (sitemaps, crawl budget, canonicalization, etc.).
•Knowledge of one or more of the following: Java, C/C++, or Python.
•Excellent problem solving and analytical skills with the ability to dig extensively into metrics and analytics.
Most SEOs know that Google doesn’t always do a great job with their on-page optimization across their divisions. This specific job doesn’t require you to do link building, as you can imagine. :)

google penguin update

Penguin Refresh Is Months Away

Google’s Gary Illyes said over the weekend on Twitter that a Google “Penguin refresh is months away.”

The last official Penguin update we had was Penguin 3.0, which took place on October 18, 2014 or almost 9-months ago. The last time we saw a shift with the Penguin algorithm was before December 11, 2014, which was over 7-months ago.

So realistically, even though Google has said time and time again, that they will be making Penguin run faster – we are likely not to see a Penguin refresh for a while. Maybe it will take over a year like we saw between Penguin 2.1 on October 4, 2013 and Penguin 3.0 on October 18, 2014. Before that, Penguin 2.0 on May 24, 2012 and the original Penguin release on April 24, 2012.

At this rate, I suspect Penguin won’t happen for a year or so after Penguin 3.0.

Google Panda

Panda Still Coming Soon But Maybe Delayed For Technical Reasons

I am not sure what he means for technical reasons but it does seem to imply that Panda will not be refreshing today or this weekend. Will it be next week? I am not sure, I am starting to believe this is going to take longer than any of us expected.

The last official Panda update was Panda 4.1 which was on September 25, 2014, or over 9.5 months ago. I believe the last unofficial Panda refresh was October 24th or over 8.5 months ago.

I really do hope it comes soon and those sites that have “cleaned up” their web sites can come back to life. Of course, new sites will be hit, but Google needs to refresh these algorithm more regularly. Both for the sake of businesses failing and for the sake of the quality of the search results.

Better Web Design

Links Within PDF Documents Pass PageRank

The question came up as it relates to a security company named Sophos uncovered link spammers cloaking PDF documents with links and keywords and then redirecting users to spam sites.

Dan Petrovic commented on Glenn’s post on Google+ saying “last time I checked PDF files didn’t pass PageRank.”

Which Gary from Google came in and said “you asked me in Sydney and I told you they [PDFs] pass PageRank.”

Also, Google said so in this blog post from 2011 “generally links in PDF files are treated similarly to links in HTML: they can pass PageRank and other indexing signals, and we may follow them after we have crawled the PDF file.”