Tuesday, June 21, 2016
Friday, June 10, 2016
Ruben Gomez spotted another test, this one is from Google, he posted about it on Twitterand called it material design for desktop. It looks like the card style design we reported about a few weeks ago.
But Ruben explained it is also visible on the home page and shared screen shots showing how the Google home page is gray in the new test design. It also has a bigger search box in the center.
Here is the new home page test in gray (click to enlarge):
Here is the current white home page (click to enlarge):
Ruben shared more examples as well as a video.
GoogleBot typically crawls from the United States, not 100% of the time. In fact, Google recently began crawling on a limited basis from other countries but only to check on local-aware features.
In any event, what if you have a web site that is not accessible for US users for legal or other reasons? Google says that GoogleBot from the US wouldn't be able to access it and it probably will cause major indexing issues.
Google's John Mueller said this in a Google Webmaster Help thread yesterday. He wrote, "In general, our cloaking guidelines say that you must show Googlebot the same content as you would show other users from the region that it's crawling from. So if you're blocking users in the US, then you'd need to block Googlebot when it's crawling from the US (as is generally the case)."
He did offer advice that you can allow some legal content to be shown to US users and thus GoogleBot can index the legal content. But without allowing US users to your web site, you have to imagine GoogleBot won't access it - unless you do things that are against Google's Webmaster Guidelines. John said, " one suggestion would be to have content that's globally accessible, for both users & Googlebot from the US, which can then be indexed in search."
This isn't a new topic, we actually wrote about it a few times including in 2008 and 2011 - the interesting part is the advice didn't change even since the January 2015 news that GoogleBot has locale aware GoogleBot smarts.
Forum discussion at Google Webmaster Help.
Joy Hawkins asked in a Local Search Forum thread if Google will filter the gold stars shown in the organic search results for local results for specific industries. She said she can't get the stars to come up for specific industries and she is wondering if Google filters them out for specific types of industry queries.
Tim Capper responded that he asked Google's John Mueller (he didn't share the source of this conversation) and John said they do not filter by industry. Here is what Tim wrote:
Just asked John Mu and he is not aware of any filter being applied to industry. He did say that they don't like if the review markup is sitewide or on irrelevant pages. Also they don't like if testimonials are used within the markup.
There was a time where Google would not show local results for SEO or web design companies back in 2009, but it came back a few years later. So it isn't too far fetched to think there might be a gold star review filter.
I do see gold review stars for my company:
Forum discussion at Local Search Forum.
Bing announced you can now submit your news site, like this one, to Bing News at the new Bing News PubHub.
When you go to Bing News PubHub, you can fill out the form as I screen shot below, to submit your news site to Bing. Here is a screen shot of the form:
To be accepted, first you need to make sure your site is verified in Bing Webmaster Toolsand that the site is in accordance with Bing Webmaster guidelines.
Then Bing will review your site and see if they want to include it in Bing News.
Here is their internal criteria:
- Newsworthiness – Report on timely events and topics that are interesting to users. Content that doesn’t focus on reporting, such as how-to articles, job postings, advice columns, product promotions, is not considered newsworthy. Similarly, content that consists strictly of information without including original reporting or analysis, such as stock data and weather forecasts, is not considered newsworthy.
- Originality - Provide unique facts or points of view. Faced with numerous sources frequently reporting similar or identical content, originality or uniqueness becomes a critical way to determine the value to a user of an individual story.
- Authority – Identify sources, authors, and attribution of all content. News sites with authority maintain the highest level of trust and respect from our users.
- Readability – Create content with correct grammar and spelling, and keep site design easy for users to navigate. Advertising should never interfere with the user experience.
Apple is bringing search ads to their App Store. Apple announced it last night saying, "starting this summer, you'll be able to participate in the Search Ads beta and see the ads in action."
In short, people search in the App Store and if you want to advertise your apps there by keyword, you can using Apple's Search Ads platform. "Search Ads is an efficient and easy way for you to promote your app directly within the U.S. App Store search results, helping customers discover or reengage with your app, while respecting their privacy," Apple added.
Here are screen shots of what it looks like:
See the ad highlighted at the top?
Here is the campaign builder:
Here is the reporting engine:
If you want to try the beta apply here.
There are more details on this product over here.
Like it or not, many SEOs use the disavow backlinks feature within Google which launched in 2012 mostly based on webmaster feedback.
But sometimes the disavow file can cause a headache for webmasters. When you have weird characters in your URLs, sometimes Google can confuse them and not disavow them or show errors.
John Mueller of Google responded to one such complaint with advice on how to test it. He said on Twitter to "try to submit a file with just that line, then rewrite the line." Use this method to check which URLs are causing you problems and then fix them one by one. Sounds like this method can take some time with debugging but I don't know if there are any better ways?
He added that "sometimes special characters are tricky" for Google's disavow system to handle.
I always find it interesting when a Googler responds to a specific SEO question with an audit like response. By that I mean, if someone complains about their SEO efforts and rankings in Google and Google responds with additional qualification questions, it makes you wonder - what does that all mean.
Let me share an example. In a Google Webmaster Help thread a webmaster for slant.cois complaining his traffic is flat from Google. It has been flat for six months and he isn't sure why because the content improves and more content is added daily. His bullet points are:
- This is a good example page: http://www.slant.co/topics/341/~2d-game-engines
- Been around for about 3 years
- Mostly works like a structured Q&A site with wikipedia elements to it as products are a rapidly changing area and our site keeps up with new releases etc.
- Google traffic is about 350k a month and pretty flat.
- We've done a lot of work on algorithmically noindexing content until it hits a quality threshold. A lot of currently noindexed pages are still in the index (such as our /comments links) unfortunately.
- Our content/community is growing really quickly, around 30/40% each month. We also have 24/7 moderation.
Then he pinged John Mueller of Google about it on Twitter and John's response onTwitter was interesting. He asked "If it's flat for a while, look at the pages with traffic: do they change? do the new ones show up?"
The response is interesting.
Look at the pages that have traffic and look to see if they change. Look if your new pages are showing up in Google and have traffic.
Makes you wonder...