Sensible SEO
Follow
288 views | +2 today
 
Rescooped by Norman Pongracz from Link Building and Linkers
onto Sensible SEO
Scoop.it!

How to systemise your link building process (without compromising quality) | Skyrocket SEO

How to systemise your link building process (without compromising quality) | Skyrocket SEO | Sensible SEO | Scoop.it

Via wayneb77
Norman Pongracz's insight:

I am in favour of more flexible approach,. however this guide can help a lot to understand what are the goals of link building process and what factors one should consider.

more...
Norman Pongracz's comment, January 15, 2013 5:17 AM
Thanks. Even though I am favour of a more flexible approach, I like the underlying idea of systematical link building. I think it can help to see through the confusing process so I re-scooped it.
Sensible SEO
Technical SEO and Link Building
Your new post is loading...
Your new post is loading...
Scooped by Norman Pongracz
Scoop.it!

Understanding web pages better

Norman Pongracz's insight:

Google indexing JavaScript: Implications and Risks

 

The “Googlebot” finally has the ability to interpret JavaScript, the last remaining core construct used to create and manipulate content on web pages (HTML and CSS being the other two).

 

Implications and potential risks with solutions:

 

Better Flow of Link Juice

Entire navigation menus are sometimes fully reliant on JavaScript. The ability to parse these links will result in better “link juice” distribution.

 

Poor Load Times

The use of excessive JavaScript is rampant and, often times, a browser has to make a significant quantity of additional requests and spend time downloading this JavaScript. Now that the Googlebot has to do this too, many sites’ load times in the eyes of Google are likely to increase. To see if you’re affected log in to your Google Webmaster tools and check your “Crawl Stats” graph over the past few months. Also, if your web server is unable to handle the volume of crawl requests for resources, it may have a negative impact on our capability to render your pages. If you’d like to ensure that your pages can be rendered by Google, make sure your servers are able to handle crawl requests for resources.

 

Blocking

If resources like JavaScript or CSS in separate files are blocked (say, with robots.txt) so that Googlebot can’t retrieve them, Google's indexing systems won’t be able to see your site like an average user. We recommend allowing Googlebot to retrieve JavaScript and CSS so that your content can be indexed better.

 

Graceful Degradation

It's always a good idea to have your site degrade gracefully. This will help users enjoy your content even if their browser doesn't have compatible JavaScript implementations. It will also help visitors with JavaScript disabled or off, as well as search engines that can't execute JavaScript yet.

 

Content indexation

Some JavaScript removes content from the page rather than adding, which prevents us from indexing the content.

 

Additional resource: http://www.business2community.com/seo/googles-crawler-now-understands-javascript-mean-0898263

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

Top 20 SEO requirements for scoping your eCommerce platform

Top 20 SEO requirements for scoping your eCommerce platform | Sensible SEO | Scoop.it
Having spent the last 6 years Client side as Head of eCommerce and agency side managing digital marketing teams, one constant has been confusion in new platform builds over what a “search engine friendly” website actually is.
Norman Pongracz's insight:

SEO requirements for setting up new eCommerce domain

 

Accessibility and Navigation
The key content is still visible to search engine spiders/bots as well as to visitors when elements like JavaScript are disabled
The site can be used in different browsers and devices
XML sitemap generated dynamically and is submitted on a regular basis
HTML sitemap is auto generated based on product catalogue and site structure
Robots.txt file is provided
Rich snippets are supported within platform
Custom 404 error page and automated report to flag error pages
Flat information architecture
Thin content pages are blocked from crawling
Site search is blocked from crawling
Flash objects are search engine friendly
Pdf content is readable
Page load time to meet agreed threshold

 

Canonicalisation
301 redirects from legacy pages, non canonical URLs and orphaned pages (non www or no trailing slash) to preserve search engine rankings
Canonical tag used to avoid duplicate content
Hreflang directives are set up for pages having version in different languages and regions

 

URLs
Dynamically generatet search engine friendly URLs for product and content pages
Ability to specify / edit URLs for individual pages via CMS for campaign landing pages and microsites

 

On-page Elements
Keyword optimised H tags within html for headings – structure for use of H1 to H6 to provide a relevant hierarchy of content
Core provision for meta content (title, description, keywords) that is auto generated
Images have appropriate ALT tags
Keyword in title tag (unique for each page, include keywords)
H1 with keyword can be found on each page

 

Content
Major headings are clear & descriptive
Critical content is above the fold
Font size/Spacing is easy to read
Clear path to company information and contact Information
Main navigation is easily identifiable
Navigation labels are clear & concise
Number of buttons/links is reasonable
Company logo Is linked to home-page
Links are consistent & easy to identify
Site search is easy to access
Provide text alternatives for all non-text content
For all non-text content that is used to convey information, text alternatives identify the non-text content and convey the same information. For multimedia, provide a text-alternative that identifies the multimedia.
For non-text content that is intended to create a specific sensory experience, text alternatives at least identify the non-text content with a descriptive label (for instance: colour guide).
Captions are provided for pre-recorded multimedia.
HTML page titles are explanatory
Social media content such as blogs are hosted on your primary website domain

 

Visitor tracking
Visitor analytics (such as Google Analytics or Omniture) is implemented
Google & Bing Webmaster Tools accounts set up
Event and Goal Tracking set up

 

References:

 

Main article:
https://econsultancy.com/blog/5525-top-20-seo-requirements-for-any-ecommerce-platform

 

Supporting articles:
http://searchengineland.com/seo-checklist-for-startup-websites-170965
https://docs.google.com/file/d/0B31KfhEE-e3oZGI5NmI1NGUtZmZlZC00YTVlLTg4MWQtNTVkMTAyMzZkNjEy/edit?hl=en
http://www.w3.org/TR/2005/WD-WCAG20-20050630/checklist
http://www.usereffect.com/topic/25-point-website-usability-checklist
http://moz.com/blog/launching-a-new-website-18-steps
http://www.newmediacampaigns.com/page/the-8-minimum-requirements-for-seo-features-in-a-cms

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

15 Great Citation Resources for Local Search

15 Great Citation Resources for Local Search | Sensible SEO | Scoop.it
For many SMBs & SEOs that are new to local search, understanding citations and what’s important about them can be a bit mystifying. On the surface, local directory listings seem as plain as day – how complex can business listings on a website be right?! But once you start getting drawn into the murky world
Norman Pongracz's insight:

Citation Building Basics (Summary)

 

In simple SEO terms a Local Citation is simply where your company is mentioned on other websites and places found on the Internet. Local citations are used heavily in helping you to rank in local search results.
An example of a citation could be a business directory such as Yell, Thompson Local or Brown Book where your company is mentioned explicitly by name. Local citations do not to include a link to your site. It could also be where your company is mentioned, cited, referenced or spoken about on other local websites.

Citation sources come in 6 main shapes & sizes (see below). Some are specific to an industry or city, while some are much broader in scope and provide listings for all types of business in all towns across the country. As long as the site has some relevance to your business (e.g. offers correct category to list or covers same geographic location) and is decent in quality then it’s a goer.

Local Directories
-Niche or Vertical directories
-General Directories
-Event sites
-Social platforms
-Local news & blog sites

How can I find out where I’m already listed?
-CitationTracker (by BrightLocal)
-CitationFinder (by WhiteSpark)

Knowing where you’re listed gives you ½ the picture. To really bring your citation situation into focus you need to know what your business data looks like on these sites.
-Do they have your business name stored correctly?
-Do they have your exact address & zipcode?
-Are they using the right local number for your business?

Tools:
-Yext Local listings scan tool
-Brightlocal SEO Check Up
-UBL Visibility Tool

Where else can I get myself a listing? -The best way to work this out is to spy on your competitors and see where they’re listed. If your competitors can get a listing on a site it follows that you should – in most cases – also be able to get a listing. The same 2 tools that help you find your existing citations (CitationTracker & CitationFinder) can also be used to spy on your competitors.

What category should I use for my business? - Selecting the right category/categories to list your business on aggregator & citation sites is very important. But identifying & selecting the right category can be tricky for some businesses. Tools: https://moz.com/local/categories

How long does it take for listings to go live? If you submit listings manually, direct to sites then the speed of go live tends to be much faster than if you submit via a 3rd party or aggregator service. We typically see 70% of our direct submissions go live within 4 weeks of submission, with many going live instantly or in 48-72 hours.

Most important UK Citation Sources:
192.com
AccessPlace.com
AgentLocal.co.uk
ApprovedBusiness.co.uk
BizWiki.co.uk
Britaine.co.uk
Brownbook. net
BTLinks.com
Business.Unbiased.co.uk
BusinessNetwork.co.uk
City-Listings.co.uk
City-Visitor.co.uk
CityLocal.co.uk
CompaniesintheUK.co.uk
Cylex-UK.co.uk
Directory.TheSun.co.uk
FindtheBest.co.uk
ForLocations.co.uk
Foursquare.com
FreeBD.co.uk
FreeIndex.co.uk
Fyple.co.uk
GoMy.co.uk
HotFrog.co.uk
InfoServe.co.uk
It2.biz
Listz.co.uk
LocalDataCompany.com
LocalDataSearch.com
LocalLife.co.uk
LocalMole.co.uk
LocalSecrets.com
LocaTrade.com
Manta.com
MarketLocation.com
MisterWhat.co.uk
MiQuando.com
My118Information.co.uk
MySheriff.co.uk
MyLocalServices.co.uk
Near.co.uk
Opendi.co.uk
Qype.co.uk
Recommendedin.co.uk
Scoot.co.uk
SmileLocal.com
TheBestof.co.uk
TheBusinessPages.co.uk
TheDirectTree.com
TheDiscDirectory.co.uk
ThomsonLocal.com
Tipped.co.uk
TouchLocal.com
UFindUs.com
UK.Uhuw.com
UK.WowCity.com
UK-Local-Search.co.uk
UK-Locate.co.uk
UKSmallBusinessDirectory.co.uk
VivaStreet.co.uk
Wampit.com
WeLoveLocal.co.uk
WheresBest.co.uk
WhoseView.co.uk
Yalwa.co.uk
Yell.com
Yelp.co.uk (verification required only for claiming listing)
Zettai.net

Reference:

Main article:
http://www.brightlocal.com/2014/05/21/15-great-citation-resources-local-search/#learn

Other articles:
http://www.localvisibilitysystem.com/definitive-local-search-citations/#uk
http://www.hallaminternet.com/2012/what-is-a-citation/

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

Google Analytics Troubleshooting Guide & Auditing Resources - MarketingVOX

Google Analytics Troubleshooting Guide & Auditing Resources - Publisher: MarketingVox
Norman Pongracz's insight:

Debugging Google Analytics Setup - Common Mistakes (Summary)

 

GA 101: accounts, trackers, domains:
1) the tracking code is in your website's HTML source code,
2) you are using the right tracking code,
3) you are checking the right GA account in the application's settings,
4) GA is acknowledging that it is receiving data for that account, and (For testing this, see: http://www.webanalyticsworld.net/2012/02/debugging-google-analytics-code-ii-a-tutorial-video-on-fiddler%E2%80%99s-inspector-and-autoresponder-functions.html
5) there are no "rogue" sites using your UA code out there. (If someone puts your Google Analytics tracking code on their site (the same UA-#), visits to their site will show up in your Google Analytics profiles - for more details read: http://www.blastam.com/blog/index.php/2011/06/are-rogue-sites-influencing-your-google-analytics-data/)

Goals, funnels, and filters
If your goals are not being tracked (i.e. GA never reports any match) make sure your URLs are an exact match, or double-check your regular expressions, depending on how the goal rules have been written. Exact match is easier to use but less flexible and more brittle if you are going to change URLs or want to match a whole family of similar URLs.
Badly set up funnels can show incoherent data such as everyone leaving after an early step while your goal conversion does show conversion events further down the funnel. See http://www.lunametrics.com/blog/2008/06/25/funnel-problems-google-analytics/
When running filters, check you understand their syntax. If you are using several filters, bear in mind they are executed one after the other and "feed" into each other. Using more than one Include filter can lead to data loss and should be done with caution.

Campaign tracking
If you have already made sure your various traffic generation efforts (e.g. in email newsletters) embed the right URL parameters, the next step is to verify that the redirects properly work. In larger teams it is advised to use a centralized online document or spreadsheet to keep track of normalized campaign parameters.https://developers.google.com/analytics/resources/articles/gaTrackingTroubleshooting?csw=1#campaignTrackingIssues

Site search, site overlay, site speed
Some GA features do not work if you rewrite URLs, so if you want to use site search or site overlay, make sure to use a separate profile from the one where you generate readable "fake" URLs. If your CMS allows it or if you can run custom server-side code (e.g. in PHP) you can also make GA believe you are using a search query parameter even if you are not.
http://www.lunametrics.com/blog/2010/08/19/site-search-without-query-parameters/#sr=nbslfujohwpy.dpn&m=r&cp=(sfgfssbm)&ct=/hpphmf-bobmzujdt-uspvcmftippu-uppmt-049532/-tmc&ts=1401813714

Asynchronous tracking, ecommerce and custom variables
While CMS upgrades and migrations are a leading cause of analytics problems on accounts that used to work, moving to the newer async GA code has to be done with similar caution. Among specific problems with the latest generation of GA scripting:
-Stick to the exact spelling and casing of method names (e.g. _trackPageview) - they are case sensitive.
-Be careful to not have leading or trailing whitespace when you're pushing the tracking code.
-Pass along strings within quotes, but do not otherwise use quotes for other value types such as booleans.
When coming from the older (synchronous) syntax, make sure you have converted everything, including ecommerce integration. Speaking of which, check that you do not have improperly escaped special characters or -apostrophes getting in the way. See more on this topic.
If you are using custom variables, verify that you are following these guidelines. Mixing page, session and visitor-level variables in the same slot is not recommended. Migration from the deprecated _setVar method to _setCustomVar should be done carefully. And while the dreaded "%20" bug was finally fixed in May 2011, this means filters need to be rewritten. Also, know that custom variables typically take longer to appear in GA than "regular" data - it can take up to 48 hours on very large sites.

Auditing and support tools
Auditing and support tools - with a mix of a javascript console and a http live header sniffer, it is possible to instantly know what's going on with your code. To that effect, Google Analytics Tracking Code Debugger is a very convenient Chrome browser extension.

 

Reference:

 

Main article:
http://www.marketingvox.com/google-analytics-troubleshoot-tools-049532/

 

Other articles:
http://www.blastam.com/blog/index.php/2011/06/are-rogue-sites-influencing-your-google-analytics-data/
http://psgrep.com/
http://www.lunametrics.com/blog/2008/06/25/funnel-problems-google-analytics/#sr=nbslfujohwpy.dpn&m=r&cp=(sfgfssbm)&ct=/hpphmf-bobmzujdt-uspvcmftippu-uppmt-049532/-tmc&ts=1401813714
http://www.cardinalpath.com/the-math-behind-web-analytics-mean-trend-min-max-standard-deviation/
http://www.webanalyticsworld.net/2012/02/debugging-google-analytics-code-ii-a-tutorial-video-on-fiddler%E2%80%99s-inspector-and-autoresponder-functions.html

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

There's no place like home (page)

There's no place like home (page) | Sensible SEO | Scoop.it
Your home page is one of the most visited pages on your website. Few people will visit your site without seeing it. But a lot of home pages suck. Read this, and make sure yours doesn’t.
Norman Pongracz's insight:

Homepage Optimisation (Summary)

 

-Show whatever it is you’re selling - sounds obvious but often sites don’t show off their products. The first thing users should see is whatever you’re selling. It should be big, bold and beautiful.
-If, you sell a service rather than a physical product (see Mailchimp), try to encapsulate what you do in the simplest, shortest way you can. So if you’re a lawyer, you could say ‘No-nonsense legal advice’, an accountant ‘Tax returns the easy way!’
-Keep it short, punchy and use real language. And try to include (where appropriate) words of quality like ‘easy’, ‘simple’, ‘discover’, ‘free’ as these are the words that people tend to respond to. Visitors then know what they’re getting and if they’re interested, they’ll stick around.
-Use really eye-catching images. For instance ASOS draws in your eyes so you can’t help but look at ‘SHOP MEN’ and ‘SHOP WOMEN’. T
-The key here is that all your products need to be “above the fold”. That is to say, you shouldn’t need to scroll down to see them.
-Make the next step clear - make sure it’s blindingly obvious. It’s sometimes difficult to summarize what your business does as succinctly as this. But try, because it will make your website much more compelling. Retailers can’t use the home page to sell individual products (that’s what product pages are for), but they can, should and do, sell themselves. (see: ASOS)
-Use a disruptive homepage design! A very linear, blocky site, where everything is aligned and there are lots of right angles, might be clear and even, but it’s very difficult to make anything stand out. This means users’ attention won’t be channelled towards your ‘call to action’. Use elements that break up alignments and hierarchies, and push your users’ eyes to where you want them to go. The more something sticks out, the more people will click on it.
-Testify! As a webmaster, you need to do all you can to reassure your customers that you’re trustworthy. One of the easiest ways is with testimonials, quotes from people who have used, and liked, your service. A lot of companies ask for feedback automatically after a purchase is complete.
-Videos. You don’t need to produce the next blockbuster, but you should turn your hand to making a video or two. Videos, as well as being great content that search engines love, reassure your customers.
-Remember the visual hierarchy - Some of the pages on your website will make you lots of money, others won’t. Navigation should always be easy and intuitive, but you can still nudge your users in the right direction. Don’t feel you have to treat all pages equally. You can make some pages easier to find than others.
-Consider making the social icons bigger, to make my content easier to share. (And try to put the icons on the right hand side of the page, because more people will click them.)

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

Ecommerce SEO Tips: User-Focused SEO Strategies For Deleted Products | Linchpin SEO

Ecommerce SEO Tips:  User-Focused SEO Strategies For Deleted Products | Linchpin SEO | Sensible SEO | Scoop.it
Ecommerce SEO Tips: User-Focused SEO Strategies For Deleted Products by Linchpin SEO
Norman Pongracz's insight:

Options for handling changes in product pages

 

 

Redirect to the Deleted Product’s Category Page - this should be the category that is one level up from the product page, or if there is less than 3 products in the defined category, keep climbing the taxonomy until there is at least 3 products in a category.
Once this category is defined, 301 redirect the old product page to this category.
Pros:
-Good for seasonal products
-Pushes ranking value into the category page.
-Allows the ranking value to be split between remaining products in that category
-Gives users the ability to find other relevant products that could fit their needs
-Lowers the risk of users going back to the search results page and visiting a competing website
-Once the search engine re-crawls the page and finds the 301 redirect the product will be removed from the search engines index.
Cons:
-Possible user confusion. This risk can be mitigated by serving a small JavaScript overlay on the category page (this can’t interfere with the search engines ability to crawl the category page) explaining that the previous item is not available, but that these might be helpful.

A 301 redirect coupled with a noindex/follow meta-tag on the search results page - Whenever a user clicks on an external link – 
return the search results page to the user that includes similar products
Pros:
-Keeps users engaged with the website.
-Using the noidex/follow meta-tag allows ranking metrics to flow through the internal links on the search results set, but keeps the search results page out of the Google index.
-Allows for discovery of similar products
-Once the search engine re-crawls the page and finds the 301 redirect the product will be removed from the search engines index.
Cons:
-Possible user confusion. This risk can be mitigated by serving a small JavaScript overlay on the category page (this can’t interfere with the search engines ability to crawl the page) explaining that the previous item is not available, but that these might be helpful.
-Leaves the product selection up to the user. Thus, the website owner can’t control the outcome of the user journey or directly match/recommend a single product that best matches their intent.

Manually Redirect to a Similar Product - Manually create a 301 mapping by selecting a similar product or page from the remaining product set so whenever a user clicks on an external link – either from the search results, bookmark, social website, or from a link on another website they are taken to the new page. Create an environment to allow for the deleted item to be redirected to this newly identified page.
Pros:
-Ability to easily match relevancy based on user need.
-Ability to redirect to a similar product that high conversion rate – or even a new product that has a high relevancy to the deleted product.
-Keeps users engaged within the website.
-Allows for direct flow of ranking and social metrics from one product to another.
-Once the search engine re-crawls the page and finds the 301 redirect the product will be removed from the search engines index.
Cons:
-Possible user confusion. This risk can be mitigated by serving a small JavaScript overlay on the category page (this can’t interfere with the search engines ability to crawl the page) explaining that the previous item is not available, but that these might be helpful.
-This is done manually and can be time consuming for large ecommerce websites.

Redirect Based on Relevancy Value
Title relevancy is high enough redirect directly to related product.
Whenever a user clicks on an external link – and it is detected that a 404 error will occur, dynamically spin a search on the back end utilizing title of the product.
-If there is a product that matches at a high enough relevancy (this will be defined based on product set) send the user directly to that product.
-If the relevancy of products is not high enough, send the user to a search results page with a group of related products.
Pros:
-This is a combination is serving the best option to the user based on value and relevancy.
-Keeps users engaged within the website.
-Keeps ranking and social metrics flowing throughout the website.
-Once the search engine re-crawls the page and finds the 301 redirect the product will be removed from the search engines index.
Cons:
-Possible user confusion. This risk can be mitigated by serving a small JavaScript overlay on the category page (this can’t interfere with the search engines ability to crawl the page) explaining that the previous item is not available, but that these might be helpful.

Custom 404 Page - Whenever a user clicks on an external link, redirect the user to a custom 404 page This page should: Inform the user the product is no longer available; Provide related product selections; Provide a search box for the user to search the website for other products.
Pros:
-Directly informs the user that the product is no longer available
-Once the search engine re-crawls the page and finds the 301 redirect the product will be removed from the search engines index.
Cons:
-Loss of ranking or social value that the deleted item had built.
-Higher risk the user will hit the back button and go to a competitor of yours who still has the product.

Permanently delete the expired product’s pages, content and URLs. When you have no closely related products to the one that’s expired, you may choose to delete the page completely using a 410 status code (gone) which notifies Google that the page has been permanently removed and will never return.

Reuse URLs. If you sell generic products where technical specifications and model numbers are not relevant, you could reuse your URLs. That way you will preserve the page’s authority and increase your chances of ranking on Google.

Some items deserve to live on. Certain products may have informational value for existing customers or others wanting to research it. Leave these pages intact. Previous buyers can get information, help and service through these pages.

In case of out-of-stock items
-Leave the pages up. If the items will be in stock later, leave pages up just the way they are. Don’t delete, hide or replace them. Don’t add another product to them or redirect visitors to other pages.
-Inform users when it will return. Always offer an expected date when the product will be back in stock so visitors will know when to come back and buy.
-Offer to backorder the product. Let them order and promise to have it sent out to them as soon as fresh supplies arrive. Prospective buyers who really want the product won’t mind waiting a few extra days for it.

References: 
http://www.linchpinseo.com/ecommerce-seo-tips-user-focused-seo-strategies-for-deleted-products
http://searchengineland.com/best-practices-in-e-commerce-seo-176921
http://moz.com/blog/how-should-you-handle-expired-content

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

Page Title & Meta Description By Pixel Width In SERP Snippet | Screaming Frog

Page Title & Meta Description By Pixel Width In SERP Snippet | Screaming Frog | Sensible SEO | Scoop.it
Norman Pongracz's insight:

For page Titles Google now use 18px Arial for the title element, previously it was 16px. However interestingly, Google are still internally truncating based on 16px, but the CSS will kick in way before their ellipsis is shown due to the larger font size. The upshot of this change is that text is no longer truncated at word boundaries. Bolding pushes he size of text in pixels. We also see Google moving brand phrases to the start of a title dynamically.

For meta descriptions actually the CSS truncation appears to be around 920 pixels.

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

How to Remove a Manual Penalty | World of Search

How to Remove a Manual Penalty | World of Search | Sensible SEO | Scoop.it
This guide explains, in depth, how to get over a manual penalty for inbound links and uses: This Excel template DISCLAIMER: There are undoubtedly faster and shorter processes to audit your links and submit a reconsideration request. Taking shortcuts like that may work, or it may not. This process is the full-blown, no cut corners …
Norman Pongracz's insight:
SummaryFirst Steps / Basic Analysis on Pitch LevelDid the client receive any link warning message in Google Webmaster Tools?Did the client experience any sharp decline in visibility via Search Metrics? (Penguin and manual penalty tends to be a sharp decline in rankings compared to the slow downfall of Panda penalty)Is it possible that the decline was caused by competitors' sites getting penalty removed and regaining their ranking?Did the client experience any sharp decline in traffic via GA? Does this correspond with decline in visibility?Is it possible that decline in traffic was caused by tracking or other issue than penalty?Did the company lost its ranking for brand terms (Google search "Brand term")Has there been any link removal done beforehand? Are there any documentations on it?Does the client have any white/safe-list of links/domains?Does the domain interlink with any other domains? (such as Debenhams.com and Debenhams.ie) Did they implement HrefLang or any other solution to prevent looking spammy?What Is a “Bad” Link?

Basically a “bad” link in Google’s eyes is anything that isn’t editorial – any link that you created for the purpose of SEO.

If someone created a random link to your website on some unrelated forum, that might be a link that we consider not great from an SEO perspective, but from a penalty perspective there’s theoretically nothing wrong with it.

However, if you discover that you have many backlinks from low quality and unrelated domains, they may be worth removing – even if you didn’t make them. Look for patterns. One link from a Spam directory will not result in penalty. A dozens of links from Spam directories might.

In addition, Google seems to give the most relevancy when it comes to penalties to your most recent links (as opposed to links you made five years ago). Try getting your “latest links” look around to see if there is any suspicious recent activity.

Link Metrics Analysis

Link Research Tools Detox Analysis

What risk did LRTs assign to the domain?Did you classify at least 80% of keywords?What is the risk distribution?Does anchor text distribution look natural?

LRTs is not that useful to determine the site’s spam backlinks but provides an excellent benchmark to understand the toxicity.

Backlink Data Collection

Majestic (Historic preferably)OSEaHrefsMajestic and/or OSE apiUpload list of links on LRTs

Spreadsheet analysis

The best practice is to review each domain (or a linking page/domain) manually, but many times this isn't possible. So here are some shortcuts.

Are any of the links appearing in previous disavow files?Do they have a white list of links?What are the top referring domains? (Worth reviewing the top domains manually)Do links have suspicious domain names and/or URL paths? Spam links tend to have at least one of the following words in their URLs: SEO, link, directory (or many times: dir), submit, web, site, search, Alexa, moz, domain, list, engine, bookmark, rank etc.Spam domains tend to have unusually long URL names (example: best-shoes-for-wedding.wordpress.com) or have marketing sounding path names and titles (example: importance-of-selling-your-stuff-online)Spam domains tend to have low citation and trust flow metrics (under 15)Spam domains tend to have low amount of backlinksSpam domains often created within a short period of time – i.e. low domain age.Are referring domains all have unique IP address? Does some of the domains look like a link network?Are press releases and sponsored posts as well as guest blogging guidelines no-followed?Review for high quality directories that can be whitelisted (dmoz, yahoo directory, yell etc.)
more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

Case Study: The Impact of HrefLang Tag | SEER Interactive

Case Study: The Impact of HrefLang Tag | SEER Interactive | Sensible SEO | Scoop.it
A client came to us few months ago wanting to improve their organic presence in Google.com (US) for their sub domain (us.example.com). Their main domain
Norman Pongracz's insight:

Case Study and Examples of Hreflang Implementation:

 

 

1) Diff URLs with same Language

Google UK:

<link rel=”alternate” hreflang=”en” href=”http://www.example.com”/>
(in this example their .com domain is for Google UK)

<link rel=”alternate” hreflang=”en-gb” href=”http://www.example.co.uk”/>
(for .co.uk sites)

Google US:
<link rel=”alternate” hreflang=”en-us” href=”http://us.example.com”>

Google Australia:
<link rel=”alternate” hreflang=”en-au” href=”http://www.example.com/au/”>

 

2) CCTLD

Google France: <link rel=”alternate” hreflang=”fr” href=”http://www.example.fr/”/>

Google Germany: <link rel=”alternate” hreflang=”de” href=”http://www.example.de/”/>

 

3) Subfolders

Google Italy: <link rel=”alternate” hreflang=”it” href=”http://www.example.com/it/”/>

Google Spain: <link rel=”alternate” hreflang=”es” href=”http://www.example.com/es/”/>

 

 

 
more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

Help Google serve the correct language or regional URL - Webmaster Tools Help

Help Google serve the correct language or regional URL - Webmaster Tools Help | Sensible SEO | Scoop.it
rel="alternate" hreflang="x"Many websites serve users from around the world, with content that's translated or targeted to users in a certain region. Google uses the rel="alternate" hreflang=
Norman Pongracz's insight:

Help Google serve the correct language or regional URL

Google uses the rel="alternate" hreflang="x" annotations to serve the correct language or regional URL to searchers.

Using language annotations - Imagine you have an English language page hosted at http://www.example.com/, with a Spanish alternative at http://es.example.com/. You can indicate to Google that the Spanish URL is the Spanish-language equivalent of the English page in one of three ways:
HTML link element in header. In the HTML <head> section of http://www.example.com/, add a link element pointing to the Spanish version of that webpage at http://es.example.com/, like this:
<link rel="alternate" hreflang="es" href="http://es.example.com/"; />
HTTP header. If you publish non-HTML files (like PDFs), you can use an HTTP header to indicate a different language version of a URL: Link: <http://es.example.com/>; rel="alternate"; hreflang="es"
Sitemap. Instead of using markup, you can submit language version information in a Sitemap.

<url>
<loc>http://www.example.com/english/</loc>;
<xhtml:link
rel="alternate"
hreflang="de"
href="http://www.example.com/deutsch/";
/>
<xhtml:link
rel="alternate"
hreflang="de-ch"
href="http://www.example.com/schweiz-deutsch/";
/>
<xhtml:link
rel="alternate"
hreflang="en"
href="http://www.example.com/english/";
/>
</url>

<url>
<loc>http://www.example.com/deutsch/</loc>;
<xhtml:link
rel="alternate"
hreflang="en"
href="http://www.example.com/english/";
/>
<xhtml:link
rel="alternate"
hreflang="de-ch"
href="http://www.example.com/schweiz-deutsch/";
/>
<xhtml:link
rel="alternate"
hreflang="de"
href="http://www.example.com/deutsch/";
/>
</url>

<url>
<loc>http://www.example.com/schweiz-deutsch/</loc>;
<xhtml:link
rel="alternate"
hreflang="de"
href="http://www.example.com/deutsch/";
/>
<xhtml:link
rel="alternate"
hreflang="en"
href="http://www.example.com/english/";
/>
<xhtml:link
rel="alternate"
hreflang="de-ch"
href="http://www.example.com/schweiz-deutsch/";
/>
</url>

</urlset>
Be sure to specify the xhtml namespace as follows: xmlns:xhtml="http://www.w3.org/1999/xhtml";
You must create a separate url element for each URL. Each url element must include a loc tag indicating the page URLs, and an xhtml:link rel="alternate" hreflang="XX" subelement for every alternate version of the page, including itself.
This example uses the language code de for the URL targeted at German speakers anywhere, and the language-locale code de-ch for German speakers in Switzerland. If you have several alternate URLs targeted at users with the same language but in different locales, it's a good idea to provide a URL for geographically unspecified users. For example, you may have specific URLs for English speakers in Ireland (en-ie), Canada (en-ca), and Australia (en-au), but want all other English speakers to see your generic English (en) page. In this case you should specify the generic English-language (en) page for searchers in, say, the UK.

If you have multiple language versions of a URL, each language page must identify all language versions, including itself. For example, if your site provides content in French, English, and Spanish, the Spanish version must include a rel="alternate" hreflang="x" link for itself in addition to links to the French and English versions. Similarly, the English and French versions must each include the same references to the French, English, and Spanish versions.

It's a good idea to provide a generic URL for geographically unspecified users if you have several alternate URLs targeted at users with the same language, but in different locales.
You can annotate this cluster of pages using a Sitemap file or using HTML link tags like this:
<link rel="alternate" href="http://example.com/en-ie"; hreflang="en-ie" />
<link rel="alternate" href="http://example.com/en-ca"; hreflang="en-ca" />
<link rel="alternate" href="http://example.com/en-au"; hreflang="en-au" />
<link rel="alternate" href="http://example.com/en"; hreflang="en" />

For language/country selectors or auto-redirecting homepages, you should add an annotation for the hreflang value "x-default" as well:
<link rel="alternate" href="http://example.com/"; hreflang="x-default" />
Update the HTML of each URL in the set by adding a set of rel="alternate" hreflang="x" link elements. For the default page that doesn’t target any specific language or locale, add rel="alternate" hreflang="x-default"
<link rel="alternate" hreflang="x-default" href="http://www.example.com/"; />
<link rel="alternate" hreflang="en-gb" href="http://en-gb.example.com/page.html"; />
<link rel="alternate" hreflang="en-us" href="http://en-us.example.com/page.html"; />
<link rel="alternate" hreflang="en" href="http://en.example.com/page.html"; />
<link rel="alternate" hreflang="de" href="http://de.example.com/seite.html"; />

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

How to Get Clean Links From Affiliate Bloggers | SEER Interactive

How to Get Clean Links From Affiliate Bloggers | SEER Interactive | Sensible SEO | Scoop.it
Think you could never get a clean link directly to your site from a blogger DESTINED to add an affiliate tag? Think again!
Norman Pongracz's insight:

Getting links through scanning affiliate blogs posts

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

Google kills links in bios to drive authors to Google+

Google kills links in bios to drive authors to Google+ | Sensible SEO | Scoop.it
Back in 2011, Google CEO Larry Page sent a memo to all his employees that their bonuses depend on the success of Google’s social efforts.
Norman Pongracz's insight:

According to Google, now, you don’t deserve a link just for writing a post…at least not a link that Google will use to determine rankings. As Google inevitably begins discounting author bio links it will start to become clear that the only way to increase the position of your site in the SERPs using your status as an expert author is to ensure that you have Google+ authorship implemented, and if you want to increase your authority in any subject area Google+ is the only thing that will count in the algorithm.

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

SEO Ryan Gosling

SEO Ryan Gosling | Sensible SEO | Scoop.it
Building Links and Breaking Hearts
Norman Pongracz's insight:

Because you can never have enough of Ryan Gosling :)

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

Whitepaper: Taking on big competitors with local SEO | STAT Search Analytics

Whitepaper: Taking on big competitors with local SEO | STAT Search Analytics | Sensible SEO | Scoop.it
How do people really do local search? We looked at auto insurance in the USA, and found some surprising things about local SEO that folks in every industry should know.
Norman Pongracz's insight:

Case study on American auto insurance local queries by GetStat

 

Auto insurance industry was chosen because it offers localized products and services but isn’t dependant on brick-and-mortar sales, and one of the biggest American brands "Geico".

 

Study was based on syntax variants of 33 basic short-tail keywords related to auto insurance with search volume per keyword greater than 100 per month.

 

Findings

 

Query Structure Matters
As we expected, when you only look at search volume, the short-tail unmodified queries win hands-down. A strong majority of desktop searchers do not geo-modify (geo-modify: add location to query) queries for local services that are not tied to brick-and-mortar locations. Instead, they are using simple short-tail keywords like [auto insurance quotes] and letting Google geo-locate (show localised result for any keyword query) them.

 

Every Query is Local
Even if your sales aren’t tied to brick-and-mortar locations, you still need to be looking at the local picture. The reality is that every search is now local. Google routinely modifies desktop and mobile search results based on a location — even for queries that do not explicitly include geo-modifiers.That means if you’re not tracking keywords on a local level, you’re missing the majority of the competitive landscape

 

Big players have weaknesses as well
Things can seem hopeless looking at the national picture. But by going deeper with localized ranking data and analysis, you’ll find that even the biggest brands aren’t dominating the SERPS in every local market. These are opportunities that you can identify and strategically exploit in any industry. And it’s not just about large regions like states or provinces. You can dig down to the level of individual cities or even postal codes and ZIP codes to micro-target your SEO strategy.

more...
No comment yet.
Rescooped by Norman Pongracz from SEO Tips, Advice, Help
Scoop.it!

Google To Warn Searchers When A Mobile URL Redirects To The Homepage

Google To Warn Searchers When A Mobile URL Redirects To The Homepage | Sensible SEO | Scoop.it

Google alerted webmasters late yesterday that it will let smartphone searchers know if it thinks a website has a “faulty redirect” in place that sends the searcher to your home page, not the page they clicked on.


Via Bonnie Burns
Norman Pongracz's insight:

We’d like to spare users the frustration of landing on irrelevant pages and help webmasters fix the faulty redirects. Starting today in our English search results in the US, whenever we detect that smartphone users are redirected to a homepage instead of the page they asked for, we may note it below the result. If you still wish to proceed to the page, you can click “Try anyway.”


But Google’s not just warning searchers; there’s also help for webmasters. The “Crawl Errors” section of Webmaster Tools will offer specific information about faulty redirects affecting smartphone crawling.

more...
Bonnie Burns's curator insight, June 5, 11:08 AM

Google stated: We’d like to spare users the frustration of landing on irrelevant pages and help webmasters fix the faulty redirects. Starting today in our English search results in the US, whenever we detect that smartphone users are redirected to a homepage instead of the the page they asked for, we may note it below the result. If you still wish to proceed to the page, you can click “Try anyway.”

Scooped by Norman Pongracz
Scoop.it!

Basics of Debugging Google Analytics Code: GA Chrome Debugger and other tools

Basics of Debugging Google Analytics Code: GA Chrome Debugger and other tools | Sensible SEO | Scoop.it
An overview of debugging tracking code, with information on common debugging tools and a deeper look into Chrome GA Debugger
Norman Pongracz's insight:

Debugging GA with GA Chrome Debugger and other tools without no code access (summary)

 

Common Debugging Tools
-Fiddler2 makes it easy to view every request to Google Analytics (and any other javascript-based Web Analytics tool). You can even try out changes to your tracking code on your live system before releasing them to everyone, the tool is independent from browsers – and it is free!
-Web Analytics Solution Profiler (WASP)
-Charles Debugger
-Firebug/Chrome Developer Console https://www.youtube.com/watch?v=nOEw9iiopwI
The function I use most of the time apart from the Console (where errors are being logged) is the Network Tab. It can tell us if the tracking beacon has been sent to Google Analytics successfully. To find out, look for the __utm.gif request. If it displays a “200 OK” status code (see the green light in the screen shot), you know that Google Analytics has received the current Pageview or Event. You can take a look what is inside that request in the “Headers” tab (Cardinal Path’s Kent Clark’s marvelous “Cheat Sheet” helps interpreting the values). http://www.cardinalpath.com/wp-content/uploads/ruga_cheat_sheet.pdf

Chrome GA Debugger / ga_debug.js
Google’s recommended debugging tool for Google Analytics is Chrome’s Add-On “GA Debugger”. It is basically a form of using the “ga_debug.js” script without having to alter your page’s code at all (if you use ga_debug.js, you will have to change ga.js into /u/ga_debug.js on every page you want to debug). Chrome GA Debugger is a nice and easy-to-use tool that logs every Pageview and Event that you send to Google Analytics in your Chrome Developer Console (right-click on any part of the page => “Inspect Element” → go to tab “Console”):
Chrome GA Debugger shows you in an easy-to-read format what is being sent to Google Analytics without having to understand or inspect cookie variables or the Network Tab of your Console. It gives you hints like:
-Does my visit have the correct source/medium/campaign?
-Are there pages that accidentally override those sources?
-Are there pages where conflicting JavaScript or other reasons hinder the Tracking Code from being executed?

Fiddler, the browser-independent HTTP debugger and manipulator
With Fiddler, you can even debug your iPhone apps or anything else that does not run through a classic browser.

https://www.youtube.com/watch?v=jlQYf1DiA3U
-The filters to capture only the requests you need (e.g. the Google/Adobe/Webtrends Analytics HTTP requests)
-The inspector tab where you can investigate all the request’s parameters under “Web Forms”
-The AutoResponder that allows you to “kill” specific files or replace one (JavaScript or other) file by another one on your computer or somewhere else

Rewrite the HTML with FiddlerScript
With the AutoResponder, you can easily have your browser load the file you want instead of the default one. So if the code you want to debug is in that specific analytics_code.js file, you just download that file, change it the way you think it could work, save it to your hard drive as “my_analytics_test_code.js”, and then tell the AutoResponder that whenever it encounters “analytics_code.js”, it shall replace that file by the test file on your PC.
But there are some cases where you really have to alter the very HTML code of the page, not simply replace an entire file. Examples when you need this are:
-You want to change a line (or more) of the tracking code inside of your HTML, e.g. add a Webtrends HTML meta tag or a custom variable for Adobe Analytics or Google Analytics
-You want to rewrite an inline Event Tracking call (“inline” are those “onclick” handlers that are inside the link, e.g. < a onclick=”yourcall” >Link< /a >. You should avoid them by the way to keep JavaScript and HTML separate so it is less likely that you have to resort to the method I am describing here)
-You want to test-drive another tool, for example a Tag Management System, a heatmap tool, some conversion tracking for your email marketing tool, or whatever else that requires you to insert code into the web page – which would usually mean working yourself through your development release cycle first (and wait months, ask for budget etc…)
-You want to check whether it is the Tag Management System’s fault that a tag is not working. I had this case recently with a tag in Google Tag Manager (GTM). When I inserted it “the traditional way” by rewriting the HTML via Fiddler, I saw that the tag fired correctly. I realized that it was one of those tags that have to load synchronously, but GTM can only fire tags asynchronously (see some more drawbacks of Google Tag Manager). After some more hours of re-coding the tag, I finally got it working via GTM as well.

Additional tools: AddOn “FiddlerScript” and execute: http://www.telerik.com/fiddler/add-ons => “Syntax-Highlighting Add-Ons”

...


References:

Main article:
http://www.webanalyticsworld.net/2012/01/basics-of-debugging-google-analytics-code-ga-chrome-debugger-and-other-tools.html
http://www.webanalyticsworld.net/2014/05/advanced-analytics-debugging-no-code-access-no-problem.html

Additional articles:
http://www.cardinalpath.com/wp-content/uploads/ruga_cheat_sheet.pdf

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

Homepage Sliders: Bad For SEO, Bad For Usability

Homepage Sliders: Bad For SEO, Bad For Usability | Sensible SEO | Scoop.it
One of the most prevalent design flaws in B2B websites is the use of carousels (or sliders) on the homepage. Carousels are an ineffective way to target user personas, which ends up hurting the site’s SEO and usability. In fact, at the recent Conversion Conference in Chicago, about 25% of the speakers mentioned carousels — […]
Norman Pongracz's insight:

Problem with B2B Homepage Sliders (Summary)

 

With B2B websites, carousels seem to only be used for one of three reasons: branding, thought leadership or product/service promotion.

Problems:

Alternating Headings - Many of the carousels the headings in the slider were wrapped in an h1 tag. Basic SEO best practices state that there should only be one h1 tag per page, and it should appear before any other heading tag. The problem with using h1 or any heading tag in the carousel is that every time the slide changes, the h1 tag changes. A page with five slides in the carousel will have 5 h1 tags, which greatly devalues the keyword relevance.

SEO issues
-Flash Usage - few of the websites serve up slider content using Flash. Avoiding Flash to serve up content is SEO 101.
-Poor Performance - As with any website, the more you complicate and add things, the slower the page loading speed. A few sites featuring full-width carousels packed with high resolution images greatly impacts the page load speed.
-Content Replacement - As stated earlier, carousels are used as an ineffective method of targeting user personas. Many websites take this to an extreme by using shallow content on the page.

Usability Issues
-Nobody Clicks On The Carousels
-Content Is Pushed Below The Fold
-The Megaphone Effect - When a user lands on a page, his or her attention is drawn to the carousel because it has revolving content, alternating text, colour changes, and all sorts of other attention-stealing features.
-Confusing Objectives - When a carousel is used, the user will assume the page talks about whatever heading is used in the carousel.

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

Official Google Webmaster Central Blog: Infinite scroll search-friendly recommendations

Official Google Webmaster Central Blog: Infinite scroll search-friendly recommendations | Sensible SEO | Scoop.it
Norman Pongracz's insight:

Infinite scroll search-friendly recommendations

With infinite scroll, crawlers cannot always emulate manual user behaviour - like scrolling or clicking a button to load more items - so they don't always access all individual items in the feed or gallery. To make sure that search engines can crawl individual items linked from an infinite scroll page, your content management system should produce paginated series (component pages) to go along with your infinite scroll.
-Chunk your infinite-scroll page content into component pages that can be accessed when JavaScript is disabled.
-Determine how much content to include on each page while maintaining reasonable page load time.
-Divide content so that there’s no overlap between component pages in the series

Implement replaceState/pushState on the infinite scroll page (The decision to use one or both is up to you and your site’s user behaviour)for the following:
-Any user action that resembles a click or actively turning a page.
-To provide users with the ability to serially backup through the most recently paginated content.

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

Beginners Guide to Universal Analytics - Creating Custom Dimensions & Metrics

Beginners Guide to Universal Analytics - Creating Custom Dimensions & Metrics | Sensible SEO | Scoop.it
Beginners Guide to Universal Analytics.Learn some quick tips to get started. Learn creating custom dimensions and custom metrics.
Norman Pongracz's insight:

Difference between Universal Analytics (UA) and Google Analytics (GA) - Excerpt

 

Data Collection and integration - UA provides more ways to collect and integrate different types of data (across multiple devices and platforms) than Google Analytics (GA). UA provides better understanding of relationship between online and offline marketing channels.

Data Processing - UA is visitor based instead of visit based.

Custom Dimensions and metrics - UA allows 'custom dimensions' and ‘custom metrics’ to collect the type of data GA does not automatically collect (like phone call data, CRM data etc). GA uses 'Custom variables' instead. Vostum variables are available in UA still (not sure for how long). User Interface only changes when using custom dimensions and metrics.

Javascript library - UA uses ‘analytics.js’ JS library whereas GA uses ‘ga.js’.

Tracking Code - GA uses UA tracking code

Remarketing - UA does not support ‘Re-marketing’ yet.

Referrals Processing - in UA, returning referrals considered to be wo web sessions.

Cookies - While GA can use up to 4 cookies (_utma,_utmb,_utmz and _utmv) to collect visitors’ usage data, UA uses only 1 cookie (called _ga).

Privacy and data usage - You need to give your end users proper notice and get consent about what data you will collect via UA. You also need to give your end users the opportunity to ‘opt out’ from being tracked. That means you need to make changes in your privacy and data usage policies. Google recommends using Google Analytics opt out browser add on if you want to block Google Analytics.

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

Bruce Clay EU - Theming Through Siloing

Bruce Clay EU - Theming Through Siloing | Sensible SEO | Scoop.it
Theme building through directory based and virtual link silos. Europe
Norman Pongracz's insight:

Possible Alternatives to Eliminate Excessive Navigation or Cross Linking (Excerpt)


When it is impossible to remove menus that contradict subject relevant categories, instead use technology to block the search engine spider's indexing of those specific elements to maintain quality subject relevance.

 

IFRAMEs: If you have repetitive elements, add an IFRAME to isolate the object to one location and eliminate subject confusion from interlinking. The contents of an IFRAME is an external element that is not a part of the page, or any page except the HTML of the IFRAME contents file itself. As such, IFRAME contents does not count as a part of any page displaying the IFRAME code.

 

Ajax: Ajax code included dynamically into a web page cannot be indexed in search engines, providing the perfect haven for content, menus and other widgets for user's eyes only.

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

Local Landing Pages: A Guide To Great Implementation In Every Situation

Local Landing Pages: A Guide To Great Implementation In Every Situation | Sensible SEO | Scoop.it
With the right types of landing pages, you can dramatically increase your local visibility. These helpful guidelines and tips will help you ensure proper implementation for your specific business model.
Norman Pongracz's insight:

Local Landing Pages: A Guide To Great Implementation In Every Situation - Summary

 

1) Single-location service area business (SABs) - E.g: Plumber with 30 mile work radius incorporating multiple cities or even a state
Most SABs will be unable to obtain rankings in Google's local pack of results for any city other than the one in which they are physically located, and this leaves business owners wondering how they can accurately represent the fact that they serve in a variety of locations. The answer is to pursue organic rankings, rather than local ones, for these other service cities.

How it works:
Identify the key cities in which you serve, beyond your city of location.
Create a unique page of content on your website for each of these cities.
Link to these pages from a top level menu, perhaps under a heading such as "Cities We Serve."
If possible, earn social mentions and links for these pages.

Start by identifying your very most important cities (maybe 5 or 10 of them). Develop well-planned, high-quality pages for each of them. You can then continue to build out new pages over time, or, consider the idea of developing an on-site blog to begin publishing ongoing content about your less-important service cities as well as your important ones.

2) Single location brick-and-mortar business - This is the restaurant, dental office, or retail shop with just one physical location.
If your business has more of a link than this to surrounding towns or cities, you might have something of value to write about. A legitimate connection might include, but not be limited to, the following hypothetical scenarios:

3) Multi-location brick-and-mortar or service area business - E.g: Solicitors
In this scenario, you have more than one office, either from which your staff travels to offer services or to which your customers come to do business. In both cases you will be creating local landing pages for each physical address. Provided that each location has a unique phone number and is staffed during stated open hours, you are allowed to create a Google+ Local page for each office, too.

4) National company desiring a local presence - E.g: Estate businesses
If you have staffed, physical locations in some cities and make in-person contact with your customers, then you are eligible to create a local landing page and attached Google+ Local page for each physical office.

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

Getting hreflang Right: Examples and Insights for International SEO

Getting hreflang Right: Examples and Insights for International SEO | Sensible SEO | Scoop.it
If you're trying to figure out exactly what hreflang will and will not do for your sites, this post (complete with examples of hreflang implementations from several major brands) should help set things straight.
Norman Pongracz's insight:

Summary

 

Section 1: How to check international SERPs the right way

Mimicing international searches: https://www.google.com/search?hl=es&gl=us&pws=0&q=seo
Where:
google.com - national search engine (change it to .co.uk, .es, etc.)
hl=es - language of search (change it to en, de etc.)
gl=us - country of search (change it to uk, au, etc.)
pws=0 - de-personalised search, don't change it
q= - query, add query parameters (for example q=seo, q=red+dress, etc.)

Full list of language/country codes:
https://www.distilled.net/blog/uncategorized/google-cctlds-and-associated-languages-codes-reference-sheet/
Or: http://www.davidsottimano.com/google-language-country-code-reference-sheet/
Tool: http://isearchfrom.com/

Section 2: What should hreflang do and not do
Hreflang will not replace geo-ranking factors and will not fix (general) duplicate content issues
Will: Help the right country/language version of your cross-annotated pages appear in the correct versions of *google.*

Section 3: Examples of hreflang behaviour

Case 1: CNN.com <head> hreflang, 302 redirect on homepage, and subdomain configuration
<link href="http://www.cnn.com" hreflang="en-us" rel="alternate" title="CNN" type="text/html"/>
<link href="http://mexico.cnn.com" hreflang="es" rel="alternate" title="CNN Mexico" type="text/html"/>

This ranks the spanish site neither for mexican searches nor spanish searches in the US
Let's try to explain this behaviour:
Cnn.com actually 302's to edition.cnn.com; this is regular SEO behaviour that causes the origin page URL to display in search resuls and the content comes from the redirect.
Mexico.cnn.com is not the right answer for "es" (Spanish language) IMO, because it's the Mexican version and should be annotated as "mx-es" ;)
Since cnnespanol.cnn.com exists and seems to have worldwide news, I would use this as the "ES" version.
Cross hreflang annotations are missing, so the whole thing isn't going to work anyways.

Case 2: play.google.com
Configuration: <head> hreflang, language/country variations and duplicate content
*FYI - I've shortened this for simplicity
x-default - https://play.google.com/store/apps/details?id=com....
en_GB - https://play.google.com/store/apps/details?id=com....
en - href https://play.google.com/store/apps/details?id=com....

X default ranks for UK searches
Let's try to explain this behaviour:
One thing you may not notice is that the EN, X default, and GB version are almost entirely duplicate (around 99%). Which one should the algorithm choose? This is a good example of hreflang not handling dupe content.
The GB version doesn't display in UK search results, and the rankings are not the same (US ranking is higher than UK on average). The hreflang annotation is using the underscore rather than the standard hyphen (EN_GB versus EN-GB)
They use a self-referencing canonical, which, contrary to some beliefs, has absolutely no effect on the targeting

Case 3: Musicradar.com
Configuration: <head> hreflang, subdomain & cctld, country targeting and x-default
<link rel="alternate" hreflang="en-gb" href="http://www.musicradar.com/"; />
<link rel="alternate" hreflang="x-default" href="http://www.musicradar.com/"; />
<link rel="alternate" hreflang="en-us" href="http://www.musicradar.com/us/"; />
<link rel="alternate" hreflang="fr-fr" href="http://www.musicradar.com/fr/"; />

X default works. US nad FR query works, but FR sitelinks does not.
Let's try to explain this behaviour:
Perfect example of perfect implementation
One thing to notice is that they double list the EN-GB page also as the X-default
The English sitelink in the French results is pretty weird, but I think this is the perfect situation to escalate to Google as their implementation is correct as far as I can tell.

Case 4: Ridgid.com
Configuration: XML sitemaps hreflang, subfolders, rel canonical and dupe content
Sample of hreflang annotations:

<loc>https://www.ridgid.com/</loc>;
<xhtml:linkhreflang="en-US" href="https://www.ridgid.com/"; rel="alternate"/>
<xhtml:link hreflang="en-CA" href="https://www.ridgid.com/ca/en"; rel="alternate"/>
<xhtml:link hreflang="en-PH" href="https://www.ridgid.com/ph/en"; rel="alternate" />
Ridgid.com should appear in the US, ridgid.com/ca/en should appear for Canadian - English queries (google.ca) and ridgid.com/ph/en should appear in Google Philippines for English queries.

Canadian results are wrong. US nad PH is ok.
Let's try to explain this behaviour:
All 3 homepages are almost exactly identical, hence duplicate content
The Canadian version contains <link rel="canonical" href="https://www.ridgid.com/"; /> - that means it's being canonicalized to the main US version
The Philippines version does not contain a canonical tag
Google is choosing which is the right duplicate version to show, unless there is a canonical instruction

Section 5: Tips from many screw-ups and successes
Use either the <head> implementation or XML sitemaps, not both. It can technically work, but trust me, you'll probably screw something up - just stick to one or the other.
If you don't cross annotate, it won't work. Plain and simple, use Aleyda's tool to help you.
Google says you should self-reference hreflang, but I also see it working without (check out en.softonic.com). If you want to play safe, self reference; we don't know what Google will change in the future.
Try to eliminate the need for duplicate content, but if you must, it's okay to use canonical + hreflang as long as you know what you're doing. Check out this cool isolated test which is still relevant. Remember, mo' dupes, mo' problems.
Hreflang needs time to work properly. At a bare minimum, Google needs to crawl both cross annotations for the switch to happen. Help yourself by pinging sitemaps, but be aware of at least a 2-day lag.
You can double-annotate a URL when using X-default, in case you were afraid to. Don't worry, it's cool.
Make sure you're actually having a problem before you go ranting on webmaster forums. Double check what you're seeing and ask other people to check as well. Check your Google parameters and personalized results!
You can 302 your homepage when you're using a country redirect strategy. Yes, I know it's crazy, yes, a little bird told me and I thoroughly tested this and didn't see a loss. There's 2 sites I know of using this, so check them out: The Guardian & Red Bull.

more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

Web design Hertfordshire: What ranks and why - Distinctly Digital

Web design Hertfordshire: What ranks and why - Distinctly Digital | Sensible SEO | Scoop.it
Search engine optimisation is continually evolving as Google improves its manipulation of search results for its own gain.
more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

Competitive Link Analysis: Link Intersect in Excel

Competitive Link Analysis: Link Intersect in Excel | Sensible SEO | Scoop.it
Without a doubt, one of the main steps in creating an SEO strategy is the competitive analysis. Competitor backlinks can offer information on their link building strategies, as well as giving you opportunities to strengthen your own link profile.
more...
No comment yet.
Scooped by Norman Pongracz
Scoop.it!

The Complete Guide to Mastering Duplicate Content Issues - Search Engine Journal

The Complete Guide to Mastering Duplicate Content Issues - Search Engine Journal | Sensible SEO | Scoop.it
There is little doubt that duplicate content on a site can be one of the biggest battles an SEO has to fight against. Too many content management systems a
Norman Pongracz's insight:

Duplicate content issues.

If of-site can file DMCA for removing the duplicate content. Many scrapers are low quality sites and the webmaster can't do much about them.

 

In many cases content is published into distribution channels hoping to be picked up and republished on other websites. The value of this duplication is usually one or more links pointing to the author’s website.



more...
No comment yet.