Are Social Media Signals the Answer for Quality Backlinks?

Google has needed a way to provide people with the ability to improve the quality of a link, to verify that links are trust-worthy and has created Google+. I believe that verifying the author of a link is a fantastic way to achieve this.

There’s been a lot of debate in the SEO community lately regarding social media versus traditional link building methods. While some SEOs argue that social media links are the wave of the SEO future, traditionalists staunchly maintain traditional, authoritative links from quality sources are still the best way to go.

Whatever your stance, I think it’s easy to agree that gaining links from trusted authorities is desirable for any site — but that doesn’t mean the rise of social shouldn’t affect our outreach methods.

We’ll start with the obvious: social sites allow you to network and build relationships with industry players and authorities. Someone who’s gotten to know you over social media is going to be more receptive to a link request than someone receiving a random email from an outside party.

Further, social media offers a quick way to see that you’re a legitimate source with an active interest in the field — you’re not just out to spam any email address or Twitter account you can get your hands on.

However, social media also offers an ideal way to find and target industry users for specific link building outreach campaigns, too. Of course, before you can start targeting, you’ve got to identify who you’re trying to reach.

External backlinks (inbound links for Google Page Rank)

Google gets Tough on bad Backlinks and Spammers

I hope the above website marketing how to information was of help to you.

Your Internet website marketing partner at

Natural – Relative Backlinks and Google Search Rank

Let’s first briefly go over natural or relative backlinks and un-natural backlinks.

Natural or Relative backlinks are like gold to an internet marketer.
It’s when a website or blog owner loves the content on your site so much they decide to link to it because your website has relative useful information that their readers can benefit from it as well.
Matt Cutts of Google, has stated that the best way to get links to your site is to create amazing quality content, and I agree.
CLICK HERE for more info on backlinking

Un-Natural Backlinks are backlinks to your website that are come from websites that are not relative with the information on your website.

What’s being called the Unnatural Links update is rocking the SEO world. In March 2012 people started receiving warnings from Google about “unnatural links” and then many of those sites’ rankings in search results have taken a nosedive. Some sites have been de-indexed, which means their site’s pages no longer show up in Google at all.

The scary part? Unlike other Google algorithm changes, you can’t respond to this one by changing things on your site. This time, it’s about links to your site from other websites (commonly called backlinks) that you may or may not have control over – including sites you may never have asked to link to you.

One way to get relative or natural backlinks to your website is the old fashion way and it is called work! Start your own informational webiste or blog using a different website name (URL) and host it on a different server or webhost account or better yet a different webhost altogether. You want to make sure Google will not connect the new website back to you and your current website.

Now start creating quality content and link to other websites as well as your own. By linking to other websites as well as your own you are letting Google know that your website is not one sided or slanted or bias etc.

How to aviod unnatural backlinks and Google Penalties?

What’s being called the “Unnatural Links Update” is rocking the SEO world. In March 2012 people started receiving warnings from Google about “unnatural links” and some of those sites’ rankings in search results have taken a nosedive. Some sites have been de-indexed (which means their site’s pages no longer show up in Google at all) or sent to the the Google Raters to be checked and rated.

Kristina Weis at About Us (this link is a good example of a high quality natural backlink to “ suggest the following tips to avoid a Google unnatural backlink penalty and make sure your links don’t look unnatural.

Pay attention for an “Unnatural Links” notice in your Google Webmaster Tools. *tweet this* If you get one, your rankings are doomed to slip if you don’t do get those unnatural links to your site removed. It’s important to note, though, that you may not get a notice before your site takes a dive (or you may not get a warning at all), so don’t assume you’ll be fine if you haven’t received a notice.

1) Make sure you don’t have any paid links. Buying links or selling links that pass PageRank (meaning they aren’t NoFollow) is against Google’s webmaster guidelines. If you are found to be paying for links, or if it appears you’re paying for links, your site’s rankings will suffer. To be safe rather than sorry, you should ask for any paid links to be removed or made NoFollow. Contact the linking site’s webmaster or customer service department and hope they’re listening.

2)Make sure you don’t have links from blog networks. Google has cracked down on blog networks (like BuildMyRank), which are typically basic-looking WordPress blogs with low-quality content and keyword-infused links to other sites. For more information about blog networks and Google’s update, read this article.

3) Sites with lots of links with keyword-rich anchor text look suspicious. *tweet this* If the vast majority of the links to your site just happen to use one of a few keyword phrases as the anchor text, they aren’t going to look very natural to a person, nor to Google’s algorithm. What are the odds someone would choose one of your top keyword phrases when linking to you? Odds are, most people will use something like your business name, the title of your blog post, your business name, or “click here” as the anchor text when linking to you. Make sure that the anchor text in your backlinks looks diverse and not like you asked or paid people to give you links with SEO-perfect anchor text.

4) Try to have a balance of high-quality and lower-quality links. Many sites will have a few low quality links that they never asked for, but it becomes a problem when the majority of your backlinks look iffy. Look at the root domains (like that are linking to you: What is their PageRank? Do they have a decent social media following, or are many people sharing their content in social networks? If most or all of the websites that link to you seem low quality, you may be in trouble. It’s time to build up some quality backlinks and/or try to get rid of some of the low quality links.

5) Avoid site-wide links. These don’t look too natural, and many sites that sell links will put the links on all their site’s pages.

I hope the above website marketing how to information was of help to you.

Your Internet website marketing partner at

10 New Google Analytics Features You Need to Start Using

The below information as received in part from SeerInteractive and written by Rachael Gerson.

Over the past eight months, Google has steadily released one revolutionary new feature after another. On March 17, the company announced a new version of Google Analytics. Up until this point, users could decide whether they preferred to stick with the old interface or switch to the new one. However, Google recently announced that the old version of GA will be turned off in January 2012.

If you’re not already familiar with the new version, take the next few weeks to get comfortable with it.

To help you get started, let’s review the top 10 features of the new Google Analytics at

I hope the above information is of help to you.

Your Internet website marketing partner at

Adding a website to Google

The below information as posted in part from Google.

Inclusion in Google’s search results is free and easy; you don’t even need to submit your site to Google. Google is a fully automated search engine that uses software known as “spiders” to crawl the web on a regular basis and find sites to add to our index. In fact, the vast majority of sites listed in our results aren’t manually submitted for inclusion, but found and added automatically when our spiders crawl the web.

If you’ve just added a URL to your site, or a page has significantly changed since the last time it was crawled, you can ask Google to crawl it.

If your site offers specialized products, content, or services (for example, video content, local business info, or product listings), you can reach out to the world by distributing it on Google Web Search. For more information, visit Google Content Central.

Read more on the above subject at

I hope the above information is of help to you.

Your Internet website marketing partner at

Google Quality Raters Handbook – does it exsist?

The below information as received in part from SearchEngineLand and written by Barry Schwartz

Brian Ussery has discovered a revised copy of the Google Quality Raters Guidelines, which he archived on his own site.

The documents are used by Google Quality Raters to aid them in classifying queries, measuring relevancy, and rating the search results. To do so, the Quality Rater must understand how Google works and this document has a bunch of that. Let me pull out some of those details in easy to read bullet points.

Three Query Types:

* Navigational: someone searching for a site, such as a search for IBM.
* Informational: someone searching for information on a topic of interest, such as finding out more information on Danny Sullivan.
* Transactional: someone searching when seeking to purchase something either online or offline, such as searching for ‘buy ipod touch.’

Quality Rating Scales:

* Vital: This is the highest score a web page can receive for a query. A vital result comes from a query that is most likely navigational and the resulting page is the official web page of the query. When searching for ‘ibm’, the vital result would be
* Useful: This is the second highest score a web page can receive for a given query. A useful rating should be assigned to results that “answer the query just right; they are neither too broad nor too specific.” One of the examples given for a useful rating would be a search on meningitis symptoms with a resulting web page of
* Relevant: This comes after a useful rating, and is used for results that return less useful results. The guidelines say the result is often “less comprehensive, come from a less authoritative source, or cover only one important aspect of the query.” An example would be a review of laptop computers, but the review only takes five computers and not all computers within its class. Since it is not a fully comprehensive review, it would be rated as relevant and not useful.
* Not Relevant: This rating is used for pages that are not helpful to the query but are somewhat still connected to the original query. Classifications of a not relevant page would be “outdated, too narrowly regional, too specific, too broad” and so on. One of the examples give is a search for the ‘BBC’ that returns a specific article from BBS; it is too specific and is not relevant to the query at hand.
* Off-Topic: This is the lowest rating a page can receive for a query. If the returned page is completely not relevant to the query, it would be given a rating of “off topic.” An example given is a query on ‘hot dogs’ that returns a page about doghouses.

Categories For Results That Can’t Be Rated: Not everything can be rated, and those must be classified somehow. The categories for those types of results include:

* Didn’t Load: For pages that return a 404 error, page not found, product not found, server time out, 403 forbidden, login required, and so on.
* Foreign Language: This is given to a page that is in a “foreign language” to the “target language” of the query. English is never a foreign language, no matter what. So, if you search in Chinese for something and a Hebrew page is returned, it is a foreign language, but if an English page is returned, it is not a foreign language. There are exceptions to the rule.
* Unratable: When the rater cannot rate it for any other reason.

Spam Labels: Now for the really good stuff, spam labels. This is a new addition to the quality raters guidelines and is fairly small. The labels include:

* Not Spam: The not spam rating is given to pages that “has not been designed using deceitful web design techniques.”
* Maybe Spam: This label is given when you feel the page is “spammy,” but you are not 100% convinced of that.
* Spam: Given to pages you feel are violating Google’s webmaster guidelines.

Flags: Flags are for pages that require immediate attention, such as:

* Pornographic content
* Malicious code on pages

That is a brief overview of some of the many points in the document. For more, see the archived document and for some history, check out Google Blogoscoped. Here is an additional copy of this document at

Read more on the above subject at

I hope the above information is of help to you.

Your Internet website marketing partner at

Google Raters and who are they?

The below information as received in part from SeoMoz and written by

Google Quality Raters are out there rating not only organic search results, but also Google ads (AdWords) and Videos, and probably more things but those are the three types of raters I am sure of.

There is a good forum out there that is all “Quality Raters” info and discussion. I poured through a couple hundred pages of posts but you are welcome to read over at that forum here.

The Raters I will be talking about today are the ones that rate the organic results – called Search Quality Raters. In a nutshell, Google outsources this job to outside companies and those outside companies hire independent contractors to do the work.
What Is a Google Search Quality Rater?

Here is a quote from one of the forum posts that sums up the position of a Google Search Quality Rater quite well:

“There are a few names for this position. The companies hiring for this are Lionbridge, Leapforce and Butler Hill. I am not sure if Workforce Logic still hires for this or not. Lionbridge titles this position as an internet assessor and Leapforce titles it as Search Engine Evaluator. You do not apply with Google. These companies contract with Google and hire independent contractors.
I work for Lionbridge. I enjoy the pay, but the task availability is not always steady and the work can get boring sometimes. Basically what it entails is assessing the utility of search engine results. In order to get started, you must apply and if they have openings in your area and you qualify, they will invite you to take a test.
The test is long and you must study carefully.
The pay is great(I cannot disclose due to confidentiality agreement), but it is hourly.
It is a great way to earn a part-time income.”

So, to sum up – these people are “stay at home” type folks…moms, between jobs, students, etc.

They can apply for a Quality Rater position through one of these outside companies.

If accepted, they then take a 2-part exam to qualify their ability.

“The exam consists of two sections: Section 1 has 24 theory questions and Section 2 consists of 270 practical exams. “

If they pass the exams, they are then hired to work as an independent contractor (ie, work from home, no taxes taken out of pay – no employee benefits). They tend to work 10-20 hours per week and are paid by the hour. From what I can gather, they make something like $12 – $15 per hour.

These QRs (Quality Raters) are only allowed to work for one year. Then, they must wait 3 months before they can apply for that position to work again.

Now, these folks, for the most part, are not internet marketers or especially experienced with Google or organic rankings from the perspective we are. They are “normal” users of Google. They hire these raters from all over the world and those speaking all types of languages.

These quality raters are not new for Google. One Quality Rater in that forum said this:

“Just to reiterate from earlier in the thread, I was in the original group of raters back when the project started in late 2003/early 2004….”

So Google has been doing this for about 8 years.

Got all that? Good – let’s move on.
What Do Google Search Quality Raters Do?

These Quality Raters for organic results in Google are given 2 types of rating assignments. One type is when they are given one keyword and one url and are instructed to rate the relevance (ie, “utility”) of the url to the intent of the keyword. “Intent” according to the quality raters handbook is summed up as a keyword being a “do”, “know”, or “go” type keyword.
The rater decides if the keyword is relevant to something that searcher would want to do (ie, buy something, watch something, etc), something they want to know (ie, info on a topic, reviews, etc), of somewhere they want to go (ie, go to when they search for ‘youtube’).

Then they look at the url and decide if that url is relevant to that specific search query. They are also given an opportunity to mark that url as spam and to give notes about that url. Important to mention: A Rater is allowed to rate a url as BOTH spam AND as being relevant. An example of this, to me, would be a keyword like ‘buy digital camera’ and the url DOES offer a way to buy a digital camera but the site is chock full of banner ads of other “spam-like” stuff.

Important Take-Away: Since these raters are typically “normal” users of Google, first impressions of your url count BIG TIME. I think we all agree that when we each come to a web page, we cast judgement within 3 seconds of landing on that page, don’t we? Quality Raters are totally the same.

The other type of assignment these raters get is when they get 2 sets of search results (ie, a first page result for a keyword search). One result page is the “before” page and the other is the “After” (if you don’t know how this works, learn how Google makes algo changes on this post here at The video is short and very helpful – and yes, there is an info-graphic picture, too).

With this type of assignment the raters pretty much rate which set of results is “better” in their opinion.
What Are The Performance Requirements for a Google Quality Rater?

Since these quality raters are paid by the hour, there is a certain level of performance they are required to maintain. They receive progress reports on their performance that appears to be a star-rating type system.

Their performance is judged by how many urls they do per hour.

For the single url/keyword assignments, it appears they are required to do 30 per hour.

From that forum:

“Hi, I would like to ask you how many URLs per hour do you usually do? More than 60 or less than 60? I have no idea how many is the average because no one ever told me how many should I do. Thanks “

One other rater posted this as a response:

“I believe that 30 is the minimum expected for rating, within a few weeks after you’re hired”

Another posted this:

“I believe that 30 is the minimum expected for rating, within a few weeks after you’re hired (not including the test week). You’ll get an email if you fall below standards. They stress quality over quantity, however, so that overall, a slower rater who rates really well may be preferable to a fast one who misses the mark more often. “May” is the operative word, we don’t know what the criteria are for judging our work.”

It really appears as none of them are actually SURE about what level they are expected to perform at. Interesting.

Another Quality Rater posted this:

“Experienced QRs, on average, how many U*Ls do you personally complete in one hour? I just got my 2nd progress report and productivity is only at 3 stars when I do about 45 to 50 per hour.”

Only a 3-star rating when he/she does 45 for 50 an hour?!?! Yikes!

With these “before and after” type assignments (called “side by side tasks” – SxS) it appears they are required to do 20 per hour.

What REALLY concerns me about this is one main thing – how in the world can you really rate a web page in 2-3 minutes? So, as I mentioned above – First Impressions REALLY matter.

One quality rater posted this in that above mentioned forum (emphasis mine):

“It’s not possible to do 20 SxS tasks an hour if you click through to each result. It’s pretty easy on the occasions when the sides are nearly identical and you just have to decide whether you’d rather have A or B somewhere. When I was doing 20 I did a lot based just on snippets… I had been doing 2-3 an hour without complaint from them, mind you. And then I figured that if they wanted me to go faster maybe they didn’t want me to be QUITE so careful about judging every single result. Now I am not sure what they care about! Maybe it helps them to have a lot of different perspectives and styles of rating. Maybe no one is checking up on our work at all! “

Anyone else thinking, “Oh boy…that’s not good news”?

Your url can be judged simply by your snippet in the search results! It’s possible no one actually LOOKS at your url at ALL!

Important Take-Away – Be SURE your snippet (ie, url meta description) is as relevant to your target query as possible. Now, Google can, and does, auto-generate these on their own MANY times, but try and control what you can.

Another quality rater posted this….which really bothers me:

“I would guess that for most people, the biggest challenge is remembering *precisely* what it is that you’re rating. “

Which backs up my impression that many aren’t exactly sure about what is expected of them. Awesome.
Can ONE Quality Rater Change the Ranking of a Url?

In this interesting 2009 interview with Google’s Engineering Director, Scott Huffman. John Paczowski asked Mr Huffman this (emphasis mine):

JP: So you’re describing a process in which these evaluators are going to specific Web pages and rating them according to a specific criteria. Do these data have any effect on those sites’ page ranks or pay-per-click and Ad Word bids?

To which Mr Huffman from Google replied (again, emphasis mine):

SH: We don’t use any of the data we gather in that way. I mean, it is conceivable you could. But the evaluation site ratings that we gather never directly affect the search results that we return. We never go back and say, “Oh, we learned from a rater that this result isn’t as good as that one, so let’s put them in a different order.” Doing something like that would skew the whole evaluation by-and-large. So we never touch it.

Now this makes sense to me – ONE rater can not cause a rankings change. However, I do believe that if a certain percentage of raters mark one url as spam or non-relevant, that it does throw up some type of flag in the system that can cause something to happen to that url. Now I naturally do not KNOW this, but I get that sneaky feeling.
Do Quality Raters Rate EVERY Query Space?

It is impossible to have a human rater out there rating every single query space. Heck, 15% or more of searches each and every month are NEW phrases that have never been searched for before. Yes, brand new combinations of words that Google has never had before!

However, especially with the side-by-side rater assignments, Google is testing potential algo changes. The way the sample queries are rated can cause the algo change to roll out…which can affect a MUCH larger set of query spaces without a human ever looking at YOUR url.

In that situation, there is bound to be many “false positives” and there is not a whole lot you can do about it other than wrack your brain and try and figure out what the algo change was targeting (and good luck with that!)
How To Survive Google Raters

It’s tough to answer how to survive a Google Rater visit or a subsequent algo change due to Quality Raters at play. Here are 4 tips to help survive a manual review:

1. Accept what you cannot change – There is nothing we can do about Human Raters judging our urls or the things that happen to the algo due to OTHER urls being rated. Therefor, I think the most important thing we can do is – don’t stress over it. Manual reviews have been going on for years and I don’t see them going away any time soon. Just roll with it the best you can.

2. Be proactive – make sure you site/url gives a great FIRST impression – and don’t look at it as a marketer…look at it as a general CONSUMER. Would YOU keep reading on your site? Would YOU buy something from your site? Be a “normal person” and judge your own urls just as we naturally judge any OTHER url we visit.

3. Check Your Snippets – Keep an eye on how your snippet reads in a Google search result for your target keyword(s). Does it tell a potential visitor that your page IS what they are looking for? Does your snippet match what a visitor will actually FIND on your page?

4. Evaluate Intent and Be Relevant – Lastly, really think about your target keyword(s)….if YOU typed that phrase into Google, what would YOU expect/want to find? Is your url and content truly relevant to the intent of the keyword used to find your web page?

All in all, remember that these raters are people simply trying to earn some money from home. For the most part, they really don’t care what happens to a web page, they just want to do the job that is expected of them. Many aren’t exactly sure what IS expected of them, either. I would also imagine that many aren’t sure, don’t know, or simply don’t care how their actions fit into the BIG picture either.

Google was not created for webmasters – it was created for SEARCHERS. These human raters are the people they (Google) are catering to – not us marketers, ok?

Want a “professional” human review of your web page(s)? Here’s an idea….

The holidays are coming. Many of us will have a house full of people or be IN a house full or people. I’m willing to bet that YOUR house full of people is like mine – “normal people” that use Google to find stuff. Ask THEM what they think about your web page. Heck, show them 2 or 3 for your search query (one being yours) and ask THEM which page they like the best. Also, don’t forget to ask them WHY they choose one web page over another.

Your family can start a new holiday tradition – The Google Game!

Read more on the above subject at

I hope the above information is of help to you.

Your Internet website marketing partner at

Tips from Google AdSense

The below was sent to me by Google AdSense.

Hi there,

As the holiday season approaches, advertisers are preparing new campaigns and internet traffic is likely to increase. To help you take advantage of this, in this issue, we’ve put together the top tips for making your site more visible. You’ll also find updates on:

* Improving page load speed in DFP Small Business
* Using Google+ Pages for brands and businesses

Make your site more visible!

Be relevant
Do you know how users find your pages? Webmaster Tools provides you with detailed information on the top queries for which your site ranks in Google search results. Use this information to make sure that the content users find most interesting is most prominent.

Be social
As we mentioned in our last newsletter, by implementing the +1 button on your pages, you can give users an opportunity to share their opinion with you and guide you to what is most interesting for them on your site. Webmaster Tools now also allows you to track the search impact of +1’s on your pages.

Be fast
A faster site increases user satisfaction, so speed is an important factor in the Google search ranking. Page Speed Online analyzes the content of a web page and generates suggestions to make that page faster. Let Maile Ohye from the Developers Program Team tell you more about the importance of your site’s performance:
Site Speed Performance For Webmasters (VIDEO)

Check how fast your website loads !

Google+ Pages for businesses and brands
Google+ Pages are a way for brands and businesses to have a presence on Google+. Pages bring you closer to your audience, letting you have real conversations with the right people, connecting you face to face with your site’s visitors, and letting current fans recommend new ones.

Giving your business a home on Google+ lets you directly interact with your users, while giving them more chances to share your content with their friends. Create a page today and connect with your audience!

Your Internet website marketing partner at

Google Analytics Time on Site, Length of Visit and Bounces

Bounce as described by Google;

“Bounce rate is the percentage of single-page visits or visits in which the person left your site from the entrance (landing) page. Use this metric to measure visit quality – a high bounce rate generally indicates that site entrance pages aren’t relevant to your visitors. The more compelling your landing pages, the more visitors will stay on your site and convert. You can minimize bounce rates by tailoring landing pages to each keyword and ad that you run. Landing pages should provide the information and services that were promised in the ad copy.”


To know how long a visitor has spent on a web page Google needs two things:

1 ) The time you arrived on a specific web page
2 ) The time you requested another web page on the SAME website.
Read # 2 again… “The time you requested another web page on the SAME website.” You have Google Analytics installed on your web site but if the visitor leaves your website for another page for any one of the following reasons; entering an URL in the address bar, closing the browser, clicking on advertisement or an outgoing “self target” link etc… Google Analytics does NOT get triggered again and it cannot calculate the time on that page. So… it defaults to 00:00:00.
Therefore in theory someone could land on a web page and stay on that web page for 10 minutes and get the info they were looking for, then close the browser and Google will count as 00:00:00 time and that equals as a bounce.

How to view your BOUNCE RATE?

If you have a Google Analytics account connected to your website you can view the bounce rates for your website by going to the Bounce Rate report under Visitors > Visitor Trending > Bounce Rate.

CLICK HERE to go to Google Analytics

Time on Site as described by Google;

“Time on Site: Time on site is one way of measuring visit quality. If visitors spend a long time visiting your site, they may be interacting extensively with it. However, Time on site can be misleading because visitors often leave browser windows open when they are not actually viewing or using your site.”


This method of calculating Time on Site is also not 100% accurate because even the reported time on site is faulty to begin with. Time on Site is nothing but the sum of Avg. Time on Page for all web pages. And we know that the reported Time on Page is not 100% accurate because Google Analytics does not record time spent on page for exits and bounces.

Length of Visit as described by Google;

“Length of Visit (Visitor Behavior): Length of visit is a measure of visit quality. A large number of lengthy visits suggests that visitors interact more extensively with your site. The graph allows you to visualize the entire distribution of visits instead of simply the ‘Average Time on Site’ across all visits. Keep in mind that ‘Average Time on Site’ is skewed by visitors leaving browser windows open when they are not actually viewing or using your site. You can see whether a few visits are skewing your ‘Average Time on Site’ upward or whether most visits to your site have a high average time.”


The only way for someone to register a visit for more than 0 seconds is if they go to a SECOND page. The reason for this is that Google Analytics can only tell how long someone has been there if they have more than one page view – it uses the timestamps (date and time record) of each page view and “subtracts” them to get the time that someone was on one page. For this reason; you will not know how long someone stays on the FIRST and only page visited and you will never know how long someone was truly on your site because you can’t tell how long they were looking at the LAST page they were on (in boyh instances there isn’t a next page for Google Analytics to subtract time from).

Google Analytics Definitions Help Web Page CLICK HERE

Till the next time,
Your Website Marketing Partner and Friend at ProNetUSA – WebSites and Internet Marketing