Are Social Media Signals the Answer for Quality Backlinks?

Google has needed a way to provide people with the ability to improve the quality of a link, to verify that links are trust-worthy and has created Google+. I believe that verifying the author of a link is a fantastic way to achieve this.

There’s been a lot of debate in the SEO community lately regarding social media versus traditional link building methods. While some SEOs argue that social media links are the wave of the SEO future, traditionalists staunchly maintain traditional, authoritative links from quality sources are still the best way to go.

Whatever your stance, I think it’s easy to agree that gaining links from trusted authorities is desirable for any site — but that doesn’t mean the rise of social shouldn’t affect our outreach methods.

We’ll start with the obvious: social sites allow you to network and build relationships with industry players and authorities. Someone who’s gotten to know you over social media is going to be more receptive to a link request than someone receiving a random email from an outside party.

Further, social media offers a quick way to see that you’re a legitimate source with an active interest in the field — you’re not just out to spam any email address or Twitter account you can get your hands on.

However, social media also offers an ideal way to find and target industry users for specific link building outreach campaigns, too. Of course, before you can start targeting, you’ve got to identify who you’re trying to reach.

External backlinks (inbound links for Google Page Rank)

Google gets Tough on bad Backlinks and Spammers

I hope the above website marketing how to information was of help to you.

Anthony,
Your Internet website marketing partner at ProNetUSA.com

Natural – Relative Backlinks and Google Search Rank

Let’s first briefly go over natural or relative backlinks and un-natural backlinks.

Natural or Relative backlinks are like gold to an internet marketer.
It’s when a website or blog owner loves the content on your site so much they decide to link to it because your website has relative useful information that their readers can benefit from it as well.
Matt Cutts of Google, has stated that the best way to get links to your site is to create amazing quality content, and I agree.
CLICK HERE for more info on backlinking

Un-Natural Backlinks are backlinks to your website that are come from websites that are not relative with the information on your website.

What’s being called the Unnatural Links update is rocking the SEO world. In March 2012 people started receiving warnings from Google about “unnatural links” and then many of those sites’ rankings in search results have taken a nosedive. Some sites have been de-indexed, which means their site’s pages no longer show up in Google at all.

The scary part? Unlike other Google algorithm changes, you can’t respond to this one by changing things on your site. This time, it’s about links to your site from other websites (commonly called backlinks) that you may or may not have control over – including sites you may never have asked to link to you.

One way to get relative or natural backlinks to your website is the old fashion way and it is called work! Start your own informational webiste or blog using a different website name (URL) and host it on a different server or webhost account or better yet a different webhost altogether. You want to make sure Google will not connect the new website back to you and your current website.

Now start creating quality content and link to other websites as well as your own. By linking to other websites as well as your own you are letting Google know that your website is not one sided or slanted or bias etc.

How to aviod unnatural backlinks and Google Penalties?

What’s being called the “Unnatural Links Update” is rocking the SEO world. In March 2012 people started receiving warnings from Google about “unnatural links” and some of those sites’ rankings in search results have taken a nosedive. Some sites have been de-indexed (which means their site’s pages no longer show up in Google at all) or sent to the the Google Raters to be checked and rated.

Kristina Weis at About Us (this link is a good example of a high quality natural backlink to “AboutUs.com) suggest the following tips to avoid a Google unnatural backlink penalty and make sure your links don’t look unnatural.

Pay attention for an “Unnatural Links” notice in your Google Webmaster Tools. *tweet this* If you get one, your rankings are doomed to slip if you don’t do get those unnatural links to your site removed. It’s important to note, though, that you may not get a notice before your site takes a dive (or you may not get a warning at all), so don’t assume you’ll be fine if you haven’t received a notice.

1) Make sure you don’t have any paid links. Buying links or selling links that pass PageRank (meaning they aren’t NoFollow) is against Google’s webmaster guidelines. If you are found to be paying for links, or if it appears you’re paying for links, your site’s rankings will suffer. To be safe rather than sorry, you should ask for any paid links to be removed or made NoFollow. Contact the linking site’s webmaster or customer service department and hope they’re listening.

2)Make sure you don’t have links from blog networks. Google has cracked down on blog networks (like BuildMyRank), which are typically basic-looking WordPress blogs with low-quality content and keyword-infused links to other sites. For more information about blog networks and Google’s update, read this article.

3) Sites with lots of links with keyword-rich anchor text look suspicious. *tweet this* If the vast majority of the links to your site just happen to use one of a few keyword phrases as the anchor text, they aren’t going to look very natural to a person, nor to Google’s algorithm. What are the odds someone would choose one of your top keyword phrases when linking to you? Odds are, most people will use something like your business name, the title of your blog post, your business name, or “click here” as the anchor text when linking to you. Make sure that the anchor text in your backlinks looks diverse and not like you asked or paid people to give you links with SEO-perfect anchor text.

4) Try to have a balance of high-quality and lower-quality links. Many sites will have a few low quality links that they never asked for, but it becomes a problem when the majority of your backlinks look iffy. Look at the root domains (like example.com) that are linking to you: What is their PageRank? Do they have a decent social media following, or are many people sharing their content in social networks? If most or all of the websites that link to you seem low quality, you may be in trouble. It’s time to build up some quality backlinks and/or try to get rid of some of the low quality links.

5) Avoid site-wide links. These don’t look too natural, and many sites that sell links will put the links on all their site’s pages.

I hope the above website marketing how to information was of help to you.

Anthony,
Your Internet website marketing partner at ProNetUSA.com

Relative Internal Links vs Absolute Internal links and SEO

Many SEOs will tell you there is no difference and the main thing is to be consistent with your choice. Others (they seem to be in majority) support absolute URLs, while web developers and designers consider it stupid and not important.

For information on Internal Links and Website Navigation Links( SEO ) CLICK HERE

First let’s look at the one type of Absolute Links and the two types of Relative Links;

1. Absolute Link
An absolute link contains the protocol (HTTP://) plus the hostname (domain name, with subdomain if any) followed by the file path to the actual resource.
Example: http://www.example.com/directory/page.htm

2. Root relative Link
This style of linking begins with a forward slash and omits the protocol and the hostname. Every link written this way is assumed to start with whatever the current protocol and host name are in the user-agent (browser).
Example: /directory/page.htm

3. relative Link
relative links are computed relative to the current location – they do not begin with a forward slash .. and that’s where you get the crazy ../../ schemes to go back a few directories toward the domain root.
Examples: ../directory/page.php or directory/page.aspx

Absolute URLs for internal interlinking:

Example: < a href="http://www.searchenginejournal.com/about-us/4070">About SEJ < /a>
# are better when handling canonicalization issues;
# are safer when talking about site hijacking;
# are safer when switching to a new CMS;
# will save you in cases when your content is stolen and the thief does not take time changing the internal references;
# are a better choice if your content is distributed via email (you do want your readers to click the internal links and actually get to the page, don’t you?);
# might be easier for search engines to follow as they resolve all relative URLs to absolute ones before following them.

Relative URLs for internal interlinking:

(First, a short definition of a relative URL)

URL whose location is specified relative to the address of the base document in which the URL resides. It provides a shorthand way to refer to files or resources that use the same protocol, domain name, or directory path as the current document. (source)

Example: < a href="/about-us/4070">About SEJ < /a>
# make it easy to move from one domain to another one;
# make the code shorter which might decrease a page’s download time.

I hope the above website marketing information was of help to you.

Anthony,
Your Internet website marketing partner at ProNetUSA.com

WordPress User Capabilities and Manager PlugIn

Depending upon what type of wordpress user you are you should be able to do all or some of the following capabilities; publishing, editing and deleting posts and pages, moderating comments, managing users, managing themes and plugins.

If the wordpress default users system doesn’t meet your requirements, then check out Thomas Schneider’s role manager plugin. This plugin allows you to edit the capabilities of existing roles, or even create completely new roles with arbitrary collections of capabilities!

Administrator, Author, Editor, Contributor, Subscriber – WordPress User Permissions

I hope the above website design information was of help to you.

Anthony,
Your Internet website marketing partner at ProNetUSA.com

WordPress User Permissions- Administrator, Author, Editor, Contributor, Subscriber

Listed below are the permissions for the five Word Press default users.

In order of most permissions first;

Administrator – Somebody who has access to all the administration features and mangement, including management of all the below users.

Editor – Somebody who can publish posts, manage posts as well as manage other people’s posts.

Author – Somebody who can publish and manage their own posts.

Contributor – Somebody who can write and manage their posts but not publish posts without getting individual post approval from one of the above first two listed users.

Subscriber – Somebody who can read comments/make comments/receive news letters.

WordPress user Capabilities and Manager PlugIn is available.

I hope the above wordpress website design information was of help to you.

Anthony,
Your Internet website marketing partner at ProNetUSA.com

Three tips to protect your WordPress installation

The below information as received in part from Matt Cutts.

Here are three easy but important ways to protect yourself if you run a WordPress blog:

1. Secure your /wp-admin/ directory. What I’ve done is lock down /wp-admin/ so that only certain IP addresses can access that directory. I use an .htaccess file, which you can place directly at /wp-admin/.htaccess . This is what mine looks like:

AuthUserFile /dev/null
AuthGroupFile /dev/null
AuthName “Access Control”
AuthType Basic
order deny,allow
deny from all
# whitelist home IP address
allow from 64.233.169.99
# whitelist work IP address
allow from 69.147.114.210
allow from 199.239.136.200
# IP while in Kentucky; delete when back
allow from 128.163.2.27

I’ve changed the IP addresses, but otherwise that’s what I use. This file says that the IP address 64.233.169.99 (and the other IP addresses that I’ve whitelisted) are allowed to access /wp-admin/, but all other IP addresses are denied access. Has this saved me from being hacked before? Yes.
2. Make an empty wp-content/plugins/index.html file. Otherwise you leak information on which plug-ins you run. If someone wanted to hack your blog, they might be able to do it by discovering that you run an out-of-date plugin on your blog and then they could exploit that.
3. Subscribe to the WordPress Development blog at http://wordpress.org/development/feed/ . When WordPress patches a security hole or releases a new version, they announce it on that blog. If you see a security patch released, you need to upgrade or apply the patch. You leave yourself open to being hacked if you don’t upgrade.

And here’s a bonus tip: in the header.php file for your theme, you might want to check for a line like

” />

I’d just go ahead and delete that line or at least the bloginfo(‘version’). If you’re running an older version of WordPress, anyone can view source to see what attacks might work against your blog.

Hat tip to Reuben Yau and Shoe.

I hope the above information is of help to you.

Anthony,
Your Internet website marketing partner at ProNetUSA.com

10 New Google Analytics Features You Need to Start Using

The below information as received in part from SeerInteractive and written by Rachael Gerson.

Over the past eight months, Google has steadily released one revolutionary new feature after another. On March 17, the company announced a new version of Google Analytics. Up until this point, users could decide whether they preferred to stick with the old interface or switch to the new one. However, Google recently announced that the old version of GA will be turned off in January 2012.

If you’re not already familiar with the new version, take the next few weeks to get comfortable with it.

To help you get started, let’s review the top 10 features of the new Google Analytics at Mashable.com

I hope the above information is of help to you.

Anthony,
Your Internet website marketing partner at ProNetUSA.com

Adding a website to Google

The below information as posted in part from Google.

Inclusion in Google’s search results is free and easy; you don’t even need to submit your site to Google. Google is a fully automated search engine that uses software known as “spiders” to crawl the web on a regular basis and find sites to add to our index. In fact, the vast majority of sites listed in our results aren’t manually submitted for inclusion, but found and added automatically when our spiders crawl the web.

If you’ve just added a URL to your site, or a page has significantly changed since the last time it was crawled, you can ask Google to crawl it.

If your site offers specialized products, content, or services (for example, video content, local business info, or product listings), you can reach out to the world by distributing it on Google Web Search. For more information, visit Google Content Central.

Read more on the above subject at Google.com

I hope the above information is of help to you.

Anthony,
Your Internet website marketing partner at ProNetUSA.com

Duplicate Content and Search Engines

The below information as received in part from SEOmoz and written by Dr. Peter J. Meyers

Duplicate content exists when any two (or more) pages share the same content.

Duplicate content as an SEO issue was around long before the Panda update, and has taken many forms as the algorithm has changed.

Since Panda (starting in February 2011), the impact of duplicate content has become much more severe in some cases. It used to be that duplicate content could only harm that content itself. If you had a duplicate, it might go supplemental or get filtered out. Usually, that was ok. In extreme cases, a large number of duplicates could bloat your index or cause crawl problems and start impacting other pages.

Panda made duplicate content part of a broader quality equation – now, a duplicate content problem can impact your entire site. If you’re hit by Panda, non-duplicate pages may lose ranking power, stop ranking altogether, or even fall out of the index. Duplicate content is no longer an isolated problem.

Read more on the above subject at SEOmoz.com

I hope the above information is of help to you.

Anthony,
Your Internet website marketing partner at ProNetUSA.com

Google Quality Raters Handbook – does it exsist?

The below information as received in part from SearchEngineLand and written by Barry Schwartz

Brian Ussery has discovered a revised copy of the Google Quality Raters Guidelines, which he archived on his own site.

The documents are used by Google Quality Raters to aid them in classifying queries, measuring relevancy, and rating the search results. To do so, the Quality Rater must understand how Google works and this document has a bunch of that. Let me pull out some of those details in easy to read bullet points.

Three Query Types:

* Navigational: someone searching for a site, such as a search for IBM.
* Informational: someone searching for information on a topic of interest, such as finding out more information on Danny Sullivan.
* Transactional: someone searching when seeking to purchase something either online or offline, such as searching for ‘buy ipod touch.’

Quality Rating Scales:

* Vital: This is the highest score a web page can receive for a query. A vital result comes from a query that is most likely navigational and the resulting page is the official web page of the query. When searching for ‘ibm’, the vital result would be www.ibm.com.
* Useful: This is the second highest score a web page can receive for a given query. A useful rating should be assigned to results that “answer the query just right; they are neither too broad nor too specific.” One of the examples given for a useful rating would be a search on meningitis symptoms with a resulting web page of http://www.webmd.com/hw/infection/aa34586.asp
* Relevant: This comes after a useful rating, and is used for results that return less useful results. The guidelines say the result is often “less comprehensive, come from a less authoritative source, or cover only one important aspect of the query.” An example would be a review of laptop computers, but the review only takes five computers and not all computers within its class. Since it is not a fully comprehensive review, it would be rated as relevant and not useful.
* Not Relevant: This rating is used for pages that are not helpful to the query but are somewhat still connected to the original query. Classifications of a not relevant page would be “outdated, too narrowly regional, too specific, too broad” and so on. One of the examples give is a search for the ‘BBC’ that returns a specific article from BBS; it is too specific and is not relevant to the query at hand.
* Off-Topic: This is the lowest rating a page can receive for a query. If the returned page is completely not relevant to the query, it would be given a rating of “off topic.” An example given is a query on ‘hot dogs’ that returns a page about doghouses.

Categories For Results That Can’t Be Rated: Not everything can be rated, and those must be classified somehow. The categories for those types of results include:

* Didn’t Load: For pages that return a 404 error, page not found, product not found, server time out, 403 forbidden, login required, and so on.
* Foreign Language: This is given to a page that is in a “foreign language” to the “target language” of the query. English is never a foreign language, no matter what. So, if you search in Chinese for something and a Hebrew page is returned, it is a foreign language, but if an English page is returned, it is not a foreign language. There are exceptions to the rule.
* Unratable: When the rater cannot rate it for any other reason.

Spam Labels: Now for the really good stuff, spam labels. This is a new addition to the quality raters guidelines and is fairly small. The labels include:

* Not Spam: The not spam rating is given to pages that “has not been designed using deceitful web design techniques.”
* Maybe Spam: This label is given when you feel the page is “spammy,” but you are not 100% convinced of that.
* Spam: Given to pages you feel are violating Google’s webmaster guidelines.

Flags: Flags are for pages that require immediate attention, such as:

* Pornographic content
* Malicious code on pages

That is a brief overview of some of the many points in the document. For more, see the archived document and for some history, check out Google Blogoscoped. Here is an additional copy of this document at Huomah.com.

Read more on the above subject at SearchEngineLand.com

I hope the above information is of help to you.

Anthony,
Your Internet website marketing partner at ProNetUSA.com