Get Your Goal

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

Discussion

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

Discussion

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

Way

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

World Wide Web

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

Ambitious

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

Ambitious

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

Thursday, May 31, 2012

Google Places is Out, Replaced By Google Plus Local


“This move to replace Google Places with Google+ Local is an excellent opportunity for business owners who are looking to make more out of their online campaigns.
 
In its latest move, Google has updated its numerous Google+ features and has replaced Google Places in the process. And according to numerous industry experts, this change, alongside Google’s Penguin algorithm and Knowledge Graph – will bring in some big changes that online businesses and internet marketers will have to take heed of.
 
Studies of the new Google+ modification has shown that Google+ Local pages will now appear in the area where Google Places used to appear when search results were shown. And according to Google’s Marissa Mayer – the company has already replaced 80 million Google Places pages with Google+ Local pages, and many more replacements are expected within the next few days.

In addition, the giant internet company has also tweaked its review system, with the replacement of its traditional 5-star rating system with Zagat 30-point rating scale. Google has also made the Zagat reviews more in depth by adding subsections on the review process, such as service, atmosphere and the like.

Industry experts and analysts likewise report that there are other advantages to the new Google+ Local pages, such as its integration with other Google features – like Google Properties, Maps, Mobile, Search and other 
features. Additionally, its integration with Google+ “Circles” is also expected to make user find recommendations done by family, friends, colleagues and other connections, making reviews more honest and accurate – and minimising the existence of false reviews.

Regarding search engine optimisation, SEO experts are saying that because Google+ Local pages (unlike Google Places) is actually indexed by Google – this makes optimisation easier for those who are looking for more traffic, better search engine rankings and more customers in the long run.

Concurrently, since a user’s Google Places page is converted into a Google+ Page automatically – it can likewise be considered as an excellent tool to enter into internet marketing and social media marketing campaigns as well. And according to experts, this opportunity must be maximised accordingly if one wants to earn even more in the end.

This opinion of experts is similarly viewed by Oracle Digital’s Director of Operations, Clint Maher. He says, “This move to replace Google Places with Google+ Local is an excellent opportunity for business owners who are looking to make more out of their online campaigns. After all, not only does it bring more accurate ratings for businesses, but it will also make your social media marketing campaigns a lot more effective as well. You just need to follow the proper procedures – and everything will follow”.

“Although this new interface may seem to be a bit daunting to some, this is not really the case. You just need to make relevant and valuable content – then share it by utilising the right procedures in order to widen your market. Fortunately, we have the tools and tactics that can do these, and we are always willing to use them for all of our clients.”, Maher added.
Indeed, with the new Google+ Local pages, businesses will have a better chance of getting more out of their online campaigns. And with the help of a company that is an expert on their field, this would not be a problem at all.

Google Algorithm Updates: Panda And Penguin Part 2


Google Algorithm Updates: Penguin 
 
Google estimates that the Penguin update has affected just over 3% of search queries in the English language.   Its main goal has been rectify the way Google scores backlinks.  Google uses backlinks as one aspect of its voting system, generating details on what websites the rest of the internet deems important.  If a site receives more quality backlinks then its competitor, it is seen as a better resource for more valuable content.  Backlinks have become a major target of spammers as an easy way to quickly increase ranking.  Also known as the ‘webspam algorithm update’, Penguin seems to target websites with 3 types of spammy backlinks:
1.                      Sites where the majority of backlinks come from low quality pages, including link directories, link  exchange pages, sponsored links or footer links
2.                      When the majority of a websites backlinks are from an unrelated niche or webpage
3.                      When the anchor text backlinks pointed to the site all have the same keyword anchor text
Penguin has not affected websites that contain a good mixture of natural quality and low-quality backlinks.  The general consensus is that creating balance in backlinks is the key.   This can only be accomplished through a wide and varied backlinking strategy.
 The Importance of Backlink Variety in Google Algorithm Updates
To remain unaffected by Google algorithm updates, backlinks should appear natural by coming from a variety of locations.  Many sites seek to increase their Google ranking by building their backlinks from single methods like blogs or article directories.  Backlinks from a single type of site are a giant red-flag for Google.  Webmasters should always seek to create broad diversity in their backlinks.  The more backlinks are extended across many sites, like directories, wikis, press releases, forums and social media, the more natural they will appear to Google.

The next aspect to this idea is diversity in the page location where the backlink is found.  Generally speaking, links found in on-page clusters can suggest paid placement, which is heavily discouraged by Google.  This commonly occurs when links are added to blog rolls, short text blurbs and forum threads.  Penguin seems to be successful at punishing sites with too many of these link clusters.  Once again, maintaining variety in backlinks creates a much more natural picture, and will prevent loss of ranking due to future Google algorithm updates.

Backlink Authority And Relevance 

The more authority a page has that is linking back, the higher the quality of link.  This is no surprise as page ranking factors have been at the heart of Google’s link algorithm for a long time.  But, the Penguin update has changed the algorithm to ensure those high PR sites are within the same niche or category.  Backlinks should also be naturally spread across all page ranks.  When they all occur in a specific page rank zone, it acts as an alert for Google that spamming practices are at work.

Anchor Text


Google algorithm updates have been targeting aggressive building of anchor links since before Penguin.  Examination of websites that were affected by Penguin reveals many participated in over optimization of anchor text links.  The websites that experienced a loss in ranking due to Penguin had their ‘target’ keyword appear in at least 60% of their anchor text links.  This percentage is way too high to be viewed as natural.  Many times when a natural anchor link occurs, it is in the form of ‘click here’, ‘site name’ or simply the website’s address.  This high percentage of keyword focused anchor text links is highly improbable and an easy sign for Google that spamming practices are at work.

What Penguin Means Moving Forward


The bottom line is that Google has been very clear with its guidelines about what practices are acceptable.  Penguin did not enact any revolutionary change, it just evolved the algorithm to be more adapt at catching those sites breaking the rules.  The update simply discounted the weight which it gives to linking patterns it believes to indicate spam tactics.  What we know going forward is simple: practice sound linking techniques that emphasize diversity and link relevance.  While no site will ever be completely protected against Google’s algorithm updates, by following guidelines and avoiding spamming activities, you can guard against penalties and de-indexing.

Tuesday, May 29, 2012

Google Penguin 1.1 algorithm update confirmed

It has been officially confirmed by Google's Matt Cutts that the Penguin 1.1 algorithm update is now live.

The Distinguished Engineer took to Twitter late on Friday night to announce: "Minor weather report: We pushed 1st Penguin algo data refresh an hour ago. Affects <0.1% of English searches." The original Penguin algorithm change on 24 April 2012 was said to affect 3.1 per cent of English search queries, while last year's major Panda algorithm update affected closer to 12 per cent, so in comparison the Penguin 1.1 update is minor.

The announcement puts pay to rumours there had been several Penguin changes in the last few weeks, insisting this is the first update since the original took effect; the fluctuations in search engine results that have been noticed in recent weeks may therefore be a delayed effect of Penguin 1.0. Like its predecessor Google Panda, the Penguin algorithm change is intended to curb webspam by penalising sites that violate Google's quality guidelines, including those that are 'over optimised' and use black hat SEO techniques.

Fighting spam and scams

Although Google has not revealed any of the specific aspects that are penalised by the Penguin update, the official blog post stated: "Our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics".

Richard Frost, managing editor at theEword, commented: "While major algorithm updates such as Panda and Penguin 1.0 have caused noticeable fluctuations in SERPs, it's unlikely we'll see Penguin 1.1 making much of an impact. However, a large number of sites are still trying to recover from the original algorithm update and figure out where they went wrong. Speculation in the SEO industry suggests unnatural linking and low quality content are the culprits, but there is confusion as to why some high quality sites have also been negatively affected."

Friday, May 25, 2012

Way To Use Social Media To Enhance Your Website's Ranking


If you have recently heard that your website’s search engine ranking may decline as a result of Google’s latest “Penguin Update,” and you’d like to help minimize any decrease in traffic, there are a few things you can do to boost your site. In this day and age, numerous businesses operate websites. The businesses operating websites which successfully bring in qualified leads understand the critical importance of maintaining an effective search engine optimization (SEO) strategy. Your SEO plan most likely includes performing keyword research on a regular basis, and strategically placing the most effective content throughout the pages of your ecommerce website. Each page of your site is well optimized, including the programming code for the metadata, titles, headings, alt tags, and anchor text within internal links. You have a system for creating a lot of inbound links, you submit your site to countless directories, you have RSS feeds, your write a blog post every day, you actively participate in social media networks, and much more. Eventhough you continually work extremely hard to follow Google’s best practices for SEO, your website’s ranking may fall. 


According to Wikipedia, “Google Penguin is a code name for a Google algorithm update that was first announced on April 24, 2012. The update is aimed at decreasing search engine rankings of websites that violate Google’s Webmaster Guidelines by using black-hat SEO techniques such as keyword stuffing, cloaking, participating in link schemes, deliberate creation of duplicate content, and others.” Although this update is designed to punish the sites which deliberately “spam” the search engines, your business websitecould potentially be downranked by Google as well. In order to protect your site from losing web traffic and decreased sales of your products and services, now is the time to expand your website’s online presence.

A great way to enhance your business website and grow your online marketplace is through the use of social media networks. It’s important to share exciting information, news and events concerning your company’s products and services on the two largest social networking sites, Facebook and Twitter. As your first step, devise your social networking marketing plan. Next, develop content that will create a “buzz” about your products and/or services. Then, post the “buzz-worthy” content, on a regular basis, within Facebook and Twitter. There are many more ways to encourage people to make recommendations about your products to their friends, family members and business associates. For instance, you may want to allow customers to post comments about your products and/or services within your website. The more positive comments you have about your business, the greater the chances people will share this content with their social networks. Make it as simple and easy as possible for your online followers and fans to share your content with their friends, family and associates. In addition, utilize the exciting content about your products within blog posts. Spend a little time researching like-minded businesses and then ask if they would like to exchange blog content. 

Here are three things you can consider for expanding your online demographics and boosting your website’s search engine ranking:
1. Drive more local search traffic to your website by implementing a Facebook pay-per-click ad campaign. Facebook has a massive following, so now is a great time to explore how it can help you grow your business.
2. Build a profile within Google+. This social media site can really help drive a lot of traffic your website.  Google+ allows people to create a profile and share the right things with specific groups of people. Although it has not surpassed Facebook or Twitter in “social” popularity, it does exhibit great authority when ranking content, authors, and users in search engine results pages (SERP)s.  So, create a (well optimized) user profile within Google+ and start sharing the fabulous content about your company’s products and/or services.

3. Create a profile account within the social media site called Pinterest. Pinterest allows you to ”organize and share all the beautiful things you find on the web. People use pinboards to plan their weddings, decorate their homes, and organize their favorite recipes. Best of all, you can browse pinboards created by other people. Browsing pinboards is a fun way to discover new things and get inspiration from people who share your interests.” As Pinterest is now the third most popular social networking site, you may want to check out the site, submit a request for an invitation, and start pinning as soon as possible!
To some small and midsized business owners, website search engine optimization may seem like a daunting task. Your staff may not have the time necessary to devote to maintaining your website for optimal SEO. We are happy to help you and your team with your business website, social networks campaigns, marketing and search engine optimization efforts. 

Friday, May 18, 2012

Penguin or Panda? How To Determine Which Google Algorithm Update Impacted Your Website

Ever since Google rolled out Penguin 1.0 on April 24th, I’ve been heavily analyzing websites that were hit by the update (I’ve now analyzed close to 75 websites hit by Penguin).  Based on my analysis, I have written several posts covering my findings.  In my latest post, An Update from the Over Optimization Front Lines, I explained how important it is for webmasters to know exactly what hit them before taking action.  I know that sounds simple, but I’ve had several companies contact me believing they were hit by Penguin, when in fact, they were hit by Panda.

Panda, Penguin, and The Algorithm Sandwich

After Penguin 1.0 was released, Google also explained that a Panda update was rolled out a few days before Penguin (on 4/19).  Then, to make matters even more confusing, Google rolled out a Panda refresh on 4/27.  To quickly recap, Panda rolled out on 4/19, then Penguin on 4/24, and then a Panda refresh on 4/27.  Yes, that’s essentially an algo sandwich special, with a side of insanity.  As you can imagine, webmasters that aren’t extremely familiar with SEO could very easily think they were hit by Penguin (since that was the primary topic during the time period).

The Danger of Not Knowing

Since Penguin and Panda target two different issues, it’s extremely important to know the exact algorithm update that hit your website.  Panda targets low quality content, thin content, duplicate content, etc., while Penguin targets webspam (and at this point it’s heavily targeting unnatural inbound links).  So, if you incorrectly believe you were hit by Penguin and start addressing links, then you would be wasting your time…  On the flip side, if you incorrectly believe you were hit by Panda and start addressing low quality content, then you could also be wasting your time.
And to make matters worse, both Penguin and Panda will be rolled out periodically.  That means you won’t know if your latest refinements actually made a difference until Pandas and Penguins come knocking on your door again.  And that is exactly why I wrote this post today.  I’ve had several people mistakenly believe they were hit by Penguin, when it was Panda (or vice versa).  And some were already making changes, based on the wrong assessment.  So, don’t prune your links if you were hit by Panda, and don’t gut content if you were hit by Penguin. Know what hit you, and then act.

How To Determine If You Were Hit by Penguin or Panda


Working in Google Analytics


1)       Check Your Dates

The first thing you should do is launch Google Analytics and drill into Google Organic reporting.  Set the timeframe to April 1st through May 15th.  More on why May is important in a minute.  This will give you a good view of traffic by day during the various algorithm updates.  Remember, Panda was on 4/19, Penguin was on 4/24, and then a Panda refresh rolled out on 4/27.
In the graphs below, you can clearly see that one site was hit by Penguin while the other has been hit by Panda (twice).

A Website Hit by Panda Twice:


A Website Hit by Penguin:

 

 Note: I explained above that you should set your final date to May 15th for a reason.  There has been a lot of chatter recently about another possible Google update.  I first received calls from webmasters on Saturday May, 12th about traffic fluctuations beginning on Friday, May 11th.  Some actually had their traffic bounce back after getting hit by Panda.  Barry Schwartz covered this on Search Engine Roundtable and Google said it was not a Penguin update or a Panda update.  One thing is for sure… there was some type of update.
2) Meeting Panda on a Weekend – Dimension by Keyword and Compare to Past
Now that you know which algorithm update hit you, you can start to determine the keywords that dropped.  Penguin rolled out on a Tuesday, while Panda rolled out on a Thursday, and then followed with a refresh on a Friday!  Since many sites see a natural dip late in the week and on weekends, it’s important to start understanding normal visitor trending, and which keywords potentially were hit.
First, within Google Organic, set the primary dimension to “Keyword”.  This will show you all of the keywords leading to your site from Google Organic during the timeframe.

 
Next, compare the dates after you were hit by Panda or Penguin with a previous timeframe to compare traffic by keyword.  To do this, click the date in the upper right hand corner of the interface and select a timeframe.  If you were hit by Penguin, select 4/24 to 5/15.  If you were hit by Panda, select 4/19 to 5/15.  Then click the checkbox for “compare to past”.  The default comparison will be the number of days immediately prior to the range you selected.  You can change that by selecting new dates to compare, if needed.

 
You will now be presented with all of the keywords leading traffic to the site, along with the percentage of increase and decrease (compared to the previous timeframe).  How awesome is that?  See a keyword drop by 75%, it probably got hit.  Then you can dimension that keyword by “Landing Page” to see which webpage got hit.  Spend some time here… the insights you glean could be incredibly valuable to your recovery efforts.

The Not So Obvious – Google Webmaster Tools and Filters

Although a lot of webmasters are familiar with Google Analytics, I find there are still many who don’t have Google Webmaster Tools set up.  As I mentioned in my post about Avoiding SEO Disaster During a Website Redesign, it’s essential to have GWT set up for your domains.  There is a wealth of information directly from Google… including messages from the Search Giant about the SEO health of your sites.  And yes, Google Webmaster Tools can help you determine which algorithm update hit your site.
1) Search Query Data
There is a tab in Google Webmaster Tools titled “Traffic” that holds a link for “Search Queries”.  This tab reveals the impressions and clicks for queries that returned your webpages in the search results.  Yes, you can see impression data and click data directly from Google properties.  While Google Analytics relies upon a click to your site, this data shows you how many impressions your content is receiving for queries on Google.  For our purposes, we can see the surge or dip in impressions and clicks as the various algorithm updates rolled out.
As you can imagine, this is a great way to see the impact of a certain algorithm update.  The default view is 30 days back, but you can now select a greater time range (up to 90 days).  Again, let’s check April 1st to May 15th to view impressions and clicks.



At this point, you can start to identify impression and click issues. If you were hit by Penguin, then you might see a steep drop-off on 4/24, and then lower levels beyond.  If you were hit by Panda, then you might see a steep drop-off on 4/19, and then again on 4/27 (if you were hit by both updates). Here is data I exported from Google Webmaster Tools for a site hit by Panda twice.



2) Focus on the Problem – Filter by Web
During my analysis of sites hit by Penguin and Panda, I noticed something interesting in Google Webmaster Tools.  For certain sites, using the filters available helped some webmasters hone in on their problem.  There is a “filters” button in the upper left-hand corner of the Search Queries report.  This lets you filter your results based on a number of criteria.  For our purposes, let’s filter by Google property.  Click the dropdown that’s labeled “Search” and choose “Web”.  That will filter your data by web-only searches, and will exclude Images, Video, Mobile, etc.


After doing this, you might see a more pronounced drop during 4/19, 4/24, and 4/27.  It will also enable you to view keywords that dropped from web search without mixing other Google properties in, which can skew the results.  For example, I analyzed several sites that actually received more impressions from Google Images after being hit by Penguin and Panda! Go figure… Removing that data provided a clearer view of the problem.
3) Export Your Data
Although Google Webmaster Tools recently rolled out an update enabling you to view up to 90 days of search query data, you can’t go back further… That means you should export the current data in order to archive it, work with it, and analyze it.  You will notice two buttons labeled “Download this table” and “Download chart data” under the trending graph.  Export your data now.

Summary – You Must Know the Problem in Order to Address It

Based on how Google rolled out Penguin and Panda recently, I’m finding it’s common for webmasters to be confused about which algorithm update hit their websites.  Penguin 1.0 and the latest Panda updates were so close that it’s easy to believe you were hit by one, when in fact, it could have been the other.  Use the techniques I listed in this post to help you determine which update really hit your site.  Then form a plan of attack knowing which cute animal you are dealing with.  Good luck.



How to Survive the Google Penguin Update with Effective Content Writing


If search engine traffic from Google matters to your business, then there is little chance that you haven’t heard of the recent Google Penguin update. What exactly is this?
 Apparently, on April 24, 2012 Google activated new ranking algorithm changes to take care of websites and blogs that indulge in:
·             Excessive link building with no regard for quality
·             Deceptive doorway pages
·             Lots of keyword stuffing
·             Publishing lots of meaningless content just to get traffic from search engines 
Which, basically, means all websites that don’t comply with Google’s SEO guidelines.
In terms of improving search quality, this is a good change. It is also good for businesses and entrepreneurs legitimately trying to get good rankings without the headache of competing with websites that try to game the system.
But, as happens with most “simple” changes like this, there has been some collateral damage. Although Google claims that the new update has affected just 3 percent of websites, there have been multiple declarations across the internet of it causing a bloodbath. People are even going to the extent of laying off their employees and considerably scaling down their businesses. 
Are you one of those negatively affected by the Google Penguin update? If you are, you can salvage the situation by taking corrective measures. If you’re not, you should also take preventive measures so that you do not get caught in the fray the next time something like this happens.
How do you do this? With effective content writing, of course.

What is effective content writing, and how does it help?

Effective content writing provides the true value. It is not done simply to improve your search engine rankings. Although there is nothing wrong in trying to improve your rankings, the problem comes up when you write and publish content for that purpose alone.
The days of cheap and low-cost SEO articles are rapidly going away — thankfully. With its successive updates, Google is trying to push forward content that really deserves its place in the ranking index. In turn, this means pushing down content that doesn’t carry much value: Content that just rambles on will not be ranked well no matter how brilliantly it has been “optimized.”
So how do you create effective content that Google and other search engines love? Here are a few things you can keep in mind while creating content for your website or blog:
·             Use your keywords only when needed: Keywords are great, but don’t over-use them because this will make your content reek of spam. For instance, if I needlessly go on repeating “great content writer” everywhere on my website, not only will I fail to rank well for the phrase, I might even get penalized and removed from the rankings altogether. Use keywords but only when there is a relevant context. Don’t worry too much about keyword-optimizing your copy – just focus on quality and value.
·             Make your content social: Create your content in such a manner that it gains some popularity on social media and social networking websites. This way you don’t have to depend solely on Google for all your traffic. Create compelling and meaningful headlines. Provide content that is bang on target. Develop an original style and focus on quality rather than quantity.
·             Create a resource that is highly useful: An ability to write and publish content is a great privilege. There is so much you can teach and communicate to your audience. Make use of it. Whether you share your own information, or gather it from the internet, make sure you create content that addresses topics your audience is interested in and will have a use for. This will naturally make it irresistible for search engines, bloggers, and social media users, alike.
·             Create content for other websites and blogs: Prepare an editorial calendar for writing articles and guest blog posts that can be published on websites and blogs other than your own. This helps you gain new exposure and earn quality backlinks – just make sure you only offer your content to trusted and reputable content publishers.
·             Create engaging content for online forums and blog comment sections: Online forums are still alive and kicking, and so are blog comment communities. Great interactions go on at these places. There is a misconception that you interact on online forums and blogs just to get backlinks, and when you don’t get those link benefits, there is no use leaving comments there. Yes, sometimes you get some link juice, but even if you don’t, the added exposure you get — and the potential for greater traffic — is well worth the effort.
·             Regularly publish a newsletter: Newsletter publishing still rules the roost, as evidenced by the many quality email marketing newsletter publishing services that have been cropping up. It is the best way of keeping in touch with your readers and subscribers, and once you have built yourself a mailing list of a few thousand subscribers, you can instantly broadcast your ideas and offers to these people without having to rely upon search engine traffic.
·             Maximize your conversion rate: Re-examine your content and see how well it is working to convert your website visitors into customers. A higher conversion rate can compensate for low traffic periods, so look for ways to measure, analyze, and improve your content wherever necessary to make sure that those who do find your site (through search or through other means) are getting what they want from the experience. 
All the points mentioned above will not only help you improve your search engine rankings, they will also strengthen your overall online presence — both on your own blog or website and across the web.

Saturday, May 12, 2012

What is “Google Panda”?

Its a improvement to google’s search algorithm which helps google to fight content robbers and spammers. Remember google caffeine? This update is similar to caffeine update but it might not impact to most of users same like caffeine did



If you asked anyone involved with SEO and SEM what is the biggest SEO update at 2011, the Google Panda update would be no doubt on top of the list. It has raised a lot of debate and quite a few headaches along the way. Soon after the Panda rollout, many websites, including Google’s webmaster forum, became filled with complaints of scrapers/copyright infringers getting better rankings than sites with original content.

Google Panda is a change to the Google’s search results ranking algorithm that was introduced in February 2011. The change aimed to lower the rank of “low-quality sites”, and return higher-quality sites near the top of the search results. There is a surge in the rankings of news websites and social networking sites, and a drop in rankings for sites containing large amounts of advertising. This change reportedly affected the rankings of almost 12 percent of all search results.

To help affected publishers, Google published an advisory on its blog, thus giving some direction for self-evaluation of a website’s quality.

The panda process

Google Panda was built through an algorithm update that used artificial intelligence in a more complicated and scalable way than previously possible. Human quality testers rated thousands of websites based on measures of quality, including design, trustworthiness and speed.

Many new ranking factors have been introduced to the Google algorithm as a result, while older ranking factors like PageRank have been downgraded in importance. Google Panda is updated from time to time and the algorithm is run by Google on a regular basis.

Google Panda affects a whole site’s ranking or particular section rather than just the individual pages on a site.

In addition to other changes, Panda seems to focus on the date of a web page. Some experts think this has adversely impacted sites with lots of “evergreen content”. Because evergreen content usually has an older publication date, Panda seems to reduce its visibility in search results. For searchers looking for in-depth information, many of these evergreen posts are great sources of knowledge on a topic. If these evergreen web pages happen to be on a blog they also often contain a long comment thread with lots of additional, valuable information. In the future Google has not addressed how evergreen pages are listed in search results.

Google Panda v3.2 was released January 14th, 2012 and v3.3 was released in February 29th, 2012. According to the Google these updates were just a “data refresh”, meaning if site was not punished previously just by mistake it will be punished now and if a site was punished wrongly punishment will be removed.

What is Panda targeting?


It would be quite easy to paint a picture of Panda’s main targets. In summary, the Panda update is designed to:

  • Reduce spam
  • Combat sites such as ‘content farms’
  • Improve scraper detection
  • Filter low quality content
  • Close vulnerabilities in its algorithm
The main factors that are being considered, and are clearly important are:

  • Low quality content
  • Excessive advertisements
  • Branding
  • User signals
Who has been affected by Google Panda?

Based on different stories through World Wide Web, the following sorts of websites have been the clear losers:

  • Free classified websites
  • Advertising websites designed to host ‘Ad-sense’
  • Price comparison websites with lean content
  • Travel websites with poor or duplicated reviews
  • E-commerce websites with poor product pages
  • Article websites with low quality or reproduced content
  • Websites with poor usability and branding
It is obvious that the Panda update has not affected the bigger brands, because they are rich and should be safe!

Friday, May 11, 2012

Web Analytics

Web analytics is the process of analyzing the behavior of visitors to a Web site. The use of Web analytics is said to enable a business to attract more visitors, retain or attract new customers for goods or services, or to increase the dollar volume each customer spends.

Web analytics is often used as part of customer relationship management analytics (CRM analytics). The analysis can include determining the likelihood that a given customer will repurchase a product after having purchased it in the past, personalizing the site to customers who visit it repeatedly, monitoring the dollar volume of purchases made by individual customers or by specific groups of customers, observing the geographic regions from which the most and the least customers visit the site and purchase specific products, and predicting which products customers are most and least likely to buy in the future. The objective is to promote specific products to those customers most likely to buy them, and to determine which products a specific customer is most likely to purchase. This can help to improve the ratio of revenue to marketing costs.

In addition to these features, Web analytics may include tracking the clickthrough and drilldown behavior of customers within the Web site, determining the sites from which customers most often arrive, and communicating with browsers to track and analyze online behavior. The results of Web analytics are provided in the form of tables, charts, and graphs.

Benefits of Web Analytics Tool
Monitor the return of investment for campaigns, website development, and maintenance investment.

  • Determine the effective promotional tools that gives trafffic to a website.
  • Be able to know the traffic sources or the referring websites of the website.
  • Identify which features of the website needs improvement or deletion.
  • Be able to know the demographics of the visitors.
  • The most viewed web page from the website.
  • From the web analytics report, be able to evaluate and see the strengths and weaknesses of the website.
  • Correct and intensify the aspects of the website which causes its weaknesses.

SEO Friendly Web Site Design Guidelines

If you are making a website to earn online or want to popular your brand in the world wide web, then it is important to understand the demand of SEO; a technical process to make your website more visible in search engines. To rank into top positioning, it is always recommended that your website design should be friendly to search engine robots / crawlers.

 How to design an SEO friendly website?

We should understand one thing before applying any optimization technique that our objective should be, making our website more user-friendly. But rankings are also essential to get business from the website which can be done by making it friendly with search engine robots. Here are the guidelines -

1.         Give More Importance to Content – Content and content is the only basic food of search engine robots. So, avoid flash animations and images as much as you can. Make your website a keyword rich web site.

2.         Page Load Speed – Try to make your web pages light. If you have heavy data, divide that page into 2-3 pages to get the best load speed. It is also necessary in terms of visitors’ flexibility.

3.         Validate your code and make it more refined.

4.         Browser Compatibility – Check your website design that it should be compatible with all known web browsers like Mozilla Firefox, IE, Google Chrome etc.

5.         Say NO to Frames – Search engine spiders are unable to read frames. So, there is no use having frame based website.

6.         Search engine friendly navigation – Your website navigation should be user-friendly but also search engine friendly i.e. every link placement should be readable by search engine robots. JavaScripts and Flash navigation can’t work if you are optimizing your website to get top rankings.

7.         Alt Tag placement - Make sure to use alt attribute to your all images. Alt tag should be relevant and descriptive to your image. In this way, it will be readable and you can get benefit in getting placement in Search engine image search results.

8.         Convert your dynamic pages into static web pages – If you have a dynamic nature website,  try to get an alternate or opt for a CMS which makes SEO friendly web pages. Not every search engine is able to crawl dynamic web pages.

9.         Provide a Sitemap to your users and search engine robots.

10.       Check broken links and all typographical errors.

If you are interested in having an SEO friendly website design and want to rank high in Google. Please feel free to contact to SEO info Zone !!

Google Penguin & Unnatural Links: How to Protect Your Site !!


The past few weeks have been painful for many hundreds of thousands of site owners following the Google Penguin Update, unnatural link warnings, a parked domain bug, and a host of other Google updates. Businesses have folded and lives have been changed irrevocably – and all because of the immense power Google's algorithm wields.
 For those that have experienced a ranking catastrophe in the last four weeks or so you’ll know only too well how it feels – and why things have changed for good. The web’s ‘Wild West Frontier’ days are over for the right businesses with the right approach to digital marketing. That is a great thing.
The issue, of course, is in dealing with the change. If your business has been either forced into the spammy links game because it was what was required within your niche to rank, or you simply 
trusted your agency to ‘do the right thing’ then you could have an issue. And issues cost money.
It is possible to save your site even if you have a spammy back link profile cluttered with links Google may view as low quality. The goal of this post is to arm business owners with the tools necessary to understand what may have been triggered the drop in Google and how to avoid the same issue coming back to bite you in the future.

Relative Searches

To begin understanding if your site may be at risk, or to take a proactive approach to protecting your organic positions, it has never been more important to properly profile your site’s backlinks.
You learn little from doing this in isolation either as Google’s understanding of how "unnatural" your link profile might be isn't based on a single universal understanding. Instead it's niche, or search specific.
For instance, the backlink profile of a site in the uber-competitive world of car insurance or, similarly, one in a mature and buoyant market where social sharing is ripe (such as parenting) will look very different to each other.
Google uses machine learning to build up a picture of what a profile should look like within each genre/niche and uses that to test sites against on the fly. Once you understand this it is a short step to the realization that you should know precisely what your profile looks like and how it could be perceived.

Balance is Key

So how do you do that and how can it help you should a dreaded unnatural links email fall into your inbox via Webmaster Tools? The best way to answer that is for us to take a look at a real world scenario, profiling one site that received just such an email and the steps that were taken to reach resubmission nirvana.

Stage 1: Profiling

There are a variety of tools out there that can make the job of making sense of the data and our favorites include, but aren't limited to the following.
·          CognitiveSEO 
·          LinkResearchTools 
·          Ahrefs
·          SEOMoz Tools 
·          MajesticSEO 

If you can then benchmark against similar sites/competitors in top five positions you can quickly build up a multi-faceted picture of where your site may be standing out.

Stage 2: What to Look for

Like any problem finding out where the issue may lie requires a segmented approach and we analyze 12 different metrics of the profile including such data as what type of sites do the links come from (blogs, generic, parked domains, web directory, article directory, personal site, ecommerce, forums, press release, wiki, news site, search engine, social network)?
Below we can see real data from the site in question (brand removed for client confidentiality):

Immediately here the number of blogs from our domain (the black column) created cause for concern. Mainly because we know those kinds of links are being hunted right now and the overall profile balance is upended by what appears to be a heavy reliance on blog posts. Such a find would result in an immediate investigation of those specific links and some re-optimization work to reduce the risk of being spotted and chopped.
Next up is developing the understanding of where on the page your links are found (e.g., blog post, forum thread, group of links, short paragraph of text, blogroll, image). It’s another key metric and generally the more links you have in clusters of other links suggests possible paid placement (big frown from Mr. Google).
Below you can see the results of this analysis for the site in question and once again our ‘groups of links’ warning bells sounds and we go off to investigate those specific inbounds:

There are many other ways of testing algorithmically for link or blog networks and one of the easiest ways is for the search engines to look for clusters of links in the web graph. This is where there is a tightly formed ‘cluster’ of sites all linking out to one another and to external sites much more prevalently than is natural. These nodes stand out and are ripe for investigation.
It’s for this reason that keeping an eye on the number of outbound links from sites that link to you is key, as if you find that a blog you have a link from also links to 200 other sites then it’s possible it’s the web equivalent of the escaped convict going to the mall in their orange jump suit.
Thankfully our site didn’t score too badly in this area with a decent spread of outbound links from their link partners.

We could go on but the one critical metric still to share is one that looks at the authority of the pages that link back. We’ve all known for a long time that Page Rank. MozRank and other metrics are a key component of the link algorithm and that is still true today. The emphasis has changed, however, from being one purely about getting ‘as many links as you can quickly’ to one of acquiring topically relevant, high PR links from content.
Sadly the work of old is now catching up with many webmasters as spammy, low value links are pounced on by the Web Spam team.
The key is diversity and as we can see from the live example graph below (showing inbound links by PageRank) there is much work to do in acquiring those higher PR links. With so many PR1-3 links they are once again at risk of raising suspicions. Efforts here should be pointed at the creation of great relationships with the web influencers in their space and in the content required to reach them. That way they will obtain those PR5 and 6 links that are necessary to put them back in that safe place again.

Stage 3: Anchor Text

You might be wondering why this has its own separate section (surely we’ve written enough right) but anchor text is so important it warrants its own moment in the sun.
It is also becoming clear to us that Google’s move to switch off a link signal has not meant they have left a gaping hole in their core algo. It’s not their style. Instead they have moved things on and my own take is that it’s been replaced by a mixture of social signals and, more importantly, greater emphasis on link relevance. And that should change the way you create link prospect lists for good.
The profiling and understanding of anchor text though in this context is perhaps the most critical aspect of avoiding penalties as much as it is for ranking for specific terms in the first place.
To understand a little more let’s go back to our example for a moment.
As you can see below we have taken a snapshot of the first 3,000 links from the site in question and organized the results by anchor text.

As you would expect to see the former agency had been hitting a couple of terms hard with lots of lower value links – hence why in the case of the term with 25% of the anchor text distribution it only has a small share of the overall PR Value.
Controlling the distribution and ensuring that no one term is used too often is a key ingredient in the overall recipe for avoiding issues later on.
The final piece of the jigsaw is link acquisition. Understanding, and monitoring this helps you both avoid penalties and understanding if one has been applied in the first place.
In the first instance checking that the rate at which you acquire links is controlled and natural will help you avoid triggering any algorithmic penalties. The chart below demonstrates this perfectly and suggests that the site in question is prospering. If you see a sudden drop in live links it can suggest devaluation, or deindexing, of some of your links and with it a tasty penalty for your own site…so beware.

The other side of the coin is this horror show – peaky link gains that just shout ‘manipulation’ from the rooftops. Avoid this and you will go some way to protecting your site from the Wrath of Google.

Conclusion

There are two things to remember if you do receive the dreaded email:
·          You are not alone – almost 1 million sites have had the same thing and are in the same position. 
·          It can be fixed.
The key, however, is to understand how to avoid it in the first place and that begins with a full appraisal of your backlink profile.
The penalty is harsh. There is no way around that fact and we have had sites contact us that have ‘not built a link for two years or more’ and yet have still received the email and resulting demotion. It’s an extremely unmeasured response and one that shows this latest filter to be a less-than-polished tool with much still to be improved in terms of its ability to clean up spammy links.
That, of course, is the glass half empty view and we still believe that with the right approach and a fundamentally solid backlink profile full of diversity and relevant links then you will prosper and your business will grow over the long term.
The danger is, of course, that in making the results so volatile Google is at risk of alienating many businesses from spending time and money reaching digital audiences via search, full stop. And that’s a very real and present danger that must be addressed

 
Google+