Get Your Goal

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

Discussion

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

Discussion

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

Way

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

World Wide Web

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

Ambitious

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

Ambitious

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

Saturday, October 6, 2012

Google Penguin Update 3 Released, Impacts 0.3% Of English-Language Queries

Google’s Matt Cutts used Twitter this afternoon to announce that Google is launching the latest “data refresh” of its Penguin algorithm today, and that it will affect searches across multiple languages. Including the original Penguin algorithm launch in late April

0.3% of English queries will be noticeably affected.
~0.4% of Spanish queries.
0.4% of French queries.
~0.3% of Italian queries.

Google Panda Update 20 Released, 2.4% Of English Queries Impacted

Google has confirmed with us that on Thursday, September 27th, they released a Panda algorithm update – this would be the 20th Panda update and thus we are naming it Panda 20. This is a fairly major Panda update that impacts 2.4% of English search queries and is still rolling out.

Late Friday afternoon, Google announced a exact match domain update that removed the chances of a low-quality exact match domain from ranking well in Google. But over the weekend, many non-exact match domain site owners noticed their rankings dropped as well. What was it?

Google confirmed that they pushed out a new Panda algorithm update that isn’t just a data refresh but an algorithm update. Google told us this “affects about 2.4% of English queries to a degree that a regular user might notice.”

There is more to come with this update, where Google promises to roll out more to this Panda algorithm update over the next 3-4 days. Here is the comment Google’s Matt Cutts sent us after asking about this update:

Google began rolling out a new update of Panda on Thursday, 9/27. This is actually a Panda algorithm update, not just a data update. A lot of the most-visible differences went live Thursday 9/27, but the full rollout is baking into our index and that process will continue for another 3-4 days or so. This update affects about 2.4% of English queries to a degree that a regular user might notice, with a smaller impact in other languages (0.5% in French and Spanish, for example).
The confusing part is that there are many sites affected by either this Panda update or the EMD update and it is hard to know which update you were hurt by. For more on this concern, see The Return of the Google Dance.

Panda Update History
We’ve had a string of updates since then, as follows, along with the percentage of queries Google said would be impacted:

Panda Update 1, Feb. 24, 2011 (11.8% of queries; announced; English in US only)
Panda Update 2, April 11, 2011 (2% of queries; announced; rolled out in English internationally)
Panda Update 3, May 10, 2011 (no change given; confirmed, not announced)
Panda Update 4, June 16, 2011 (no change given; confirmed, not announced)
Panda Update 5, July 23, 2011 (no change given; confirmed, not announced)
Panda Update 6, Aug. 12, 2011 (6-9% of queries in many non-English languages; announced)
Panda Update 7, Sept. 28, 2011 (no change given; confirmed, not announced)
Panda Update 8, Oct. 19, 2011 (about 2% of queries; belatedly confirmed)
Panda Update 9, Nov. 18, 2011: (less than 1% of queries; announced)
Panda Update 10, Jan. 18, 2012 (no change given; confirmed, not announced)
Panda Update 11, Feb. 27, 2012 (no change given; announced)
Panda Update 12, March 23, 2012 (about 1.6% of queries impacted; announced)
Panda Update 13, April 19, 2012 (no change given; belatedly revealed)
Panda Update 14, April 27, 2012: (no change given; confirmed; first update within days of another)
Panda Update 15, June 9, 2012: (1% of queries; belatedly announced)
Panda Update 16, June 25, 2012: (about 1% of queries; announced)
Panda Update 17, July 24, 2012:(about 1% of queries; announced)
Panda Update 18, Aug. 20, 2012: (about 1% of queries; belatedly announced)
Panda Update 19, Sept. 18, 2012: (less than 0.7% of queries; announced)
Panda Update 20 , Sep. 27, 2012 (2.4% English queries, impacted, belatedly announced
Previously we used numbers like “Panda 2.2″ or “Panda 3.92,” but this was proving too confusing. That’s why we’ve shifted to a sequential numbering format.

Wednesday, September 19, 2012

Google's Latest Panda Algorithm Update Affects 0.7 Percent of Searches


Google’s announced that another Panda Update is being unleashed on its results, one that it says will impact 0.7% of queries. We’re calling it Panda 3.92, through we’re wondering if it’s time to declare Panda 4.0 upon us.

"Panda refresh is rolling out—expect some flux over the next few days. Fewer than 0.7% of queries noticeably affected," Google tweeted.

The Panda search algorithm update is aimed at spammy and low-quality sites. The initial aim was content farms along with sites that aggregate content but layer a lot of ads on top and others that offer a similarly poor experience.

Since the initial release of the Panda algorithm update, there have been several other updates and tweaks, some with a noticeable impact, but most being smaller improvements and changes.

Thursday, August 30, 2012

How to Survive Google’s Algorithm Updates

There’s only one way out of a deep rut, and that’s to actually work at getting out of it. If you’re going to wait for things to settle down or for other people to do the work for you, you’ll be stuck in the search engine’s penalty mud for a long time.

This is the mentality that SEOs should have with regards to Google and its numerous algorithm updates, particularly Panda and Penguin. We say ignore the name, and we mean that: take the updates at face value and don’t let the label intimidate you so much. It certainly doesn’t mean continuing with widely-practiced strategies that are bound to fail, or underestimating the strength of the updates. It simply means concentrating on the effects they inflict on your website and coming up with SEO services that will work well with them. If all else fails you can hire a team from a reputable SEO company to do the damage control for you.

No Need for War 



The common reactions among webmasters when each of the updates was introduced are along the lines of rebellion, irritation, frustration, anger, and the desire to boycott the search engine. While some of you have no doubt felt the same way, it won’t change the fact that if your website is sub-par, then it will also be perceived by other search engines as sub-par.
Instead of looking for SEO services that will cheat and go around the rules implemented by the Google updates, why not do things the honest way and just work at making your website the best that it can be? After all, what matters is that you can satisfy your visitors just as much as—if not more than—Google and other search engines.

And then, what’s the purpose of generating organic traffic if the copy of your site is poor and not converting to sales?

Must-Dos in the Aftermath of Penguin and Panda

1. High-Quality Content

Google wants to provide the best search experience for their users, and that means they’ll rank up websites that can offer high-quality content that’s valuable and useful for searchers. Those with useless, erroneous, and poorly-written content will be delegated the last spots in the SERPs.

Writing excellent content for your website is one of the best and most effective ways to convince Google your website is high-quality. All written and visual content should be related to the website’s main topic. The information should be correct, articles well-written, and users should be able to trust the credibility of your website. Always refer to Google’s guidelines for building high-quality websites so that you can be certain of the overall quality of your own website. This is also your way of making peace with Panda.

The following are things you can do to improve the overall quality of your website:

- Check the quality of your articles.
- Remove poorly-written pages.
- Correct text errors.
- Avoid keyword stuffing.
- Publish unique, engaging, and informative articles.
- Recognize and reference authority sources.

Producing high-quality content should now be a priority concern although that doesn’t mean you’ll have to leave link building in the dust. Links, after all, are still considered important ranking signals. This leads us to our next item.

2. High-Quality Links

The Penguin update is forcing webmasters to think twice about linking just about everywhere. Webmasters used to acquire as many backlinks as they can, both through white-hat methods and outright spamming strategies (ex: blog spamming, link buying, concealed links, reciprocating links, creating doorway pages). The problem is, Google is now on to these dishonest methods of manipulating page ranks. Thanks to Penguin, websites with excessive and spammy link-building activities will be penalized and demoted in their PR.

Quality over quantity: this should now be your mantra where links are concerned. You can do the following to make sure your website won’t be penalized:

- Cease link-spamming activities, if you’re currently doing it.
- Request lower-quality websites to remove backlinks to your own site.
- Be a guest blogger for authority websites and provide a link to your own site.
- Remove any link that will be deemed manipulative by Google.

Incorporate the links in very good content as much as possible. If you don’t want your links to look like webspam, you need to make them as natural-sounding as possible. Including them in articles is one way to do that.

3. Website Structure

After you’ve improved your website’s content quality and cleaned up your links, you need to have an organized website structure so that they will be properly crawled by search engines. Recall that an organized website structure is also one factor in website ranking. If you neglect to do this, your efforts in improving your website’s pages will not be recognized and indexed right away by Google.






4. Avoid Black-Hat SEO

Basically, if you avoid doing any black-hat SEO, you should be in Google’s good graces at all times. Yes, you may not shoot up to first place in the SERPs right away, but in the long run you’ll be much better off than black-hat practitioners. The updates introduced by Google are here to stay, and scheming your way around it all the time will only bring temporary benefits.
Now isn’t the time to concentrate on technical SEO. Work on your website and truly improve its quality. At the end of the day, users will still be the ones to judge your website; if they are unsatisfied and unconvinced by its quality and credibility, you cannot encourage them to purchase, subscribe, or do anything you would like them to do.

“Fear of a Name Increases Fear of the Thing Itself”

This is a very popular quote from the book Harry Potter and the Sorcerer’s Stone. It seems to be a very apt quote in describing how people are reacting towards Google Penguin, and indeed the Panda. Come to think of it, these two aren’t the only updates to the search engine’s algorithm. Many others remain unnamed. In fact, on the very same month that the Penguin update was first introduced, Google conducted a second Panda update and another unnamed update.

If you want to keep track with algorithm changes, you need to pay more attention to Google’s trend in ranking websites than keeping an ear and an eye out for another Atlantic animal that might be patrolling through the nooks and crannies of your website.

Thursday, August 23, 2012

Google Panda Refresh On August 19th: Version 3.9.1

Google has confirmed they have pushed out a Panda refresh this past Monday.


This updated affected less than 1% of search queries and is a “minor” Panda refresh. We emailed Google, after hearing speculation of a Panda update and Google confirmed it by tweeting it.

Here is that tweet:


Google has said the Panda updates will be smoother and more consistent going forward, unlike the Penguin update, which will be more jolting.

The previous Panda update was version 3.9 on July 24th, so just about a month ago.


Source:- http://searchengineland.com

Wednesday, July 25, 2012

Google Pushing Out Panda Update 3.9 Tonight

Google says it will roll out the latest update to its Panda algorithm later tonight.


The company posted the news a few minutes ago on Twitter, saying this update will affect about one percent of search results.


By our count, this is Panda Update 3.9. The previous update, 3.8, occurred just about a month ago — on June 25th.

Panda rolled out initially in February 2011 and was designed to remove low-quality/thin content from Google’s search results.

Resource: http://searchengineland.com/

Thursday, July 5, 2012

Bing Launches Disavow Links Tool Before Google

A couple weeks ago, Google won the praise of most of the SEO community by announcing they will build a disavow link tool into Google Webmaster Tools.

So when I heard a disavow link tool was built for Webmaster Tools I was excited but then a bit taken back and shocked to hear it was Bing who first launched their disavow link tool. That is right, Bing announced you can disavow links within Bing Webmaster Tools. Bing beat Google to the punch!

Is this more of Bing's attempt to win the hearts of SEOs to maybe impact market share? Possibly but either way, is this a black eye for Google?

Vanessa Fox went into detail on why the Bing's tool might not do anything. She explains that Microsoft's stance of links hurting you is more on the side that Bing says they do not hurt you. So if links do not hurt you, why would you need this tool?

Duane Forrester from Bing tried to explain why:


Is that clearer? Not really.


In any event, Bing beat Google to the game by adding a disavow link tool. But will you use it? Is it necessary? I am sure many will use it on Google but I am not sure about Bing.

Either Google, Bing or other Search Engines we gives update for all Search engine at our official Blog at SEO Infozone. get best updated solutions to make your self update for better sitting between your coligues and market.

Tuesday, June 26, 2012

Official Google Panda Update Version 3.8 On June 25th

Google has announced they pushed out a new refresh to the Panda algorithm recently.

This update “noticeably affects only ~1% of queries worldwide,” said Google on Twitter.


There were earlier rumors of an update over the weekend but Google said the rollout started today and not over the weekend.

The previous Panda update was on June 8th and before that on April 26th. Typically, Google pushes out algorithm updates for Panda and Penguin every month or so. While the last Panda update was just over 2 weeks ago, Google felt they wanted to push out a new refresh.

Google said there were no updates to the algorithm or changes in the signals. This was simply a basic data refresh where they ran the algorithm again.

Monday, June 18, 2012

GOOGLE.COM TAKES YOU TO ANOTHER GOOGLE

Google latest update about specific Google searches as per the country or regions.

As per the Google Inside Search blog we post here the update:

If you find yourself on a country-specific Google homepage, such as www.google.co.uk, you can always easily get back to Google.com by clicking the link below the search box.

Google Web Search is customized for a number of countries and regions across the world. For example, Google.fr provides search results that are most relevant for users in France; Google.co.jp is the Google domain for Japan. We try to direct users to the site that will give them the most relevant results.
Changing your settings

If you'd rather use a different Google site, like Google.com, no matter where you are, try one of the following tips:
  •     Click the Google.com link on any other domain.
  •     Choose a Google domain manually by visiting the Language Tools page (the section with the flags).
  •     Bookmark www.google.com/ncr. This is an alternative web address for Google.com that always takes you to Google.com without redirecting you.
Make sure your browser's cookies are enabled:- Your Google settings are saved in a cookie, so if cookies aren't enabled the preferences you pick won't be saved.



Ref:  http://support.google.com/websearch/bin/answer.py?hl=en&answer=873

Friday, June 15, 2012

Google: Can't Recover From Penguin? Start A New Site

Danny Sullivan published a new story yesterday named Two Weeks In, Google Talks Penguin Update, Ways To Recover & Negative SEO.

In that article, he interviews Google's spam lead, Matt Cutts on ways to recover from the Google Penguin update. There are some solid tips there but scary ones also.

Here is one that is scary for those who were hit by the Penguin update:

If you've cleaned and still don't recover, ultimately, you might need to start all over with a fresh site, Cutts said.

Yes, that is scary for someone who was hit, is trying to frantically make changes but has not seen any recovery. Now, if you have not seen a recovery yet - I wouldn't worry, I don't think they refreshed the update yet, so there wouldn't be any recoveries in my opinion.

But Google is not going to roll this back. Google's Matt Cutts said, "It's been a success from our standpoint." Were there false positives? Few Cutts said, "we've seen a few cases where we might want to investigate more, but this change hasn't had the same impact as Panda or Florida." Very interesting.

Key Take Aways:

(1) Google is not going to roll this update back.
(2) Google says it had less of an impact than Panda or Florida.
(3) Don't take drastic measures yet, do what you can now so when Google does refresh the update, maybe you can break free.

Source: www.seroundtable.com

6 Ways to Boost Your Rankings Using Google Authorship

Google authorship is big on the agenda for a lot of SEOs at the moment, and rightly so considering how heavily Google is looking to push this into search results.

Creating a personal brand is incredibly valuable toward building and strengthening great relationships. After all, you’re always going to have more trust in recommendations from your own circle of friends.

This study from Nielsen strongly backs this up, with 90 percent of people trusting recommendations from people they know. In comparison, only 41 percent trust search engine results!

That stat alone shows the importance of why Google had to become more social. If searchers are given more buying trust in their results, then advertisers are going to get a better return on investment. And Google authorship appears to be a key part of their solution toward achieving this, essentially merging personalized search with your social circle.

So what can you do to capitalize on this?

1. Build a Personal Brand Online

Google have realized just how important social media is as an indicator toward assessing the quality of a website. And because in many ways social media is more about personal branding, (as opposed to company branding) authorship starts to make a lot more sense.

Google doesn't want to just measure the influence of a brand profile on Twitter/Facebook/Google+ – it wants to know about its employees, its writers, and online fans/followers. Treat your own reputation management seriously and look to build a strong profile on key social sites such as Twitter, Facebook, Google+, LinkedIn, etc.

If you’re not an active blogger, you really should start now. It’s never too late! Blogging is a great way to build a personal brand – plus it opens doors to further opportunities such as writing on higher profile sites, conference/event speaking or media interviews, meaning that you can leverage your online profile even further.

2. Setup Google Authorship For All the Websites You Write On
By connecting your Google+ profile with all of the websites you publish content on, it means that your profile is going to be displayed whenever an article/blog post that you’ve written appears in Google’s search results.

On its own this should naturally increase your following – because people will also click through to your Google+ profile as well. So that’s already helping to build your profile and make you more influential.

Plus it’s going to get you extra traffic by positively influencing rankings within your social circle. Which is also another key reason Google promotes this: it pushes large volumes of traffic through to Google+ and gives users a reason to re-visit.

3. Use Google+ Daily
Even if you have to force yourself to do it – try spending a small amount of time each day commenting and sharing your connections content, not just your own, to strengthen those connections.

By making a small effort each day, you'll gradually start to build up a stronger profile.

4. Use Tools & Interact With Influencers in Your Niche
Firstly identify the key influencers in your industry. Tools like FollowerWonk, Topsy and FindPeopleOnPlus are great for this.

Then get their attention – definitely don’t overdo it by stalking them and replying to everything they say. But make the effort to interact with them occasionally and share their content and hopefully this will go both ways.

In Twitter you can build private lists of people you want to interact with more frequently. Why not do the same with a Google+ circle too?

If you can start getting retweets from people with 10,000+ followers it’s likely to help build your social reach. And again, the same rules apply with shares on Google+.

Another great tool is Google Ripples (see this guide on SEOmoz to show how to use it). Basically, it means you can see who the influential people are when sharing a post and how far it has reached. See this screenshot for the recent announcement of Google Drive:


5. Hire Great Writers With Strong Social Profiles
This is one of the most important lessons to learn. Finding the right people is critical. Making mistakes in new hires can set you back massively.

Take the time to find someone who is great, as opposed to someone who “can do the job.” It's much more rewarding in terms of the results you can achieve.

This fits in incredibly well with Google authorship – personally this is something I looked for in a writer beforehand anyway. But now there is even more incentive to find a writer who has a great social profile and can reach out to their own audience and network.

In many ways, authorship is almost a Panda-style algorithm update, meaning that Google will reward quality over quantity. Look for top writers within your industry to hire – forget about the $10-a-go copywriting services and find an actual individual who can help promote yourself my using sites such as the Problogger Job Board or AuthorPress and most importantly reach out to your own blog and social network.

If Google is going to measure the influence of a writer’s social profile when ranking content, you should do the same when looking to hire them too!

6. Meet Bloggers and Writers in Person

Meet people in real life. Go to industry events, find blogger meet-ups – figure out where these guys hangout and get to know them. But do it offline – it’s far more personal!
Recently I analyzed the people I had most frequently interacted with on Twitter using WhoReTweetsMe.com because I wanted to see how many of these people I’d actually met in person. Even I was surprised by the results, I’d met 88 out of the top 100!

Having a personal brand is a great first step – but really getting out there and meeting people is the way to take it to the next level.

What Impact Will Authorship Have to Rankings?

At the moment it’s unclear how much of a direct impact authorship has, or will have, to influencing rankings. However, even if it’s not a ranking factor just yet, Google authorship and Google+ already influences personalized search results – which in itself can boost the rankings for connections within your social circle. This opens up huge opportunities for individuals, because it means Google is likely to reward you personally for being an authority online.

Google are also fully aware that a website’s reputation is no longer just based on links. People will much more commonly tweet a link now than they would blog about it, so to ignore social from their algorithm long-term would make no sense to them if they want to continue providing the most relevant set of search results possible. Which of course they do!

It seems pretty clear that this is the way Google intends on heading. There's no reason not to set up Google authorship. So get in there sooner rather than later.

Ref: http://searchenginewatch.com

How The Latest Google Panda Update Helps Local Search

Local search proves to be very valuable to local tourists, residents and businesses alike. Since Google merged Google+ and its online yellow pages Local Places into one entity (Google+Local), the more definitive search game in local has been modified.





Local Search And Panda 3.7

In line with Google’s aim to provide organized, useful and meaningful search results to people looking for information in the web, local listings and SERP’s (search engine results page) donned a new look and feel that presents a more fine-tuned list of results for the searcher. And with the new Google+Local, all business listings are combined into one that can used across general search, Maps, mobile and Google+ searches. Click on “Local” on the left pane of your Google+ page to search what’s in your neighborhood.

The latest Panda refresh by Google rolled out last Friday, June 8. The fresh update’s effects were followed closely by SEO watchers the world over including those concerned with local search indexing such as local SEO companies. It turned up a notch for assessing local rankings. Google’s algorithm to determine local rankings is codenamed “Venice” and was announced to be part of the Panda 3.3 update last February this year. As per Google’s official statement “This improvement improves the triggering of Local Universal results by relying more on the ranking of our main search results as a signal.”







How To Get Indexed To The Top

The latest Panda update (we’re now experiencing Panda 3.7) aims to present search users a more improved “blended” local results with its new system of finding reliable results from a query specified to a particular city or place. For local businesses this means a better opportunity to be seen provided one has taken care of the basic fundamentals.

With the earlier Venice update on local and the very recently Panda 3.7 refresh last Friday, what does a local business has to do to be on top of the index?

Have a complete and accurate address
Select the proper category/ies for your business
Provide accurate and consistent NAP (name, address, phone number) and other business info such as contact email across major authoritative citation sources (i.e., Localeze, Infogroup, Acxiom)
Get customers to write short reviews about you at new Google+Local
Got a website? It is still crucial for your listing! Traditional SEO still plays a huge role in places search.

Confirmed: Google Panda 3.7 Update

Google has now confirmed there was a refresh late Friday affected less than 1% of queries. Google said, "Panda data refresh started rolling out on Friday. Less than 1% of queries noticeably affected in the U.S. & 1% worldwide." We saw reports of this over the weekend and reported on those rumors. Google has confirmed there was an update.

Over the weekend there was a large spike in webmaster and SEO chatter at WebmasterWorld around Google's shift in the search results.

Many are suspecting there was a Panda update but we do not (DO NOT) have confirmation from Google on this update. In fact, there were rumors early about a Panda update and Google shot us down on that.

If there was an update, which honestly, there are strong signs of, this would be the Panda 3.7 update. The last Panda refresh was Panda 3.6 on April 27th so we are due a new Panda refresh any time now.

There was a Penguin 1.1 update on May 25th or so. So this shuffling is probably Panda related.
One webmaster said:

It could be another Panda update, I have been working on a client site to help him escape Panda. His rankings have just jumped back up to Top 3 on his major keywords where he used to be before Panda got him 2 months ago. So these recent fluctuation and Thursday night events might be Panda.


Past Panda Updates:
Panda 3.6 on April 27th
Panda 3.5 on April 19th
Panda 3.4 on March 23rd
Panda 3.3 on about February 26th
Panda 3.2 on about January 15th
Panda 3.1 on November 18th
Panda 2.5.3 on October 19/20th
Panda 2.5.2 on October 13th
Panda 2.5.1 on October 9th
Panda 2.5 on September 28th
Panda 2.4 in August
Panda 2.3 on around July 22nd.
Panda 2.2 on June 18th or so.
Panda 2.1 on May 9th or so.
Panda 2.0 on April 11th or so.
Panda 1.0 on February 24th.

Source: http://www.seroundtable.com

Friday, June 8, 2012

Penguin Update Spearheads the Spam Raid for Google

Google’s search algorithm is something that is changing constantly. New updates and changes to their algorithm help them provide better search results to users while keeping spam and poor quality content lower on the rankings. Unfortunately for web owners these changes can cause their visitor count to plummet within hours. There have been multiple updates over the past year and each one has both negatively and positively affected thousands of webmasters.


The last major update to Google’s algorithm was known as the Panda update and it was first released back in February of 2011. This update was a massive overhaul and it used an incredibly sophisticated learning AI that looked at the similarities between low and high quality websites and then applied that to the search results. The update was named after Navneet Panda and initially caused quite a stir amongst webmasters because many claimed that it, at first, allowed scrapers to get higher rankings as opposed to quality sites with original content.

Since the initial Panda 1.0 release back in February of 2011, there have been several additional updates to Google Panda. Below is a list with descriptions of each update:

Google Panda Updates
Panda 1.0 Initial Panda release on February 18, 2011 and it affected around 12% of websites because of the new AI.
Panda 2.0After the initial release Google then rolled out Panda 2.0 which went global – much tomany webmasters despair!
Panda 2.1 (aka Panda 3.0) – since Google never really officially announced this update many know this as the 3rd major Panda update since its release so it goes by Panda 2.1 or 3.0. No one knows what exactly went into the update however many saw their monitored niches changing as well as website rankings change.
Panda 2.2 – This was a smaller change to Panda that went into effect around June 18th. While some webmasters were finally recovering and seeing improvements from the initial hit in February, others were still being de-ranked even more!
Panda 2.3 – 2.3 is actually a confirmed update from Google and it occurred on July 22nd. Google stated that this update was using some new signals for their search results filter to distinguish between high and low quality sites.
Panda 2.4 This is simply the previous update (2.3) being applied globally to all languages. The change affected many webmasters negatively with their international sites. The 2.4 Panda update was released around August 12th.
Panda 2.5 – On September 18th another confirmed Panda update by Google.
Panda 2.5.1 – The previous 2.5 update was ‘tweaked’ by Google on October 9th however no further information was released. Many sites hit negatively by the 2.5 update did report a massive recovery though several stated that no change was seen.
Panda 2.5.2 – Another confirmed tweaking of the 2.5 update on the 13th of October. This one confirmed by Matt Cutts from Google via Twitter.
Panda 2.5.3 –This update happened around the 19th-20th of October however it is not confirmed by Google though some webmasters happily reported an increase in traffic.
Panda 3.1 – If you look back to Google 2.1 you’ll see that it was aka as 3.0 hence the reason why there is a jump in numbers between this update and the last. This was update was confirmed on November 18th 2011 and stated that there was a “minor” update to Panda that simply was a data refresh affecting less than a percent of searches.
Panda 3.2 – Google gave webmasters an early holiday gift on the 19th with aminor update. While Google never confirmed the “minor” update they did say that no “major” updates would be done until the New Year.
Panda 3.3 – This update on the 26th was another data refresh to make Panda more accurate – confirmed by Google as well.
Panda 3.4 – This is the latest Panda update and it occurred on March 23, 2012. Google stated via Twitter that this was another “refresh” and that only 1.6% of searches would be affected.
Panda 3.5 – This update took place on April 18, 2012. Initially it was thought of a new algorithm change to fight the webspam, but it turned out to be a new Panda Update.
Panda 3.6 – This update took place on April 27, 2012. It went unnoticed for a week.
As you may have noticed while reading the various Panda updates a LOT of websites were both positively and negatively affected. Most of the sites negatively affected were ones that used a variety of SEO mistakes and low quality content.

So what are the SEO mistakes that have and still are hurting thousands of websites since the Panda update? Mainly over-SEO mistakes such as keyword stuffing, duplicate content, spammed links, backlinks to blacklisted sites, and no diverse anchor texts.

Keyword Stuffing:-
Keyword stuffing is a common mistake that those new to SEO make, and even one that experienced webmasters resort to. Usually the keyword density of a website should be between 2%-5% however those trying to rank higher for a specific keyword will have the keyword density be higher than 5%.
Unfortunately if you use keywords too many times in the content the readers will catch on so webmasters use hidden text, or simply stuff the meta content with the keywords.
If you haven’t been penalized for keyword stuffing yet, then you need to alter your site because search engines will catch on and you can lose rank or even be banned from the search results completely.
Instead of stuffing a few pages and posts with too many keywords, you will need to produce more original and relevant content with the targeted keywords in optimal density levels.


Duplicate content 
This is a huge and ongoing issue for many webmasters and will result in severe penalties on your website should you be found of plagiarizing or copying another site’s content. Ensuring that all of your web pages pass CopyScape is crucial to avoid any banning or penalties by Google.

Spammed Links:-
In an attempt to get backlinks to their sites many webmasters will use tools that will spam out thousands of comments and hope that those comments get approved on blogs. Google, however, has begun to work against this and those wishing to get backlinks should do it in a consistent and natural manner.
  
Rather than spam out thousands of backlinks all at once, get good high quality backlinks consistently.
Those who are guilty of over-SEO may believe that just because they have thousands of backlinks that this is a good thing, however Google and other search engines constantly stress ‘Quality over Quantity’. If you’ve spammed backlinks you may have been penalized by Google however you can slowly work to fix this by getting higher quality backlinks.

Backlinks to blacklisted sites:-
 Another big thing that those using over-SEO commonly do is, in their spamming of links for backlinks, they associate themselves with blacklisted sites. A blacklisted site is one that has been banned from the search results for a severe infraction such as duplicate content, blackhat methods, or some other reason. Unfortunately if your site is associated with a blacklisted site you can easily be penalized for it.

No diverse anchor text:-
 
When trying to get your backlinks out there, many people who do their own SEO use the same type of anchor text over and over and over. This overuse of keywords as anchor texts can end up causing your site to not only lose rank but, as a consequence, your traffic and sales will also drop.


Not sure how to diversify your anchor texts? Fortunately there are a variety of ways to change things up a bit to avoid the harsh penalties by Google. Using misspelled keywords, using actions rather than keywords (i.e “click here!”), using your URL, or even getting out your thesaurus to find some synonyms to use are all excellent ways to create diverse anchor texts and drive more traffic to your website!

Buying or Exchanging Links:-
If you’ve been buying or exchanging links for the sole purpose of increasing PageRank you might have to reconsider that strategy. In the era where you can actually tell Google about the sites that are buying links, and with the Panda Update in place you will definitely get your rankings suffer or even face a ban on Google.

Instead of buying or exchanging links, you should try to make something useful which will bring links to you. Think out of the box, be creative, be smart, and you’ll get the reward, sooner than later.


Conclusion:-
Rule of thumb for all who practice SEO techniques and who want their site to be successful, especially since Google Panda is still consistently being updated, slow and steady wins the race. Google is constantly changing their algorithm, as you can see from the amount of recent updates, and they will continue to work hard to promote quality sites with original high quality information on them. Having patience and self motivation is the key to success here.

Rushing, using duplicate content, spamming your links to every site out there, and associating your site with blacklisted site can end your Internet Marketing career before it can begin!


Update:-
SEL just announced that Google launched a new algorithm which will target webspam and it’s now called the Penguin Update. 

Thanx   

 




Thursday, June 7, 2012

Ways To Improve the SEO friendly website !!


There are Lot of Ways to Improve the SEO friendly Website :-

1. Add a blog: - Adding a blog to any site gives your clients the opportunity to add fresh content easily and regularly. Search engines love fresh content and if you encourage your client to update the blog on at least a weekly basis, or hopefully even more regularly, you will be giving them a distinct SEO advantage. 

2. Keyword Placement:- So, one of the major factors of SEO is telling Google what the page is about. This is done by writing great “user-focused” content. Within this content it’s important to get the keywords in the right position on the page. Here are the best places.


     Title tag
·                       Meta description and keywords
·                       Website slogans
·                       Navigation
·                       Breadcrumb trails
·                       H1, H2 and H3 tags
·                        Bullet points
·                        Alt text
·                        Title attribute on links
·                        The main website copy
·                                                                                                                                                                                                 Internal links
·                                                                                                                                                                                                 Footer links
·                                                                                                                                                                                                  URL’s
·                                                                                                                                                                                                  File / folder names

One thing to remember with the above is, don’t over-do it. Google has become heavily focused on the user so make sure the content is focused at the user; it will also become link-worthy content.

3. Add Google Analytics to each page: - By using such features as Goal Tracking, Event Tracking, bounce rates and Intelligence you can show your client which keywords, search engines and traffic sources are producing sales or leads for their business. You can also use this data as a way of showing them how they can improve their site over time (with your help) through ongoing split testing.

Bonus tip: Sign up for Google Webmaster Tools, which also have a ton of information that you can use to learn more about your keywords and web pages.

4. Reduce code bloat: - Google’s spiders are on the lookout for unique content, and JavaScript and CSS in the HTML code make it harder for them to find it. For example, you should already know that all scripts and CSS files should be added as external files to reduce the time it takes for search engine spiders to find the actual content as well as reducing the code-to-content ratio.

Remember that excessive code not only slows the page’s loading time, but it also increases the possibility of coding errors that, whilst they may have no direct impact on the site’s SEO, may still cause difficulty for the search engine spiders.

5. Make each page unique: - Google ranks the relevance of each website according to the content it contains, and is always seeking relevant content not contained anywhere else on the Internet. This means that the content of every page needs to be completely different not just from any other site on the Web, but also any other page on the same site. This raises the issue of duplicate content, the dread of all site owners.

6. Use Meta description tags: - <meta> description tags are what appear in the search engine results pages – they give the web surfer an overview of what the site is about. Put your marketer’s hat on and write a description that convinces visitors to click on the result. This is your site’s first opportunity to attract visitors, so it’s vital that you give your client the best chance of standing out from the other results.

Remember, Google also uses meta description tags to differentiate web pages (although not as much as title tags) so you also need to be careful to describe each page differently to avoid any duplicate content issues. Including free offers, guarantees and a phone number can improve the click through rate on your clients’ SEO ranking.

Make sure to limit the <meta> description tags to 160 characters in length, including spaces.

7. Remove repetitive wording from the website layout: - It’s worth repeating again that unique content is vital to the success of any site’s SEO. When designing a website layout for a client it is tempting to include information such as copyright text, contact details and maybe even company mottos on every page of the site. If there is not enough unique content on every page then you run the risk of your client’s site being penalised for duplicate content. That’s why it’s important to remove such repetitive wording from the website layout so that the true informational content of the site is not diluted in any way.

8. Add footer links to every page: - Linking between web pages using plain text links, with the target SEO keywords in the anchor text, can provide a significant boost to your clients’ SEO rankings. The problem is that most good website designs use graphical, JavaScript or Flash navigation that don’t use anchor text. If this is the case, you can use footer links to link between your pages, with the keywords you want to rank for within the anchor text of the links.

9. Create a separate web page for each keyword or keyword phrase :- The best way for a website to rank for a particular keyword phrase is to create a web page targeted to that phrase with the keywords in the <title> tag, <meta> description tags, <h1>, body copy and URL. This means that it is critically important to create a separate web page for each product or service that the client sells, as well as category pages if they are needed. A dedicated page for each product or service will also ensure a good user experience as they will land directly on this page from the search engine results, making it much easier for them to buy online or submit a form for more information.

10. Use keyword rich title tags on each page :- <title> tags appear in the title bar of the browser and are one of the factors used by search engines to determine the content of your page. Rather than including the company name in the <title>tag, use the keywords that your client wants to rank for. This will give your client a solid advantage for ranking in the search engines. Make sure to limit the <title> tags to 60 characters in length, including spaces, so that the full text of the title tags appears in the search engine rankings and doesn’t get cut-off by Google.

11.301 Redirects: Research: – pull an inventory of your natural search engine results (in Google use the site: command) and match it up to your most powerful page urls by comparing your statistics. Make sure that you implement a 301 redirect for every indexed page and point requests to the new page location. This needs to be done before moving day.


12. Custom 404 Page:- Realize up front that there are going to be some loose ends. Design a custom 404 “not found” page that is inside the structure of your site and offers a clear menu of options for the human searcher. This will help them engage with your site. Make sure that the site has a clear text link to your sitemap so that search engine spiders can learn about the structure of your site. NEVER automatically redirect bad requests to the home page.

13. Robots.txt File:- Move and update the robots.txt exclusions file to reflect your site structure change. If your private data, images, or testing area has changed locations, make sure to add a line in the file to account for it.

Tuesday, June 5, 2012

Why the Google Penguin Update is Good For SEO


If you’re not familiar with Google’s latest algorithm update codenamed Penguin, you might be perplexed by falling search rankings for your websites. Every now and then, Google changes up their search ranking algorithms to cut down on spam, penalise duplicate content and generally eliminate weak websites from the first page. It’s an ongoing arms struggle between Black Hat SEO artists and Google, one which will probably never be resolved. The general idea behind Penguin is to crack down on underhanded backlinking techniques and reward strong sites by focusing more on content and less on SEO tricks. Here are a few reasons why Penguin is actually a good thing for quality SEO in general.
 Authority Matters More Than Ever
The major focus of Penguin is on backlinks and the manner in which websites garner “link juice” to increase PageRank. Specifically, Penguin places more of an emphasis on the reputation and quality of a site that’s linking to your specific domain rather than the sheer number of links that point in your direction. Basically, this means that SEO technicians won’t be rewarded anymore for taking shortcuts when it comes to link building. Those $10/mth for 2000 back link offers are now not only pointless, but they’re also quite dangerous.

Content is Still King

You’re well within your rights to roll your eyes at the cliché, but content is the lifeblood of the web. If you don’t put out a quality information product, you can’t expect visitors to stick around to be bombarded with irrelevant ads and annoying popups. Penguin incorporates Google’s latest research on Latent Semantic Indexing or LSI deeper into its indexing recipe, which means it’s getting harder and harder to fool the search engine with generic, badly spun articles. The main takeaway with Penguin is clear when it comes to content: if you don’t have time to write something decent, hire somebody who does.

Penguin Rewards Natural Backlinks

Ultimately, there are no real shortcuts when it comes to building solid, all-natural organic backlinks. Reputable SEO experts know this to be one of the primary truisms of the industry. Penguin rewards positive, honest linking practices like mixed anchor text and on-page optimisation at the expense of sneaky tactics like Javascript redirects and cloaking. That’s a good thing both for clients and SEO experts alike. It levels the playing field and gives the good guys a fighting chance against spammers and fly by night marketers that aren’t above resorting to temporary gimmicks to pull in traffic.

It Brings Stability to SEO

Anyone that works in SEO full-time knows that keeping up with the latest changes in Google’s algorithms and endlessly modifying web pages is a drag on productivity. It takes focus away from the real goal of SEO, which should be to help quality websites and businesses attract more eyeballs online. Though the collateral damage that comes with any Google update can be disheartening in the near term, Penguin is ultimately a good thing for ethical SEO professionals. By punishing the many ne’er-do-wells that inhabit the world of online marketing, Google makes life easier for people who play by the rules.

The Breakdown

There’s always going to be a few hiccups that come with any major update to Google’s highly secretive and proprietary ranking system. For the SEO industry in general, all the Penguin update really means is that SEO experts, copywriters, webmasters and marketers will have to step up their game and deliver quality if they want to succeed. If you’ve already been doing that from the start, then you don’t really have anything to worry about. The entire business model of Google is predicated on the goal of serving up only the most relevant results to the end user, and their algorithm changes will invariably pursue this objective. You can either swim with the tide, or get washed up on the shore by not modifying your SEO practices to reflect the current reality of Google’s search algorithms.

Saturday, June 2, 2012

A List of All On-Page and Off-Page SEO Optimization Steps


1. Website analysis = A website analysis is a must for every website owner and if you are an SEO you must know how to do a website’s analysis. A website analysis is more than a checklist in which you check if everything works fine and according to plan or not like content, programming, competition etc.


2. Keyword Research = Keyword research is necessary for a website. It helps in getting best keywords which will help you to get only the targeted traffic from search engines.

3. Bold, Italic effect to main keywords = These tags shows the emphasis given to a particular word. This kind od tagging helps the bots to understand that what the owner want to tell the user and what are the main key points of the content shown in that page, for which it would have higher ranking in SERP results.

4. Canonicalization = Url Duplicacy, if not solved then can harm your website. It happens because one don’t redirects the unnecessary page that show the same data which is somehow considered as duplicate data by search engine bots.

5. Competition Analysis = It is must to understand the market and your competitors. One has to make some plans to top the completion that can only happen after doing the competition analysis about the every step they have taken so far and probable future plans, this helps a lot to perform better in the market.

6. CSS Validation = You website must look good not from outside but also from inside to the bots which crawl your site, so make sure that you have 100% correct CSS, so validate it CSS validators.

7. Google Base Feeds = Better known as FeedBurner, It will help you to have more subscribers from google reader which leads to more and better traffic.

8. H Tags Optimization (Eg: H1, H2, H3) = An important part of SEO, which helps you to make the robots understand that what is more important in your website and what is less.

9. HTML Code Clean Up & Optimization = HTML also needs to be optimize to provide optimal results, one must not create a website full of codes, try to make them as small as you can so that the load time can get better and this also helps in ranking better in SERP results.

10. Image Optimization = It refers to use alt tags properly. Using alt tags with every image will help you to get traffic from image search as well.

11. Hyperlink Optimization = The anchor text used to create hyperlinks of your website always helps you to rank better for those anchor texts, so try to use keywords as your anchor text to create links, internal or external.

12. In depth site Analysis = A deep analysis of your own website is necessary to do. It helps you to detect all the errors and other malfunctioning in your site.

13. Link Validation = All the links must be correct and must not be broken.

14. Meta Description Tags Optimization = You Meta tags help you to rank better but you must also know that how to manage them as there are some limitation and rules that you must follow to get maximum benefits from it.

15. Meta Keywords Tags Optimization = Just Like Meta Description tags Meta Keywords tag is also important most of the people thinks that it does not help in any way but it really does.

16. Navigation & Design Optimization = Your Navigation bar or any other navigation system must be good and user friendly, with it a good design which attracts the eye of the user is must.

17. PR Sculpting = It’s a way to get PR from the websites which already have good PR. One can have the PR juice from good ranking websites by having

18. Robots.txt Optimization = It a file which tell the search engine bot that which place to crawl and which to not.

19. Text Modification Optimization = Text optimization also plays important role in ranking better in SERP. One must apply the keyword density rule to gain more benefits from their site.

20. Title Tag Optimization = A title is most important tag, a good not only attracts the users but also the search bots.

21. URL Rewrite = URL rewrite helps one to increase the visibility of their site by making url understandable in human language which are highly recommended by search engines as well.

22. W3C Validation = With help of W3C you can validate your website to make sure that it is up to dated and working fine according to the latest rules of programming.

23. Broken Links Checking = It’s necessary to find out every error in your website even if it is a broken link. These kinds of error can happen internally or externally which may lead to bad reputation from both users and bots.

24. Directory Submissions = Directory Submission will help you to set the category of your website and to get good free one way backlinks which will help in faster indexation of your website.

25. Extraction of Site Url’s ( Link Level Depth) = It helps in detecting the duplicacy of title and meta tags of those pages.

26. Internal Link Structuring = Internal link structure refers to a structure which let you link the other pages of your website with any other relative pages, these also may serve as backlinks.

27. Link Building ( Link Bait ) = It is a procedure of a reciprocal linking with websites which already have good PR to share the link juice.

28. One way link (PR4 or Greater) = This steps may include all of the off page optimization steps to get one way backlinks, sometimes they are paid and sometimes they are free.

29. Site Back-links count = Try to get more backlinks everyday, more backlinks means better ranking and more traffic which leads to more income.

30. Local Search Engine Optimization = If your business is local then Local search engine optimization can help you a lot. Submitting in local directories and search engine apps will increase your chances to appear more in a local search.

31. Customer Review Submission = This is just like the testimonials but on a large scale, it includes testimonials, customer feedback, their comments etc. You can show this on many pages of your website which will increase customers’ trust in you.

32. H card Integration = hCard is one of several open microformat standards which helps in representing vCard properties and values in semantic HTML or XHTML.

33. Testimonial Submission = Try to get more testimonials from some reputed customers of your which will increase your goodwill.

34. Local Search Engine Optimization = This helps in getting better response from internet users if you have a local business. It is helpful for small businesses like shoe makers, restaurants, pizza shops, marts etc.

35. Google Webmaster Tools account setup & monitoring = It will help you to understand the flaws and errors in your website with many other facilities to be used to make your website a better one.

36. Installing Usability Tools on Website = If there are any tools that can be used by your customers, please provide it to them.

37. Optimization for Multiple Browsers = Making a website look good in all of the web browser is necessary because you don’t want to lose your user just because your layout does not open correctly in some browser.

38. Article Submission = It helps in having a good backlink and also some visitors from good article websites.

39. Blog Comment on Relevant Blogs = Commenting on relevant blogs, increase your popularity as well as provides good backlinks, just make sure that you do not spam.

40. Blog Designing for the website = A blog of your website must look attractive as well it also must be similar to your website, so that the user may not get confused about its ownership.

41. Classified submission = It is a free facility given my many websites mostly forums to promote your website, products on it or any service that you provide for free, but please just don’t spam there.

42. Creating Promotional pages on hubpages, squidoo, etc = Hub pages are good sources to attract more users, keeping the relevancy of the content you submit there can bring more users to your website.

43. Face Book Twitter Marketing = Social Media Marketing (SMM) with Social networks like Facebook and Twitter can also help you to spread the word about your website.

44. Integration of page bookmarking tools = You can also bookmark your bookmarks like creating a linkwheel between them.

45. Integration of page sharing tools = Sharing tools help you to increase your link popularity through those users who like to share the info with other with help of your sharing tools.

46. Paid Submission = One good websites people usually get paid links on targeted pages from  where they get backlink, PR juice and traffic as well.

47. Photo Sharing = Sharing images on websites like flickr, photo bucket etc helps to increase traffic through image searches.

48. PPT Submission = Uploading PPT presentation or pdf files with good knowledge also attracts the users eyes.

49. Press Release = Any small or major change must be released in market to aware your customers, Press release do this work.

50. RSS Feeds = RSS feed helps reader to read your content without any fancy layout and irritating ads and many users use different RSS feed directories to find the content that they like to read and these website also keep track of your every single post which also counts as backlinks.

51. Social Bookmarking = Bookmarking is one of the best way to get good one way dofollow backlinks all you need a good list of bookmarking websites.

52. Video Submission = A good video can make a huge effect on your user, submitting to video websites like YouTube, MetaCafe, Vimeo etc can bring more targeted traffic.

53. Article Writing = It is the same as blog writing but article you write can also be submitted in other article websites or article directories.

54. Blog Writing = By providing good content to users that they like to read about your website, your service or your product, whichever the case is keeps your users stay in your contact.

55. Press Release Writing = Press release helps one to look like more profession, press release helps in spreading the word about your site or

56. Website Spell Check = One must keep their content correct even from the point of view grammar rules.

57. XML Site Map Creation & Submission = XML sitemap helps bots to crawl your website without tackling any difficulties usually created by JavaScript, HTML error and other malfunctions.

58. HTML Sitemap for users = HTML sitemap helps user to find out all the links available on a website and they can go at the desired place.


59. Log file analysis = By keeping a track of who is coming and when is coming you can provide content or do SEO according to that region. It also helps in tracking the incoming of search engine bots and you can know that when to expect them again.

60. Google, Yahoo & Bing Site Map Creation = XML sitemaps helps in getting site indexed much faster by letting the bot to know about all the urls of your website.

61. Deep Indexing Recommendations = Deep indexing helps to get more backlinks from your own website, though as internal links they still help a lot.

62. Check Search Engine Road Blocks = Check if there is any problem for robots to crawl your website, you can check it through use of fetch as bot from Google webmaster tool.

 
Google+