Wednesday, 11 June 2014

Miscellaneous Techniques



Don't do the followings:
     
   Ø  Don't keep hidden text on your web pages.
   Ø  Don't create alt image spamming by putting wrong keywords.
   Ø  Don't use Meta tags stuffing.
   Ø  Don't use frames and flash on your site.
   Ø  Don't exchange your links with black listed sites.
   Ø  Don't try to fool your site visitors by using miss spelled keyword.
   Ø  Don't send spam emails to thousands of email IDs.
   Ø  Don't use too much graphics on your site.
   Ø  Don't create too many doorway pages.
   Ø  Don't try to create duplicate content of pages.
   Ø  Don't submit your website many times in a single search engine.
   Ø  Don't use sub-directory depth more than 1-2.
   Ø  Don't create too many dynamic pages. Try to convert them into static pages.
   Ø  Don't bloat your pages with code.

Tuesday, 10 June 2014

XHTML Verification for Web Site


XHTML Verification for Web Site

You design and develop a web site but how would you know if you have put all the HTML syntax in correct way. Almost all browser don't complain against your wrong syntax but wrong is wrong. There are many SEO experts who claim that SEO is not dependent on site HTML/XHTML verification. But I will tell you various reasons why your site should be W3C Compliance.

Why HTML/XHTML Verification is Required? There are various reasons to verify your web page before hosting it over the internet.  Any web page quality depends on how well you have written your web page. It should be syntactically correct and should pass all the Quality Gates.  When any Search Engine does indexing for your web page content it might get confused if your HTML tags are not written properly and much of the web page content might not be indexed properly.  There might be many HTML tags which you are using in your web page but then have been depreciated and many of the search engines also don't support them.  Consistency, HTML Code Beauty, Process Compliance are always appreciated by good Webmasters.


What Is W3C Compliance? The W3C is the World Wide Web Consortium and since 1994 the W3C has provided the guidelines by which websites and web pages should be structured and created. Here are the links to validate your web pages:  Validate HTML/XHTML File against W3C Standard HTML/XHTML Validator.  Validate CSS File against W3C Standard CSS Validator.



SEO Optimized Keywords

SEO Optimized Keywords 

                We are discussing everything in Web context so in web terminology a keyword is a term that a person enters into a search engine to find specific information. Most people enter search phrases that consist of between two and five words. Such phrases may be called search phrases, keyword phrases, query phrases, or just keywords. Good keyword phrases are specific and descriptive. There are following concepts related to Keywords which helps in optimizing keywords on a web page.

Keyword Frequency  

              This is calculated as how often does a keyword appear in a site's title or description. You don't want to go overboard with frequency, however, since on some engines if you repeat a word too many times, you'll be penalized for "spamming" or keyword stuffing. In general though, repeat your keyword in the document as many times as you can get away with, and up to 3-7 times in your META tags.
Keyword Weight: This refers to the number of keywords appearing on your Web page compared to the total number of words appearing on that same page. Some search engines consider this when determining the rank of your Web site for a particular keyword search. One technique that often works well is to create some smaller pages, generally just a paragraph long, which emphasize a particular keyword. By keeping the overall number of words to a minimum, you will increase the "weight" of the keyword you are emphasizing.

Keyword Proximity 

             This refers to the placement of keywords on a Web page in relation to each other or, in some cases, in relation to other words with a similar meaning as the queried keyword.

Friday, 27 September 2013

Google Humming bird algorithm


Google has a new search algorithm, the system it uses to sort through all the information it has when you search and come back with answers. It’s called “Hummingbird” and below, what we know about it so far.

What’s a “search algorithm?”

That’s a technical term for what you can think of as a recipe that Google uses to sort through the billions of web pages and other information it has, in order to return what it believes are the best answers.

What’s “Hummingbird?”

It’s the name of the new search algorithm that Google is using, one that Google says should return better results.

So that “PageRank” algorithm is dead?

No. PageRank is one of over 200 major “ingredients” that go into the Hummingbird recipe. Hummingbird looks at PageRank — how important links to a page are deemed to be — along with other factors like whether Google believes a page is of good quality, the words used on it and many other things (see our Periodic Table Of SEO Success Factors for a better sense of some of these).

Why is it called Hummingbird?

Google told us the name come from being "precise and fast"

When did Hummingbird start? Today?

Google started using Hummingbird about a month ago, it said. Google only announced the change today.

What does it mean that Hummingbird is now being used?

Think of a car built in the 1950s. It might have a great engine, but it might also be an engine that lacks things like fuel injection or be unable to use unleaded fuel. When Google switched to Hummingbird, it’s as if it dropped the old engine out of a car and put in a new one. It also did this so quickly that no one really noticed the switch.

When’s the last time Google replaced its algorithm this way?

Google struggled to recall when any type of major change like this last happened. In 2010, the “Caffeine Update” was a huge change. But that was also a change mostly meant to help Google better gather information (indexing) rather than sorting through the information. Google search chief Amit Singhal told me that perhaps 2001, when he first joined the company, was the last time the algorithm was so dramatically rewritten.

What about all these Penguin, Panda and other "updates" — haven’t those been changes to the algorithm?

Panda, Penguin and other updates were changes to parts of the old algorithm, but not an entire replacement of the whole. Think of it again like an engine. Those things were as if the engine received a new oil filter or had an improved pump put in. Hummingbird is a brand new engine, though it continues to use some of the same parts of the old, like Penguin and Panda

The new engine is using old parts?

Yes. And no. Some of the parts are perfectly good, so there was no reason to toss them out. Other parts are constantly being replaced. In general, Hummingbird — Google says — is a new engine built on both existing and new parts, organized in a way to especially serve the search demands of today, rather than one created for the needs of ten years ago, with the technologies back then.

What type of “new” search activity does Hummingbird help?

“Conversational search” is one of the biggest examples Google gave. People, when speaking searches, may find it more useful to have a conversation.

“What’s the closest place to buy the iPhone 5s to my home?” A traditional search engine might focus on finding matches for words — finding a page that says “buy” and “iPhone 5s,” for example.

Hummingbird should better focus on the meaning behind the words. It may better understand the actual location of your home, if you’ve shared that with Google. It might understand that “place” means you want a brick-and-mortar store. It might get that “iPhone 5s” is a particular type of electronic device carried by certain stores. Knowing all these meanings may help Google go beyond just finding pages with matching words.

In particular, Google said that Hummingbird is paying more attention to each word in a query, ensuring that the whole query — the whole sentence or conversation or meaning — is taken into account, rather than particular words. The goal is that pages matching the meaning do better, rather than pages matching just a few words.

I thought Google did this conversational search stuff already!

It does (see Google’s Impressive "Conversational Search" Goes Live On Chrome), but it had only been doing it really within its Knowledge Graph answers. Hummingbird is designed to apply the meaning technology to billions of pages from across the web, in addition to Knowledge Graph facts, which may bring back better results.

Does it really work? Any before-and-afters?

We don’t know. There’s no way to do a "before-and-after" ourselves, now. Pretty much, we only have Google’s word that Hummingbird is improving things. However, Google did offer some before-and-after examples of its own, that it says shows Hummingbird improvements.

A search for "acid reflux prescription" used to list a lot of drugs (such as this, Google said), which might not be necessarily be the best way to treat the disease. Now, Google says results have information about treatment in general, including whether you even need drugs, such as this as one of the listings.

A search for "pay your bills through citizens bank and trust bank" used to bring up the home page for Citizens Bank but now should return the specific page about paying bills

A search for "pizza hut calories per slice" used to list an answer like this, Google said, but not one from Pizza Hut. Now, it lists this answer directly from Pizza Hut itself, Google says.

Could it be making Google worse?

Almost certainly not. While we can’t say that Google’s gotten better, we do know that Hummingbird — if it has indeed been used for the past month — hasn’t sparked any wave of consumers complaining that Google’s results suddenly got bad. People complain when things get worse; they generally don’t notice when things improve.

Does this mean SEO is dead?

No, SEO is not yet again dead. In fact, Google’s saying there’s nothing new or different SEOs or publishers need to worry about. Guidance remains the same, it says: have original, high-quality content. Signals that have been important in the past remain important; Hummingbird just allows Google to process them in new and hopefully better ways.

Does this mean I’m going to lose traffic from Google?

If you haven’t in the past month, well, you came through Hummingbird unscathed. After all, it went live about a month ago. If you were going to have problems with it, you would have known by now.

By and large, there’s been no major outcry among publishers that they’ve lost rankings. This seems to support Google saying this is very much a query-by-query effect, one that may improve particular searches — particularly complex ones — rather than something that hits “head” terms that can, in turn, cause major traffic shifts.

But I did lose traffic!

Perhaps it was due to Hummingbird, but Google stressed that it could also be due to some of the other parts of its algorithm, which are always being changed, tweaked or improved. There’s no way to know.

How do you know all this stuff?

Google shared some of it at its press event today, and then I talked with two of Google’s top search execs, Amit Singhal and Ben Gomes, after the event for more details. I also hope to do a more formal look at the changes from those conversations in the near future. But for now, hopefully you’ve found this quick FAQ based on those conversations to be helpful.

By the way, another term for the “meaning” connections that Hummingbird does is “entity search,” and we have an entire panel on that at our SMX East search marketing show in New York City, next week. The Coming “Entity Search” Revolution session is part of an entire “Semantic Search” track that also gets into ways search engines are discovering meanings behind words. Learn more about the track and the entire show on the agenda page.

Saturday, 16 February 2013

10 SEO Techniques


1. Title Tag
Near the very top of a web site’s source code you’ll find various meta tags — the standard ones being the Title, Description and Keyword tags. The title tag is technically not a meta tag, though it is commonly associated with them. The title tag plays such a large role in the indexing of your web site, that it is considered the most important of the three.
A page title is the first thing a search engine will look at when determining just what the particular page is about. It is also the first thing potential visitors will see when looking at your search engine listing.
It’s important to include a keyword or two in the title tag — but don’t go overboard – you don’t want to do what’s known as “keyword stuffing” which does nothing but make your web site look like spam. Most people will include either the company name, or title of the particular page here, as well.
2. Meta Tags
There are two primary meta tags in terms of SEO — the description and the keyword tag. It’s debatable whether the search engines use the description tag as far as ranking your results. However it is one of the more important tags because it is listed in your search result — it is what users read when your link comes up and what makes them decide whether or not to click on your link.
Be sure to include a few relevant keywords in this tag, but don’t stuff it with keywords either. The description tag should read like a sentence — not a keyword list.
Due to “keyword stuffing” many search engines now completely disregard the keyword tag. It is no longer nearly as important as it was years ago, however it doesn’t hurt to include them in your source code.
When creating your keyword list, you’ll want to think of the specific terms people will type in when searching for a site like yours. Just don’t go overboard — too many duplicates are not a good thing (as in “web designer” “web designers” “custom web designer” “html web designer” “your state here web designer” – you get the idea). Those are all basically the same, so pick one or two variations at the most and move onto the next keyword.
3. Proper Use of Heading Tags
This is a very important element to consider when writing out your site copy. Use of heading tags helps users, web browsers and search engines alike know where the major key points of your copy are.
Your main page title should use the <h1> tag — this shows what your page is about. Use of additional tags, such as <h2> and <h3> are equally important by helping to break down your copy. For one, you’ll see a visual break in the text. But as far as the search engines are concerned, it will automatically know what your topics are on a page. The various heading tags give a priority to the content and help index your site properly.
4. Alt Attributes on Images
Putting alt attributes on your images actually serves two purposes. In terms of SEO, putting a brief yet descriptive alt attribute along with your image, places additional relevant text to your source code that the search engines can see when indexing your site. The more relevant text on your page the better chance you have of achieving higher search engine rankings.
In addition, including image alt attributes help the visually impaired who access web sites using a screen reader. They can’t see the image, but with a descriptive alt attribute, they will be able to know what your image is.
5. Title Attributes on Links
Including title attributes on links is another important step that any good web site will have. That’s the little “tool tip” that pops up when you place your mouse over a link. These are especially important for image links, but equally useful for text links.
As a note, you should use descriptive text for your links. “Click here” doesn’t really tell a person – or more importantly, the search engines — what the link is. At the very least put a title tag that will explain that “Click Here” really means “Web Design Portfolio” for example. Better yet – make the main link text something like “View my web design portfolio” — this will give some value to the link showing that the resulting page is relevant to searches for portfolio’s.
6. XML Sitemap
My Last Post referenced the sitemaps used by web visitors to help them navigate through your site themselves. However, there’s another version — XML sitemaps — that are used by the search engines in order to index through your site, as well.
This list of ALL pages / posts / etc. of your site also includes information such as the date the page was last modified, as well as a priority number of what you feel the most important pages of your sites are. All elements that help the search engines properly find and link to all content of your site.
7. Relevant Content
Having content relevant to your main page or site topic is perhaps the most important SEO aspect of a page. You can put all the keywords you want in the meta tags and alt image tags, etc — but if the actual readable text on the page is not relevant to the target keywords, it ends up basically being a futile attempt.
While it is important to include as many keywords in your page copy as possible, it is equally as important for it to read well and make sense. I’m sure we’ve all seen keyword stuffed pages written by SEO companies that honestly don’t make much sense from the reader’s point of view.
When creating your site copy, just write naturally, explaining whatever information you’re discussing. The key is to make it relevant, and to have it make sense to the reader. Even if you trick the search engines into thinking your page is great — when a potential customer arrives at the site and can’t make heads or tails of your information and it just feels spammy to them — you can bet they’ll be clicking on the next web site within a matter of seconds.
8. Link Building
We’ve probably all heard of Google Page Rank — it seems to be every web site owner’s dream to have as high a page rank as possible. While the algorithm for determining page rank encompasses many elements, and is constantly changing, one item is the number of links pointing to your web site.
Now, you’ll want to steer clear of link farms and other spammy attempts at getting links to your site. However there are many reputable and niche directory sites that you can use to submit your web site, or specific blog articles to.
With genuine content — especially if you have a blog — you’ll be able to generate links with other web sites and blogs, as well. It’s somewhat of a give and take, in that if you link out to other sites, you’ll find sites linking back to you  — and hopefully see your page rank going up, as well!
9. Social Media
Although technically not SEO, Social Media is such a growing factor in getting your web site noticed, that it’s an important element to include in your plan.
Social media ranges from social networks like Twitter, Facebook and LinkedIn — to social bookmarking sites such as Delicious, Digg, StumbleUpon and many more. There is a lot of relationship building involved, but as you build your own networks and build quality content on your web site or blog, you’ll see traffic to your web site increasing, as well.
As with any relationship, it is a give and take. Don’t just expect to join a site like Twitter for the pure sake of pushing your content. That just won’t fly — your true intentions will stick out like a sore thumb and do nothing but turn people off.
Even if you are on the site purely for networking reasons, the key is to make friends. Help out members of your network if they ask for a “retweet” or Digg, give helpful advice if asked, etc. You’ll see the same in return.
If you write a great post and have built meaningful relationships with peers in your  niche, you’ll often find that friends will submit your posts and give you votes on the social bookmarking sites. The more votes you receive, the more likely your post is to be noticed by others and shared around, often resulting in additional link backs from other blogs, etc.
10. A Few SEO Don’ts — Flash and Splash
Along with any list of Do’s come the Don’ts. As far as SEO is concerned, two of these items are splash pages (often consisting of a flash animation) and all flash web sites.
Yes, flash is pretty! Full flash web sites can actually be amazing to look at — their own bit of interactive artwork. But unfortunately the search engines don’t get along well with Flash. Although there is talk of possible advancement in this area, for the most part the search engines cannot read Flash.
All that great content that you wrote for your site will not be seen by the search engines if it’s embedded into a Flash web site. As far as the search engines are concerned, your all flash web site might as well be invisible. And if the search engines can’t see your site content, a good chunk of potential customers will miss out on what you have to offer, too.
Equally as “pointless” are splash pages. Once very popular, the splash page should no longer be an important feature of any site. While splash pages used to serve as an introduction into a web site (often with a flash animation), it is no longer seen as helpful, and often times might actually annoy visitors.
For one — it’s an extra click to get into your content. Worse is when you don’t give a “skip intro” option or set of links into your main site content — because you’re essentially forcing your visitors to sit through the full animation. If you’re lucky, this will only annoy them… if not — they’ll just leave without giving your main web site a shot. And without an html link pointing into your site, the search engines have no way to continue either (unless you made use of a sitemap.xml file — but still…)
A good alternative to both issues is to make use of a flash header. There’s no problem to include a flash animation at the top of your main site, or as a feature within the content area, etc. Because this is an addition to your web site, as opposed to a full separate element.