Friday, April 10, 2009

The relationship between SEO and your business:

Websites are quickly becoming one of the most popular ways of advertisements. Whether it may be a business, its products or services or something completely different, everyone of all ages is turning towards the web as a medium of getting their message out there only. With the popularity of this marketing medium increasing and a number of websites always increasing, it is obvious that everyone wants to appear at the top of Google's search engine rankings. Achieving such a goal is not an easy task, however with a bit of perseverance, one can definitely improve their chances of reaching that first page result.

Given that there are a bulk of websites out there who are on the first page, what are their secrets? It is a little industry term called "SEO" and it stands for Search Engine Optimization. SEO basically consists of the customization of your website or the web pages, its content and its internal and external links to assist in the overall indexing and ranking of your website in popular search engines. There are many contributing factors that are used in determining a website's ranking and every search engine is different. This makes trying to optimize your site for Google, Yahoo, Msn, Live, Altavista and the many others quite a painstaking task.
As most of us knows that Google is currently the most popular search engine for the majority of Internet users. As such, it is only normal that we'd want to focus our task on achieving a higher ranking within Google first with the hope that the rest will follow. To do this, we must start a journey that could potentially take monthes before we start seeing any real change, however we have to start somewhere.

Our journey begins by defining some of the key contributing factors that Google uses to determine a website's and webpage's ranking within its results. These factors range from keyword use to manipulating internal and external links and the líst goes on. To get you started, we have listed the top twenty factors that you should focus on in order to help get your website that little bit closer to the top of the search engine results listings.

Keyword Use Factors:

The following components relate to the use of search query terms in determining the rank of a particular page.

A- Keyword Use in Title Tag - Placing the targeted search term or phrase in the title tag of the web page's HTML header.

B- Keyword Use in Body Text - Using the targeted search term in the visible, HTML text of the page.

C- Amount of Indexable Text Content - Refers to the literal quantity of visible HTML text on a page.

D Keyword Use in H1 Tag - Creating an H1 tag with the targeted search term/phrase.

E Keyword Use in Domain Name & Page URL - Including the targeted term/phrase in the registered domain name, i.e. keywordphrase.com plus target terms in the webpage URL, i.e. seoservices.org/keyword-phrase.

Page Attributes :

The following elements comprise how Google interprets specific data about a webpage independent of keywords.

F- Link Popularity within the Site's Internal Link Structure - Refers to the number and importance of internal links pointing to the target page.


G-Topical Relevance of Inbound Links to Site - The subject-specific relationship between the sites/pages linking to the target page and the target keyword.


H- Age of Document - Older pages may be perceived as more authoritative while newer pages may be more temporarily relevant.


I -Relationship of Body Text Content to Keywords - Topical relevance of text on the page compared to targeted keywords.


J- Quality of the Document Content (as measured algorithmically) - Assuming search engines can use text, visual or other analysis methods to determine the validity and value of content, this metric would provide some level of rating.


Site/Domain Attributes


The factors below contribute to Google's rankings based on the site/domain on which a page resides.


K- Global Link Popularity of Site - The overall link weight/authority as measured by links from any and all sites across the web (both link quality and quantity).


L- Age of Site - Not the date of original registration of the domain, but rather the launch of indexable content seen by the search engines (note that this can change if a domain switches ownership).


L- Quality/Relevance of Links to External Sites/Pages - Do links on the page point to high quality, topically-related pages?


M - Link Popularity of Site in Topical Community - The link weight/authority of the target website amongst its topical peers in the online world.


N- Rate of New Inbound Links to Site - The frequency and timing of external sites linking in to the given domain.


Inbound Link Attribute:


These pieces affect Google's weighting of links from external websites pointing to a page and ultimately will assist in the ranking of that page.


O - Anchor Text of Inbound Link.


P- Global Link Popularity of Linking Site.


Q - Topical Relationship of Linking Page.


R - Link Popularity of Site in Topical Community - The link weight/authority of the target website amongst its topical peers in the online world.


S - Age of Link.


Negative Crawling/Ranking Attributes


There are also some points we should make before you start getting hands dirty. With any type of SEO marketing, there are some things that can actually have a negative impact on your ranking. These following components may negatively affect a spider's behaviour to crawl a page or its rankings at Google.

Server is Often Inaccessible to Bots.

Content Very Similar or Duplicate of Existing Content in the Index.

External Links to Low Quality/Spam sites.

Duplicate Title/Meta Tags on Many Pages of the website.

Overuse of Targeted Keywords (Stuffing/Spamming) in the site:

It's now time to get busy! Start prioritizing your tasks, modifying your content and building your internal and external links to meet some of the above guidelines. Keep in mind that improving indexing is mostly a technical task and improving ranking is mostly a business/marketing strategy. What might work now may not work in the future and finally, it takes time. Loads of time. Still, with a bit of trial and error and a good dose of persistence, you can achieve the search engine ranking you're after.

Thursday, April 9, 2009

What is Spamming ?? & of how many types it could be?

Spamming in form can be harmful for your web page or website. Spamming simply means using unethical ways to promote your website on the web, however all the major search engines have the tracking for such types of unethical activities. The vast majority of web users would agree with this statement and nobody would even think of the these types of activities. Even the word itself is infectious in all the worst ways, being used to describe the dark-side and often deceptive side of everything from Email marketing to abusive forum behavior. In the search engine optimization field, Spam is used to describe techniques and tactics thought to be banned by search engines or to be unethical business practices

Types of Spamming:

B-Keyword Stuffing

C-Blogs or Forums Spam

D-Link Farms

E-Cloaking

F-IP Delivery

G-Leader Pages

H-Mini-Site networks

I-Hidden Text

J-Useless Meta Tags

K-Misuse of Directories

L-Misuse of Web 2.0

M-Redirect Spam

N-Email Spam

O-Hidden Tags

P-Organic Site Submissions

Keyword Stuffing: At one time, search engines were limited to sorting and ranking sites based on the number of keywords found on those documents. That limitation led webmasters to put keywords everywhere they possibly could. When Google emerged and incoming links became a factor, some even went as far as using keyword stuffing of anchor text.The most common continuing example of keyword stuffing can be found near the bottom of far too many sites in circulation.

Blogs or Forums Spam : Blogs and forums are amazing and essential communication technologies, both of which are used heavily in the daily conduct of our business. As with other Internet based media, blogs and forum posts are easily and often proliferated. In some cases, blogs and certain forums also have established high PR values for their documents. These two factors make them targets of unethical SEOs looking for high-PR links back to their websites or those of their clients. Google in particular has clamped down on Blog and Forum abuse.

Link Farms : Link farms emerged as free-for-all link depositories when webmasters learned how heavily incoming links influenced Google. Google, in turn, quickly devalued and eventually eliminated the PR value it assigned to pages with an inordinate collection or number of links. Nevertheless, link farms persist as uninformed webmasters and unethical SEO firms continue to use them.

Cloaking : Also known as "stealth(ing)", cloaking is a technique that involves serving or feeding one set of information to known search engine spiders or agents while displaying a different set of information on documents viewed by general visitors. While there are unique situations in which the use of cloaking might be considered ethical in the day-to-day practice of SEO, cloaking is never required. This is especially true after the Jagger algorithm update at Google, which uses document and link histories as important ranking factors.

IP Delivery: IP delivery is a simple form of cloaking in which a unique set of information is served based on the IP number the info-query originated from. IP addresses known to be search engine based are served one set of information while unrecognized IP addresses, (assumed to be live-visitors) are served another.

Leader Pages: Leader pages are a series of similar documents each designed to meet requirements of different search engine algorithms. This is one of the original SEO tricks dating back to the earliest days of search when there were almost a dozen leading search engines sorting less than a billion documents. It is considered SPAM by the major search engines as they see multiple incidents of what is virtually the same document. Aside from that, the technique is no longer practical as search engines consider a far wider range of factors than the arrangement or density of keywords found in unique documents.

Mini-Site networks: Designed to exploit a critical vulnerability in early versions of Google's PageRank algorithm, mini-site networks were very much like leader pages except they tended to be much bigger. The establishment of a mini-site network involved the creation of several topic or product related sites all linking back to a central sales site. Each mini-site would have its own keyword enriched URL and be designed to meet specific requirements of each major search engine. Often they could be enlarged by adding information from leader pages. By weaving webs of links between mini-sites, an artificial link-density was created that could heavily influence Google's perception of the importance of the main site.

Hidden Text: It is amazing that some webmasters and SEOs continue to use hidden text as a technique but, as evidenced by the number of sites we find it on, a lot of folks still use it. They shouldn't.

There are two types of hidden text. The first is text that is coloured the same shade as the background thus rendering it invisible to human visitors but not to search spiders. The second is text that is hidden behind images or under document layers. Search engines tend to dislike both forms and have been known to devalue documents containing incidents of hidden text.

Useless Meta Tags: Most meta tags are absolutely useless. The unethical part is that some SEO firms actually charge for the creation and insertion of meta tags. In some cases, there seems to be a meta tag for virtually every possible factor but for the most part are not considered by search spiders.

In general, StepForth only uses the description and keywords meta tags (though we are dubious about the actual value of the keywords tag), along with relevant robots.txt files. All other identifying or clarifying information should be visible on a contact page or included in the footers of each page.

Misuse of Directories: Directories, unlike other search indexes, tend to be sorted by human hands. Search engines traditionally gave links from directories a bit of extra weight by considering them links from trusted authorities. A practice of spamming directories emerged as some SEOs and webmasters hunted for valuable links to improve their rankings. Search engines have since tended to devalue links from most directories. Some SEOs continue to charge directory submission fees.

Misuse of Web 2.0: An emerging form of SEO spam is found in the misuse of user-input media formats such as Wikipedia. Like blog comment spamming, the instant live-to-web nature of Web 2.0 formats provide an open range for SEO spam technicians. Many of these exploits might even find short-term success though it is only a matter of time before measures are taken to devalue the efforts.Search engine optimization spam continues to be a problem for the SEO industry as it tries to move past the perceptions of mainstream advertisers. When under-ethical techniques are used, trust (the basis of all business) is abused and the efforts of the SEO/SEM industry are called into question. Fortunately, Google’s new algorithm appears to be on the cutting edge of SEO Spam detection and prevention. Let’s hope 2006 is the year the entire SEO industry goes on a Spam-free diet.

Redirect Spam: There are several ways to use the redirect function to fool a search engine or even hijack traffic destined for another website! Whether the method used is a 301, a 302, a 402, a meta refresh or a java-script, the end result is search engine spam.

Email Spam: Placing a URL inside a "call-to-action" email continues to be a widely used of search marketing spam. With the advent of desktop search appliances, email spam has actually increased. StepForth does not use email to promote your website in any way.

Hidden Tags: There are a number of different sorts of tags used by search browsers or website designers to perform a variety of functions such as: comment tags, style tags, alt tags, noframes tags,and http-equiv tags. For example , the 'alt tag' is used by site-readers for the blind to describe visual images. Inserting keywords into these tags was a technique used by a number SEOs in previous years. Though some continue to improperly use these tags, the practice overall appears to be receding.

Organic Site Submissions: One of the most unethical things a service-based business can do is to charge clients for a service they don't really need. Charging for, or even claiming submissions to the major search engines are an example. Search engine spiders are advanced enough to no longer require site submissions to find information. Search spiders find new documents by following links. Site submission services or SEO firms that charge clients a single penny for submission to Google, Yahoo, MSN or Ask Jeeves, are radically and unethically overcharging those clients.

Search Engine Spider Behaviour :

Search engines use an online software to browse the web, looking for new sites to index. The software is referred to as a spider or crawler or the worm. Most search engine spiders have the same level of technology as an early version of Search engines.

What does that mean to us? It means that the spiders can't perform JavaScript functions,images frames and generally can't spider image maps. Spiders can't see text that is contained in graphic images, although some of them read and use the Alt tag assigned to the image. If you think about how the navigation is setup on your site, do you have rollover images in your navigation or drop down lists? Search engines may not see those interior pages. They require text links to get into the deeper content your site.

Have you ever noticed how some designers always have text links at the bottoms of their pages? One good reason to do that is so that your site is in compliance with the American Disability regulations since text readers also can't perform Javascript functions. But the best reason is to get the spiders into the site. 508 compliance has become an important aspect of Web design, and goes hand in hand with search engine optimization.

The text on the pages should be HTML text. Text that is contained in images can't be seen by the spiders, so the content won't be reported back to the search engine.

The goal is to have the search engine spider send back information on every page of your site.

Spiders also browse through the directory structure of your site. It might not be a good idea to give them access to certain areas such as your CGI bin directory. Particular meta tags give instructions to search engine spiders but they are mostly ignored. The best way to keep spiders out of things you don't want them in is to use a Robots.txt file.The Robot.txt file includes those file name which we want to hide from the search engine. This file is kept on the server level, and is the first thing that search engine spiders look for when they access your site. Take a look at your Web site statistic program. If you don't have a Robots.txt file, take a look at your error section and see if it is listed as a "file not found". If you have one, it should be listed under your accessed files section. One of the best way to keep your site search engine friendly is by making the use of XML site map.

Saturday, April 4, 2009

What are the steps which we have to follow for an SEO process?

The search engine optimization process involves changing or optimizing a website such that it become search engine friendly and appears on the top in SERPS for the search engine for the targeted keywords.
Search engine optimization has gained a much of importance today as it allows you to market your product on internet and drive the business and traffic to it. As well planned Search engine optimization strategy can help you to gain your desired business. There are a number of steps for SEO process but we are recommending to you the most important and necessary ones.

SEO process is divided into the following steps, these steps has proven and always given the desired results to our clients.
These SEO steps are :
A- Website analysis.
This steps comprises of Website review, Competitor analysis, Current site positioning. The initial analysis serves as a benchmark and used to track the progress as we proceed in our SEO process. This step involve the in depth analysis of our site which we are going to optimize for search engine. We will check the current site positioning in the SERPS , its search engine friendliness and the current traffic our site is gaining.

A- Client requirements.
This step involve the understanding of the client requirements before starting any SEO work.
This step comprises of reason for SEO, Clients expectations and goal settings .In this step we will identify the purpose behind the SEO being done for the site. The client expectations could be like : to increase traffic , high search engine ranking , increase business , increase popularity. Once we came to know about the client expectation then we can work accordingly.

B- Keyword research :
This step involve the Initial keyword listing, Keyword list expansion, finalizing the keyword list. Search engine drive traffic to your website if you will achieve the top ranking for your targeted keywords.


C- Content Writing :
This step involve the content writing process after the keyword research process. Search engine spider consider the content as an important for ranking a webpage or to the whole website.

D- Website Optimization process:
In this step , we will develop a search engine friendly website as an essential part of onpage optimization process. Developing a Search Engine Friendly website is an essential part of on-page optimization.It Involve new Navigation Plan, HTML Code optimization, Removal of unnecessary Flash and JavaScript, On-page updates like Title Tags, Meta Tags, Internal Links, Headings, Images and ALT Tags Addition of Site Map .
The Website Optimization step involves a number of sub-steps and processes which are carried out on the website pages to make them Search Engine Friendly. We propose and implement a new navigation structure for the website if there are problems with the existing website navigation plan which can cause problems with the Search Engines. Sitemap is incorporated to the website, and the Title Tags, Meta Keywords and Meta Description Tags are added to the website pages. Internal linking of the website is improved and heading tags and ALT Tags are added to the web pages. On-page optimization is carried out to make the website Search Engine Friendly and easy to navigate by the search engine spiders.


E- Submission of the Website:
This step involve the submission of your website to web directories is a good way to obtain oneway links. Anchor Text creation Submission to Web Directories Search Engine Submission . The submission process allows the website to be submitted to some of the most prominent search engines and directories on the Internet. Submission to these directories and search engines allows quick indexing of the website and its pages.

F- Link Building :
This step involves finding quality link partner , contacting to them, replacing dead links. Links can also help drive traffic to your website from other web pages on the Internet. Finding quality link partners Contacting link partners Replacing old or removed links. Link Building is the most important step in off-site optimization of a website. Finding quality link partners for a website can be a tedious and time onsuming task. We hand pick quality sites and contact them to finalize the link deal using appropriate anchor texts in the incoming links.

G- Reporting:
This step comprises the information which we have to inform to the client about our work progress. Reporting allows us to show our progress to the client on a regular basis. Keyword Ranking Reports Website Traffic Reports Link Popularity Reports Targets and Goals Achievement

What is an seo that is search Engine Optimization ?

Before knowing the seo it becomes mandatory to know what the Search Engine. So, the search engine is an online application which is used to process our search request according to our request parameters. This application works based on the algorithm specially designed for such type of applications. What's a search engine really works? Search engine examines it's own database for searching requested query first. Apart from this there are many other simultaneous task being done by search engine, and based on the result of these different simultaneous task it creates the view which is shown to the user.
Now, the search engine optimization is a process which is done to make a web page search engine friendly. So, that it faces no difficulty in reading these web pages from the web and show them in the top searches of the result. The SERP position depends upon the various factors, so if we want our pages in the top of these SERP than we have to work according to some rules and emphasis on some of the major points which are recommended by the major search engine like Google, Yahoo, Msn. So in order to make our page search friendly we have to follow some guidelines recommended by the experts.