Saturday, January 2, 2010

Wordpress Plugins To Monitor, Cache And Speed Up Your Blog

1. WP System Health

WP System Health can display basic server info and Wordpress PHP memory usages. Besides these, the plugin will shows the PHP information that related to Wordpress and Wordpress database table utilization status. By using the WP System Health, bloggers can easily detect the Wordpress system misconfiguration and memory race issues. Wordpress blog’s administrators can install this plugin to easily track the system health status and help them easily identify the problem.
Wordpress System Health Plugin

2. Hosting Monitor Plugin

Hosting Monitor is a Wordpress plugin developed by WebHostingSearch.com. This plugin will display the system information of your current hosting server. For example, you can read the uptime, load averages, current users, hardware information and memory usage. This information will be display in your sidebar widget while the administrator has the right to control which information is to be hidden or shown.
Hosting Monitor Plugin
3. WP super Cache

WP super Cache is a well-known caching plugin for Wordpress. After installing the plugin, the system will generate static HTML pages and serve your visitors with static contents instead of the dynamic one. By using cache, you can reduce your server memory usage, bandwidth and also your hosting bill. Furthermore, static pages also decrease the loading time, which means your visitors will load the content faster! So, blogs that always hit Digg.com’s frontpage are encourage to activate this plugin.
WP Super Cache
4. IP Host Blocker

This plugin can redirect unlogged in users depending on their IP addresses. Administrators can also use this plugin to redirect users from a certain proxy to a customized page. By using this plugin, administrators can block anonymous visits to the blogs. Blocking the anonymous visits will save your bandwidth and also the server usage. A good example of using this plugin is to monitor the spammers IP address and redirect them before they comments on your blog. By doing this, you can save your database space besides the bandwidth sad server usage.
IP Host Blocker
5. Throttle

Throttle is a Wordpress plugin that can limit the bandwidth and scripts load. It uses API to monitor and report server load to the plugins and themes. Plugin and theme developers can then uses the Throttle API to limit the non-essential features when the load of the system is peak. This plugin is useful if your blog often receives traffic spikes or always hits the popular social media frontpages.
Wordpress Throttle Plugin
6. Error Reporting

This plugin can log errors and record them in a file or send the errors as email notifications to the site administrators. Ping errors will also be displayed in the administrator dashboard widget. Administrators can always check the reports for any errors and fix them immediately while the errors occur.
Error Reporting
7. Amazon S3 for WordPress

This plugin allows you to use Amazon S3 service to host your media for your Wordpress blog. amazon S3 is a cheap and effective way to handle traffic spikes on your blog. All you need to use this plugin is an Amazon S3 account and a PHP5 server environment. After you setup your account and plugin, you can upload and browser files hosted at your Amazon S3 account. You can even create folders and the generated thumbnails are stored at your Amazon S3 bucket too. That means, you can manage your Amazon S3 files without leaving your Wordpress admin dashboard.
Amazon S3 for Wordpress
8. http:BL WordPress Plugin

http:BL WordPress Plugin allows bloggers to verify the visitors IP addresses with Project Honey Database. It helps to determine whether your visitors are email harvesters, comment spammers or other malicious creatures. By activating and verifying their IP addresses, you can block potential harmful visitors from accessing your website content. So, you can save your server bandwidth and also the memory usage.
http:BL WordPress Plugin
Essential Tips And Tricks To Optimize Your Wordpress Blog
1. Combine and compress Javascript and CSS files

Browser can only download limited items at the same time. If you have multiple Javascript files to be loaded, then the browser will send the request individually and queue the files to be downloaded. So we can resolve this problem by adding the names of the other files to the URL of the first file. For example, you may have:
http://www.domain.com/js/jquery.js
http://www.domain.com/js/jquery.slider.js
http://www.domain.com/js/jquery.tooltip.js

You can actually combine these files into a single file:
http://www.domain.com/js/jquery.js,jquery.slider.js,jquery.tooltip.js

The server will know that the page requires multiple JS files, and it will concatenate the requested files, compress it and send is as one file to the browser. You can read more from the source of the article.
2. Remove unnecessary PHP executions

You should replace the Wordpress PHP tags with static HTML so that the number of PHP executions and database calls reduced. For example, you can replace the with your blog’s description, instead of calling the Wordpress template tag each time your visitors visit your blog. Another example of Wordpress template tag that can be replaced with static HTML is the

Most of the free or even premium themes use the Wordpress template tags, since these tags help the developers to easily deploy the templates across different sites. But you can easily replace the template tags by viewing your source code and edit them by using on if your favorite text editors.

More information can be found on the source article.
3. Optimize CSS and Javascript files

The CSS and Javascript files can be optimized to reduce the file sizes. For example, the extra blank line and unused CSS classes should be removed. By optimizing the file sizes, you can effectively reduse the server bandwidth and save your diskspace too. There are online tools which able to help you perform optimization to your Javascript and CSS files. CSSOptimizer and Javascript Compressor are among the best optimizers.
4. Optimize your images

If you use a lot of images on your blog, then you should consider optimizing your images to reduce the file size and increase the loading speed. We did covered the image optimization tools before, which you can either optimize the images online, or download the applications and optimize them before upload to your server.
5. Move the Javascript calls to the footer

Server can only load a certain limit of items at the same time. If you are using too many Javascripts in your Wordpress site, then you should compress it using the first technique that we covered earlier. Besides compress the scripts, you should also move them to the footer/bottom of the page so that the content will be served first. Imagine, if you use a lot of script tags, visitors have to wait a long time before they can read your content, and visitors may leave your site if they take too long to load.
6. Use Google Ajax Libraries API

Google Ajax Libraries API is a content distribution network and loading architecture for popular Ajax libraries. You can get a lot of benefits by using the Google Ajax Libraries. For example, the caching of these files can be done correctly by Google. Google has the distributed CND at various locations, so these files are “close” enough to your visitors and indirectly reduce the loading time.

For first time visitors, they can load your page faster since these scripts may already load in their PC. The browsers will automatically detect the version number and will try to load the scripts locally.
7. Remove unnecessary plugins

Wordpress plugins make calls to database, which will slow down the loading speed. Bloggers are encourage to deactivate unnecessary plugins so that the number of database calls can be reduced

Deactivate the unnecessary plugins can also remove the unwanted hooks. Hooks are provided by Wordpress to allow the plugin to ‘hook into’ the rest of Wordpress and unnecessary hooks will slow down your Wordpress site.

Yahoo seo interview

Interview Transcript

Eric Enge: Can you talk a little bit about the role that links play in Yahoo's ranking algorithms? Eric Enge: Can you talk a little bit about the role that links play in Yahoo's ranking algorithms?

Priyank Garg: Sure. Priyank Garg: Sure. There's a lot of mythology in the industry sometimes around how links work. There's a lot of mythology in the industry sometimes around how links work. Links are a way for us to understand how Web sites and other people on the Web are recognizing other content that they have come across. Links are a way for us to understand how Web sites and other people on the Web are other content Recognizing that they have come across. The anchor text from that indicates context for the content that it's linking to, and we have used this information in our algorithms for many, many years to better address search queries as they come into the search engine. The anchor text from that context for the content indicates that it's linking to, and we have used this information in our algorithms for many, many years to better address search queries as they come into the search engine.

So links are important, but anchor text is just as important. So links are important, but anchor text is just as important. What we look for are links that would be naturally useful to users in context, and that add to their experience browsing on the Web. What we look for are links that would be naturally useful to users in context, and that add to their browsing experience on the Web. And links of that nature, which are organic, will survive when the user comes across them and interests him. And links of that nature, Which are organic, will survive when the user comes across them and interests him. Those are the kinds of links that we are trying to recognize, identify, and attribute to the target content. Those are the kinds of links that we are trying to recognize, identify, and attribute to the target content.

Eric Enge: Right. Eric Enge: Right. So part of what you are pointing at there is that relevance matters a lot. So part of what you are pointing at there is that relevance matters a lot. So getting a link from the bottom of a WordPress template that you create and distribute is completely irrelevant. So getting a link from the bottom of a WordPress template that you create and distribute is completely irrelevant.

Priyank Garg: Exactly, that's the kind of thing that we are trying to do all the time. Priyank Garg: Exactly, that's the kind of thing that we are trying to do all the time. The irrelevant links at the bottom of a page, which will not be as valuable for a user, don't add to the quality of the user experience, so we don't account for those in our ranking. The irrelevant links at the bottom of a page, Which Will not be as valuable for a user, do not add to the quality of the user experience, so we do not account for those in our ranking. All of those links might still be useful for crawl discovery, but they won't support the ranking. All of those links might still be useful for crawl discovery, but they will not support the ranking. That's what we are constantly looking at in algorithms. That's what we are constantly looking at in algorithms. I can tell you one thing, that over the last few years as we have been building out our search engine and incorporating lots of data, the absolute percentage contribution of links and anchor text to the natural ranking of algorithms or to the importance in our ranking algorithms has gone down somewhat. I can tell you one thing, that over the last few years as we have been building out our search engine and Incorporating lots of data, the absolute percentage contribution of links and anchor text to the natural ranking of algorithms or to the importance in our ranking algorithms has gone down somewhat.

New sources of data and new features that Yahoo! New sources of data and new features that Yahoo! has built and developed have made our ranking algorithm better. has built and developed have made our ranking algorithm better. Consequently, as a percentage contribution to our ranking algorithm, links have been going down over time. Consequently, as a percentage contribution to our ranking algorithm, links have been going down over time. I believe that is somewhat attributable to people abusing links on the Web. I believe that is somewhat attribute to people abusing links on the Web. As that happens, the net quality of links goes down, and the net contribution directly goes down too. As that happens, the net quality of links goes down, and the net contribution goes directly down too. However, we're still working hard to make sure all the high-quality links are effective in providing us information we need to search on queries. However, we're still working hard to make sure all the high-quality links are effective In providing us information we need to search on queries.

Eric Enge: So from a mind-set point of view it sounds like you are much more focused on the high-quality links, because they are less noisy as a ranking signal? Eric Enge: So from a mind-set point of view it sounds like you are much more focused on the high-quality links, Because they are less noisy as a ranking signal?

Priyank Garg: Exactly, that's right. Priyank Garg: Exactly, that's right. If we take all the links together including the noise, the percentage contribution to our ranking system goes down, because we are discounting the noise more effectively over time. If we take all the links together including the noise, the percentage contribution to our ranking system goes down, because we are discounting the noise more Effectively over time.

Eric Enge: Right. Eric Enge: Right. I understand, but the links are still a very significant factor even now. I understand, but the links are still a very significant factor even now.

Priyank Garg: Yes. Priyank Garg: Yes. They continue to be a very significant factor. They continue to be a very significant factor.

I'm saying that people and site owners should think about the site in all aspects of the user experience, and not obsess about links as the only thing that drives traffic to them. I'm saying that people and site owners should think about the site in all aspects of the user experience, and not obsess about links as the only thing that drives traffic to them. Links are critical factors, good organic links are earned through great content and great value that will add to the users' visibility on search engines. Links are critical factors, good organic links are earned through great content and great value that will add to the users' visibility on search engines. But they can do a lot of things in parallel that will also make the search engine visibility better and beyond the search. But they can do a lot of things in parallel that will also make the search engine visibility better and beyond the search.

Eric Enge: Yes, indeed. Eric Enge: Yes, indeed. So do you picture the role of links that will continue to decrease as new rankings do? So do you picture the role of links that will continue to decrease as new rankings do?

Priyank Garg: We are not focused on doing that. Priyank Garg: We are not focused on doing that. This is a developmental process and you might have anything happen, right? This is a developmental process and you might have anything happen, right? It somewhat depends on how the Web evolves. It somewhat depends on how the Web evolves. For example, if tomorrow there is a whole turnaround and all spammy links are shutdown, we might suddenly have a link signal noise quality go up so much that they might increase in importance. For example, if tomorrow there is a whole turnaround and shutdown are all spammy links, we might suddenly have a link quality signal noise go up so much that they might increase in importance. I don't want to make predictions about what will happen in the future. I do not want to make predictions about what will happen in the future. All I can say is that we have seen that as we enhance our algorithms with a lot of other features, which we have been building, they have been contributing a lot of information to supplement the links. All I can say is that we have seen that as we Enhance our algorithms with a lot of other features, Which we have been building, they have been Contributing a lot of information to supplement the links. As a percentage contribution to the ranking function, links are relatively less than they were in the past. As a percentage contribution to the ranking function, links are relatively less than they were in the past.

Eric Enge: So it sounds like the process in which you are evaluating the noisiness of the signal is used to attenuate its impact on the results. Eric Enge: So it sounds like the process In which you are evaluating the noisiness of the signal is used to attenuate its impact on the results.

Priyank Garg: That's right. Priyank Garg: That's right.

Eric Enge: That's a very interesting concept. Eric Enge: That's a very interesting concept. If link prices were going up at the same rate that gas prices are going up and people just stopped doing it, then the signal quality would improve and its importance would improve. Link if prices were going up at the same rate that gas prices are going up and people just stopped doing it, then the signal quality would Improv and its importance would improve.

Priyank Garg: Yes. Priyank Garg: Yes. Our algorithms are evolving constantly; we are making changes to our systems multiple times a week. Our algorithms are constantly evolving, we are making changes to our systems multiple times a week. Some of the changes are minor. Some of the changes are minor. Most of them are so minor that we don't even need to talk about it, but we are constantly evolving our system to keep up with the data on the Web that is also evolving. Most of them are so minor that we do not even need to talk about it, but we are constantly evolving our system to keep up with the data on the Web that is also evolving. In search engines, the data is the key part of the output. In search engines, the data is the key part of the output.

The data and the algorithm constantly evolve. The data and the algorithm constantly evolve. The Web evolves, new tools come into play, new ways of interacting with users comes into play. The Web evolves, new tools come into play, new ways of interacting with users comes into play. It is a constant evolution all the time, and we adapt our algorithms to what we are seeing on the Web to make sure that our end goal of user relevance is optimized. It is a constant evolution all the time, and we adapt our algorithms to what we are seeing on the Web to make sure that our end goal of user relevance is optimized.

Eric Enge: Sure. Eric Enge: Sure. Ultimately, that's what it is all about. Ultimately, that's what it is all about. So, what kind of non-link based signals do you use? So, what kind of non-link based signals do you use?

Priyank Garg: Well, we have lots of data sources that we are recognizing all the time. Priyank Garg: Well, we have lots of data sources Recognizing that we are all the time. We build understandings of how a site lays out its contents; what's the distribution of the quality of the content; what's the spamminess of the content on the site; what is the spamminess of an individual page; what is the spamminess of a site in aggregate; what's the emphasis on words on the page; what's the context of anchor text of the page? We build understandings of how the site lays out its contents, what's the distribution of the quality of the content, what's the spamminess of the content on the site, what is the spamminess of an individual page, what is the spamminess of a site in aggregate , what's the emphasis on words on the page, what's the context of anchor text of the page? There are so many factors in there, hundreds and hundreds of elements. There are so many factors in there, hundreds and hundreds of elements.

Eric Enge: Right. Eric Enge: Right. As for off page factors, one example of something is the level of adoption on social media sites, like del.icio.us for example. As for off page factors, one example of something is the level of adoption on social media sites, like del.icio.us for example.

Priyank Garg: All of those factors are a part of it. Priyank Garg: All of those factors are a part of it. To give you a general answer, the elements of the locations that provide the most signals are the ones where users are taking active steps to recognize the value of content, whether it be through links they have created on their clean Web pages, or through social media sites like del.icio.us. To give you a general answer, the elements of the locations that Provide the most signals are the ones where users are taking active steps to recognize the value of content, Whether it be through links they have created on their clean Web pages, or through social half sites like del.icio.us.

So every location where the incentive is aligned for user value is the place where it matters most. So every location where the incentive is aligned for user value is the place where it matters most. If there's a good Web site with reviews for a product and it's generated by users, which have no incentive except to help other users, then those links would be valued more. If there's a good Web site with reviews for a product and it's generated by users, Which have no incentive except to help other users, then those links would be valued more.

That would come out through the algorithm because of the quality of that site itself. That would come out through the algorithm Because of the quality of that site itself. If that site is used by users and they value it, that will represent itself on the rest of the Web, and the quality of the site will propagate down to the sources that it links as well. If that site is used by users and they value it, that will represent itself on the rest of the Web, and the quality of the site will propagate down to the sources that it links as well.

Eric Enge: An incredibly important thing to consider is the type of relationships you are trying to develop in your market. Eric Enge: An incredibly important thing to consider is the type of relationships you are trying to develop in your market. It becomes really important to focus on the truly authoritative type of sites in your space, and I can make the argument without even bringing search engines into it, but it sounds like it is a very smart way for a SEO to think about promoting his/her site. It becomes really important to focus on the truly authoritative type of sites in your space, and I can make the argument without even bringing it into search engines, but it sounds like it is a very smart way for a SEO to think about Promoting his / her site.

Priyank Garg: That's right. Priyank Garg: That's right.

Eric Enge: Very interesting. Eric Enge: Very interesting. Now, as you have already alluded to, there are, unfortunately, a lot of people out there generating spam type tactics, ranging from old-fashioned things like hidden text to purchasing links and these kinds of things. Now, as you have already allude to, there are, unfortunately, a lot of people out there generating spam type tactics, ranging from old-fashioned things like hidden text and links to purchasing these kinds of things.

Priyank Garg: Yes. Priyank Garg: Yes.

Eric Enge: So, what are the kinds of things that Yahoo typically does to fight spam? Eric Enge: So, what are the kinds of things that Yahoo typically does to fight spam?

Priyank Garg: We use algorithmic and editorial means to fight spam. Priyank Garg: We use algorithmic and editorial means to fight spam. What we have found is algorithms are very effective at fighting spam at the large scale, and our human editors are very effective at recognizing new techniques and providing us that early signal, which we can use to scale-up the detection processes. What we have found is algorithms are very effective at fighting spam at the large scale, and our human editors are very effective at new techniques and Providing Recognizing that early signal us, Which we can use to scale-up the detection processes. This two step approach helps us to be recognized as one of the best in the industry. This two step approach helps us to be recognized as one of the best in the industry.

We show the least spam among the search engines, because both of our techniques are in action. We show the least among the search engines spam, Because both of our techniques are in action. Our spam detection techniques run on every page, every time we crawl it. Our spam detection techniques run on every page, every time we crawl it. Those detection algorithms are fed directly into our ranking function, where the spam detection is actually pretty high in importance. Those detection algorithms are fed directly into our ranking function, where the spam detection is actually pretty high in importance.

Eric Enge: Yes, I guess the editorial function, which isn't quite scalable, probably gets directed by the algorithmic detection of things that just smell bad, and by where people are reporting problems. Eric Enge: Yes, I guess the editorial function, Which is not quite scalable, probably gets directed by the algorithmic detection of things that just smell bad, and by where people are reporting problems.

Priyank Garg: Yes it is. Priyank Garg: Yes it is. Yahoo! Yahoo! specialists who are doing all the editorial efforts are people who are great experts in this and they sometimes are ahead of our algorithms in detecting these things. specialists who are doing all the editorial efforts are people who are great experts in this and sometimes they are ahead of our algorithms in detecting these things. Some times, the algorithms point out suspicious things, which they look at. Some times, the algorithms point out suspicious things, Which they look at. There is a knowledge that builds up over time about what looks suspicious, which only humans can detect in the beginning. There is a knowledge that builds up over time about what looks suspicious, Which only humans can detect in the beginning.

Then, we use that to go to the next level of quality in our spam detection. Then, we use that to go to the next level of quality in our spam detection. Both of those mechanisms of algorithmic detection – followed by editorial follow-up or editorial detection followed by algorithmic follow-up - are in action all the time. Both of those mechanisms of algorithmic detection - followed by follow-up editorial editorial or algorithmic detection followed by follow-up - are in action all the time. Ultimately, the way to scale up the response is to build algorithmic ways to detect things on every page and every crawl. Ultimately, the way to scale up the response is to build algorithmic ways to detect things on every page and every crawl. So everything that our editors do is constantly being mirrored by our spam team as quickly as possible in the algorithms. So everything that our editors do is constantly being mirrored by our spam team as quickly as possible in the algorithms.

Eric Enge: Yes, I understand. Eric Enge: Yes, I understand. Do you have a situation where editors have the ability to take manual action if they see something extreme? Do you have a situation where editors have the ability to take manual action if they see something extreme?

Priyank Garg: Our editors are authorized to take action for various kinds of situations like DMCA or legal deletes, such as for markets like France, where there is a restriction of certain types of content, such as Nazi memorabilia, which other markets don't have. Priyank Garg: Our editors are authorized to take action for various kinds of situations like DMCA legal or deletes, such as for markets like France, where there is a restriction of Certain types of content, such as Nazi memorabilia, Which other markets do not have. Consequently, there are various tools that are available to them. Consequently, there are various tools that are available to them. They are not focused on saying, I need to find a million pages of spam and remove them in this month. They are not focused on saying, I need to find a million pages of spam and remove them in this month.

We can use it to have our algorithms learn, we can use it to address it directly, we can use it to reach out to the webmaster and warn them that it might not be meeting our guidelines, so we do what is right for the users as best we can. We can use it to have our algorithms learn, we can use it to address it directly, we can use it to reach out to the webmaster and warn them that it might not be meeting our guidelines, so we do what is right for the users as best we can.

Eric Enge: Right. Eric Enge: Right. And sometimes, you are going to spot something which looks like a mistake. And sometimes, you are going to spot something which looks like a mistake. That's the kind of the scenario where you might reach out to someone and ask if they know that they have hidden text that is kind of objectionable. That's the kind of the scenario where you might reach out to someone and ask if they know that they have hidden text that is kind of objectionable.

Priyank Garg: Yes, exactly. Priyank Garg: Yes, exactly. The point is that we don't want to hurt people who may be doing things innocuously or starting to cross the line without being aware. The point is that we do not want to hurt people who may be doing things innocuously or starting to cross the line without being aware. And our clear intent is not to explicitly remove spam from our index. And our clear intent is not to explicitly remove spam from our index. Our goal is to affect the ranking and reflect the relevance appropriately. Affect Our goal is to reflect the ranking and the relevance appropriately.

There is a query out there for which each page is relevant, and so the completion of our goal requires our algorithm to keep all the content we can, even the spammy ones. There is a query out there For which each page is relevant, and so the completion of our goal requires our algorithm to keep all the content we can, even the spammy ones. Of course, that's something that becomes egregious on resources, and then sometimes, we have to make other choices. Of course, that's something that becomes egregious on resources, and then sometimes, we have to make other choices. However, if there is a page that is generally okay but has some spamming techniques, someone might search for that URL, and as a search engine we want to make sure we have the most comprehensive experience we can. However, if there is a page that is generally okay but has some spamming techniques, someone might search for that URL, and as a search engine we want to make sure we have the most comprehensive experience we can.

But if someone goes out there and creates a hundred million spam DNS hosts, that's just a waste of resources and we may not choose to take that approach. But if someone goes out there and creates a hundred million spam DNS hosts, that's just a waste of resources and we may not choose to take that approach. In principle, our desire is to keep as much of the Web available to users on our search engine, to rank it appropriately. In principle, our desire is to keep as much of the Web available to users on our search engine, to rank it appropriately.

Eric Enge: Right. Eric Enge: Right. So, if you saw a page that has some good content, but there are some spammy practices on it, it affects ranking as opposed to indexing? So, if you saw a page that has some good content, but there are some spammy practices on it, it affects ranking as Opposed to indexing? When it crosses a certain line of resource consumption, we may change the approach. When it crosses a line Certain of resource consumption, we may change the approach. But that is the intent, yes? But that is the intent, yes?

Eric Enge: Of course, another way you can end up with pages that are not high quality is if you have large sites, which perhaps have enormous reams of really good content, but because of their size, you may end up with pages that are thin in unique content. Eric Enge: Of course, another way you can end up with pages that are not high quality is if you have large sites, Which perhaps have enormous reams of really good content, but Because of their size, you may end up with pages that are thin in unique content.

You might also have a database of some addresses, so when you look at it it's relatively thin in content, but then again, someone might actually be searching for the address of a particular business that's on that page. You might also have a database of some addresses, so when you look at it it's relatively thin in content, but then again, someone might actually be searching for the address of a particular business that's on that page.

Priyank Garg: Yes. Priyank Garg: Yes. Again, as we work we've tried to make sure that our algorithms are constantly trying to optimize experience for the largest number of queries that we get from users. Again, as we work we've tried to make sure that our algorithms are constantly trying to optimize experience for the largest number of queries that we get from users. The net information content that's available to users and ranks for queries is what we are looking at. The net information content that's available to users and ranks for queries is what we are looking at. If the page has unique content, whether it is from a large site or small site, it may mean that it's less useful for most of the queries. If the page has unique content, Whether It is from a large site or small site, it may mean that it's less useful for most of the queries.

Eric Enge: Right. Eric Enge: Right. Yes, indeed. Yes, indeed. So, what about just paid links in general? So, what about just paid links in general? What's your policy on that? What's your policy on that?

Priyank Garg: There's no black and white policy that makes sense in our mind for paid links. Priyank Garg: There's no black and white policy that makes sense in our mind for paid links. The principle remains value to the users. The principle remains value to the users. If a paid link is not valuable to the users, we will not want to give it value. If a paid link is not valuable to the users, we will not want to give it value. Our algorithms are being organized for detecting value to users. Our algorithms are being organized for detecting value to users. We feel most of the time that paid links are less valuable to users than organic links. We feel most of the time that paid links are less valuable to users than organic links.

But that's not black and white, it is always a continuum. But that's not black and white, it is always a continuum. Yahoo! Yahoo! continues to focus on the element of recognizing links that are valuable to users, building mechanisms in our algorithms that attenuate the signal and capture as much value from that link in context, rather than worrying about it being paid or unpaid. continues to focus on the element of Recognizing links that are valuable to users, building mechanisms in our algorithms that attenuate the signal and capture as much value from that link in context, rather than worrying about it being paid or unpaid. As I said before, paid links are found to be generally less useful to users. As I said before, paid links are found to be generally less useful to users. That's how we try to capture that aspect of it. That's how we try to capture that aspect of it.

Eric Enge: Right. Eric Enge: Right. So now I would like to talk a little bit about some of the common meta tags that have emerged in the past couple of years: NoIndex, NoFollow meta tags, and NoFollow attributes, and robots.txt. So now I would like to talk a little bit about some of the common meta tags that have emerged in the past couple of years: noindex, NoFollow meta tags, and NoFollow attributes, and robots.txt. In particular, the context I'd like to talk about is limiting the flow of what I will call link juice, or stopping a page from being crawled, or stopping it from being an indexed. In particular, the context I'd like to talk about is limiting the flow of what I will call link juice, or stopping a page from being crawled, or stopping it from being an indexed.

Let's take them one at a time and talk about how you handle NoIndex? Let's take them one at a time and talk about how you handle Noindex?

Priyank Garg: NoIndex on the page essentially means that none of the content on that page will be searched or will be indexed in our search engines. Priyank Garg: Noindex on the page essentially means that none of the content on that page will be searched or will be indexed in our search engines. If there is a page with a meta NoIndex on it, that page will not be recalled for any of the terms in its HTML. If there is a page with a meta Noindex on it, that page will not be Recalled for any of the terms in its HTML.

Eric Enge: Right. Eric Enge: Right. Now let us say lots of people link to this page, which is NoIndexed and those are good relevant links, and then the NoIndexed page turns around and links to some other pages with good relevant content; is that NoIndexed page passing link value to those other pages? Now let us say lots of people link to this page, Which is Noindex and those are good relevant links, and then the Noindex page turns around and links to some other pages with good relevant content; is that Noindex page passing link value to those other pages?

Priyank Garg: We do index a page and we will show its URL in search results if it is very heavily linked to the Web, even if it has a NoIndex tag on it. Priyank Garg: We do index a page and we will show its URL in search results if it is very heavily linked to the Web, even if it has a Noindex tag on it.

That is something that is a behavior that we follow. That is something that is a behavior that we follow. That's been essentially applicable to situations where the page itself is high value, and it has many links that are very relevant to a particular query as indicated by anchor text. That's been essentially applicable to situations where the page itself is high value, and it has many links that are very relevant to a particular query as indicated by anchor text.

Eric Enge: Right. Eric Enge: Right. I guess there is sort of a threshold in which the links indicate a high enough demand for that page's content that it's hard to not have it in the index. I guess there is sort of a threshold in Which the links indicate a high enough demand for that page's content that it's hard to not have it in the index.

Priyank Garg: Exactly. Priyank Garg: Exactly. So in that particular case, we will have the URLs show up in the search results, but there will be no abstract. So in that particular case, we will have the URLs show up in the search results, but there will be no abstract. And the URL would show up only because of the anchor text; it will not show up because of any terms on that page. And the URL would show up only Because of the anchor text, it will not show up position because any terms on that page.

Eric Enge: Right. Eric Enge: Right.

Priyank Garg: We do currently show pages which have a NoIndex if anchor text recommends that. Priyank Garg: We do currently have a show Pages which Noindex recommends that if anchor text. We also discover links from a NoIndex page and pass the link weights and anchors to destination documents. We also discover links from a Noindex page and pass the link weights and anchors to destination documents.

Eric Enge: So can we talk a little bit about NoFollow meta tags and NoFollow attributes? Eric Enge: So can we talk a little bit about NoFollow meta tags and NoFollow attributes?

Priyank Garg: Yes, so NoFollow meta tags mean that we will not use the links on a page as a attribution, but you may use them for discovery. Priyank Garg: Yes, so NoFollow meta tags mean that we will not use the links on a page as to attribution, but you may use them for discovery. The same thing applies for the NoFollow attribute on a link. The same thing applies for the NoFollow attribute on a link.

Eric Enge: Right. Eric Enge: Right. So the anchor text and the vote represented by the link for a given page are ignored if it's NoFollowed or it's on a page that has the NoFollow meta tag, but you will still look through the page and use it for discovery and potentially indexing if there are other reasons to index it. So the anchor text and the votes represented by the link for a given page are ignored if it's NoFollow or it's on a page that has the NoFollow meta tag, but you will still look through the page and use it for discovery and potentially indexing if there are other reasons to index it.

Priyank Garg: Yes. Priyank Garg: Yes. Exactly. Exactly.

Eric Enge: Yes, that makes sense. Eric Enge: Yes, that makes sense. And then lastly, robots.txt? And then lastly, robots.txt? Say somebody uses robots.txt who don't crawl a page, is it still possible for that page to get into the index? Say somebody who do not use robots.txt crawl a page, is it still possible for that page to get into the index?

Priyank Garg: Yes. Priyank Garg: Yes. If robots.txt files says don't crawl, we will not crawl, we will not even try to retrieve that page at all from our crawling. If robots.txt files says do not crawl, we will not crawl, we will not even try to retrieve that page at all from our crawling. But if the anchor text to that URL, as discovered on the Web, indicates a strong preference for it to show up for certain queries, it might show up. But if the anchor text to that URL, as discovered on the site, indicates a strong preference for it to show up For certain queries, it might show up.

One example in the past was the library of congress had a robots.txt denying crawling, but we still had that page show up because it was what users wanted for that query. One example in the past was the library of congress had a robots.txt denying crawling, but we still had that page show up Because it was what users wanted for that query. So it will only show up when lots of anchor text on the Web suggest that this page on this particular query is relevant to that query. So it will only show up when lots of anchor text on the Web suggest that this page on this particular query is relevant to that query.

Eric Enge: Right, okay so that makes perfect sense. Eric Enge: Right, okay so that makes perfect sense. So if you can't crawl the page because it said don't crawl it, then it's hard to show a search snippet, for example, right? So if you can not crawl the page Because it said do not crawl it, then it's hard to show a search snippet, for example, right?

Priyank Garg: Yes, we won't have a search snippet for that page. Priyank Garg: Yes, we will not have to search snippet for that page. We won't even be showing the title of the page; the title we show will be generated by other information sources. We will not even be showing the title of the page, the title we show will be generated by other information sources.

Eric Enge: Right. Eric Enge: Right. So all that makes sense and what it ranks for is really driven by the source of the links? So all that makes sense and ranks for what it is really driven by the source of the links?

Priyank Garg: Exactly. Priyank Garg: Exactly.

Eric Enge: That same phenomena could be ascribed to other technologies which just aren't in practice crawled, like Flash files for example. Eric Enge: That same phenomena could be ascribe to other technologies just are not in practice Which crawled, like Flash files for example.

Priyank Garg: Adobe Flash files are somewhat different, because that's not always what happens there. Priyank Garg: Adobe Flash files are somewhat different, Because that's not always what happens there. We do have an ability to crawl the HTML of the page, and they might give us an HTML title with the description of that page. We do have an ability to crawl the HTML of the page, and they might give us an HTML title with the description of that page. There might be a version of content that might be available for crawlers for Flash. There might be a version of content that might be available for crawlers for Flash. So there is another thing playing into it, but if there is nothing on the page except a link to a Flash files, then the other off-page factors will be what drive the visibility of that URL in the search results. So there is another thing playing into it, but if there is nothing on the page except a link to a Flash files, then the other off-page factors will be what drive the visibility of that URL in the search results.

Eric Enge: Right, and then perhaps a PDF file? Eric Enge: Right, and then perhaps a PDF file?

Priyank Garg: We are actually able to convert PDF files to HTML. Priyank Garg: We are actually able to convert PDF files to HTML.

Eric Enge: So you can actually look at the text inside the PDF file, and process that, and use that for context and everything else? Eric Enge: So you can actually look at the text inside the PDF file, and process that, and use that for context and everything else?

Priyank Garg: That's right. Priyank Garg: That's right. We can also do that with Word files, Excel files, and many other formats. We can also do that with Word files, Excel files, and many other formats.

Eric Enge: That's, of course, something that has evolved pretty significantly over the past few years, because not so long ago nobody was looking inside PDFs. Eric Enge: That's, of course, something that has evolved pretty Significantly over the past few years, Because not so long ago nobody was looking inside PDFs.

Priyank Garg: Yes we continued to evolve our quality of tools to look into PDF files, and there are efforts that have been going on over the last few years, so that has been an evolving area as well. Priyank Garg: Yes we continued to evolve our quality of tools to look into PDF files, and there are Efforts that have been going on over the last few years, so that has been an evolving area as well.

Eric Enge: Is there any reason for a publisher to fear that issuing content in the form of a PDF file wouldn't rank as well or get exposed to as many terms as the same content rendered in HTML? Eric Enge: Is there any reason for a publisher to fear that issuing content in the form of a PDF file would not rank as well or get exposed to as many terms as the same content rendered in HTML?

Priyank Garg: That's a tough one to say, and the reason is that my sense is users link less to PDF content or non-HTML content, just because it's somewhat slower to view. Priyank Garg: That's a tough one to say, and the reason is that my sense is less users link to PDF content or non-HTML content, just because it's somewhat slower to view.

Consequently, what effects are playing into the visibility of this could be multi-variant. Consequently, what effects are playing into the visibility of this could be multi-variant. I wouldn't make a blanket statement about HTML being equivalent to PDF because user attribution and other factors do play out to be different on the Web for different formats of context. I would not make a blanket statement about being equivalent to PDF HTML user Because attribution and other factors do play out to be different on the Web for different formats of context. So that is something that the publishers will need to think about. So that is something that the publishers will need to think about.

Eric Enge: Right, I understand. Eric Enge: Right, I understand. So just to step back a second to the NoIndex, NoFollow and robots type stuff. So just to step back a second to the noindex, NoFollow and robots type stuff. The notion has been discussed in many circles on the Web of what people call link juice sculpting. The notion has been discussed in many circles on the Web of what people call link juice sculpting. Using tools like the NoFollow attribute a little more explicitly to show what you think is important versus which ones you don't think are important. Using tools like the NoFollow attribute a little more explicitly to show what you think is important versus which ones you do not think are important. And so a classic example is, you have a Web site and you have your contact us, about us page, and legal disclaimer page linked to from every page of the site. And so a classic example is, you have a Web site and you have your contact us, about us page, and legal disclaimer page linked to from every page of the site. What your thoughts on that kind of sculpting? What your thoughts on that kind of sculpting?

Priyank Garg: It's interesting that this discussion is described in that context. Priyank Garg: It's interesting that this discussion is described in that context. A NoFollow tag creates an alternative state of attribution, but if you think about it, it's not very different from not linking to those pages. A NoFollow tag creates an alternative state of attribution, but if you think about it, it's not very different from not linking to those pages. When you link to a page, you are saying something about it. When you link to a page, you are saying something about it. When you don't link, that's also an implicit comment, either you didn't know about the page, or you didn't think it was useful. When you do not link, that's also an implicit comment, either you did not know about the page, or you did not think it was useful.

So if you think about link juice sculpting, this targeting of link attribution existed even before the NoFollow tag, where you could link and you could not link to something. So if you think about link juice sculpting, this targeting of link attribution Existed even before the NoFollow tag, where you could link and you could not link to something. Now you have an intermediate stage such that: Now you have an intermediate stage such that:

1. you can link without NoFollow you can link without NoFollow
2. you can link with NoFollow you can link with NoFollow
3. you can not link. You can not link.

So, it's not something that is entirely out of the blue, it's just an intermediate stage that's created; and it's not anything terribly new. So, it's not something that is entirely out of the blue, it's just an intermediate stage that's created, and it's not anything terribly new. You should always make sure you link to content that's useful to users and if you link to the right content, that will work best. You should always make sure you link to content that's useful to users and if you link to the right content, that will work best.

One of the things Yahoo! One of the things Yahoo! has done is look for template structures inside sites so that we can recognize the boiler plate pages and understand what they are doing. 've done is look for sites inside template structures so that we can recognize the boiler plate pages and understand what they are doing. And as you can expect, a boiler plate page like a contact us or an about us is not going to be getting a lot of anchor text from the Web and outside of your site. And as you can expect, like a boiler plate to contact us page or an about us is not going to be getting a lot of anchor text from the Web and outside of your site. So there is natural targeting of links to your useful content. So there is natural targeting of links to your useful content.

We are also performing detection of templates within your site and the feeling is that that information can help us better recognize valuable links to users. We are also performing detection of templates within your site and the feeling is that that information can help us better recognize valuable links to users. We do that algorithmically, but one of the things we did last year around this time is we launched the robots-NoContent tag, which is a tool that webmasters can use to identify parts of their site that are actually not unique content for that page or that may not be relevant for the indexing of the page. We do that algorithmically, but one of the things we did last year around this time is we launched the robots-tag nocontent, Which is a tool that webmasters can use to identify parts of their site that are actually not unique content for that page or that may not be relevant for the indexing of the page.

If you have ads on a page, or if you have navigation that's common to the whole site, you could take more control over our efforts to recognize templates by marking those sections with the robots-NoContent tag. If you have ads on a page, or if you have navigation that's common to the whole site, you could take more control over our Efforts to recognize templates by marking those sections with the robots-tag nocontent. That will be a clear indicator to us that as the webmaster who knows this content, you are telling us this part of the page is not the unique main content of this page and don't recall this page for those terms. That will be a clear indicator to us that as the webmaster who knows this content, you are telling us this part of the page is not the unique main content of this page and do not recall this page for those terms.

That kind of mechanism is something that we provide as a control for site owners to be more explicit about what parts of the page are boiler plate. That kind of mechanism is something that we provide as a control for site owners to be more explicit about what parts of the page are boiler plate. But the NoFollow links are very different from not putting the link, and so I don't see this to be very different in terms of the tools available to webmasters. But the NoFollow links are very different from not putting the link, and so I do not see this to be very different in terms of the tools available to webmasters.

Eric Enge: Yes, indeed. Eric Enge: Yes, indeed. So you have a NoContent that is interesting too, because I am sure when people use that it just removes potential ambiguities in the interpretation of the page and allows them to focus the attention on all the things that are most important. So you have to nocontent that is interesting too, because I am sure that when people use it just removes potential ambiguities in the interpretation of the page and Full Version them to focus the attention on all the things that are most important.

Priyank Garg: Exactly right. Priyank Garg: Exactly right.

Eric Enge: Yes, so that's a good tool. Eric Enge: Yes, so that's a good tool. Have any of the other search engines moved toward supporting that? Have any of the other search engines moved toward supporting that?

Priyank Garg: We've actually brought this up in our conversations. Priyank Garg: We've actually brought this up in our conversations. You might recall earlier this month we all blogged about the support we have for robots exclusion protocols. You might recall earlier this month we all blogged about the support we have for robots exclusion protocols.

Eric Enge: Yes. Eric Enge: Yes.

Priyank Garg: And we resolved a bunch of the small variations among us. Priyank Garg: And we resolved a bunch of the small variations among us. So it has been brought up. So it has been brought up. I don't believe any of the others are supporting it yet, but we will find out in time. I do not believe any of the others are supporting it yet, but we will find out in time.

Eric Enge: What is it that Yahoo does when you discover duplicate content across two different sites and how does it deal with that in terms of the quality of the search experience? Eric Enge: What is it that Yahoo does when you discover duplicate content across two different sites and how does it deal with that in terms of the quality of the search experience?

Priyank Garg: Our goal is to surface good, unique content for users and provide the maximum amount of relevant information for every query the user makes. Priyank Garg: Our goal is to surface good, unique content for users and provider the maximum amount of relevant information for every query the user makes. So, our efforts are constantly to detect duplicate content sources, recognize the parent source as much as possible, and attribute content as much as possible to the parent or the original author for duplicate content. So, our efforts are constantly to detect duplicate content sources, recognize the parent source as much as possible, and attribute content as much as possible to the parent or the original author for duplicate content. Then we try and surface that for every query that we receive that it's relevant for. Then we try and surface that for every query that we receive that it's relevant for. Say site-A has content which is duplicated on site-B, and we recognize that A is the parent, then for a query related to that content will likely surface A higher. Say-A has content site Which is duplicated on site-B, and we recognize that A is the parent, then for a query related to that content will likely surface at higher. But if a query says I want content from side B on those terms, we will obviously try to surface that. But if a query says I want content from side B on those terms, we will obviously try to surface that.

Eric Enge: But it's not always that easy to know who the parent is. Eric Enge: But it's not always that easy to know who the parent is.

Priyank Garg: That's true, that's something that it is not always easy to know the best page, but its part of our algorithmic efforts to detect that intent, and we continue to do that. Priyank Garg: That's true, that's something that it is not always easy to know the best page, but its part of our algorithmic Efforts to detect that intent, and we continue to do that. So there are lots of signals that can often work, and in most cases works when the duplication is not egregious or intentional. So there are lots of signals that can often work, and in most cases works when the duplication is not egregious or intentional. It is entirely a function of how the Web is operating. It is entirely a function of how the Web is operating. Usually we do a reasonable job, but sometimes it's not always possible. Usually we do a reasonable job, but sometimes it's not always possible.

Eric Enge: Right. Eric Enge: Right. And then of course is the extreme version where it's a copyright violation and sometime that escalates itself to you in the form of DMCA requests. And then of course is the extreme version where it's a copyright violation and sometime that escalates itself to you in the form of DMCA requests.

Priyank Garg: That's right, we have a well-documented process for DMCA complaints. Priyank Garg: That's right, we have a well-documented process for DMCA complaints. Those complaints, when they come in, are investigated directly by our support and editorial teams and can be acted upon in a very targeted manner. Those complaints, when they come in, are investigated directly by our support and editorial teams and can be acted upon in a very targeted manner. So if you or any site owner has any content that you believe has been plagiarized or taken without your consent and you file a DMCA complaint with us, we will investigate that and take down the content that is found to be in violation of the copyright rules. So if you or any site owner has any content that you believe has been Plagiarize or taken without your consent and you file a DMCA complaint with us, we will investigate that and take down the content that is found to be in violation of the copyright rules .

Eric Enge: Right, although I think it's probably fair to say that if you file a DMCA request that you best be the owner of the content. Eric Enge: Right, Although I think it's probably fair to say that if you file a DMCA request that you be the best owner of the content.

Priyank Garg: That's of course true. Priyank Garg: That's of course true. You better know what you are pointing out. You better know what you are pointing out.

Eric Enge: Yes, indeed. Eric Enge: Yes, indeed. Are there situations in which extreme amounts of duplicate content can be flagged? Which Are there situations in extreme amounts of duplicate content can be flagged?

Priyank Garg: The essential policy on duplicate content is not to treat it as negative; it's essentially to treat it as an optimization on our systems. Priyank Garg: The essential policy on duplicate content is not to treat it as negative, it's essentially to treat it as an optimization on our systems. But there is a point where that no longer holds true. But there is a point where that no longer holds true. A common example could be a spammer, who has hundreds of millions of posts up on the same content. A common example could be a spammer, who has hundreds of millions of posts up on the same content.

That's a clear example where you can say that it's not really duplicate content, it is just an egregious practice that can affect the entire site. That's a clear example where you can say that it's not really duplicate content, it is just an egregious practice that can Affect the entire site. So, there is a point at which it does become a violation of our editorial guidelines. So, there is a point at Which it does become a violation of our editorial guidelines.

Eric Enge: Yes, indeed. Eric Enge: Yes, indeed. To finish I would like to see if you have some general advice that you would like to offer to publishers and SEOs in terms of what Yahoo views as best practices. To finish I would like to see if you have some general advice that you would like to offer to publishers and SEOs in terms of what Yahoo views as best practices.

Priyank Garg: Yes. Priyank Garg: Yes. The basic principle remains the same as you said; be aware of users and that's what we will be continued to gear ourselves toward. The basic principle remains the same as you said, be aware of users and that's what we will be continued to gear ourselves toward. But be search engine smart so that we can discover your content. But be smart search engine so that we can discover your content. The robots NoContent and other tools that we have provided are means that give you control and if you use them, they can work for you. The robots nocontent and other tools that we have provided are means that give you control and if you use them, they can work for you. We don't expect everyone to have to use those controls, and we continue to build algorithms to do much of that work. We do not expect everyone to have to use those controls, and we continue to build algorithms to do much of that work.

Yahoo Site Explorer continues to be a great avenue to learn about what we are doing. Yahoo Site Explorer continues to be a great avenue to learn about what we are doing. We have been doing some work learning from the last feature we launched, which was well-appreciated, the Dynamic URL Rewriting. We have been doing some work learning from the last feature we launched, Which was well-appreciated, the Dynamic URL Rewriting. That is a tool that we have seen in multiple examples as having really significantly increased the quality of the experience of site owners. That is a tool that we have seen in multiple examples really as having significantly increased the quality of the experience of site owners.

I talked about this feature again at SMX Advanced in June 2008, and while speaking on the panel, someone from Disneyworld was in the audience. I talked about this feature again at SMX Advanced in June 2008, and while speaking on the panel, someone from Disneyworld was in the audience. Within five minutes, while I was still describing it, he went to Site Explorer and set it up, and he is already seeing the benefits of what he had implemented. Within five minutes, while I was still describing it, I went to Site Explorer and set it up, and he is already seeing the benefits of what he had implemented.

Eric Enge: Right. Eric Enge: Right. I really appreciate you are taking the time to speak with me today. I really appreciate you are taking the time to speak with me today.

Priyank Garg: Thank you Eric! Priyank Garg: Thank you Eric!

Have comments or want to discuss? Have comments or want to discuss? You can comment on the Priyank Garg interview here . You can comment on the interview here Priyank Garg.
Other Yahoo Interviews Other Yahoo Interviews

* Yahoo's Andrew Tomkins - May 12, 2008 Yahoo's Andrew Tomkins - May 12, 2008
* Yahoo's Tim Mayer - October 1, 2007 Yahoo's Tim Mayer - October 1, 2007
* Yahoo's Tomi Poutanen - May 14, 2007 Yahoo's Tomi Poutanen - May 14, 2007
* Yahoo's Tim Mayer - December 4, 2006 Yahoo's Tim Mayer - December 4, 2006

About the Author About the Author

Eric Enge is the President of Stone Temple Consulting. Eric Enge is the President of Stone Temple Consulting. Eric is also a founder in Moving Traffic Incorporated, the publisher of Custom Search Guide , a directory of Google Custom Search Engines, and City Town Info , a site that provides information on 20,000 US Cities and Towns. Eric is also a founder in Moving Traffic Incorporated, the publisher of Custom Search Guide, a directory of Google Custom Search Engines, and City Town Info, a site that provides information on 20.000 U.S. Cities and Towns.

Stone Temple Consulting (STC) offers search engine optimization and search engine marketing services, and its web site can be found at: http://www.stonetemple.com . Stone Temple Consulting (STC) offers search engine optimization and search engine marketing services, and its web site can be found at: http://www.stonetemple.com.
SEO Search Tool SEO Search Tool

Search the best SEO Sites on the Web! Search the best SEO Sites on the Web!

About SEO Search About SEO Search
SEO Services

* Home Home
* SEO Services SEO Services
* Analytics Analytics
* Background Background
* Case Studies Case Studies

* Articles Articles
* Blog Blog
* Podcasts Podcasts
* Presentations Presentations
* Contact Contact

Subscribe to STC articles by reader Subscribe to STC articles by reader

Subscribe to STC Articles by Email Subscribe to STC Articles by Email

Twitter Follow Stone Temple on Twitter Stone Temple Follow on Twitter
Interviews Interviews

* Tribune's Brent Payne, December 13, 2009 Tribune's Brent Payne, December 13, 2009
* Google's Josh Cohen, November 15, 2009 Google's Josh Cohen, November 15, 2009
* Chris Silver Smith, Sep 17, 2009 Chris Silver Smith, Sep 17, 2009
* Oyster's Eyan Seidman, August 6, 2009 Oyster's Eyan Seidman, August 6, 2009
* Google's Peter Linsley, July 12, 2009 Google's Peter Linsley, July 12, 2009
* Yahoo's Dennis Mortensen, June 15, 2009 Yahoo's Dennis Mortensen, June 15, 2009
* Enquisite's Richard Zwicky, June 8, 2009 Enquisite's Richard Zwicky, June 8, 2009
* Microsoft's Dr. Scott Prevost Microsoft's Dr. Scott Prevost
* YouTube's Tracy Chan, Matthew Liu YouTube's Tracy Chan, Matthew Liu
* Google's John Mueller, May 14, 2009 Google's John Mueller, May 14, 2009
* SEOmoz's Sarah Bird, May 4, 2009 SEOmoz's Sarah Bird, May 4, 2009
* Omniture's Bill Mungovan, April 6, 2009 Omniture's Bill Mungovan, April 6, 2009
* Google's Rajat Mukherjee, March 31, 2009 Google's Rajat Mukherjee, March 31, 2009
* Sabrina Parsons, March 23, 2009 Sabrina Parsons, March 23, 2009
* InfoUSA's Pankaj Mathur, March 17, 2009 InfoUSA's Pankaj Mathur, March 17, 2009
* Omniture's Chris Zaharias, March 2, 2009 Omniture's Chris Zaharias, March 2, 2009
* Google's Sandra Cheng, February 16, 2009 Google's Sandra Cheng, February 16, 2009
* Google's Cedric Dupont, February 4, 2009 Google's Cedric Dupont, February 4, 2009
* Market Motive's Scott Milrad, Jan. 26, 2009 Market Motive's Scott Milrad, Jan. 26, 2009
* AJPR's Motoko Hunt, January 12, 2009 AJPR's Motoko Hunt, January 12, 2009
* Microsoft's Angus Norton, January 5, 2009 Microsoft's Angus Norton, January 5, 2009
* Alex Chudnovsky, December 29, 2008 Alex Chudnovsky, December 29, 2008
* Eric Ward, December 22, 2008 Eric Ward, December 22, 2008
* Yahoo's Larry Cornett, December 15, 2008 Yahoo's Larry Cornett, December 15, 2008
* Google's Brett Crosby, December 1, 2008 Google's Brett Crosby, December 1, 2008
* Rand Fishkin, November 24, 2008 Rand Fishkin, November 24, 2008
* Hitwise's Bill Tancer, November 17, 2008 Hitwise's Bill Tancer, November 17, 2008
* Dennis Mortensen - October 27, 2008 Dennis Mortensen - October 27, 2008
* Guy Kawasaki - October 20, 2008 Guy Kawasaki - October 20, 2008
* Rich Stokes - September 29, 2008 Rich Stokes - September 29, 2008
* Bruce Clay - September 22, 2008 Bruce Clay - September 22, 2008
* Microsoft's Nathan Buggia - Sept. Microsoft's Nathan Buggie - Sept. 15, 2008 15, 2008
* Yahoo's Frazier Miller - Sept. Yahoo's Frazier Miller - Sept.. 8, 2008 8, 2008
* Microsoft's Ziya Genceren - Sept. Microsoft's Ziya Genceren - Sept. 3, 2008 3, 2008
* Google's Maile Ohye - August 25, 2008 Google's Maile Ohye - August 25, 2008
* EveryZing's Tom Wilde - August 11, 2008 EveryZing's Tom Wilde - August 11, 2008
* Microsoft's Brad Goldberg - August 4, 2008 Microsoft's Brad Goldberg - August 4, 2008
* Vitrium's Peter Nieforth - July 28, 2008 Vitrium's Peter Nieforth - July 28, 2008
* Aaron Wall - July 21, 2008 Aaron Wall - July 21, 2008
* Google's Carter Maslan - July 14, 2008 Google's Carter Maslan - July 14, 2008
* Yahoo's Priyank Garg - July 7, 2008 Yahoo's Priyank Garg - July 7, 2008
* Izea's Ted Murphy - June 30, 2008 Izea's Ted Murphy - June 30, 2008
* Vanessa Fox - June 23, 2008 Vanessa Fox - June 23, 2008
* Matt Cutts - June 16, 2008 Matt Cutts - June 16, 2008
* Marc Johnson - June 2, 2008 Marc Johnson - June 2, 2008
* Bob Massa - May 26, 2008 Bob Massa - May 26, 2008
* Yahoo's Andrew Tomkins - May 12, 2008 Yahoo's Andrew Tomkins - May 12, 2008
* Google's Dick Costolo - April 28, 2008 Google's Dick Costolo - April 28, 2008
* John Marshall - Apr. 21, 2008 John Marshall - Apr. 21, 2008
* TV Guide's Kirsten Rasanen - Apr. 7, 2008 TV Guide's Kirsten Rasanen - Apr. 7, 2008
* Kevin Lee - Mar. 31, 2008 Kevin Lee - Mar. 31, 2008
* Ken McGaffin - Mar. 24, 2008 Ken McGaffin - Mar. 24, 2008
* Tim Ash - Mar. 10, 2008 Tim Ash - Mar. 10, 2008
* Matt McGowan - Mar. 3, 2008 Matt McGowan - Mar. 3, 2008
* Avinash Kaushik - Feb. 18, 2008 Avinash Kaushik - Feb. 18, 2008
* Danny Sullivan - Feb. 11, 2008 Danny Sullivan - Feb. 11, 2008
* Google's Adam Lasnik - Feb. 4, 2008 Google's Adam Lasnik - Feb. 4, 2008
* comScore's James Lamberti - Jan. 28, 2008 comScore's James Lamberti - Jan. 28, 2008
* Incisive's Kevin Ryan - Jan. 21, 2008 Incisive's Kevin Ryan - Jan. 21, 2008
* Eurekster's Grant Ryan - Jan. 14, 2008 Eurekster's Grant Ryan - Jan. 14, 2008
* Eurekster's Steven Marder - Jan. 7, 2008 Eurekster's Steven Marder - Jan. 7, 2008

Podcasts with Industry Leaders Podcasts with Industry Leaders

* Michael Gray - February 26, 2009 Michael Gray - February 26, 2009
* David Szetela - October 24, 2008 David Szetela - October 24, 2008
* Debra Mastaler - October 20, 2008 Debra Mastaler - October 20, 2008
* Eric Papczun - April 15, 2008 Eric Papczun - April 15, 2008
* Stephan Spencer - April 16, 2008 Stephan Spencer - April 16, 2008
* Bill Flitter - Mar. 25, 2008 Bill Flitter - Mar. 25, 2008

2007 Interviews and Podcasts 2007 Interviews and Podcasts

2006 & 2007 Interview Archive page. 2006 & 2007 Interview Archive page.
This Old Web Site Case Studies This Old Web Site Case Studies

* Visual DX Health Case Study Visual DX Health Case Study
* Videomaker Case Study Videomaker Case Study

SEO and Web Marketing SEO and Web Marketing

* 301 redirects 301 redirects
* Anchor Text Anchor Text
* Blogging and SEO Blogging and SEO
* Cloaking - Cloak and Dagger or Cool? Cloaking - Cloak and Dagger or Cool?
* Content Development Content Development
* Content Optimization for Search Engines Content Optimization for Search Engines
* CSS and SEO CSS and SEO
* Duplicate Content Duplicate Content
* Keyword Density Keyword Density
* Linking Overview Linking Overview
* Link Building Strategies Link Building Strategies
* Metatags and SEO Metatags and SEO
* Page Rank Demystified Page Rank Demystified
* Robots.txt and Robots Metatags Robots.txt and Robots Metatags
* The Art of Keyword Selection The Art of Keyword Selection
* The Changing Face of SEO The Changing Face of SEO
* Top 10 Bad SEO Ideas Top 10 Bad SEO Ideas
* What Search Engines Want What Search Engines Want

Social Media Optimization Social Media Optimization

* Secrets for Success on Digg Secrets for Success on Digg
* Video Search Optimization Video Search Optimization

Other Major Search Topics Other Major Search Topics

* Google Ajax Search API Google Ajax Search API

Analytics Articles Analytics Articles

* Javascript Location and Accuracy Javascript Location and Accuracy
* 2007 Web Analytics Shootout Final 2007 Web Analytics Shootout Final
* 11 Cool Things With HBX Analytics 11 Cool Things With HBX Analytics
* 12 Cool Things With NetInsight 12 Cool Things With NetInsight
* 10 Cool Things With Clicktracks 10 Cool Things With Clicktracks
* 10 Cool Things With IndexTools 10 Cool Things With IndexTools
* 2007 Web Analytics Shootout Interim 2007 Web Analytics Shootout Interim
* Web Analytics and Cookies Web Analytics and Cookies

Google Co-Op Articles Google Co-Op Articles
Overview Overview

* Google Co-Op Overview Google Co-Op Overview

Custom Search Engines Custom Search Engines

* Google Custom Search Engines Overview Google Custom Search Engines Overview
* Building a Quality Custom Search Engine Building a Quality Custom Search Engine
* Custom Search Engine Topic Refinement Custom Search Engine Topic Refinement
* Creating Advanced Custom Search Engines Creating Advanced Custom Search Engines
* More Custom Search Engines Tips and Tricks More Custom Search Engines Tips and Tricks

Subscribed Links and Annotations Subscribed Links and Annotations

* Google Co-Op Subscribed Links Google Co-Op Subscribed Links
* Google Co-Op Annotations Google Co-Op Annotations

Google Co-Op Feed Google Co-Op Feed

Get our SEO articles in Google search results by subscribing to our Google Co-Op Feed . Get our SEO articles in Google search results by subscribing to our Google Co-Op Feed.


Subscribe to STC articles by reader Subscribe to STC articles by reader

Subscribe to STC Articles by Email Subscribe to STC Articles by Email

For more information on Web Marketing Services, contact us at: For more information on Web Marketing Services, contact us at:

Stone Temple Consulting Stone Temple Consulting
(508) 485-7751 (phone) (508) 485-7751 (phone)
(603) 676-0378 (fax) (603) 676-0378 (fax)
info@stonetemple.com info@stonetemple.com

Monday, August 10, 2009

NHS Blog Doctor: Erectile dysfunction : size matters

Since generic viagra is produced under the same formulation and quality control as the patented viagra you should not see any differences in effectiveness and be capable to love the same. In this respect, you can be certain that you can buy generic viagra and savor the same quality at a lower price than by buying the patented medication.

Sunday, August 2, 2009

Tuesday, July 21, 2009

Easy Exercises You Can Do At Work

Easy Exercises You Can Do At Work

The risk of developing coronary artery disease, stroke, diabetes, certain cancers, arthritis, and osteoporosis are greatly diminished in those who exercise daily and lose fat. Thus, the chances of the problems with the sexual life are reduced and you enjoy better sex. Daily exercise brings your body in shape, increases the stamina, and you look more attractive than before and thus you enjoy the more sex. If you walk daily for the 10 minutes then it reduces your risk of suffering from the impotence. Thus, losing weight and daily exercise makes you enjoy the sex more.


Wednesday, June 24, 2009

TOP SEO TIPS



Okay it seems like a trend hitting us lately, but most of the seo or webmaster related forums I frequent I see this same trend. "My rankings have disappeared", "Google hates me", "my competitor who only has 2 links is beating my 23 billion links"

So I am going to attempt to explain some of reasons for a loss in rankings in the hopes that it might help clear out the excessive posting on this issue. Please let me know if I am unclear about something.

First I should explain what causes rankings to drop and/or fluctuate. There are actually a number of reasons. Here are a few:

1. Some of the back links pointing to your site have been devalued or are no longer active(your link was removed)
2. A competitor has gotten the edge on you and taken your positions. Generally only a loss of a couple of positions is effected in this scenario.
3. New algorithmic changes that put more value on a certain factor or dis value another factor.
4. Earthquake... site structure disrupted causing pages to be down/missing etc
5. Penalization or banning

Okay I will now attempt to dig deeper into each of these scenarios...

1 - Some of the back links to your site have been devalued
Where are the majority of your backlinks coming from? Reciprocal, links pages or directories? If so it is likely they have been devalued/discredited by Google. This happens overtime. These are the easiest forms of link building and as such are not seen as very high quality by Google.

The relevancy of the directory/links page has a great deal to do with how much value is originally passed and whether it will be devalued. Directories or links pages that allow irrelevant, low quality trash to be listed/added will lower the value of your link. As more and more of these links are added to the page your link appears on the value of your link drops lower and lower. This can cause rankings to drop and even appear to flux. Directories and links pages which only have a few links and which are related to your site and which are also preferably authority and high quality will have the most value and are not likely to be as easily devalued. That is why it is a good idea to watch where you get your links and to make sure it is quality and selective.

There are sites created for the sole purpose of building links. They put up some make-shift content and then allow webmasters to add links. Such pages generally lose value quite quickly as more and more sites are added to their "resource". Sites that are hard to get a link on are the best links to go after. I don't care what it takes, spend 5 hours getting that one link... it will prove far more valuable then those 50 FFA links you could have gotten in that same time period.

When reciprocating links look for quality sites which are particular about who they exchange links with. Sites that have only a few link exchanges instead of practically accepting anybody.
Sites who are careful who they exchange with will hold their value much better and you will get more out of the exchange. Sites which exchange links directly on content pages have a much higher value and tend to not lose value, instead the value is likely to rise if the content is good and draws links from other sites.

You must be very careful that you know who your linking to and what if anything your paying for that link. DON'T waste time exchanging links with sites if all you get is a link on a links page. It will start out as a greatly devalued link and will lose more and more value as more low quality links are added.

And NEVER EVER exchange links with any site that is not related and that you can not vouch for as a quality resource. Be safe, stay relevant and stick to quality links only.


2 - A competitor has gotten the edge on you
This generally only effects a few positional changes. Such as a drop of 1-2-3 positions. However if you are slacking and haven't been working as hard as you should to keep your rankings then a good many of your competitors could rise above you just because they are still working and you aren't. If this is the case then you will likely see a slow but steady loss of rankings.
A good many sites in business fields pay for continued seo/link building so if you stop working it is highly unlikely that you will stay where you are very long... unless you have high quality link baits which act as high-power link magnets. In which case you will always be maintaining a quality site but will have a far more secure position. BUT don't let this convince you that you can sit still, even high quality link baits can be beat... by higher quality link baits

Also some of these factors I am mentioning combined may do alot of damage. Say you are slacking on your links but not only that the links you were acquiring were low quality which are being devalued by Google. Your competitor on the other hand developed a tool which is in very high demand, this tool acts as a very powerful link bait for their site. Some of these links are of very high value and authority. In such a situation you could lose alot of ground. Cause not only are you using very poor means of link building but at least one or more of your competitors is using VERY GOOD link building techniques. Are you following my logic? That is why I always suggest building up the quality of your site. The back links will follow but without a good quality site you are going NOWHERE!


3 - New algo changes that put more value on a certain factor or devalue another factor.
This factor is similiar to the first, actually it is the same in a way. Algo changes can put more or less value on a certain factor resulting in loss of rankings literally overnight.
Google seems to be updating as they please lately which means you will see changes all the time. Massive updates such as the bigdaddy can cause alot of unhappy webmasters though. But again you must understand this is generally because of devalution of low quality links and link building methods or because Google is finding new ways of deciding whether or not your site is as good as they once thought. Google is getting smarter are you?

If you are using low quality link building methods and then Google devalues them even more you can lose a ton of ground. Ground that you could have held if you had been working on building less but more quality links.

4 - Site structure disruption
If a page on your site that used to rank well suddenly disappears and is not immediately fixed the ranking is likely to also disappear.
When moving to new pages use 301 redirects from the old to the new but if at all possible keep the old pages.

I have seen whole sites disappear from Google because of site problems whether it was server downtime issues or the page disappearing... I once purchased a site off a guy for a wicked price, I mean I got a wicked deal. The site was hosted with a reseller who had terrible support and not the best uptime so I decided to move the site. Well in the process the host decided to suspend my account. To make a long story short by the time I got the issue resolved(or close to it) I had lost all my rankings and traffic. The site had a increase in traffic of 40-60% each month, it had great potential but I lost it all due to a stupid host.

Learn from my mistakes, don't ever let one of your sites be down due to server issues or whatever. And don't ever let pages get deleted... you never know what rankings you might lose due to a simple page disappearing from your site.


5 - Penalization or banning
You need to understand the difference here. When your site is banned a site:mydomain.com query will show no site. However this is not fool-proof as a new site will also appear this way until indexed unless you know for a fact your site was once indexed. Also as long as there is at least ONE PAGE indexed you are not banned.

Your site can be penalized without you even knowing however and without loss of any pages from Google's index.
Don't use methods such as hidden text, cloaking, or other black hat seo techniques. Some sites do rank well WHILE using these methods but NOT because of them. When in fact removing the black hat methods could improve their rankings. -30, -49, -950 and other such penalties are myths IMO. I have yet to see proof of them. Ohh no today my site dropped 7 positions. I got the -7 penalty. This IMHO is nonesense.
Design and build your site as though search engines were nonexistent. Design and build your site for the visitor not the spider. If you do this you will find your sites are more quality and will actually rank much better, because links are more easily obtained.


So to end this article, if you have recently lost rankings it is probably a result of one of the above factors. I suggest you work on building up the quality of your site as a whole and individually creating link bait resources, create new content regularly, write on unique subjects(within your niche) which other bloggers miss or have not fully covered. Provide answers to questions not easily found, provide an easy navigation both for your visitors and your robot friends. Create a HTML sitemap - the xml version is not needed.

Google is working to improve their search engine and make it even more relevant than it is now. Content is a very high priority. Don't under-estimate that, its indirect seo benefits are worth as much as the links they attract.
And I mean content as in Wiki quality content. That is content which is unique, quality and very informative. As such it attracts links very well. I am not saying create your own wiki. I am saying create wiki-quality articles.

If you follow the above suggestions you will improve your chances of creating a virtual link bait out of your site. This will cause webmasters to link to you without even being requested to do so. This type of link building is the most effective of the highest value and believe it or not the easiest to get.

Build sites based on your knowledge. Don't start a mesothelioma site JUST BECAUSE it has high adsense earnings. Build sites based on subjects you are very knowledgeable in, are passionate about and which you can provide useful information on. These types of sites perform very well in Google.
I find most of the ranking losses are for sites that are not following this method. The webmasters of a good many of these sites are not very knowledgeable about the subject they are writing about but they thought there was some good money in it
As Google devalues these low quality sites they will continue to drop... Instead of blaming Google however, use the tips I have given you to create sites that will perform well.



Sunday, June 21, 2009

8 Tips for Success in Online Business

Today I tell about how to do online Business. Just read following point.

8 Tips for Success in Online Business




If you want to succeed in your online business, there are some very basic tips that will apply to any type of online venture. A home based business required a lot of self discipline, but it also requires a good direction to focus your efforts. These should help you get started.

1. Take care of your potential customers - Any communication you receive from a potential customer should be answered promptly and courteously. That first communication from a customer is almost like an audition. They may have a question, but they are also testing you to see if they would like to do business with you, not your company, but you. They want to know who is behind the web page.

2. Be persistent - Be determined to make your business a success. You will likely encounter set-backs, but keep working to overcome them. If you chip away at the walls that appear in front of you, you will eventually tear them down.

3. Don't expect instant success - It will take time and effort to make your online business a success. Work hard and take the necessary action to make your business work. In the end, your hard work will pay off.

4. Create a budget for your business - It is very rare that you will make your business a success without spending some money along the way. Set up your plan for how much you will spend in any given area, so you do not overspend without realizing it. Don't be afraid to rethink your budget allocations if you see that they need to be changed.

5. Promote aggressively - The only way you will make money in your business is if people are aware that it exists. You have to spend a majority of your time and energy marketing and advertising your business. Do your research and try out different ideas, then put more effort towards the ideas that work. Your goal is to make sure as many people know about you as possible.

6. Never stop learning - Every day is a chance to learn how to do thing more efficiently. Keep your eyes open for news ideas and strategies to implement. Look for tools and techniques that will help you.

7. Communicate - If you have business partners and employees, communicate with them often. Talk to them about decisions and the direction of the company. Also, talk to them about any idea they may have and look objectively at their suggestions. They may see something you have overlooked.

8. Believe in your success - If you believe that you can do this, you will inherently work harder. You will set high goals and standards for yourself. You will not give up easily.

Remember, by following these steps, you will greatly increase your chances for success. You can do this. Grow your business!