Archive for the ‘Matt Cutts’ Category

About Panda Effects, Automated Generation of pages and No-Follow Links

Posted by:

Inbound-Links-SEOOnce again some common questions on SEO have been posed to Matt Cutts, the chief of web spam team at Google. Like always, we have the elaboration on whatever he said about SEO or Google. There were 3 main questions about which Cutts enlightened webmasters around the World. The first involved No-Follow Links and whether their inclusion can mar the position of the website. Cutts answered with a straight “no”, elaborating that such links typically cannot inflict any damage on any regular website. But, he warned about excessive blog comments that tend people to report spam to Google. In such cases, Google may take manual action against the website for spamming. However, the largest search engine on the Web is considerate about the scale of abuse and does not take harsh actions unless the magnitude of the offence is severe.

The second issue posed to him was regarding websites that have scripts that can automatically generate pages on the basis of search queries by users. The pages do not offer relevant content or any at all and often contain advertisements all around the message saying that the link on which the user clicked does not have any search result. The user asked about what actions does Google take against to such websites. Cutts explained that such websites do not add value to search results of Google and violate Google Webmaster Guidelines. Rather, they yield adverse SEO results. He asked users to report about such websites and alerted webmasters to block empty pages at their websites so that they do not get indexed.

Panda is the concern in the third SEO question that has been discussed this week. The user asked about its effect and sought a way to know whether his website has been under its impact. Panda was released a few years ago followed by periodic incremental updates from Google. However, Google realised that updates were too small and affected too small a number of websites to integrate Panda into regular indexing mechanism. Generally, Panda can be avoided with good content that does not depend on short term techniques such as keyword stuffing. In reality, SEO is more about structuring of the website and its content properly to let the latter have meaningful stand on the Web. Steady increase in visitors is possible only through consistency, which can be achieved only through right strategy.

How Google Assists Webmasters with Search Engine Optimization

Posted by:

Matt_Cuts_Answering_SEO_Questions-20130716Chief Engineer of Google Matt Cutts recently received a question from one YouTube user that when will Google consider supporting questions of webmasters without just sending automated responses. Some problems were faced by this user and he had sent details of it to Google but again the responses were automated.

Scale is a huge problem for Google as it has to deal with millions of domain names and offering individual responses is a bit difficult at this stage for Google. But, it cannot be said that Google is doing nothing for SEO efforts made by webmasters to help their sites get proper exposure.

Primary concern of Google is returning best web results to visitors every time they conduct a search query. Secondary concerns are the problems faced by webmasters about automated responses. But, Google has already responded to 400,000 requests or more answering various SEO questions from webmasters for optimizing their sites in the best possible way.

Even though, sending responses to every individual query is hard for Google but scalable ways are being devised for answering various types of questions about everything related to websites. YouTube videos are definitely an innovative way adopted by Google for answering different questions by webmasters. Instead of replying numerous times, the videos help to answer only one time which saves Google from any extra effort. The videos have become highly popular and more number of people in the website business is finding it as a good way to get answers to their doubts and queries.

New SEO related questions dealt by Mat Cutts

Posted by:

In his last webmaster video Matt Cutts discussed some questions related to Penalty Link Examples, SEO and Webmaster Tools.

It is quite important to fully understand the strategy of links when dealing with a website. To put it simply, one should be fully aware of the sites that are forming a link with them. Higher PageRank matters a lot in this case and one should see that they are teaming up with quality websites if they are planning for a successful SEO. Webmaster Tools, in this case, won’t be commenting about the incoming links but it’ll get reflected in the site’s ranking. But the software will be showing a notice if an action is taken by the Webspam team. It will be giving examples of the penalised links. However this won’t be happening always, putting the site managers in dire straits.

The example links are not always comprehensive as a particular website might contain numerous offensive links. The ones listed help in putting everything on track and to recognise the trouble-making links. Cuts added that it is next to impossible to notify every offensive link as the number is huge.

There are plans to include a lot more details from here on in the messages and to include 2 or 3 example links that could be followed as references.

The SEO implementation process is definitely tricky and could be hard task when dealing with inbound links. A number of sites get punished for reasons unknown to them. Thus, Webmaster Tools need to be used if one is planning to maintain a successful SEO.

Content Freshness and Concealed Content in SEO

Posted by:

Search Engine Optimization for dental officeThe discussion today would be on a subject that concerns itself with website optimization and web-spam, focusing on what the expert on SEO as well as Google engineer, Mat Cutts, has to say on this.

The first part of the discussion would be regarding the safety of concealed content, which the visitor can see only when it is clicked. Regarding this, Cutts says that these kinds of contents can be seen to appear most often on the web. He cites the example of the store sites where in way of buying the products, the visitors are often required to click on buttons that show that specifications as drop down. This is done in order to maintain the tidiness of website. So these are not harmful as long as they are designed for the sole use of the visitor. In fact they are quite safe to be clicked.

However, if there occurs a very tiny button, which is hardly visible to a visitor and has many pages in it, is held illegal by Google. These sites are consequently sent down in the rankings on search results.

The next part of the discussion concerns the importance of fresh content for SEO. If, for instance, a person searches “earthquake” as a term on the search engine, he is presented with a number of results showing the mention of any current happening of such an event. Hence, such sites as will have the freshest of information will have higher ranks on the search result. However, if the term is a more generalized one and is evergreen, freshness shall not be counted while presenting the search results.

Finding Bad Links Getting Directed to the Site

Posted by:

Dental SEO OptimizationWhile building any website, SEO is used for optimizing the site which in turn puts the webmasters in all sorts of trouble like handling bad links. If the site contains several links from sites that are not legitimate pointing towards it then the rank of the site can drastically reduce. The main reason behind this is to eliminate sites using bad backlinking strategies. As a matter of fact, legitimate businesses also get punished in the process. The most unfortunate thing is that examples of “bad” links are not provided by Google when it sends reports to different sites.

People face different types of trouble related to SEO after submitting to Google their reconsideration request. They generally ask for clarification regarding which links cause the actual problem. But, surprisingly they are faced with one more denial. This is mainly because of the fact that the system processes its replies with only “No”, “Processed” and “Yes”. It is not capable of assisting with clarification required by webmasters.

Matt Cutts, Google Engineer recommends that people can visit the forum for Webmaster Tools and get answers to their questions there. Experienced webmasters are available out there capable of providing helpful feedback. This will lead to better understanding about the actual cause of problem. “Bad links” examples are also offered by the Tools which can be quite beneficial when it comes to SEO.

It has already been acknowledged by Cutts that transparency is lacking at present which can affect legitimate businesses in an adverse manner. But, the main cause behind the system being made to work this way is growing spammer numbers. It is definitely not possible to respond to every individual trying to fool the system through illegitimate tactics. Continue Reading →

Matt Cutts answers SEO related questions

Posted by:

Matt_Cuts_Answering_SEO_Questions-20130723Google’s Matt Cutts faced some “website downtime” and “voice search” questions in this week’s one-to-one.

Someone asked whether a website will be suffering a lot, particularly in its ranking and SEO, if it stays down for a period of 24 hours. The man from Google answered that it won’t be a big concern if it stays down for a few hours. But if the “hours” starts turning into “weeks” then it surely is asking for some immediate attention. As Google is always trying to maintain integrity in their performance, they won’t be leading their visitors to a malfunctioning website. Thus, the best possible way to continue their site’s ranking is to resolve these issues as early as possible.

Voice search was next in line as one was interested to know whether voice searches’ popularity influences search syntax in some way or the other. Cutts said that the process is comparably more natural here. He added that the company is trying to make sure that voice search gets better with every passing day. The search algorithms will be boiling down the search queries so as to get hold of the summary of which one is talking about. One can safely say that query syntax has gone through a lot of changes for the last few years and shall continue in doing so. Cutts also hinted that this process will continue for an indefinite period, so it should be wise for SEO professionals to perform consistently if they want to survive in this game.


Attributing Source Content with SEO

Posted by:

Matt-Cutts-Google-SEOWhile writing the original contents for a blog, people always wonder as to how the sources used should be attributed. In other words, it can be said that they want to learn how to place the resources in an article that were used for writing it. Recently, one webmaster contacted Matt Cutts, Google Engineer, to know more about this issue and its importance in terms of SEO. Matt Cutts offered some answers that were really surprising.

The thing is, where the attributes are sourced does not actually matter. While building up SEO for the site and staying within the guidelines of Google Webmaster, people always think that where the links are placed within the article decides the PageRank flow and helps the site to achieve better ranks. But, that is not actually true. As a matter of fact, Matt offered the webmaster with different tips, preferable from the point of view of a reader.

Keeping that thing in mind, Matt feels that links to original source of an article (article from where the writer got the information) should be placed higher up in the post. Ideally that should be the 1st paragraph. He also stated that, one cannot just mention “According to popular auto blog…” or “According to (name of the website)….” the webmaster should place the actual link within an article so that readers are directed to source material. If this isn’t the case then, although, the source is being technically attributed but one is not concentrating towards PageRank flow. This also means that, the writer is not considerate towards the readers who are in search of additional information which is no doubt an essential part of SEO.

There is obviously no harm in placing the source link at the article’s bottom, as long as it is defined like “Source: Website” where the website is the link. However, when the article is quite big, there are slim chances that readers will scroll to its bottom which means that the source is not being acknowledged at all.

Matt Cutts advises bloggers and webmasters to acknowledge the source from where they obtained the information. This not only means citing the sources but being considerate about where the attributes are being placed within the link. The readers should find it easy to search for additional information related to the topic of an article. Responsibility on web is much more than simply SEO.


SERP subjectivity and Advertorial questions answered by Matt Cutts

Posted by:

Matt-Cutts-Google-SEO-June-2013Google’s webspam team head, Matt Cutts shared his views regarding advertorials recently in his YouTube channel.

They may look like age-old advertising but advertorials are more like editorials or natural content. The problem associated with this is that when people go through an advertorial content piece, they draw a quick conclusion that the content is editorial and it has been written by a part-time writer. Many a time, advertorials passes PageRank, which creates quite a problem as they are not disclosed that much. He stated that if one is found to be doing this then they could see a number of penalties related to SEO heading their way.

If one is thinking of avoiding the penalties, one needs to reveal the advertorial nature of the content to Google. SEO Optimization tactics can be used while doing this such as the rel=”nofollow” tag. It informs Google of not passing PageRank for it.

Next question Cutts dealt with was in regards to Google’s viewpoint on quality of search when one is relying on the subjective signals. The assumption is quite faulty in this case, he explained, as many thinks that as far as SERPs are concerned, the search query context cannot be determined by Google. But the thing is that they can determine it quite often. Human testers can deduce the context by simply looking the query and the domain. It is quite easy for them to see how these two are related, even though they are not accustomed with the matter. Continue Reading →

SEO in coming months from Google’s perspective

Posted by:

mattcutts-with-USA_TodayIt is time again to analyse one of the videos of Mr. Matt Cutts and extract useful information from it by dividing it into smaller, itemised sections. One of the users asked Matt Cutts, the leading engineer at Google, where SEO could be in coming months from Google’s perspective?

One of the key things that he suggested people was development of a really impressive website, which should have comprehensive and relevant content that is shared with known people. However, he cautioned users as far as plans of Google are concerned and mentioned that strategies may change anytime due to any reason. Still, he informed that his company has completed the work on Penguin 2.0 as well as briefed about certain features of it. According to him, the updated Penguin is going impact 2.3% of usual search queries generated in the US. This algorithm has been updated for the second time and Google encourages people to report any spam under this algorithm as Google always aims to improve.

Checking advertorials that violate quality guidelines of Google is another thing that his team is doing. Simply stated, payment for reviews should not flow PageRank. Engineers are also working to discover new ways for link analysis. Although it is in its earliest stage of development, it looks promising to Mr. Cutts.

Google is also looking to identify new means to inform webmasters about hacking and websites dispense malware. Thus, webmasters would have a resource that can be accessed by them for information on what needs to be corrected in case of a hacked website. Repairing the damages that hacking caused to the website is crucial for the restoration of all the hard SEO work. Continue Reading →

Google rolled out Penguin 2.0

Posted by:  /  Tags:

As per Matt Cutts blog post, Google started rolling out the newest version of the Penguin webspam algorithms on May 22, 2013.

Only 2.3% of US queries will be affected  and the goal of the recent update is to more impact the webspammed sites.

During the roll out of Penguin 2.0 some websites have been downgraded as a result of  the negative SEO which perhaps is an indicator that the negative SEO can be now easier used by your competition to remove your site from high ranking.

If your websites has lost the ranking after May 22nd then perhaps you need to research whether your site have been targeted by the negative SEO.


Page 1 of 212